Sometimes, I browse my old scripts in my private GitHub repository and think… Why did I write it / Where’s the error handling / I could improve this with what I learned after writing that years ago, etc. In this blog post, I will show you a few examples of just that and how I improved 🙂

How I started writing my first few scripts
I think everyone started with scripts that had variables hard-coded at the top of their scripts and used that inside the script. The script was meant to get a particular thing done once… For example: (Changed Domains to generic ones because it’s a previous customer)
$Group='Domain Admins" $Recursive=$true $Domains='Customer.local','subdomain.customer.local' $LogFile='C:\temp\DomainAdminscan.log'
Also, I used many Read-Host prompts to create a script that asks you for a specific server, username, process, etc. Store the answer in a variable and use it in the script. For example, below, I first prompt for the first letters of the desktop or laptop and use the Quest cmdlet to find the AD Computer Objects to check if the Avast Anti-Virus software is installed by checking its service:
$criteria=Read-Host 'Please enter the search criteria (pct*, or lat*)'
$computers=Get-QADComputer -SamAccountName "*$criteria*"
foreach ($i in $computers) {
$service = Get-WmiObject Win32_Service -Filter "Name = 'avast! Antivirus'" -ComputerName $i.Name
if ( ! $service ) {Write-Host Avast! Antivirus is not installed on $i.name}
}
And yes, clear-text passwords were also a thing in scripts 🙁 For example, the part below was to gather snapshot information from VMware VCenter (Changed server and usernames and the password, too)
Connect-VIServer -Server srv325 -User 'domain\task' -Password Passw0rd! $snapshotsNL = Get-Snapshot -VM * -Server srv325 Connect-VIServer -Server srv525 -User 'domain\task' -Password Passw0rd! $snapshotsUSA = Get-Snapshot -VM * -Server srv525
But also, because there were no cmdlets/modules back then, I called executables in my scripts to execute things and scraped output from that in my script. For example, this was a part of the script that logged off accounts from servers in the Domain (Not Terminal Servers, but member servers that had Admins logged in)
query user /server:$computer | ?{$_ -match "\b$User\b"} | ?{$_ -match "ICA-tcp#\d*"} | %{logoff $matches[0] /server:$computer}
or this snippet to query Scheduled Tasks:
schtasks.exe /Query /FO CSV /s $server.Name /v > $env:TEMP\schtemp.csv
Things that I improved on
I try to create scripts now that are more flexible, offer error handling (and yes, sometimes I add more error handling than the actual script :P), are easy to read (Formatting in VSCode FTW), work in every situation (not only on my computer with all modules already installed), etc.
I like to share my scripts with the community by posting them on my GitHub page now, but in the past, I just kept them for myself so that nobody could comment on my scripting skills. You should never do that; you can learn from others, which means sharing your scripts and being ready for feedback. And yes, that’s scary at first. But don’t worry, and the PowerShell community is not like the ‘feedback’ you get from Stack Overflow 😀
In the past, I thought PowerShell was for small admin tasks, but now I see more and more possibilities for using it for almost anything.
Personal best practices
If you take a look at my scripts from the last two years on my GitHub page or in the blogs on this website, you see a pattern in them regarding:
Functions
Instead of running things as a script, I like creating Functions that I can add to my PowerShell profile and then make available in my sessions immediately. You must understand that if you run a Function, nothing happens until you run the Get-XYZ, Invoke-123, or similar Function name. (I had a lot of questions on my blog posts that the script was executed, but nothing happened…)
GridView
Instead of filtering data from a file output or screen, I like to output data to Out-GridView/Out-ConsoleGridView for easy filtering and selecting one or more items in an OutputMode variable.
Output to CSV/XLSX
I started exporting data from your script to CSV. Still, I use Doug Finke’s Import-Excel module more and more to export to Exel immediately, making it easier for the people who receive the reports (And not making them import/convert them themselves).
Parameters
I love using parameters to make my scripts more flexible and easy to run. I tend to use them with a default value so that you can run the script with that or change it from the command line with a more specific value if needed. ParameterSets, ValidateSets, [String[]] so that you can specify multiple values for a Parameter, Mandatory $true or $false, etc.
PSCustomObject
That’s my go-to for storing results, values, etc., in scripts. I like building them as a nice database that I can use in the script for reporting output, etc.
Try/Catch
Instead of getting red error lines on your screen, I use Try/Catch to Try an action, and if it fails, Catch it in an error message on the screen and stop processing the script. It prevents error output on your screen and gives you the chance to provide user-friendly error messages 🙂
Things I want to improve on
With PowerShell, you’re never done learning 🙂 I want to learn and improve on:
- Creating PowerShell Modules and publishing them on the PSGallery
- Building GUI-based scripts
- Using Pester
- DevOps CI/CD stuff
- Creating more significant scripts and using more Functions inside them
Wrapping up
And that’s my story about how I started writing my first scripts, how I improved, and that there’s still much to learn. What did you improve on compared to your first scripts? Leave a comment below!
Have a lovely weekend!