PowerShell tips and tricks

Over the years I learned a lot of PowerShell tricks but also stopped using a few because they are outdated or not that smart to use them any more performance-wise. In this blog post I will share a few of them 🙂 (These are just a few, will write a new blog post with others in the future)

Color coding output on-screen

When creating scripts, I use Write-Host a lot to display status on the screen. To see what the script is doing and if everything is alright. Normally Write-Host writes the output in gray and you can’t see if that’s good or bad, it’s all gray. When using Write-Host you can use the -ForegroundColor parameter to output in the following colors: Black, Blue, Cyan, DarkBlue, DarkCyan, DarkGray, DarkGreen, DarkMagenta, DarkRed, DarkYellow, Gray, Green, Magenta, Red, White and Yellow.

For normal status messages, I always use Green and for errors Red, for warnings usually Yellow 🙂 (For warnings you can also use the Write-Warning cmdlet)

Filtering data server-side

If you’re retrieving a lot of data, users and groups for example in an (Azure) Active Directory, then pulling all the data to your client and using Where-Object on it… That’s resource hungry and not efficient, it’s better to Filter the data before pulling it to your client. Exchange, Active Directory and Azure cmd-lets have this feature, for example: (And yes, I should have used that in the Memory Management example below too 🙂 )

Get-ADUser -Filter {Department -Like "IT"} -Properties Department

This retrieves the users from Active Directory which have an “IT” department configured.

Formatting output on-screen

Getting the output on your screen using Write-Host is easy, but sometimes you run into special characters and can’t get the output to show correctly. For example:

Write-Host "Found $($user.SamAccountName)'s Office location to be $($user.physicalDeliveryOfficeName)"

Normally I use the ‘ character with the text in between, but because I used the ” ‘s office location.. ” part that’s not possible. It’s better to use it in a format like this so that you don’t have to think about correctly formatting the “s and ‘s:

Write-Host ("Found {0}'s Office location to be {1}" -f $user.SamAccountName, $user.physicalDeliveryOfficeName)

The {0} and {1} indicate the order of the variables after the -f parameter, you can use as many as needed as long as they are in order/numbered like you enter between the two ()’s.

Invoke-WebRequest speed-up

With this cmdlet, you can download data from the internet, but when you download a large file (An ISO for example) it’s not that fast. The reason why it’s not that fast is because it shows you the progression of your download on-screen. You can download faster (much faster even) when setting this preference before downloading:

$ProgressPreference = "SilentlyContinue"

I measured the difference with a script that downloads a 100MB.bin file from a speedtest site:

$stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
Invoke-WebRequest -uri https://speed.hetzner.de/100MB.bin -OutFile c:\temp\100MBGB.bin
$stopwatch.Stop()
Write-Host "Done, the download took $($stopwatch.Elapsed.Hours) hours, $($stopwatch.Elapsed.Minutes) minutes and $($stopwatch.Elapsed.Seconds) seconds" -ForegroundColor Green

The download was completed in 6 minutes and 46 seconds. Added “$ProgressPreference = “SilentlyContinue” to it and ran it again:

$stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
$ProgressPreference = "SilentlyContinue"
Invoke-WebRequest -uri https://speed.hetzner.de/100MB.bin -OutFile c:\temp\100MBGB.bin
$stopwatch.Stop()
Write-Host "Done, the download took $($stopwatch.Elapsed.Hours) hours, $($stopwatch.Elapsed.Minutes) minutes and $($stopwatch.Elapsed.Seconds) seconds" -ForegroundColor Green

Now the result is… 19 seconds!! 🙂

Memory management

Most of the scripts involve getting a lot of data and looping through them, during the execution of your script this could eat up your precious memory. If you run your script on your workstation, then only you will suffer the consequence but when running it on a server… Then it might cause slowdowns for your users and that’s something that you don’t want 😉

In the past I did things like this:

$Users = Get-Aduser -Filter * -Properties *
Foreach ($user in $users) {
    if ($user.physicalDeliveryOfficeName -ne 'Amsterdam') {
        Set-Aduser $user.SamAccountName -Office 'Amsterdam'
    }
}

It’s basically retrieving all users with all their properties from your Active Directory, looping through them all and changing the Office location to Amsterdam for example. This works, but it’s not very nice in a larger environment and could consume more memory and performance than needed. Better would be to only get the basic user information and the extra attribute that you need (The office location, this is the physicalDeliveryOfficeName) and add that to your query. The script would then look like this:

$Users = Get-Aduser -Filter * -Properties physicalDeliveryOfficeName
Foreach ($user in $users) {
    if ($user.physicalDeliveryOfficeName -ne 'Amsterdam') {
        Set-Aduser $user.SamAccountName -Office 'Amsterdam'
    }
}

This is better, with less memory and CPU consumption, but… The $users variable is set in your script now but you only use it once, it’s better to use it in the Foreach loop making your code a bit more compact and not keeping the data in memory. The script looks like this after the change:

Foreach ($user in Get-Aduser -Filter * -Properties physicalDeliveryOfficeName) {
    if ($user.physicalDeliveryOfficeName -ne 'Amsterdam') {
        Set-Aduser $user.SamAccountName -Office 'Amsterdam'
    }
}

Garbage Collection is another thing to keep your memory usage low while running scripts that collects a lot of data. After each memory-intensive part in the script, you can run this command to free as much memory as possible: (I’ve had scripts that cleared over 3Gb of RAM when running it 🙂 )

[System.GC]::GetTotalMemory($true) | Out-Null

Multiple variables in a Foreach Loop

In the past I used multiple variables containing an array in a script and wanted to loop through them separately, I would then use multiple Foreach loops even though they run the same command-lines in that loop. But you specify multiple variables like this (Snippet from the Get-SecurityEvents function I wrote recently):

foreach ($eventids in 
                  $filteruseraccountmanagement, 
                  $filtercomputeraccountmanagement, 
                  $filtersecuritygroupmanagement, 
                  $filterdistributiongroupmanagement, 
                  $filterapplicationgroupmanagement, 
                  $filterotheraccountmanagement ) {
......

Requires

You can use the #Requires at the top of your script for checking certain things before the script starts, if the requirement is not met… Then it will throw an error and stop, some examples that I use are:

  • #Requires -Modules { <Module-Name> | <Hashtable> } , you can specify the modules that need to be installed on your system. For example #Requires -Modules Az.Accounts, PSParseHTML
  • #Requires -RunAsAdministrator , this will check if the script was started elevated
  • #Requires -Version 7.0 , this will check the PowerShell version and is handy if you use a specific feature that doesn’t work in 5.1

More about this here. (docs.microsoft.com)

Where-Object

It’s probably something that I learned in PowerShell 2.0 and used it in that way multiple PowerShell versions later, but the Where-Object cmdlet has changed. In the past you had to do this filter an object from within a pipe-line:

Get-Childitem -Path C:\temp | Where-Object {$_.BaseName -Match '2022'}

You can also do this by running:

Get-Childitem -Path C:\temp | Where-Object BaseName -Match '2022'

This only works when filtering one object, for multiple objects you still need to run it like this: (Example for getting files with 2022 in their name which are not empty/0Kb)

Get-Childitem -Path C:\temp | Where-Object {$_.BaseName -match '2022' -and $_.Length -gt '0'}

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.