© Adam Bertram 2020
A. BertramBuilding Better PowerShell Codehttps://doi.org/10.1007/978-1-4842-6388-4_16

16. Build Scripts for Speed

Adam Bertram1  
(1)
Evansville, IN, USA
 

Although this chapter conflicts with tips on not purely focusing on performance, there’s a fine line to follow. On the one hand, you don’t need to get bogged down shaving off microseconds of runtime. On the other hand, though, you shouldn’t completely disregard script performance.

There is a gray area that you need to stay within to ensure a well-built PowerShell script.

Don’t Use Write-Host in Bulk

Although some would tell you never to use the Write-Host cmdlet , it still has its place. But, with the functionality, it brings also a small performance hit. Write-Host does nothing “functional.” The cmdlet outputs text to the PowerShell console.

Don’t add Write-Host references in your scripts without thought. For example, don’t put Write-Host references in a loop with a million items in it. You’ll never read all of that information, and you’re slowing down the script unnecessarily.

If you must write information to the PowerShell console, use [Console]::WriteLine() instead.

Tip Source: https://twitter.com/brentblawat

Further Learning

Don’t Use the Pipeline

The PowerShell pipeline, although a wonderful feature, is slow. The pipeline must perform the magic behind the scenes to bind the output of one command to the input of another command. All of that magic is overhead that takes time to process.

You can see in the following an example of the pipeline’s speed. Using the foreach method on an array of 1,000,000 items is three times as fast as the pipeline.
../images/501963_1_En_16_Chapter/501963_1_En_16_Figa_HTML.jpg

The pipeline simply has a lot more going on behind the scenes. When the pipeline isn’t necessary, don’t use it.

Further Learning

Use the foreach Statement in PowerShell Core

PowerShell has a few different ways to iterate through collections. The fastest way is with the foreach statement in PowerShell Core. The speed of the foreach statement varies over different versions, but in PowerShell Core, the PowerShell team has really made it fly.

Consider the example in the following of iterating through an array of 1,000,000 strings vs. using the foreach() method on the collection. These two examples perform the exact same function.
../images/501963_1_En_16_Chapter/501963_1_En_16_Figb_HTML.jpg

You can see here that the foreach() method took four times as long!

When you need to process large collections, consider using the foreach statement rather than the foreach() method. You could alternatively use the ForEach-Object cmdlet as well but do without using the pipeline.
ForEach-Object -InputObject $array -Process { $_ }

Further Learning

Use Parallel Processing

Leveraging PowerShell background jobs and .NET runspaces, you can significantly speed up processing through parallelization. Background jobs are a native feature in PowerShell that allows you to run code in the background in a job. A runspace is a .NET concept that’s similar but requires a deeper understanding of the .NET language. Luckily, you have the PoshRSJob PowerShell module to make it easier.

Let’s say you have a script that attempts to connect to hundreds of servers. These servers can be in all different states – offline completely, misconfigured leading to no connection, and connectable. If processing each connection in serial, you’ll have to wait for each of the offline or misconfigured servers to time out before attempting to connect to the other.

Instead, you could use put each connection attempt in a background job which will start them all nearly at the same time and then wait on all of them to finish.

Maybe you have a text file full of server names with one server name per line like this:
SRV1
SRV2
SRV3
....

You then query these server names with Get-Content -Path C:servers.txt.

Connecting to each of these servers in serial may look like this. The following code snippet uses the Invoke-Command command’s ability to process multiple computers at once by passing in an array of server names to the ComputerName parameter:
$servers = Get-Content -Path C:Servers.txt
Invoke-Command -ComputerName $servers -ScriptBlock {
    ## Do something on the server here
}
The ability to process many servers at once is handy, but the default behavior is serial. Instead, you can use the AsJob parameter on Invoke-Command to invoke each instance and immediately create a background job and keep processing more servers.
$servers = Get-Content -Path C:Servers.txt
Invoke-Command -ComputerName $servers -AsJob -ScriptBlock {
    ## Do something on the server here
}
When the Invoke-Command has finished starting all of the background jobs, it will then release control of the console back to you. At that point, you can check on the status of the jobs using the Get-Job command and discover the output using the Receive-Job command as shown in the following:
../images/501963_1_En_16_Chapter/501963_1_En_16_Figc_HTML.jpg

Further Learning

Use the .NET StreamReader Class When Reading Large Text Files

The Get-Content PowerShell cmdlet works well for most files, but if you’ve got a large multi-hundred megabyte or multi-gigabyte file, drop down to .NET.

Perhaps you have a large file called C:MyHugeFile.txt that is several gigabytes. You need to read the entire file so you immediately use the cmdlet you’re most familiar with which is Get-Content.
Get-Content -Path C:MyHugeFile.txt
You’ll find that Get-Content takes a long time to read the entire file. Instead of using Get-Content, consider using the System.IO.StreamReader .NET class. Create an instance of the System.IO.StreamReader .NET class using the file you wish. Then, using the Peek() method, process each line of the file and read the line using the ReadLine() method inside of a while loop as you can see in the following:
$sr = New-Object -Type System.IO.StreamReader -ArgumentList 'C:MyHugeFile.txt'
while ($sr.Peek() -ge 0) {
    $sr.ReadLine()
}

Using the Get-Content cmdlet is much simpler but much slower. If you need speed though, the StreamReader approach is much faster.

Further Learning

  • PERF-02 Consider Trade-offs Between Performance and Readability

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.119.136.235