PowerShell Serial Function to Parallel Workflow

I’ve been feeling very underwhelmed about blogging for the past week, so I needed something short and simple to discuss this time. Luckily I had an old script that I could adapt and explain.

PowerShell introduced Workflows in version 3, and they are particularly wonderful for the ability to create a ForEach loop that runs in parallel. This can significantly reduce your run times when making numerous remote connections, but with most things, it depends on your use and environment.

A standard PowerShell function executes against multiple computers in a serial fashion, meaning one at a time. This is perfectly fine for a small number of computers, but as the list increases so does the time required. Adapting to a Workflow is not especially difficult, but you may run into some invalid commands. In my below examples, I built a function and workflow with nearly identical syntax. If you look closely though, you’ll see that the Get-Service cmdlet includes a -PSComputerName parameter instead of the standard -ComputerName parameter. Also, Out-Null was not included in Workflows. I had to pipe to InlineScript in order to get that to work. Further details about Workflow scripting can be found here.

The example functions execute serially while the workflow executes in parallel. Get-ServiceTimed utilizes a standard cmdlet, but also calculates the total run time like the rest of the examples. I’ve included it solely for comparison to Get-ServiceSerial which has a ForEach loop – so as to be as similar to the workflow as possible.

Each function calculates and returns the runtime.

Executed in 33.51 seconds against 32 computers.
Executed in 0.03 seconds against 2 computers.

Function Get-ServiceTimed
$StartTime = Get-Date
Get-Service -ComputerName $Computers | Out-Null
"Run Time - Cmdlet - $(((Get-Date) - $StartTime).TotalSeconds) seconds."

ForEach Serial
Executed in 29.30 seconds against 32 computers, just as slow as using the cmdlet normally.
Executed in 0.03 seconds against 2 computers.

Function Get-ServiceSerial
$StartTime = Get-Date
ForEach ($Computer In $Computers)
{ Get-Service -ComputerName $Computer | Out-Null }
"Run Time - Serial - $(((Get-Date) - $StartTime).TotalSeconds) seconds."

ForEach Parallel
Executed in 12.36 seconds against 32 computers. More than twice as fast as the serial execution.
Executed in 5.88 seconds against 2 computers. Significantly slower than any serial test.

WorkFlow Get-ServiceParallel
$StartTime = Get-Date
ForEach -Parallel ($Computer In $Computers)
{ Get-Service -PSComputerName $Computer | InlineScript { Out-Null } }
"Run Time - Parallel - $(((Get-Date) - $StartTime).TotalSeconds) seconds."

Summary I only supplied a few example execution times because times vary wildly depending on network traffic and configuration. Your results may vary greatly. Serial execution was actually faster than parallel execution on average until I was connecting to more than ten computers. If you only work with a handful of computers, you may not need to worry about this. I’m often connecting to at least 30 computers at once, in which case parallel execution is always much faster.


T-SQL Tuesday #66: Monitoring with Utility Explorer

TSQL2sDay150x150_388014A5This month’s topic for T-SQL Tuesday is hosted by Cathrine Wilhelmsen (blog | twitter) with the subject of monitoring.

The topic of the month is monitoring. We all monitor something while working with SQL Server: Maybe you’re a DBA who monitors database performance, an SSIS developer who monitors job and package execution, an application developer who monitors queries or a BI analyst who monitors report usage? Do you use T-SQL or PowerShell scripts? Have you created your own monitoring reports or applications? Have you invested in a complete monitoring solution from a vendor? What is monitored manually and what is automated with alerts? If a brand new SQL Server professional in your role asks you for advice, what do you say is the number one most important thing to monitor? Have you ever stayed awake for a week solving a problem that could have been avoided with proper monitoring? Did you implement a monitoring solution that saved your company?

I currently use so many different methods of monitoring, it was daunting to decide what to discuss. I could have written about a high level overview of everything, but that would have been nothing more than mentioning each one in order to fit it into one post. Policies, jobs, traces, audits, extended events, PowerShell…we use them all. Basically if it is not a third-party tool (spending money on any solution is tough to justify here), my office uses it or will be giving it a shot soon. Coincidentally, we just started using another form of monitoring in the past few weeks, so I will discuss some of the trials and tribulations of using the SQL Server Utility.

Utility Explorer provides a simplified dashboard for monitoring and baselining SQL Server CPU and storage resources. This feature is not available for Standard Edition, like almost anything else new and exciting. To utilize this feature, you will need to create a Utility Control Point (UCP). I’d suggest using your Central Management Server to host. Don’t have a CMS? If you have more than two servers, make one. It’s not hard. Unlike a CMS though, the UCP can monitor itself. This is one of the more obscure features added in SQL Server 2008R2, but you can access it in Management Studio by selecting View -> Utility Explorer.


After selecting your instance, you need to specify the account to run the utility collection set. Using the Agent account is the easiest way to set up, but you have to use a Windows domain account as the collection account, regardless of which path you choose.


Select the instance where data will be collected. Again, I suggest your CMS. A utility database will be created to store data, and the default collection period is one year, so make sure you have sufficient drive space or be ready to reduce your retention period. No worries though, this new dashboard will remind you if your disk space starts getting low! The utility database will grow about 2GB, per monitored instance, per year (typically), while the msdb database should grow around 20MB, per instance, per year. Warning: if you also use Data Collector, your data growth can be exponentially higher.


Now the selected instance will be validated. The instance cannot be enrolled if it is already a member of another UCP. I’m using Collection Sets as well, so validation failed on that. You can bypass this issue by simply disabling your collection sets during the enrollment process. That will allow validation to succeed. After enrolled, you can turn collection sets back on.. The two pieces are related (they use the same database) and yet separate enough that they don’t always play well together.


Once you have your UCP created, you can begin enrolling instances, which is a very simple process. Just right-click on Managed Instances, or within the white space of Utility Explorer Content, and select Enroll Instance. Specify the instance name and the agent account, like you did above while creating the UCP, and then validate the enrollment. The instance may throw a warning about WMI. Since it’s just a warning, the enrollment will still succeed. My instances have not had any issues reporting data after throwing this warning. Researching the subject has thus been low priority, but it appears to be a permissions issue. Troubleshooting with that article has shown nothing of concern though…


Once you have a few instances enrolled, open the Utility Dashboard, which will look similar to this.


Select Managed Instances and select the tabs to view CPU usage, storage space, policy configuration for utilization pass/fails, and computer properties. Each enrolled instance will show a pass/fail mark based on the configured policies.

The Storage Utilization tab provides a quick look at your volume free space. You can drill down to see data file and log file sizes as well.
The Property Details tab summarizes the instances software and hardware.

Finally, in the Utility Administration, there are a few settings you need to be aware of. This is where you can set Global policies for storage warnings, set read privileges for user accounts on the Utility dashboards, and specify data retention. I suggest you start small and monitor how fast your sysutility database grows. Like I said earlier, combining this with Data Collection, we had massive data growth, more on that in the future.

Overall, Utility Explorer provides an easy to read dashboard for quick looks at your SQL Servers. If you are a graphical person and can spare the disk space, it’s a great feature. Sure, you can get the current settings via queries, but the really valuable part is the automatic retention which allows quick baselining of your CPU and storage. Capacity planning made easy.


Remotely Get & Set Delayed Start for a Service in PowerShell

This week I was asked to check all the SQL Servers to verify that a specific service was running and that it was set to Delayed Start. Requests like this are great fun for me, I get to play with PowerShell and sometimes even get to reuse scripts I’ve written.

The first step was something I’ve done before, setting a service through PowerShell, so I was confident that I had a script for it and the task would be complete in minutes.


I had around a hundred servers to update. My advice is to use a notepad file with a server list if you have to update more than even half a dozen computers. It’s just a lot easier than formatting them all for a variable. The below declaration will pull the details of your notepad file using the Get-Content cmdlet. I keep a computer list on my desktop and the variable $Env:UserName will pull the currently logged in user, so anytime I share a script, the next person just has to make a similar list. Remember to set one computer per notepad line.

$Computers = ( Get-Content C:\Users\$Env:UserName\Desktop\Computers.txt )

I’m going to work with the Windows Update service in the following examples since every computer should have this.

$Service = "wuauserv"

What doesn’t work

It turns out that PowerShell does not natively support setting Auto Delayed Start with the Set-Service cmdlet. Why was it overlooked? There are a few connect threads out there for the issue. Go ahead and vote them up, we might get lucky and it will make a difference.


Like the Connect threads mention, there is no member for Startup Type, you can verify that it’s missing with the following query to Get-Service.

Get-Service | Get-Member -MemberType Property
Notice no Startup
Notice Startup Type is missing

This means we will have to set the service to running first, then set the startup type via a workaround afterwards.

There are a number of ways to set a service to running in PowerShell, but I’ll share two short ways. They are basically the same method; the first one just pipes the command. The benefit of the first method is that you can run the command before the pipe to verify what you are about to affect. It’s a bit like doing a select statement before you run an update in SQL. It’s all about being safe.

Method 1: Piping

Run the code before the pipe (the | character) first to verify what is going to be set. Then run the whole part once you are satisfied with the results.

 Get-Service $Service -ComputerName $Computers |
Set-Service -Status Running 

Method 2: Single Cmdlet

This method is more straightforward. Nothing is wrong with Method 2; it’s just for those who are a bit more confident in their PowerShell skills, or for the ones who like to throw caution to the wind.

Set-Service -Name $Service -ComputerName $Computers -Status Running

Verify Service is Running

It’s a good idea to verify that the services are now running afterwards. There’s almost always a computer that fails to connect and you’ll have the peace of mind that some random code you found online did run successfully.This query returns any computers with a service that is still not running.

Get-Service $Service -ComputerName $Computers |
Where-Object { $_.Status -ne 'Running' } |
Format-Table MachineName,Status,Name -AutoSize

Set Delayed Start

This step gets problematic. We have to step outside PowerShell a bit to use an obscure command line script, at least it’s obscure to me, Sc Config. Throwing a ForEach loop around the command will allow it to run against all your computers, just remember the \\$Computer parameter and that Start= Delayed-Auto has to be written exactly like I have it (including the space after the equal sign) unless you like getting errors. I added an optional Write-Host to insert whitespace and the computer name. You’ll want that if any connections fail so that you know which computer is the issue.

ForEach ( $Computer In $Computers )
Write-Host "`n $Computer"
SC.EXE \\$Computer Config $Service Start= Delayed-Auto

Verifying Delayed Start

Now we have to use yet another method to verify that the service is set to Delayed Start. We know PowerShell doesn’t work already. Using WMI doesn’t help either, it returns misleading results. Everything is Auto! It doesn’t tell you if it’s Delayed Start or regular Automatic. For example:

ForEach ( $Computer In $Computers )
Get-WmiObject -Query "Select StartMode From Win32_Service Where Name='$Service'" |
Select-Object @{l='MachineName';e={$Computer}},StartMode

The only viable choice is to search for a registry key that is created for Delayed Start services. The following script will return your desired service if it has a Delayed Auto Start value. Since it’s such a weird query, I wanted to see everything this time. I’m using Invoke-Command to demonstrate a different cmdlet. Notice that I defined $Service inside the ScriptBlock. If you fail to do this, the variable will not be declared. It’s all compartmentalized inside the Invoke-Command per computer.

Invoke-Command -ComputerName $Computers -ScriptBlock {
$Service = "wuauserv"
Write-Host "`n $Env:ComputerName"
Get-ChildItem HKLM:\SYSTEM\CurrentControlSet\Services |
Where-Object {$_.Property -Contains "DelayedAutoStart" -And $_.PsChildName -Like "$Service*" } |
Select-Object -ExpandProperty PSChildName


It was a lot more work than I anticipated to set Delayed Auto Start on a service, but I learned some neat tricks in the meantime. If you just have to set a service to running or one of the three supported Startup methods, you’ve gotten off easy, and you won’t need this work around.