PowerShell: Copy-Item to Many Remote Computers

I thought I already blogged about this, but when a friend asked me if I had a script for quickly moving a file to many different computers, I couldn’t find the page. Knowing it would be a lot easier to copy and paste a link rather than explain the script each time someone asked me for this, I had my next blog topic.

Basic Copying
Copying a file from one location to another using PowerShell is a pretty simple process, but you can encounter some hiccups with the file paths depending on your setup. At the most basic level, all you need is the Copy-Item cmdlet.

Copy-Item –Path 'C:\OriginalLocation\FileName.txt' –Destination 'C:\NewLocation'

Alternatively you can use the Move-Item command in the same manner. Can you guess what the difference is? That’s right! Move-Item moves the item rather than copying it. I rarely move, but I copy all the time.

Multi-Computer Copying
The purpose of my script is for when you have a single file that needs to be pushed out to dozens of computers. It uses Copy-Item and loops through computers. The function  has extra Verbose logging, copy verification, and, of course, multi-computer copy capability.

Remember that the $Path needs to include the local file path including the file name and file extension. The $Destination needs to be formatted for remote calls or DriveLetter$ but should not include a file name. If you plan to rename the file across all the computers you are copying to, I suggest you just rename the original file first.

Finally, I suggest you always use the –Verbose parameter for this function. It’s just a lot nicer seeing what is happening to your computers rather than the script just sitting there running, leaving you in the dark while you fret over what it might be doing…or not doing.

Function Copy-File
  Process {
   #Extract FileName from Path
   $File = $Path.Split('\')
   $File = $File[$File.Length-1]

  ForEach ($Computer in $ComputerName)
    Write-Verbose "Starting copy to $Computer"
    IF(Test-Path "\\$Computer\$Destination")
     Write-Verbose "Folder $Destination exists on $Computer"
     Write-Verbose "Creating folder $Destination on $Computer"
     New-Item "\\$Computer\$Destination" -Type Directory

   Write-Verbose "Copying $File to $Computer"
    Copy-Item -Path $Path -Destination "\\$Computer\$Destination" -Force
    Write-Host "Copied to \\$Computer\$Destination\$File`n" -ForegroundColor GREEN
    Write-Warning "Copy failed to $Computer`n"

And now for an example of execution. Declare the variables, then run the command. The function supports piping, but 99% of the time I just declare the variable first. Again, use a $ rather than a : in the destination’s path if you want this to copy files to a remote location. You should not include a preceding \\ on the drive letter though, that’s taken care of in the concatenation.

##### EXAMPLE #####
$Comp = "Comp1","Comp2","Comp3" #Get-Content to read a list from txt file
$Path = "C:\Users\$Env:Username\Desktop\Apples.txt" #Include the FileName and Extension
$Dest = "C$\Users\$Env:Username\Desktop" #Be Sure to use a $ instead of a :

#Quick Variables
Copy-File -Verbose $Comp $Path $Dest

#Piping a variable into the function
$Comp | Copy-File -Verbose -Path $Path -Destination $Dest #Include the parameter names for safety

Database Mirror Monitoring with Scripts

The database mirror monitor is a nice tool, but registering all your databases and clicking through each one can be time consuming when you have dozens. I want quick, easy to read results when an error comes in. If things look bad, I’ll open the monitor for a specific database to start an in depth look.

And yes, I do have reasonable alerts set up, but scripts are for proactive searching, dealing with new servers, and, most importantly, a jumping off point when an alert is generated. My normal process is:

  1. Read alert email
  2. Open multi-server query and run all the mirroring scripts below
  3. Investigate specific databases that are still reporting errors from the scripts

Sometimes I get to stop at step 2 because the issue has already been resolved due to a network glitch or someone else responding faster than me. I keep all the scripts saved in a single file which is always open and I run periodically.

The scripts are not as refined as I’d like, so please tweak them. They do the job when errors are coming in though. The only time I can think of any changes are when they are showing a slew of horrific errors, and at that time I just don’t have time to think about changing a script to make it prettier.

Find un-Synchronized Mirrors
Running this as a multi-server query will provide a list of all mirrored databases not currently synchronized. This is my most used method for verifying that a mirror is still in trouble.

FROM sys.database_mirroring
WHERE mirroring_state_desc IS NOT NULL
    AND mirroring_state_desc <> 'SYNCHRONIZED'

Find Databases that are not Online or are Restoring
This script is a bit more specialized. I rarely get results (which is good), but it has indicated anomalies before. This will return any databases that are offline or in a restoring state, but not part of a mirror. Sometimes this indicates a test database or one left in the process of being mirrored, but at least in my environment, I never want to let a database be left in that mode for very long.

The worst scenario I’ve had with this script was a time that it found a database that had mirroring removed, so the partner had just been left in restoring state. Somehow that had been overlooked until I found it with this.

FROM sys.databases d
   LEFT JOIN sys.database_mirroring m ON m.database_id = d.database_id
WHERE (d.state_desc <> 'ONLINE'
OR d.state_desc IS NULL)
   AND (m.mirroring_role_desc IS NULL
OR m.mirroring_role_desc <> 'MIRROR')

DB Mirror Monitor Script
This mimics the database mirroring monitor. Main benefits here are you do not have to preregister every single database to the monitor and then wait for each one to refresh and click through each database. Just run the script, and apply any filters that you want. When run as a multi-server query it will automatically increment through each mirrored database.

If you want to see your entire mirroring monitor in one moment, this is for you.

The reason for this script is because sometimes I get mirrors that are still considered synchronized, but are not pushing data fast enough. They throw strange alerts, if any, when this happens. Resolving this normally involves finding out why the network is moving slowly. Ultimately there is little I can do while I wait for a fix, but it’s good to know what is going on in the meantime.

 (database_name SYSNAME,[role] INT,mirroring_state INT
,witness_status INT,log_generation_rate INT,unsent_log INT
,send_rate INT,unrestored_log INT,recovery_rate INT
,transaction_delay INT,transactions_per_sec INT,average_delay INT
,time_recorded DATETIME,time_behind DATETIME,local_time DATETIME)

-- @id will increment and pull each mirrored database
DECLARE  @id        INT
 ,@db        VARCHAR(256)
 ,@command    VARCHAR(2000)
SELECT @id = MIN(database_id) FROM sys.database_mirroring
WHERE mirroring_guid IS NOT NULL

-- Loop through each database to pull mirror monitor information
  SELECT @db = d.name FROM sys.databases d WHERE d.database_id = @id
  SELECT @command = 'EXEC msdb.dbo.sp_dbmmonitorresults @database_name='+QUOTENAME(@db)+''
  PRINT @command
    EXEC (@command)
--Increment @id
  SELECT @id = MIN(database_id) FROM sys.database_mirroring
  WHERE mirroring_guid IS NOT NULL AND database_id > @id
-- Your WHERE clause/values will vary
WHERE unsent_log > 10

Get Drive Sizes using SQL or PowerShell

There are countless methods to obtain drive size information, each with its own set of benefits and limitations. I use a few different methods depending on if I have to type out the solution quickly or if I need a comprehensive view.

Method 1 – xp_fixeddrives
The most basic command to get drive sizes is easy to remember, so I use it when I need a fast result. This script leaves a lot to be desired though. It returns all drive letters on the queried SQL server, but only displays one column, MB free.

EXEC master.dbo.xp_fixeddrives


You could get extra information by inserting the results to a temp table and formatting it, but that defeats the purpose of this quick command. If I want more information, I move onto a more robust script entirely.

Method 2 – dm_os_volume_stats
In newer versions of SQL Server (2008R2+) you have the opportunity to use the dynamic management function dm_os_volume_stats. This function returns data on all drive letters with SQL files (Data, Log, Temp) and you can easily modify the select statement to tailor the results to your desires.

   [Drive] = volume_mount_point
  ,[FreeSpaceMB] = available_bytes/1024/1024 --/1024
  ,[SizeMB] = total_bytes/1024/1024 --/1024
  ,[PercentFree] = CONVERT(INT,CONVERT(DECIMAL(15,2),available_bytes) / total_bytes * 100)
FROM sys.master_files mf
  CROSS APPLY sys.dm_os_volume_stats(mf.database_id,mf.file_id)
--Optional where clause filters drives with more than 20% free space
WHERE CONVERT(INT,CONVERT(DECIMAL(15,2),available_bytes) / total_bytes * 100) < 20
  ,total_bytes/1024/1024 --/1024
  ,available_bytes/1024/1024 --/1024
  ,CONVERT(INT,CONVERT(DECIMAL(15,2),available_bytes) / total_bytes * 100)
ORDER BY [Drive]

I’ve returned the results in MB to match the result set in xp_FixedDrives, but feel free to divide by another 1024 bytes to convert to gigabytes. (That’s normally what I do). My query includes Free Space, Total Size, and Percent Free space. I also included an optional WHERE clause to filter out any drives that have over 20% Free space. Simply open up a multi-server query, copy and paste the script, and you have current data for drives with low space.


This is great for most situations…but this method fails to return all drives I’m concerned about. I do not get feedback on the C drive nor D Drive, which is where I have SQL and other applications installed. Not everyone will worry about system drives, but I want to know if someone is throwing data where it doesn’t belong.  I bet you care about that too. In that case, I use another resource.

Method 3 – PowerShell (preferred)
When I need data on every drive, I turn to PowerShell. PowerShell allows me to perform a full drive query against servers with versions prior to 2008R2 as well. The script isn’t overly difficult, but you will need to supply a list of computers to run against. I suggest saving your SQL Server computer names into a notepad file and having PowerShell read from that.

If you want a quick and dirty way to get your computer names, open a multi-server query using your CMS and run:

SELECT SERVERPROPERTY('ComputerNamePhysicalNetBios')
SELECT NodeName FROM sys.dm_os_cluster_nodes

That will return all computers running SQL and all cluster nodes – active or inactive. If you combine these as a distinct list, you should have all your computers. I didn’t say it was clean, I said it was quick.

The PowerShell script returns all drive letters, but you can filter a specific drive or name if you so desire. You can manipulate the WHERE clause by following the format of the one I provided.

For example, change $_.PercentFree to $_.VolumeName and -lt (which means less than) to -eq (equals).  Replace 20 with the drive name in quotes, for example, 'Data'. If you need multiple clauses, the format is WHERE { $_.Value1 -eq '1' -AND $_.Value2 -eq '2' }

$Computers = ( Get-Content C:\Users\$Env:UserName\Desktop\Computers.txt )
Get-WmiObject -class Win32_LogicalDisk -ComputerName $Computers -Filter "DriveType=3" |
 Select-Object SystemName,DeviceID,VolumeName,
  @{n='FreeSpace(MB)';e={$_.FreeSpace / 1MB -as [int]}},
  @{n='Size(MB)';e={$_.Size / 1MB -as [int]}},
  @{n='PercentFree';e={$_.FreeSpace / $_.Size * 100 -as [int]}} |
   Where { $_.PercentFree -lt 20 } |
    Sort SystemName,DeviceId | FT -AutoSize