Remotely Enable Always On

Always On Availability Groups is the new feature for High Availability in SQL Server 2012. It’s been out for awhile now, but unless you have Enterprise Edition SQL, you might not have been able to use it much.

Of course you need a cluster to utilize Always On, but once that is complete, you also have to enable Always On in Configuration Manager on all your servers that will be participating in the AG as well.

Continuing on with my lazy, automated DBA goals of logging into computers as rarely as possible, I developed the below PowerShell script to connect to SQL Servers, enable Always On, and then restart the SQL Service in order for the changes to take effect.

The only thing you need to change below is the computer names, it should automatically detect your SQL instance names. If that doesn’t work (I haven’t been able to test every possible name parsing possibility), you can supply the instance names yourself.


## List Servers in AG Here ##
$Computers = 'Computer1','Computer2'
## Everything Else is Automated ##

# Finds the servers running the services, and the services' names
Invoke-Command -ComputerName $Computers -Scriptblock {
$Services = (Get-Service -Include MsSql* | Where { $_.Status -eq 'Running' } )
$Nodes = @()

# Parses the names of the SQL Instances
ForEach( $Service in $Services )
{ $Nodes += $Env:ComputerName'\'+$Service.DisplayName.Split('(')[1].Replace(')','') }

# Loops through each instance and enables AlwaysOn, restarting with -Force
ForEach ( $Node in $Nodes )
{ Enable-SQLAlwaysOn -ServerInstance $Node -Force }

# Starts SQL Service on the affected server(s) if it's still stopped
If($Services -ne $NULL)
{ Start-Service -DisplayName ($Services.DisplayName) }
}

The key code here for enabling Always On is this snippet below.

Enable-SQLAlwaysOn -ServerInstance $Node -Force

If the longer script cannot automatically detect your ServerInstance, you can provide it manually and run the command. Restart your SQL Service for the change to take affect.

Advertisements

PowerShell: Copy-Item to Many Remote Computers

I thought I already blogged about this, but when a friend asked me if I had a script for quickly moving a file to many different computers, I couldn’t find the page. Knowing it would be a lot easier to copy and paste a link rather than explain the script each time someone asked me for this, I had my next blog topic.

Basic Copying
Copying a file from one location to another using PowerShell is a pretty simple process, but you can encounter some hiccups with the file paths depending on your setup. At the most basic level, all you need is the Copy-Item cmdlet.

Copy-Item –Path 'C:\OriginalLocation\FileName.txt' –Destination 'C:\NewLocation'

Alternatively you can use the Move-Item command in the same manner. Can you guess what the difference is? That’s right! Move-Item moves the item rather than copying it. I rarely move, but I copy all the time.

Multi-Computer Copying
The purpose of my script is for when you have a single file that needs to be pushed out to dozens of computers. It uses Copy-Item and loops through computers. The function  has extra Verbose logging, copy verification, and, of course, multi-computer copy capability.

Remember that the $Path needs to include the local file path including the file name and file extension. The $Destination needs to be formatted for remote calls or DriveLetter$ but should not include a file name. If you plan to rename the file across all the computers you are copying to, I suggest you just rename the original file first.

Finally, I suggest you always use the –Verbose parameter for this function. It’s just a lot nicer seeing what is happening to your computers rather than the script just sitting there running, leaving you in the dark while you fret over what it might be doing…or not doing.


Function Copy-File
{
  [Cmdletbinding()]
  Param(
   [Parameter(ValueFromPipeline=$true,Mandatory=$true)]
    [String[]]$ComputerName,
   [Parameter(ValueFromPipeline=$true,Mandatory=$true)]
    [String]$Path,
   [Parameter(ValueFromPipeline=$true,Mandatory=$true)]
    [String]$Destination
)
  Process {
   #Extract FileName from Path
   $File = $Path.Split('\')
   $File = $File[$File.Length-1]

  ForEach ($Computer in $ComputerName)
  {
    Write-Verbose "Starting copy to $Computer"
    IF(Test-Path "\\$Computer\$Destination")
    {
     Write-Verbose "Folder $Destination exists on $Computer"
    }
    ELSE
    {
     Write-Verbose "Creating folder $Destination on $Computer"
     New-Item "\\$Computer\$Destination" -Type Directory
    }

   Write-Verbose "Copying $File to $Computer"
   TRY
   {
    Copy-Item -Path $Path -Destination "\\$Computer\$Destination" -Force
    Write-Host "Copied to \\$Computer\$Destination\$File`n" -ForegroundColor GREEN
   }
   CATCH
   {
    Write-Warning "Copy failed to $Computer`n"
   }
  }#EndForEach
 }#EndProcess
}#EndFunction

Example
And now for an example of execution. Declare the variables, then run the command. The function supports piping, but 99% of the time I just declare the variable first. Again, use a $ rather than a : in the destination’s path if you want this to copy files to a remote location. You should not include a preceding \\ on the drive letter though, that’s taken care of in the concatenation.

##### EXAMPLE #####
$Comp = "Comp1","Comp2","Comp3" #Get-Content to read a list from txt file
$Path = "C:\Users\$Env:Username\Desktop\Apples.txt" #Include the FileName and Extension
$Dest = "C$\Users\$Env:Username\Desktop" #Be Sure to use a $ instead of a :

#Quick Variables
Copy-File -Verbose $Comp $Path $Dest

#Piping a variable into the function
$Comp | Copy-File -Verbose -Path $Path -Destination $Dest #Include the parameter names for safety

PowerShell Get Size of Folders

This week another server hit low disk space on the C drive, so I decided it would be an appropriate time to work out a new PowerShell script. I wanted to calculate the size of each folder to determine the biggest threats and work down from there. Clean out the big stuff first kind of thing. If you have ever tried to do this via the Windows GUI, you know this can take awhile. Right-click -> Properties works great for calculating one folder’s size, but when you need to check 100, it’s a teeny bit tedious.

I decided to write a script from scratch to test/improve my scripting skills, but I did use the internet in the end for some formatting help on the Size In Megabytes conversion.

Note: If you find that your folder size is not correct, the most likely issue is that you need to include a -Force parameter on Get-ChildItem within the ForEach loop. This will force hidden items to be counted, and should accurately display the total file size. Example below.

$Items = (Get-ChildItem $Folder -Force -Recurse | 
  Measure-Object -Property Length -Sum)

Further Note: If you attempt to use the above fix, you could encounter a length error as shown below.

GetChildItem-ForceError

The file name must be less than 260 characters; the directory must be less than 248 characters.

The only viable fix I have found for this requires using cmdshell syntax rather than PowerShell. It’s a lot uglier to work with, and frankly, I got the information I needed without looking at system hidden files (it’s not like I was going to delete those anyways). Knowing how big the files were that I could delete was enough for me in this exercise. But if you really want the full answer…

-Force Error Workaround Using CMD
You can replace Get-ChildItem with the below in an attempt to work around hidden items. You’ll have to mess with the code a bit, but you wanted some homework, right?
My opinion though, just don’t worry about counting the hidden items. Save yourself the frustration.

CMD /c DIR C:\USERS /-c

Full script (Not using -Force or CMD)

## SET COMPUTER(S) TO CALCULATE FOLDER SIZES
$Computers = "COMPUTERNAME"
Invoke-Command -ComputerName $Computers -ScriptBlock {
## SET THIS VALUE FOR THE FOLDER TO RECURSE THROUGH
$FolderPath = 'C:\Users'

$Result = @() #Initializes the result array
#this populates the folder list
$Folders = (Get-ChildItem $FolderPath).FullName
ForEach ( $Folder in $Folders )
{
$Items = 0 #resets file size count on each folder
$Items = (Get-ChildItem $Folder -Recurse | 
  Measure-Object -Property Length -Sum)
$Result += [PSCustomObject] @{
  FolderPath = $Folder
  Size = $Items.Sum
  SizeInMB = "{0:N2}" -f ($Items.Sum / 1MB)
}#EndCustomObject
}#EndForEach
$Result | Sort Size -Descending |
  FT FolderPath,SizeInMb -AutoSize
}#EndInvoke

PowerShell Remote Copy Workaround

Recently I ran into an unexpected problem when trying to copy some files remotely via PowerShell. I could have sworn that I have done this exact copy scenario in the past successfully, but this time, the computers just would not talk. The solution was fairly simple, but not at all intuitive.

Task

Copy a file remotely from one computer to another, both drives being local to their respective computers. Network connectivity, proper permissions, etc., exists between the computers. I wanted to do this completely remote, issuing the command from a third computer.

Script #1 (Remote Copy to Remote) – Failed

This should have been a very fast copy. I knew the cmdlet I needed and even the syntax. I wanted to copy a file from one computer to another, connecting to each from my computer.


$From = "\\Server1\C$\MyFile.txt"
$To = "\\Server2\C$"
Copy-Item -Path $From -Destination $To

However, that did not work. Instead I got an error indicating that my path was bad! “Invalid Path” isn’t a lot of help though. I know it’s valid!

RemoteCopyFail

OK, so maybe I had a typo. After quadruple-checking my paths though, I was finally led to believe that it was a double hop permissions issue causing the remote copy to fail. I decided to try connecting directly to one of the computers before attempting the copy.

Script #2 (Local Copy to Remote) – Failed

I used the same script here, the only difference is I no longer need to make a remote connection to the source file, so the $From parameter is no longer a remote call. Other options that I thought would work here were Invoke-Command or EnterPsSession.


$From = "C:\MyFile.txt"
$To = "\\Server2\C$"
Copy-Item -Path $From -Destination $To

However, it still did not work.

LocalCopyFail

The error was different this time, and even when recreating the error, I got confused and started wondering if I had a valid problem. Even though the error now says “Source and destination path did not resolve to the same provider”, it’s still the same problem and the resolution will be the same. My final attempt was a lot better.

Script #3 (Remote Copy to Remote, Fully Qualified) – Success!

After poking around the internet, I learned what the fully qualified name for PowerShell directories should be.  Providing the fully qualified name is required for any remote call. In other words, if your location starts with \\ then you need to add Microsoft.PowerShell.Core\FileSystem:: as a prefix. If you reference a local drive, you shouldn’t prefix the local drive. This worked perfectly, even remoting to both computers.


$From = "Microsoft.PowerShell.Core\FileSystem::\\Server1\C$\MyFile.txt"
$To = "Microsoft.PowerShell.Core\FileSystem::\\Server2\C$"
Copy-Item -Path $From -Destination $To

Hopefully this will save someone an hour of frustration.