Troubleshooting Policys and xp_dirtree

Today we had a policy failure on a newly built SQL server. The policy verifies that every SQL server has a backup file stored on a remote directory. Simple enough, but in the past, this policy has been prone to permissions errors.

In case you would like to implement something like this on your system, the dumbed-down condition of the policy is below:


ExecuteSql('NUMERIC', '
--Get DBs on server
DECLARE @DBCount INT,@File VARCHAR(250)
SELECT  @DBCount = COUNT(*) FROM sys.databases WHERE database_id <> 2
SELECT  @File = ''\\BACKUPSERVER\BACKUP\''+CONVERT(VARCHAR(100),SERVERPROPERTY(''MachineName''))
--Insert DirTree results into a table variable to return a numeric result
DECLARE @Check TABLE (SubDirectory VARCHAR(250), Depth BIT, [File] BIT)
INSERT  @Check
EXEC master.dbo.xp_dirtree @File,1,1
--Numeric result for easy policy evaluation 1=Pass if( # files = # dbs )
IF((SELECT COUNT(*) FROM @Check) = @DBCount)
SELECT 1
ELSE
SELECT 0
')

Policy Explanation

The dumbed-down policy checks a backup server, which has a folder named for each SQL server, to see if it has the correct count of backups.  Each folder should have a backup file for each database, except for TempDB. This isn’t exactly how we do it, but it gets my point across and it provides a good example of how a moderately complex policy works using ExecuteSQL. There aren’t a lot of examples on the web, so I try to provide as many as I can.

The Troubleshooting

The new server wasn’t processing the policy correctly. My first idea was, “Well, there’s no files!” Easy! Problem solved!  Or not…the files were in the folder, and the number was even correct. So it wasn’t the backups or copy failing. That meant that there was something wrong with the permissions, the problem was, what?

I verified the logins were all present in SQL by reviewing Security -> Logins through Management Studio. This was a new server, so I assumed it would be something simple like that. The account in question was the SQL Server service account.  Policy Based Management and xp_dirtree use this account. The account was in SQL and it had all the proper permissions. OK, next thing.

The folder where all the files are stored is a shared folder, so I went into the Properties of the folder and verified the account that was shared. Everything looked right there. Every time I’ve had an issue with the permissions of xp_dirtree reading servers, it’s been because the service account in the shared folder was wrong, so my next stop was checking out Active Directory.

FileSharing

Sharing a folder makes the file path easier. Just make sure your service account has permission here.

Drilling down through Active Directory, I found the account used for sharing the folder. This account had every SQL service account as a member. It looked fine in AD, it looked fine on the Shared folder. Everything worked perfectly except on this new server. What was wrong?!?

At this point, I just decided to throw a hammer at it and throw in extra service accounts until hopefully one worked. Eventually I added the SQL Agent account. Suddenly, the policy passed. The new server could read the backup directory! So I took the normal account out of the permissions. Now everything failed except this new server. Wait, what…?

I logged into the new server and loaded up SQL Configuration Manager…

Oh…

When the server was setup, someone switched the accounts. SQL Server was running using the SQL Agent service account. SQL Agent was running using the SQL Server service account. I restarted each with the correctly corresponding service and account. I then fixed the permissions for the shared backup directory back to the original setup.

Voila! Everything worked perfectly!

The Moral of the Story

Make sure new servers and settings are configured properly before trying to fix things that weren’t broken before the new stuff existed. I’m sad to say it took me so long to understand the problem. Even with other DBAs chiming in. None of us thought about checking the Configuration Manager…that or they assumed I had already checked that.

Sometimes, it’s the little things that get you…OK, normally its the little things that get you.

Advertisements

PowerShell: Detect and Uninstall Java FAST

I’ve been building out a remote, silent Java Updater for awhile now. Making sure it works perfectly in my environment is not an easy task. I’m 90% there, I just have to get over the last few hurdles.
One of the major problems was uninstalling Java efficiently. You can Google scripted Java uninstalls, and you’ll probably find the same recycled code over and over again. (I’ve provided it below, but don’t touch it) The problem is, this code is terrible.

Let me explain. The code works as intended. That’s not why it’s terrible. Why am I complaining then? It uses a very, very bad WMI class. What’s WMI you ask? Not the problem. The class is. It’s a huge waste of time. How bad is it?

Win32Product_Evil

When even Bing thinks it’s bad. It’s bad.

 

Get-WmiObject -Class win32_product would have been a very useful command, however, it’s by far the slowest and most inconsistent thing I have ever used.  Worse, you can’t fix it. Use a WHERE clause, use a FILTER, use wildcards, use explicit values. It doesn’t matter. It’s still slower than molasses. If you plan to use this command, you might as well just go to lunch after you hit execute.

If you didn’t check out those links above, let me summarize:

Invoking win32_product causes a computer to enumerate through the entire list of installed programs and validates each install, causing potentially extremely long wait times before returning results. Regardless of where clauses or filters, every program is always checked in this way.

In testing, the command takes 20-40 minutes to return results on a production server. Even on my computer it takes at least 5 minutes, normally much longer. I have ran this command and gotten results as fast as 11 seconds…after I ran it three times in quick succession. Then running it again five minutes later already took three minutes again. And that was the best case scenario test I had. That’s just unacceptable. Here’s that terrible code with a timer included. Take my word for it and just skip past this.

Win32_Product (Very Slow – Just Skip It!)

#### GET INSTALLED JAVA VERSION - SLOW!
$Start = Get-Date
$Java = Get-WmiObject -Class win32_product | WHERE { $_.Name -LIKE "Java [0-9]*" }
$Current = (Get-Date) - $Start
"Execution Time: $($Current.Hours):$($Current.Minutes):$($Current.Seconds)"
$Java

If you ran that, I’m sorry you wasted away half your life waiting for it to complete, but I did warn you.

My Method (Fast but Wordy)

I came up with a method to uninstall through registry guids. Detecting the java installation guid goes from 30 minutes to <1 second in this manner. The entire installation process was cut by at least 98%, over half an hour down to roughly 25 seconds! I can work with that! The code is a lot longer though.


########## GET INSTALLED JAVA VERSION - FAST!
$Start = Get-Date
Function Get-Uninstaller
{
$ErrorActionPreference = "SilentlyContinue"
$Guids = Get-ChildItem -Path 'Registry::HKLM\Software\Microsoft\Windows\CurrentVersion\Uninstall' | ForEach {$_.Name}
$Products = ForEach ($Guid in $Guids)
{
$Results = "" | SELECT DisplayName, IdentifyingNumber
$Results.DisplayName = Get-ItemProperty -Path $($Guid.Replace("HKEY_LOCAL_MACHINE","Registry::HKLM")) | ForEach {$_.DisplayName}
$Results.IdentifyingNumber = Get-ItemProperty -Path $($Guid.Replace("HKEY_LOCAL_MACHINE","Registry::HKLM")) | ForEach {$_.UninstallString.Split("X")[1]}
$Results
}
$Products
}#EndFunction Get-Uninstaller

$Java = Get-Uninstaller | WHERE { $_.DisplayName -LIKE "Java [0-9]*" }
$Current = (Get-Date) - $Start
"Execution Time: $($Current.Hours):$($Current.Minutes):$($Current.Seconds)"
$Java

Win32Reg_AddRemovePrograms (Fast, but last minute discovery)

But wait, there’s more! The documentation on how broken win32_products is provided me with an even easier method. I didn’t find it until researching some links for this blog though. I was very skeptical about this code until I ran it, because I never saw anyone suggesting its use online while searching for a PowerShell uninstaller. It’s actually just as instantaneous as my registry read method!


########## GET INSTALLED JAVA VERSION - Alternative
$Start = Get-Date
$Java = Get-WmiObject -Class win32reg_addremoveprograms | WHERE { $_.DisplayName -LIKE "Java [0-9]*" }
$Current = (Get-Date) - $Start
"Execution Time: $($Current.Hours):$($Current.Minutes):$($Current.Seconds)"
$Java.DisplayName
$Java.ProdID

Uninstall Script

Whichever method you choose, you’ll need to then run the uninstaller still. Just change the Guid Identifier appropriately for whichever script you choose. That is, $App.IdentifyingNumber or $App.ProdID. I’m logging to a Temp Java folder, so make sure that folder exists or just remove the /l and everything after it if you don’t care about the logging. The uninstall runs silently with no user interface and suppresses reboots.


########## UNINSTALL JAVA
ForEach ($App in $Java)
{
$ArgumentList = "/x $($App.IdentifyingNumber) /qn /norestart /l*vx C:\Temp\Java\UninstallLog.log"
Start-Process -FilePath "msiexec.exe" -ArgumentList $ArgumentList -Wait -PassThru
}

 

Automatic Tempdb Configuration Script

The default configuration for TempDb leaves a lot to be desired, and although good changes are on the way in 2016, it will be quite some time before any of us get to use that in production.

I manage many SQL servers, and the TempDb files have long been ignored on them, mostly due to overpowered server hardware. Recently I’ve been working to correct their file allocations and, most importantly, set up some standards for future server builds.

Microsoft best practices suggest assigning one data file per logical processor, up to eight. If contention remains, you can then assign additional data files, in groups of four, until contention lessens. I decided an automatic script could handle the initial setup.

If you hate reading you can skip to the code. I’ve commented everything you need to change and explained each step in detail. Even better, if you smash F5 and run the code, it will just show you the debug mode by default, no changes will be made.

Script Explanation

This script is meant to run against a new server that has the default settings in place for Tempdb, with a single data and log file. The script will rename those files per your input and then create additional data files up to the calculated limit based on logical CPUs or eight, whichever is lower.

The script sets the location for the data and log files, but if you are one of the lucky few who have a single drive per data file, this script cannot help you in its present state. You can modify the script to deal with multiple file locations, or manually make the changes.

One of my biggest pet peeves is percentage based autogrowth. It’s a very dangerous setting as your files grow. It isn’t even an option in this script. Generally it’s a good idea to presize your files anyhow, so the autogrowth shouldn’t matter regardless most of the time. You also may be asking why the size variables are all VARCHARs. That is only so they do not have to be converted when used later in the script.

My second favorite feature, after the number of data files calculation, is that you supply the total data file size. The script will calculate the size of each file for you. If you want an eight gig data file and have eight files, each file will be set to one gig. Simple. No math required.

The @Debug variable is supplied by default as 1. This allows you to run the script to see the actual ALTER DATABASE commands without executing them. Once you are happy with the results, you can rerun the script with @Debug=0 and the files will be modified and created immediately.


USE [master]
GO
SET NOCOUNT ON;

--- Set these values
DECLARE
 @Debug BIT = 1 --1=Information Only; 0=Execute Command
,@DataName VARCHAR(25) = 'Tempdb_Data' --Desired file name for data files
,@DataTotal VARCHAR(10) = '1024' --Total size of all Tempdb files; to be divided per file (in mb)
,@DataGrowth VARCHAR(10) = '100' --Data Autogrowth size (in mb)
,@DataLocation VARCHAR(250) = 'D:\DATA' --Data file location
,@LogName VARCHAR(25) = 'Tempdb_Log' --Desired file name for log file
,@LogSize VARCHAR(10) = '512' --Size of Log (in mb)
,@LogGrowth VARCHAR(10) = '50' --Log Autogrowth size (in mb)
,@LogLocation VARCHAR(250) = 'E:\LOG' --Log file location

--- Everything else is calculated for you!
,@Count INT = 2
,@LogicalCPU INT
,@DataSize VARCHAR(10)
,@OldData VARCHAR(25)
,@OldLog VARCHAR(25)

--- This will set # of Data files = Logical CPUs or 8, whichever is smaller
--- Afterwards, manually increase past 8 files, in groups of 4, if contention remains.
SELECT @LogicalCPU = CASE WHEN cpu_count>8 THEN 8 ELSE cpu_count END FROM sys.dm_os_sys_info
SET @DataSize = @DataTotal / @LogicalCPU
SELECT @OldData = Name FROM sys.master_files WHERE database_id = 2 AND file_id = 1
SELECT @OldLog = Name FROM sys.master_files WHERE database_id = 2 AND file_id = 2

--- Modify original single log and data files to follow desired inputs
DECLARE @SQL VARCHAR(MAX) = '
ALTER DATABASE [tempdb]
MODIFY FILE (NAME= N'''+@OldLog+''', NEWNAME= N'''+@LogName+''', FILENAME= N'''+@LogLocation+'\Tempdb_log.ldf'', SIZE= '+@LogSize+'MB, FILEGROWTH= '+@LogGrowth+'MB);

ALTER DATABASE [tempdb]
MODIFY FILE (NAME= N'''+@OldData+''', NEWNAME= N'''+@DataName+'1'', FILENAME= N'''+@DataLocation+'\Tempdb1.mdf'', SIZE= '+@DataSize+'MB, FILEGROWTH= '+@DataGrowth+'MB);
'

--- Generates additional data files up to calculated limit
WHILE(@Count<=@LogicalCPU)
BEGIN
SET @SQL+= '
ALTER DATABASE [tempdb]
ADD FILE (NAME= N'''+@DataName+''+CONVERT(VARCHAR,@Count)+''', FILENAME= N'''+@DataLocation+'\Tempdb'+CONVERT(VARCHAR,@Count)+'.ndf'', SIZE= '+@DataSize+'MB, FILEGROWTH= '+@DataGrowth+'MB);
'
SET @Count+=1
END

--- Debug=1 Prints SQL commands to execute for testing; Debug=0 will process the commands
IF(@Debug=1)
PRINT @SQL
IF(@Debug=0)
EXEC (@SQL)

Comments and suggestions are welcome. I’ve tested this on over a dozen servers so far with no issues (other than forgetting to change the drive path on a server with different drive letters).