SQL Downgrades Made Easy

I was reading Kendra Little’s ( b | t ) post on SQL Server Edition Changes and it reminded me of the work I did at my last position.   I did LOTS of downgrades, seriously, LOTS.

When we started auditing our systems we found TONS of Enterprise and Standard Editions being used in development and QA environments.  In the past, Microsoft gave us a pass and only cared about what was being used in production, but this year they indicated that not to be the case, so a remediation was in order.

Since I had done a good deal of work with Michael Well’s ( b | t Scripted PowerShell Automated Deployment Engine  (SPADE) for new server setups, I was the best candidate to handle the job.  I lost count of the total number of downgrades that I did, but I have a few dozen pages in my notebooks that look like the one below.  This page would have been one night’s work.

Spade_Downgrade_Checklist

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The process which got me through a few hundred downgrades was fairly simple:

  1. The first step of downgrading is copying off your system files. If you are not sure where your system files are you can simply view the Start Up Parameters of the SQL Service.  I usually copy this value to a draft email or temp file for reference, just in case.  Once I find the master I copy the master, model, and msdb database to a temp folder.   If you skip this step, they will be deleted during the uninstall. Don’t ask me how I know this.
  2. Now that we have the lifeblood of our instance saved safely away, we can proceed with step 2, uninstall, which is just as it sounds.  Most of the time I will simply run the uninstaller and select everything as there is nothing better than a fresh install.  If you have SSAS or SSRS there are extra steps to safely migrate those services, which aren’t covered here.
  3.  By default, when SPADE is run, it copies all of the install media over one time and sets a file indicating that the media is in place.  If you run SPADE a subsequent time, it will see the file indicating the media is already copied locally and skip that step completely.  If you’ve used SPADE in the past this would result in you installing the previously installed version, again.  So prior to running SPADE, ensure you have deleted the install folder if it is there.
  4. Once SPADE has done its thing, installing everything, checking all the standard settings etc, we need to shut SQL down and move our old system databases back into place.  If you installed your INSTALLSQLDATADIR in a different location than you had prior, you will need to place your old system files in the new INSTALLSQLDATADIR location, once done, start the SQL Service again.  If all went well, it should start right up with no issues.
  5. In some cases, we were actively changing the names of the instance being downgraded, and this would require us to remove the old SPN entry and add a new one. If you get linked server issues with AD logins, you likely forgot this step.  As you can see from my checklist, there aren’t any checks on this column, as this was not the norm, but normal enough that it had its own check-box.
  6. If you are going to be industrious as I was and perform multiple downgrades at the same time, I recommend that you pre-stage the install media prior to your actual downgrade window.  SPADE has a -PRE flag that will allow you to run all the pre-checks without doing the actual install, this will stage your media locally, which will save you some time during the actual work. This check box was for just that, one check would be for version, other services (SSRS, SSIS, second instances) that would require more investigation, and I forget what the other check-box was for.

All in all, I was able to knock out this process multiple times, over multiple nights spread over a 2-3 month time frame.  Any thing that came up out of the ordinary during downgrade time I would simply skip that server and knock out the remainder.

One thing to note is that all of the servers that contained SSRS databases had to be coordinated with the team that ran those servers as the SQL Engine and SSRS engines have to be the same version, otherwise it wouldn’t operate.  Another gotcha, was SCOM I believe.  Luckily upgrading to a higher version has been a built in option of SQL Installers since 2005 I believe, a quick manual run of the install media in your SPADE folder and it can be upgraded in place, less than a minute in most cases.

This had been my experience, yours may vary.  If you have a large environment and find yourself doing builds fairly regularly, I highly suggest checking out SPADE, it can automate nearly all of your install and standardization tasks right out of the gate. Quick and Easy!

 

Powershell Goodies

If you haven’t seen some of the other blog post, or the Microsoft Powershell Blog, Microsoft recently release Script Browser for Powershell ISE.   This little addin allows you to quickly search and import Powershell scripts directly from TechNet Script Center.    Currently the database section has 577 scripts in the repository, with over 9100 in the total repository.  There’s a pretty good chance that someone has a script for what you want to do.  Alternatively, if you are just getting started with Powershell, this is a great way to review others scripts and learn.

Try this, have you ever wanted to resize your TempDB to recommended number of files and sizes?  Search for ‘TempDB’    You should come up with Resize the TempDB database using PowerShell  from my coworker, Michael Wells (b | t).      Michael presents this as a function that will output the T-SQL to resize your TempDB.  Combine this with the better Invoke-SqlCmd2  from Powershell MVP Chad Miller (b) and you could easily loop and apply this to all your SQL Servers.  (Note: I do not recommend resizing TempDB like this on existing servers, we use this script to setup new servers to a standard base level)

Speaking of new servers, you can combine the vast array of scripts available from the script center, with the power of SPADE for SQL Server, also by Michael Wells, to automate the deployment of your SQL Servers.  With this combination and alittle time for getting things setup properly, you can easily end up fully standardized SQL Servers in a matter of minutes.  Since I work with Michael, I have used SPADE extensively, as well as wrote Powershell scripts for it to run, needless to say, I am a big fan of the project.  Guestimating the number of installs I’ve done with SPADE, I ended up somewhere around 300 installs over the last year.  It is an awesome tool, if you haven’t picked up on that from this paragraph.

If you use Powershell, check out these tools.  If Powershell isn’t currently in your tool belt, I highly recommending getting started.

Powershell – Extract Versions for SQL Features

 

Over the last few months I’ve been directed to ensure that all of our SQL Servers in the Development and QA segments have been installed with the proper licensing, namely Developer Edition.    One of the challenges I’ve ran across is how do I verify the SSRS and SSAS features are on the right versions. For SQL Server I can simply perform a multi-server query from our CMS and easily retrieve this information using the various SERVERPROPERTY options, but I’m not sure how to do this for the ancillary services.

Since I have limited experience with those two features of SQL Server I decided to go with Powershell, since it can simply do anything you ask of it.

In this example, I am going to use Powershell to extract these values directly from the registry of the host computer.

Running this against a list of servers defined in the top line will result in this output:

PS-Output
Now the beauty of Powershell is that I could source my list from a list in a text file:

Or directly from SQL:

Or from Active Directory (I know there is an inline way to filter this down, but I’m not sure the syntax right off.

In our case, scanning the Active Directory allows us to check the entire AD Server group to ensure we are not missing any rogue SSRS or SSAS servers that haven’t been added to our CMS lists which could throw us into a licensing surprise when Microsoft walks in to do their audit.

Hopefully this will help you.

Time Based Power Settings

Just recently I was part of an online presentation in which the intended audience was unable to install any of the screen sharing programs which my company had at our disposal (WebEx, GotoMeeting, Join.Me). The prospective client proposed we use their screen sharing program, Adobe Connect to perform the presentation, which seemed like a feasible idea, until our primary presented was unable to get it to connect.

The quick solution was to have someone that could connect to both systems ‘bridge’ the two screens together, essentially daisy chaining the two screen sharing application. Guess who was the lucky ‘tech’ guy on the call. Thats right, yours truly.

One of the problems I had was my multi-monitor, super awesome machine was now under the scrutiny of both my co-workers and our prospects, negating me from doing any multi-tasking on the other screens for fear of letting my lack of attention be known. (Side Note: Normally i’m very well tuned into conference calls, just not so much for presentations I see a few times a week)

Being the enterprising young lad I am, I thought this an opportune time to head down stairs to grab some lunch only to come back and find my monitor had went into power save mode a few seconds before. A quick wiggle of the mouse and we were in business.

Now I’m pondering, how do I prevent this? I could simply change the power settings to a larger time, but that would leave all three of my monitors on for X amount of hours AFTER i quit using the computer. I could setup two profiles and switch between them in the morning and night, thus allowing my screens to turn off quickly when I’m not likely to be there and stay on indefinitely when I am supposed to be there….or I could Powershell.

Of course, Powershell is the answer here.

Now with this handy little piece of awesomeness run hourly by Task Scheduler , my Power Settings are changed to my “Long Term” power saving plan between 7AM and 6PM and to my “Short Term” power saving plan all other hours.

Disk Space Checker

One method my company deploys our software is in a dedicated hosted environment. In this situation, we partner with a data center that provides hardware, we install the software and the client doesn’t get access to the hardware directly. This normally works well, unless they tend to have a-lot of activity which could cause the disk to fill prematurely. (Tons of side conversations could be had here about proper sizing, alternative storage methods, etc, but that’s for another day)

Not wanting to go into each system daily and check the available space, I whipped up some Powershell to do the work for me.

A few features I put in here:

  1. Emailed Report
  2. No Credentials Easily Accessible
  3. Only Send if below threshold

Setup

First we import our Net.Mail name space and setup a few user configurable variables.

Basic Checks

In this section, we get the path to the script itself, and look in that same directory for the Credentails and Servers files. If the credentials file does not exists, it will prompt you for the details, which it will then encrypt and save.

Extract the Secrets

This section is where we decrypt the credentials for our email server, as well as read the names of the servers from the $Server file.

Build the Email

Here we have some predefined HTML and CSS, which you could easily save to another file, but for me it made sense to just have it as one scrit.

Send it Or Not?

Notice in this snippet, the conditional statement uses Regex to search our $HTMLMessage for any

. If we see that tag, we know we have data and we can send our email using the standard Net.Mail.SmtpClient using the credentials stored in our credentials file we decrypted above.