SQLSaturday Tools

Much has been said about SQL Saturday tools and how PASS is pulling back from development of those tools past a few active items, and letting SQLSaturday go “off to college as we play catch with our other kids”  To get back to its “by the community, for the community“grassroots.    Isn’t that what all of PASS is?   I’m confused.

It seems, I’m not alone in that confusion.  Andy Warren outlined a plan end of last year, a call to action for the PASS Board if you will, and he just published a follow up post recently, which seems to have left him a bit perplexed about the situation. I too am perplexed.

The problem, as we are told, is the budget is a hot mess and there’s no money to do anything. Andy shared some details about what he knows regarding the budget in his posts and I can understand that running an organization takes money. This post isn’t about the budget. Its about the organization, what IS it?

To be clear, I wasn’t involved, or even in the periphery,  at the forming of PASS in 1999 so I have no insight into what the initial goal of the organization was at the time beyond what was written into the Bylaws, nor do I have any understanding of how the Board understands the organization, thus I can only use the data available to me for my position, and that is the PASS website. The website says that “PASS is a not-for-profit organization run by and for the community.”    We are the community.   The locals who attend the monthly user group meetings,  those who live away from civilization but drive to their nearest metropolis to attend SQL Saturday,  those who can’t get out but attend online Virtual meetings.  WE are the community for which this organization is ‘for’.

WE are Community

Looking further at the definition of PASS, we see it says it is ‘run by’ the community, which as I know it, it is.   At the internationally level, the Board, executives and directors are all members of the community, chosen by us every year to represent our wants and needs for the direction of the organization.    At the local level, I am part of a team that runs my local user group and take part in organizing our local SQL Saturday, so yes, I am part of, and help run the community.

In any community, some folks are destined to do great things: to step up and handle the big picture items, do things at scale.   I admire those guys who have the time and inclination to tackle the big things, its a thankless job that has to be done.  I’m not there yet.   Others of us, and a much large majority, are just meager local volunteers trying to do their part where they can.  One of the things I volunteer with is SQL Saturday, I’ve been assisting in someway to organize our local SQL Saturday for the last 4 years, running a local user group for ~3.    All volunteer positions because PASS is a volunteer organization and I love helping these events come to life.

One part of running these events are the tools we use.   For our local user group, we’ve simply found an alternative in Meetup.com, its just easier and works better for us, thus we simply moved,  but for SQL Saturday, I can’t easily take my ball and go home, I must use their tools, their website, their mailing functionality, their printed tickets and they don’t always meet the need.     The problem is, we have a very technically inclined community.  A community of volunteers that would love to help make the tools better.  Personally, I like to consider myself fairly technically inclined,  having done a few days as a software developer and manager.   I’d by willing to do the work to add the features needed, but I can’t.

Help Me, Help You

When asked in various forums about open sourcing the SQL Saturday or PASS Group tools,  the response has been ‘budget’, againagain, and again.    In the Q&A session put on by Grant Fritchey, current PASS President and member of the PASS board since 2015, the SQL Saturday Tools topic came up multiple times, and Grant was quite amazing at sticking to his position and deflecting every, single, time.    But the answers were circular (paraphrasing below)

Q:  “How much money will it take?”
A:  “We don’t know”

Q:  “How do you know you can’t afford it, if you haven’t even looked into it?”
A:  “Because, it will take time, which is money, for one of our IT guys to look into”

Q:  “Could the community sponsor it?”
A:  “We would need to know how much it costs to do that, and we don’t know”

Q: “Could we form a small committee of volunteers to investigate the costs so we have an idea moving forward?”
A:  “Thats not something we are looking to do right now, we don’t have budget”

With a  community of volunteers, very technically inclined volunteers at that, knocking at the door trying to help, we, as an organization have done nothing to allow for those with the skills necessary to drive forward the wants and needs of those at the ground level, closest to the membership, closest to the problems that need to be solved to step up and do their part.   We won’t even have the conversation about how volunteer labor, outside the management of local events, could further the cause of our organization, save that conversation for another day.

Its Hard

I get it, Governance is hard, it’s nuanced, and it can be complex, but refusing to start the conversation is a cop out.   If you cannot deliver on the wants and needs of your community, it will fracture and become its own thing.  If this is an organization about the community, for the community, lets put our ‘free labor’ where our mouth is and make some honest movements towards the community rather than simply being a fund-raising wrapper for PASS Summit and its organizing company.

Whats Next?

With much respect for all those stepping up and serving on the board,  you have to do more than stonewalling.  Setup a small volunteer committee to investigate the process.   Use your standard NDA to protect what needs protecting,  give the committee access to the code repo for initial analysis, use your pre-existing conference call softwares to host a call and start the conversation.   START the process.   Engage in an honest effort to engage the community in the places we can best help.

Game on, Speaker Idol at PASS Summit

This week I am attending PASS Summit 2014 in Seattle, Washington. PASS Summit is  gathering of SQL Data professionals from around the world who come together to learn more about our craft.  As most of you reading this site will know, I enjoy sharing and speak at the regional SQL Saturday events.  At some point I would like to be able to challenge my self to speaking at a broader audience, and while not an official speaker, I have been selected to compete in a competition for new speakers while at Summit.

This competition is known as SQL Speaker Idol.  It will be held in the style of the American Idol singing competition, fortunately for everyone, no singing will be involved.  Instead, 12 speakers will be selected to  present a 5 minute talk on  subject of their choosing, with judging to be done by an expert panel.

So not only will I be speaking, folks will be actively judging me and telling me how terrible I am.  On the bright side, I have had lots of support from my local user groups.  Both Pinellas SQL Pass and Hillsborough SQL Pass User groups have allowed me to give my presentations at their users groups over the last few weeks.  I’ve had comments and suggestions from SQL MVP and MCM folks who have been presenting for years, so I feel somewhat prepared and very well supported.

Luckily my session in on Friday morning, so I have a few more days to prepare, and prepare I will.  I have asked a few folks to allow me to take a few minutes of their time over the course of this week to present the session to them and get more feedback; an audience is always better than jabbering away to my self.

So there it is folks.  For anyone at PASS Summit, come over to the Community Session room 400 on Friday at 10:15 to cheer me on.  I’ve already put the Speak Idol Final on my calendar as I am an optimist.  If you’ve got the same faith,  put the final on your calendar as well, same room at 2:45.

Second Year Presenting at SQL Saturday Orlando

SQL Saturday 318 Logo

Coming up on my schedule is another appearance at SQL Saturday Orlando.   Orlando is quite special to me as it was my first ever presentation, and their team is quite welcoming, and just plain awesome.

Looking at the line up this year, I’m quite excited to see a few of my favorite speakers Louis Davidson ( b | t ), David Klee ( b | t ) and none other than Mr. Argenis Fernandez ( b | t ) coming in to share their wisdom.

For those of you local to the Tampa area, you might recognize our veteran speakers Pam Shaw ( t ) ,  Miguel Cebollero ( b | t ),  Jeff Garbus ( b ), Ron Dameron ( b | t ), as well as our most recent contribution to the Microsoft team Jorge Segarra ( b | t ).  A locally recognizable, but new on the speaker list is Pinellas PASS User Group Leader Leigh Freijo ( t ) with a presentation on how to get hired.  Having been through one of Leigh’s interviews (successfully!), and having heard some of his interview tales, I can say that he has some insight on what will get you hired.

As far as my presentation, I’m working on a new one titled: ‘Encryption 101: A look at SQL Encryption Options’ which I hope will be as well received as my previous presentations.  We’re going to take a look at TDE, SSL, Column and even backup encryption options so safeguard your data. I’ve been really heads down in the details on this one and need to start getting it all together in a palatable 101 type presentation.  We are a few weeks out so I still have time to smoke test it on my wife.

If you haven’t already, get over and register. Its set to be a fantastic day of training as it has been for the last 8 years.

SQL Downgrades Made Easy

I was reading Kendra Little’s ( b | t ) post on SQL Server Edition Changes and it reminded me of the work I did at my last position.   I did LOTS of downgrades, seriously, LOTS.

When we started auditing our systems we found TONS of Enterprise and Standard Editions being used in development and QA environments.  In the past, Microsoft gave us a pass and only cared about what was being used in production, but this year they indicated that not to be the case, so a remediation was in order.

Since I had done a good deal of work with Michael Well’s ( b | t Scripted PowerShell Automated Deployment Engine  (SPADE) for new server setups, I was the best candidate to handle the job.  I lost count of the total number of downgrades that I did, but I have a few dozen pages in my notebooks that look like the one below.  This page would have been one night’s work.



















The process which got me through a few hundred downgrades was fairly simple:

  1. The first step of downgrading is copying off your system files. If you are not sure where your system files are you can simply view the Start Up Parameters of the SQL Service.  I usually copy this value to a draft email or temp file for reference, just in case.  Once I find the master I copy the master, model, and msdb database to a temp folder.   If you skip this step, they will be deleted during the uninstall. Don’t ask me how I know this.
  2. Now that we have the lifeblood of our instance saved safely away, we can proceed with step 2, uninstall, which is just as it sounds.  Most of the time I will simply run the uninstaller and select everything as there is nothing better than a fresh install.  If you have SSAS or SSRS there are extra steps to safely migrate those services, which aren’t covered here.
  3.  By default, when SPADE is run, it copies all of the install media over one time and sets a file indicating that the media is in place.  If you run SPADE a subsequent time, it will see the file indicating the media is already copied locally and skip that step completely.  If you’ve used SPADE in the past this would result in you installing the previously installed version, again.  So prior to running SPADE, ensure you have deleted the install folder if it is there.
  4. Once SPADE has done its thing, installing everything, checking all the standard settings etc, we need to shut SQL down and move our old system databases back into place.  If you installed your INSTALLSQLDATADIR in a different location than you had prior, you will need to place your old system files in the new INSTALLSQLDATADIR location, once done, start the SQL Service again.  If all went well, it should start right up with no issues.
  5. In some cases, we were actively changing the names of the instance being downgraded, and this would require us to remove the old SPN entry and add a new one. If you get linked server issues with AD logins, you likely forgot this step.  As you can see from my checklist, there aren’t any checks on this column, as this was not the norm, but normal enough that it had its own check-box.
  6. If you are going to be industrious as I was and perform multiple downgrades at the same time, I recommend that you pre-stage the install media prior to your actual downgrade window.  SPADE has a -PRE flag that will allow you to run all the pre-checks without doing the actual install, this will stage your media locally, which will save you some time during the actual work. This check box was for just that, one check would be for version, other services (SSRS, SSIS, second instances) that would require more investigation, and I forget what the other check-box was for.

All in all, I was able to knock out this process multiple times, over multiple nights spread over a 2-3 month time frame.  Any thing that came up out of the ordinary during downgrade time I would simply skip that server and knock out the remainder.

One thing to note is that all of the servers that contained SSRS databases had to be coordinated with the team that ran those servers as the SQL Engine and SSRS engines have to be the same version, otherwise it wouldn’t operate.  Another gotcha, was SCOM I believe.  Luckily upgrading to a higher version has been a built in option of SQL Installers since 2005 I believe, a quick manual run of the install media in your SPADE folder and it can be upgraded in place, less than a minute in most cases.

This had been my experience, yours may vary.  If you have a large environment and find yourself doing builds fairly regularly, I highly suggest checking out SPADE, it can automate nearly all of your install and standardization tasks right out of the gate. Quick and Easy!


Export XML Columns to Individual Files

I was reading SqlServerCentral.com Articles Requested forum and saw this request.

Looking for an article that takes data from an XML column in a table and exports each field as a separate document.

While the request did indicate a PoSH or SSIS method, I wanted to give it a try using T-SQL. Lets look at how I managed it.


The Setup

To get started, I do some requisite cleanup, declare a few variables and create a simple table to hold our XML column that we will be exporting.

For data, lets query some of the system tables. In this case I’m going to create one XML document for Databases, Users, and Jobs.


And finally, we can use BCP to get our columns to disk:

Using this procedure we can pass in a document name, and the location we wish to save the XML file and get our results by calling the procedure like this:

Which results in three new files sitting in our C:\Temp directory:


The contents of those files are simply the XML we have stored in the XML typed document column.


Things to Think About

This approach is demonstrating the basic mechanics of how it would work, but there are a few things to keep in mind. Many DBAs will restrict xp_cmdshell, as they rightly should, as it can be very dangerous in the wrong hands, but with proper precautions and monitoring, it is a valid tool in your tool belt. One final thing to think about is permissions. When xp_cmdshell is run, it runs as the SQL Server Service Account, unless configured more precisely with sp_xp_cmdshell_proxy for instance. Which ever account that is being used will need permissions to write to the @Location you provide.

Comments, Suggestions

This is my approach using T-SQL, I’d love to hear feedback on my approach or holes in where my approach might be wrong. Thanks for stopping by.