T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (Blog | @AdamMachanic). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.
In the past year, my career has a had a little shake up. I left a development manager position, I took on a sales engineering position in the same industry, and finally I left all that madness and ended up as a full time DBA. In that time, I’ve seen three different, yet similar takes on the cloud from three seperate compaines.
We don’t know
When I was making development choices, to include database decisions, we discussed the cloud. I wrote a call center application that leveraged dynamically spun-up Asterisk in the cloud, as well as MySQL in the cloud (now defunct Xeround). Those decisions were mine to make as it was more of a proof of concept than a dedicated product offering. The reason the project didn’t move beyond the POC phase was the fear of substainable knowledge. Who was going to run this contraption if I got hit by a bus? Who knew how to manage Cloud computers? The syadmin team certainly did not, the other developers were all .NET guys, and NOBODY knows how to manage MySQL.
In this case, all the push back was on the technology used, not the platform, but there was certainly push back on the platform as well. At this company, they had limited resources who already had full plates which had no direct monetary reason for them to take on learning a new unproven platform. These old systems using Visual FoxPro on legacy hardware is good enough to carry us into the future. (true story)
We dont’ Get It
After leaving the call center, I went to work as a Sales Engineer deploying and customizing the main call center solution I had been hacking away at for the previous years. Being my first sales type job, I did not listen when my wife indicated 8 days of travel per month actually means 15, but I digress. This company had a cloud based product based off SalesForce.com, but they didn’t really get ‘the cloud’. When I think of the cloud, and I believe most technologist tend to agree, the cloud is a commodity. Use what you want, stop when you want. Its about on-demand, not long term contracts to lock you in, thats the old way. This new mentality centers upon deliverying a service that someone can pay for a portion of (an hour/ a day/ a month) , and then leave if they don’t extract value from the service that was provided. The provider was paid for the time service was provided, the user got the service they paid for, and there would be no need to play a legal game of how much money can I extract from you.
This company required 12-months upfront. You had to know how many users you wanted and lump sum a payment right in the begining, before you had a chance to prove their product worked. Prepay for a product I have yet to test drive? I was not asked to get behind that product nor asked to help deploy it for customers. I was asked to work on the Avaya based version of their product which I knew and loved (still do). At this company, they are used to their old sales cycles, their old contracts, and the old way of doing things. They just don’t ’get’ the most beneficial parts of the cloud, which is the availability to fail fast and move on if neccessary.
We have the cloud
At my current gig, they get the concepts, but eschew the public cloud in favor of a very robust private cloud, which they have truly embraced. While I knew about VMware and the concepts, I’ve never learned as much about the products available in this space as I have while on this contract. The team here understands how easy disaster recovery can become utilizing some of the enterprise cloud based tools. They understand how maximizing resource utilization with virtualization saves time and money. The team here gets virtualization, yet they still have a ‘keep it close’ mentality, which I understand due to the industry requirements.
In the End
The cloud is coming, neigh, it is here. Just like those crazy horseless carriages, and desktop computers, it is here to stay. With time, the hold outs will lower their guard and understand, and at some point, embrace the advantages that cloud based computing offers. Others will, through time, understand what terms customers will accept with their new cloud offerings. In time, just like other commodities, cloud services will be abundant, and cheap.
Since my presentation at Orlando SQL Saturday seemed to go well (standing room only, all positive feedback) I submitted to speak locally at SQL Saturday #248 Tampa BI. I was hesitant to submit to this specific SQL Saturday since it was predominantly Business Intelligence focused and the topics I have prepared are not BI focused, but what is the worst that can happen, they say no? I went for it, and they said yes.
I took a few points from Orlando that annoyed me and I fixed them before Tampa BI.
- Moving a VirtualBox machine from Desktop->Key->Laptop = 2 hours. <– All VMs now resided on external SSD.
- SSMS real estate was cramped during presentation when zoomed. <– Adjusted SSMS to show results on new tab
- I talked to long <– Set my phone up as a timer to keep me from the ever present tangents.
When presenting in Orlando for the first time ever, I was a bit on edge, so I over did the preparation, or so I thought. I wasn’t going to show up at Tampa BI and shoot from the hip or anything crazy like that, I prepared to a level I thought would be good enough given my recent presentation of the material. I took half a day off on Thursday and went through my slides and code examples, and I felt good about them. I felt good about my level of knowledge and points that I’d like to address.
Knowing that the session time was short (45 minutes vs 60 minutes normally) I had to condense my long-winded presentation and get to the point. I think in reviewing the material, I focused on what I thought I needed to say, and neglected to focus on how i wanted to say it. In the end, my thoughts, then my words, all got jumbled and the well thought out word-smithing I had delivered in Orlando was not well polished or practiced for Tampa BI, instead I relied on a few concepts which I could not analogize nor explain as well as on my previous presentation.
Now, despite all this self-abusing review, there were positive points. One of the high-points of my data was as I was setting up, a gentleman from the previous session came up and talked to me and indicated that he saw my presentation in Orlando and used a CTE to fix a problem at work less than two weeks later. He shared that he was able to take a process from 20 minutes to less that a 30 seconds just by reengineering the process to use a CTE. That alone made the whole event worth attending.
I was fortunate enough to have a well known speaker attend a portion of my session who reached out after the event with some great feedback. I couldn’t have bought more valuable feedback if I tried. The feedback provided was constructive, positive feedback that will help me rework my presentation into a more useful source of information for those starting in our field.
Another high point, I was sitting at a table with one of my two SQL User Group leaders, Mrs. Pam Shaw, who happen to be sitting with a few of her friends. When we realized there was no coordinated after-event, the whole table of us decided to grab dinner. I had the pleasure of dining with some wonderful people, whose names I won’t publish here so as not to lead on that I have cool SQL friends, when I actually don’t. Thanks for the conversation, it was truly enjoyed.
In the end I believe public speaking is a good place for me. It forces me to learn more about a subject I think I already know about. It reminds me that even thought I may have rocked it in Orlando, I can still miss my target at Tampa BI. It reminds me that the industry luminaries that you read on the blogs, watch at the events, and follow on Twitter, are all just regular folks who once walked in shoes quite similar to your own. We are quite blessed that our field is filled with these folks who are so willing to give back and mentor us newer members with open arms.
Thank you all that came to my presentation, those that endured my conversation throughout the day, and of course those that have inspired me to get to this point.Read More
As a DBA your job is to protect the data, whether that be from corruption, attack, developers or any other host of unknown afflictions. While I was not involved in the day to day backup or recovery while acting as an Accidental DBA (handled by an MSP), nor do I handle those duties in my current role as an actual DBA (handled by Storage Team) I am very aware of the needs of a solid strategy for backup AND recovery.
As I was tuning my next presentation which will introduce the uninformed to Ola Hallengren’s portfolio of free utilities I realized that teaching new or aspiring DBAs about the importance of a backup plan is reckless unless you also tell them about the importance of the more important, a recovery plan. I recalled having read a write-up by Greg Robidoux at MSSQLTips.com for a script to automatically generate a recovery script, based off a folder full of backups.
Once I looked at the requirements Greg lays out in his script I saw that it would not work as-is with Ola’s solution, so I modified it to do just that.
Now, as a warning, there is nothing earth shattering here, just a simple rework of a great solution for your toolbox. If you use Ola’s tools, maybe you can add this as an item in your tool box, as always, all feedback is greatly appreciated.
SET NOCOUNT ON
DECLARE @dbName sysname
, @backupPath NVARCHAR(500)
, @cmd NVARCHAR(500)
, @lastFullBackup NVARCHAR(500)
, @lastDiffBackup NVARCHAR(500)
, @backupFile NVARCHAR(500)
DECLARE @fileList TABLE (backupFile NVARCHAR(255))
DECLARE @directoryList TABLE (backupFile NVARCHAR(255))
SET @dbName = 'AdventureWorks2012'
SET @backupPath = 'C:\Backup'
/* Match that of Olas output */
SET @backupPath = @backupPath + '\' + @@SERVERNAME + '\' + @dbName + '\'
Get List of Files
SET @cmd = 'DIR /s /b /O D ' + @backupPath
INSERT INTO @fileList(backupFile) EXEC master.sys.xp_cmdshell @cmd
Find latest full backup
SELECT @lastFullBackup = MAX(backupFile)
WHERE backupFile LIKE '%' + @@SERVERNAME + '_' + @dbName + '_FULL_%.bak'
SET @cmd = 'RESTORE DATABASE [' + @dbName + '] FROM DISK = '''
+ @backupPath + @lastFullBackup + ''' WITH NORECOVERY, REPLACE'
Find latest diff backup
SELECT @lastDiffBackup = MAX(backupFile)
backupFile LIKE '%' + @@SERVERNAME + '_' + @dbName + '_DIFF_%.bak'
check to make sure there is a diff backup
IF @lastDiffBackup IS NOT NULL
SET @cmd = 'RESTORE DATABASE [' + @dbName + '] FROM DISK = '''
+ @backupPath + @lastDiffBackup + ''' WITH NORECOVERY'
SET @lastFullBackup = @lastDiffBackup
check for log backups
DECLARE backupFiles CURSOR FOR
backupFile LIKE '%' + @@SERVERNAME + '_' + @dbName + '_LOG_%.trn'
AND backupFile> @lastFullBackup
Loop through all the files for the database
FETCH NEXT FROM backupFiles INTO @backupFile
WHILE @@FETCH_STATUS = 0
SET @cmd = 'RESTORE LOG [' + @dbName + '] FROM DISK = '''
+ @backupPath + @backupFile + ''' WITH NORECOVERY'
FETCH NEXT FROM backupFiles INTO @backupFile
put database in a useable state
SET @cmd = 'RESTORE DATABASE [' + @dbName + '] WITH RECOVERY'
I love this and would love to use it all the time, but that whole ‘hostile workplace’ is frowned upon.Read More