Tag Archives: azure

Simple Database Backups With SQL Azure Part 2: Simpler and Free with Red Gate

So while ago I wrote about my adventures in SQL Azure backups.  At the time, there was very little offered by either Microsoft or tool vendors to provide an easy solution for scheduling SQL Azure backups.  So in the end, I cobbled together a solution involving batch files, Task Scheduler, and most importantly Red Gate Compare and Data Compare.

But much has changed the past year.  Red Gate released their new SQL Azure Backup product, whose functionality looks freakishly similar to other less polished solutions that people had written about.  The cool part is that while the SQL Compare solution I proposed originally required a purchased copy of the Red Gate SQL tools, Red Gate has been nice enough to release their Azure backup tool for free.

Also, Microsoft has released a CTP version of their SQL Import/Export Service.  This service allows you to backup and restore your database using Azure Blob storage instead having to download it to a local database server, which is actually what most of us really wanted in the first place anyway.  The latest versions of Red Gate’s Azure Backup also supports this functionality, which gives you a lot of options.

So just to close the loop on this, here’s the updated batch script file we’re using for SportsCommander now for doing regular production backups of our database.  We’re opting to use the the Import/Export functionality as our primary backup strategy:

SET SqlAzureServerName=[censored]
SET SqlAzureUserName=[censored]
SET SqlAzurePassword=[censored]
SET SqlAzureDatabaseName=[censored]

SET AzureStorageAccount=[censored]
SET AzureStorageKey=[censored]
SET AzureStorageContainer=[censored[

for /f "tokens=1-4 delims=/- " %%a in (‘date /t’) do set XDate=%%d_%%b_%%c
for /f "tokens=1-2 delims=: " %%a in (‘time /t’) do set XTime=%%a_%%b

SET BackupName=SportsCommander_Backup_%XDate%_%XTime%


C:\SQLBackups\RedGate.SQLAzureBackupCommandLine.exe /AzureServer:%SqlAzureServerName% /AzureDatabase:%SqlAzureDatabaseName% /AzureUserName:%SqlAzureUserName% /AzurePassword:%SqlAzurePassword% /CreateCopy /StorageAccount:%AzureStorageAccount% /AccessKey:%AzureStorageKey% /Container:%AzureStorageContainer% /Filename:%BackupName%.bacpac

 

A few notes:

– This runs the same Import/Export functionality you can get through the Azure portal.  If you have any problems with the parameters here, you can experiment in Azure portal

– The AzureStorageAccount parameter is the account name of your storage account.  So if your blob storage URL is http://myawesomeapp.blob.core.windows.net, your would want to use “myawesomeapp” in this parameter

– The /CreateCopy parameter will use SQL Azure’s CREATE DATABASE AS COPY OF method to create a snapshot first and then back that up, instead of just backing up the live database.  This takes a little extra time, but it is important to ensure that you are getting a transactionally consistent backup.

 

Of course, if you still want to copy down a local instance of the database like we did in the previous post, you can easily do that too:

SET SqlAzureServerName=[censored]
SET SqlAzureUserName=[censored]
SET SqlAzurePassword=[censored]
SET SqlAzureDatabaseName=[censored]

SET LocalSqlServerName=[censored]
SET LocalSqlUserName=[censored]
SET LocalSqlPassword=[censored]

for /f "tokens=1-4 delims=/- " %%a in (‘date /t’) do set XDate=%%d_%%b_%%c
for /f "tokens=1-2 delims=: " %%a in (‘time /t’) do set XTime=%%a_%%b

SET BackupName=SportsCommander_Backup_%XDate%_%XTime%

C:\SQLBackups\RedGate.SQLAzureBackupCommandLine.exe /AzureServer:%SqlAzureServerName% /AzureDatabase:%SqlAzureDatabaseName% /AzureUserName:%SqlAzureUserName% /AzurePassword:%SqlAzurePassword% /CreateCopy /LocalServer:%LocalSqlServerName% /LocalDatabase:%BackupName% /LocalUserName:%LocalSqlUserName% /LocalPassword:%LocalSqlPassword% /DropLocal

 

Good luck.

Generating Azure-Friendly SQL Scripts

The Error

So if you are working in SQL Azure, you’ve probably learned the hard way that you can’t just script out your DDL changes in your local SQL Management Studio and run it against your Azure database.  It throws in a whole bunch of extra fancy-pants DBA-y stuff that SQL Azure just doesn’t let you use. 

For example, say I throw together a simple table in my local database.  Depending on your SQL source control approach (you have one, right?), you might script it out in SQL Management Studio and get something like this:

CREATE

TABLE MyWonderfulSampleAzureTable (

[ID] [int] IDENTITY(1,1) NOT NULL,

[GoEagles] [varchar](50) NOT NULL,

[BeatThemGiants] [varchar](50) NOT NULL,

[TheCowboysAreAwful] [bit] NOT NULL,

[AndyReidForPresident] [varchar](50) NULL,

CONSTRAINT [PK_MyWonderfulSampleAzureTable] PRIMARY KEY CLUSTERED

(

[ID] ASC)

WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

)

ON [PRIMARY]

GO

 

Pretty enough, no?  Sure, it’s full of a lot of gibberish you don’t really care about, like PAD_INDEX=OFF, but hey if it runs, what’s the problem?

Now, let’s run that against our SQL Azure database:

Msg 40517, Level 16, State 1, Line 10
Keyword or statement option ‘pad_index’ is not supported in this version of SQL Server.

 

Ooops.  This is a pain to fix when you’re deploying a single script.  However, when you’re running a whole development cycle worth of these against your production database at 3 AM and it chokes one of these scripts, this is absolutely brutal.

Censorship Brings Peace

So why does this happen?  Why can’t SQL Azure handle these types of cool features?  Mostly because they just don’t want to.  Sure, some of the features missing from SQL Azure are because they just haven’t been implemented yet, but some of them are deliberately disabled to prevent unmitigated chaos.

While you may have a DBA managing your on-premise database who is working in your best interest (or at least your company’s interest), SQL Azure has a much bigger problem to solve.  They need to provide a shared SQL environment that does not let any one consumer hose up everyone else.  If you’ve ever hosted a SQL database in a high-traffic shared hosting environment, you’ve probably feel the pain of some joker going cookoo-bananas with the database resources.

In other words, what you do in the privacy of your own home is all well and good, but if you are going to go play bocce in the public park, you’re certainly going to have to watch your language and act live a civilized person.

And a lot of these features you don’t really have to care about anyway.  No doubt, you are really really smart and know when your indexes should be recompiled, but the reality is that much of the time whatever algorithm the folks on the SQL team came up with is going to be a little bit smarter than you, Mr. SmartyPants.

Anyhow, for your edification, here’s a wealth of information about the stuff you can’t do.

The Manual Workaround

So how do we get our script to run?  My general rule of thumb is to rip out all of the WITH stuff and all of the file group references:

CREATE TABLE MyWonderfulSampleAzureTable (

[ID] [int] IDENTITY(1,1) NOT NULL,

[GoEagles] [varchar](50) NOT NULL,

[BeatThemGiants] [varchar](50) NOT NULL,

[TheCowboysAreAwful] [bit] NOT NULL,

[AndyReidForPresident] [varchar](50) NULL,

CONSTRAINT [PK_MyWonderfulSampleAzureTable] PRIMARY KEY CLUSTERED

(

[ID] ASC)

WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

)

ON [PRIMARY]

GO

See, I had a general rule of thumb for this, because I encountered it a lot.  On just about every DDL script I had to generate.  And missed a lot of them.  Quite the pain in the neck.

 

The Much Better, Not-So-Manual Workaround

So I was at the a Philly.NET user group meeting last night and Bill Emmert from Microsoft was walking through the options for migrating SQL Server databases, and he showed us this setting that I wish I knew about a year ago:

SQLAzureSettings

If you change this to SQL Azure Database, it will change your environment settings to always use create SQL scripts that are compatible with Azure.  No more manual script trimming!  Unless, of course, you are into that kind of thing, in which case you really wasted the last 10 minutes reading this.

Good Luck.

Simple Database Backups With SQL Azure

Why?

Last year we launched a new version of SportsCommander.com, which offered volleyball organizations across the country the ability to promote their tournaments and accept registrations for a negligible fee.  Having grown out of our previous hosting company, we tried hosting the platform on Windows Azure, and for the most part it’s been great.  Also, the price was right.

We are also hosting our data in SQL Azure, which again for the most part has been great.  It has performed well enough for our needs, and it abstracts away a lot of the IT/DBA maintenance issues that we would really rather not worry about.

Of course, nothing is perfect.  We’ve had a few snags with Azure, all of which we were able to work around, but it was a headache. 

One of the biggest issues for us was the ability to run regular backups of our data, for both disaster recovery and testing purposes.  SQL Azure does a great job of abstracting away the maintenance details, but one of the things you lose is direct access to the SQL backup and restore functionality.  This was almost a deal-breaker for us.

Microsoft’s response to this issue is that they handle all of the backups and restores for you, so that if something went wrong with the data center, they would handle getting everything up and running again.  Obviously this only solves part of the problem, because many companies want to have their own archive copies of their databases, and personally I think doing a backup before a code deployment should be an absolute requirement.  Their answer has been “if you need your own backups, you need to build your own solution.”

Microsoft is aware of this need, and it has been the top-voted issue on their Azure UserVoice site for a while. 

In poking around the interwebs, I saw some general discussion of how to work around this, but very little concrete detail.  After hacking around for a while, I came up with a solution that has worked serviceably well for us, so I figured I’d share it with y’all.

 

What?

In order to address these concerns, Microsoft introduced the ability to copy a database in SQL Azure.  So, as a limited backup option, you can create a quick copy of your database before a deployment, and quickly restore it back if something fails.  However, this does not allow for archiving or exporting the data from SQL Azure, so all of the data is still trapped in the Azure universe.

Apparently another option is SSIS.  Since you can connect to Azure through a standard SQL connection, theoretically you could export the data this way.  Now I am no SSIS ninja, so I was just never able to get this working with Azure, and I was spending far too much time on something that I shouldn’t need to be spending much time on.

I’ve heard rumblings Microsoft’s Sync Framework could address the issue, but, uh, see the previous point.  Who’s got time for that?

So of course, Red Gate to the rescue.  Generally speaking their SQL Compare and SQL Data Compare solve this type of problem beautifully, they are excellent at copying SQL content from one server to another to keep them in sync.  The latest formal release of their products (v8.5 as of this writing) does not support SQL Azure.  However, they do have beta versions of their new v9.0 products, which do support SQL Azure.  Right now you can get time-locked beta versions for free, so get yourself over to http://www.red-gate.com/Azure and see if they are still available.  If you’re reading this after the beta program has expired, just pony up the cash and by them, they are beyond worth it.

 

How?

OK, so how do we set this all up?  Basically, we create a scheduled task that creates a copy of the database on SQL Azure, downloads the copy to a local SQL Server database, and then creates a zipped backup of that database.

First, you need a SQL Server database server.  And go install the Azure-enabled versions of SQL Compare and SQL Data Compare.

Also, go get a copy of 7-Zip, if you have any interest in zipping the backups.

The scheduled task will execute a batch file.  Here’s that batch file:

SET SqlAzureServerName=[censored]
SET SqlAzureUserName=[censored]
SET SqlAzurePassword=[censored]

SET LocalSqlServerName=[censored]
SET LocalSqlUserName=[censored]
SET LocalSqlPassword=[censored]

echo Creating backup on Azure server
sqlcmd -U
%SqlAzureUserName%@%SqlAzureServerName% -P %SqlAzurePassword% -S %SqlAzureServerName% -d master -i C:\SQLBackups\DropAndRecreateAzureDatabase.sql

echo Backup on Azure server complete

echo Create local database SportsCommander_NightlyBackup
sqlcmd -U %LocalSqlUserName% -P %LocalSqlPassword% -S %LocalSqlServerName% -d master -i C:\SQLBackups\DropAndRecreateLocalDatabase.sql

echo Synchronizing schema
"C:\Program Files (x86)\Red Gate\SQL Compare 9\SQLCompare.exe" /s1:%SqlAzureServerName% /db1:SportsCommanderBackup /u1:%SqlAzureUserName% /p1:%SqlAzurePassword% /s2:%LocalSqlServerName% /db2:SportsCommander_NightlyBackup /u2:%LocalSqlUserName% /p2:%LocalSqlPassword% /sync

echo Synchronizing data
"C:\Program Files (x86)\Red Gate\SQL Data Compare 9\SQLDataCompare.exe" /s1:%SqlAzureServerName% /db1:SportsCommanderBackup /u1:%SqlAzureUserName% /p1:%SqlAzurePassword% /s2:%LocalSqlServerName% /db2:SportsCommander_NightlyBackup /u2:%LocalSqlUserName% /p2:%LocalSqlPassword% /sync

echo Backup Local Database
for /f "tokens=1-4 delims=/- " %%a in (‘date /t’) do set XDate=%%d_%%b_%%c
for /f "tokens=1-2 delims=: " %%a in (‘time /t’) do set XTime=%%a_%%b
SET BackupName=SportsCommander_Backup_%XDate%_%XTime%
sqlcmd -U %LocalSqlUserName% -P %LocalSqlPassword% -S %LocalSqlServerName% -d master -Q "BACKUP DATABASE SportsCommander_NightlyBackup TO DISK = ‘C:\SQLBackups\%BackupName%.bak’"

"C:\Program Files\7-Zip\7z.exe" a "C:\SQLBackups\%BackupName%.zip" "C:\SQLBackups\%BackupName%.bak"

del /F /Q  "C:\SQLBackups\%BackupName%.bak"

echo Anonymize Database For Test Usage
sqlcmd -U %LocalSqlUserName% -P %LocalSqlPassword% -S %LocalSqlServerName% -d SportsCommander_NightlyBackup -i "C:\SQLBackups\AnonymizeDatabase.sql"

 

The first thing this does is run a SQL script against the SQL Azure server (DropAndRecreateAzureDatabase.sql).  This script will create a backup copy of the database on Azure, using their new copy-database functionality.  Here’s that script:

DROP DATABASE SportsCommanderBackup
GO
CREATE DATABASE SportsCommanderBackup AS COPY OF SportsCommander
GO
DECLARE @intSanityCheck INT
SET @intSanityCheck = 0
WHILE(@intSanityCheck < 100 AND (SELECT state_desc FROM sys.databases WHERE name=’SportsCommanderBackup’) = ‘COPYING’)
BEGIN
— wait for 10 seconds
WAITFOR DELAY ’00:00:10′
SET @intSanityCheck = @intSanityCheck+1
END
GO
DECLARE @vchState VARCHAR(200)
SET @vchState = (SELECT state_desc FROM sys.databases WHERE name=’SportsCommanderBackup’)
IF(@vchState != ‘ONLINE’)
BEGIN
DECLARE @vchError VARCHAR(200)
SET @vchError = ‘Failed to copy database, state = ”’ + @vchState + ””
RAISERROR (@vchError, 16, 1)
END
GO

 

A few notes here:

  • We are always overwriting the last copy of the backup.  This is not an archive; that will be on the local server.  Instead, this always the latest copy.  Besides, extra Azure databases are expensive.
  • For some reason SQL Azure won’t let you run a DROP DATABASE command in a batch with other commands, even though SQL 2008 allows it.  As a result, we can’t wrap the DROP DATABASE in an “IF(EXISTS(“ clause.  So, we need to always just drop the database, which means you’ll have to create an initial copy the database drop for the first time you run the script.
  • The CREATE DATABASE … AS COPY OF will return almost immediately, and the database will be created, but it is not done the copying.  That is actually still running in the background, and it could take a minute or two to complete depending on the size of the database.  Because of that, we sit in a loop and wait for the copy to finish before continuing.  We put a sanity check in there to throw an exception just in case it runs forever.

 

Once that is complete, we create a local database and copy the Azure database down into that.  There are several ways to do this, but we chose to keep a single most-recent version on the server, and then zipped backups as an archive.  This gives a good balance of being able to look at and test against the most recent data, and having access to archived history if we really need it, while using up as little disk space as possible.

In order to create the local database, we run a very similar script (DropAndRecreateLocalDatabase.sql):

IF(EXISTS(SELECT * FROM sys.databases WHERE Name=’SportsCommander_NightlyBackup’))
BEGIN
DROP DATABASE SportsCommander_NightlyBackup
END
CREATE DATABASE SportsCommander_NightlyBackup

 

In this case, we actually can wrap the DROP DATABASE command in a “IF(EXISTS”, which makes me feel all warm and fuzzy.

After that, it’s a matter of calling the SQL Compare command line to copy the schema down to the new database, and then calling SQL Data Compare to copy the data down into the schema.  At this point we have a complete copy of the database exported from SQL Azure.

As some general maintenance, we then call sqlcmd to backup the database out to time-stamped file on the drive, and then calling 7-Zip to compress it.  You might want to consider dumping this out to a DropBox folder, and boom-goes-the-dynamite, you’ve got some seriously backed-up databii.

Lastly, we run an AnonymizeDatabase.sql script to clear out and reset all of the email addresses, so that we can use the database in a test environment without fear of accidentally sending bogus test emails out to our users, which I’ve done before and it never reflected well on us.

Run that batch file anytime you want to get a backup, or create a scheduled task in Windows to run it every night.

Anyhoo, that’s about it.  It’s quick, it’s dirty, but it worked for us in a pinch.  Microsoft is just getting rolling on Azure and adding more stuff every month, so I’m sure they will provide a more elegant solution sooner or later, but this will get us by for now.

Have you had a similar experience?  How are you handling SQL Azure backups?