Getting Started with Red Gate DLM Automation Suite

Red Gate's DLM Automation Suite is the second half of the automated database deployments process. It takes the scripts that were created by SQL Source Control, builds a package, and then deploys that package to the various database environments (test, staging, production, etc.). Red Gate has built the tooling so it can be used by almost any build tool out there, Microsoft's VSTS Build vNext, TeamCity, Octopus Deploy, Jenkins, Bamboo and so on.

Before diving headfirst into building out a continuous integration process, here are some tips to get you started.

Understand Red Gate DLM Automation Suite Basics

The documentation written by RedGate is a great way to get started. This section gives a high-level overview of the core four pieces of functionality provided by the Red Gate DLM Automation Suite.

Build

This is the process that takes what is in source control and creates a package so it can be deployed to database servers. No actual deployment occurs during the build process.

At the time of this writing, it takes the folder SQL Source Control created and wraps it up in a NuGet package with some additional metadata.

Fun Fact: a NuGet package is nothing more than a .zip file. If you want to browse, just replace .nupkg with .zip and Windows Explorer will allow you to browse the contents.

Test

This command will run unit tests found in the NuGet package. I haven't had a chance to work with database unit tests (something I am not proud of), so this is the only mention of this functionality in this article.

Publish

Publish will take the NuGet package built by the build command and publish it to a NuGet feed. It is very straight forward. The documentation confirms this.

Sync

This command will deploy the package built by the build process and deploy it to the database servers. Under the hood, it is using SQL Compare and SQL Data Compare to do the deployments.

During deployment any schema changes not in the NuGet package will be overwritten by this process. For example, if you made a change to a stored procedure that only exists on a server but it is not in Source Control then that change to the stored procedure will be deleted. Every developer who has adopted this process has made this mistake at least once.

Fun Fact: this step is called sync because you are syncing the contents of what is in the NuGet package with the SQL Server database being deployed to.

Build Tool Evaluation

Red Gate is very good about building out their tooling, specifically the automation tooling, so it can really run using any build tool. This has been very helpful because at Farm Credit Services of America we changed build/deploy servers four times in under a year (TFS, Red Gate Deployment, TeamCity and VSTS vNext). Had Red Gate not had that mindset, well, I shudder to think about how hard it would have been.

There are typically two schools of thought when it comes to a continuous integration process. Have a single tool that builds the artifacts and deploys them. Or, make use of multiple tools, one tool builds the artifacts, and another deploys the artifacts.

It was a single tool setup for the longest time at Farm Credit Services of America when deploying to the lower environments. The build server built the artifacts, then it deployed it to development. Once development looked good, the artifacts were built again (yikes!) and deployed to a testing environment. This process made use of TFS and XAML defined builds (double yikes!).

Having the build server do the deployments had some advantages. It was a one-stop shop for everything about the build. But, it was very brittle and because of that, there was a different process pushing to pre-production and production. Also, editing XAML builds...well the less said about that the better.

The key to a continuous integration process is consistency. Two different processes to deploy is just asking for trouble.

Last year Microsoft released their build vNext through their VSTS (formerly Visual Studio Online) services. Before moving all of our builds over to it we decided to sit down and plan out how we wanted the new process to work. We wanted to adopt Build Once, Deploy Anywhere.

VSTS could be massaged to do build once, deploy anywhere. It would take some custom steps but it was possible. Still, we would rather not build custom tooling (we have to maintain that tooling till the end of time) but it was still lacking a key feature. Approvals. We wanted to have the ability for our support staff and business owners to approve deployments prior to going to production. They already had this ability but it was manual. If it could be baked in, even better. And there would be an audit trail. After some research, we decided to go have Octopus Deploy handle all our deployments.

What we have now is:

  1. Microsoft VSTS build vNext - responsible for building, testing, and publishing packages
  2. Octopus Deploy - responsible for deploying packages and approving deployments

What is really nice is the two tools really work together well. It is really easy for a VSTS build to push to Octopus Deploy, create a release in Octopus Deploy, kick off a deployment and promote a deployment.

You will need some sort of build tool in your continuous integration process. When evaluating the tool make sure it (at a minimum) has the following features:

  • Build Once, Deploy Anywhere. Build a deployment package once, and take that package and deploy it to any server.
  • Ability to run PowerShell. This is the backbone of the DLM Automation Suite.
  • Granular permissions to ensure the right people are deploying to the right environments.

Here are some of the tools that support that functionality. Even better, Red Gate has plug-ins for all of those tools.

Scratch Database Servers

When using the Red Gate DLM Automation Suite to build and sync you can specify a scratch database server. How the tooling works is it creates a temporary database and runs all the SQL statements stored in Source Control. If one of the scripts contains invalid SQL, say if you drop a column and don't update all the stored procedures, then that script will fail and the build/sync will fail.

If you don't specify a scratch database server, the tool will automatically choose LocalDB, a lightweight database server.

For the most part, LocalDB will meet your needs. But it does not support all the features of SQL Server. The one that tripped us up was Full-Text Search. It was a key piece to our application and we couldn't get rid of it.

Initially, we used our development SQL Servers as the scratch database server. This caused a few issues.

  1. If anyone locks the model database then the build will fail. In order to create a database, an exclusive lock is needed for on the model database for a few seconds. This happened about once a week.
  2. The service account running the Red Gate tooling needs SysAdmin rights when it creates a database. Right after the Red Gate tool creates a database it will give ownership over to dbo. In order to do that, the user needs SysAdmin rights.
  3. After it finishes building/deploying the temp database should be deleted. Every once in a while, the tool fails. It happens typically if a someone has a lock on the model database. So you end up with rogue temp databases. The DBAs had to create a script to cleanup the development SQL Servers weekly.
  4. Every developer has SysAdmin rights to the development SQL Server.

Fun Fact: the model database kept getting locked by the same user running Red Gate's SQL Search.

So we ended up spinning up some scratch SQL Servers. They are nothing servers, 8 cores, 16 GB of RAM. Just enough to support the process.

Service Account Permissions

By default, we gave the deployment service account SysAdmin rights to all the SQL Servers. This is a very dangerous practice as that user can do anything it damn well pleases on the SQL Servers. The DBAs were not big fans of this.

  1. I could check-in a script granting myself SysAdmin rights.
  2. I could check-in a script turning on xp_cmdshell

With those two scripts I, as a user, could take over the SQL Server. In production. Nope!

After meeting with the DBAs we decided the service account doing the deployments got the following permissions.

The user could create and modify any table, view, proc, trigger. It could also read and write to any table. But it could not create schemas, users, roles, add users to roles, or grant permissions. All of the security stuff is managed manually. It is not like we are creating users left and right. We have the process generate a delta script to see if any permissions or users need to be added.

Licensing

I have been working with the DLM Automation Suite for almost two years now. In that time the licensing has changed quite a bit. I won't go into the gory details. It appears you now have to buy an SQL Toolbelt license for each developer.

I recommend emailing RedGate to ensure you are getting the most accurate, up to date, licensing information.

Troubleshooting

The best way to troubleshoot your deployment process is to:

  1. Look at the logs.
  2. Run SQL Profiler on the database being deployed to.

SQL Profiler will tell you exactly what is being run and when. It really helped us track down a number of problems when we were starting out.

Conclusion

Hopefully, these tips and tricks will help you as you get started creating your own continuous integration process. Getting your database under source control was the easy part. Now comes the fun part actually implementing all of this. In my next post, I will walk through how to get setup using VSTS Build vNext or Team Foundation Build