Redgate Command Line Tools

Redgate Command Line Tools

One of the less common known facts about Redgate's DLM Automation suite is purchasing license grants the purchaser the right to use to command line versions of SQL Data Compare and SQL Compare. Seems strange, right? Well when you really boil it down to the basic parts, the new DLM Automation Suite is nothing more than PowerShell wrappers around those tools.

Redgate's DLM Automation Suite provides a lot of functionality, Redgate really has worked on providing solutions for a lot of the common problems most companies face with database automation. But it is not all the possible scenarios, each company has their own process. And sometimes a custom step or custom process is needed for database automation. Having access to command line versions of those two tools really open a lot of opportunities. This article is going to explain how I helped create a custom step in the database automation process at Farm Credit Services of America using those command line tools.

Database Deployment Process

Before getting too far in, let's take a moment and discuss (at a high level) our environment and process. We have four environments, development, testing, pre-production, and production. All developers have admin rights to development and testing. In the past, this meant the developer would make a change to a database in development and almost immediately make the same change in testing. Now that is all automated, the developer makes a change to their local dedicated instance, checks in that change and that change then gets pushed up to the development and testing environments automatically using our CI process. Pushes to testing occur anywhere from 5 - 20 times a day.

Pushing to pre-production happens a lot less, perhaps 1 - 2 times a day, with a lot of days of no pushes. Pre-production pushes are not automatically triggered by a check-in to one of our GIT repo. A developer will manually kick off that promotion. During the promotion to pre-production, a database architect has the ability to approve or to deny the deployment.

Production is even more strict. A developer cannot trigger a promotion to production, only a database architect can. There are no manual intervention steps during a production deployment. This is so the database architect can schedule a deployment to kick automatically at 7 PM on only get paged if the deployment fails.

What is going to production?

With me so far? I hope so!

Our automation process generates a NuGet package using Redgate's Custom VSTS Step. That is what gets deployed to production. It is great because the automation process will generate the necessary change scripts on the fly and apply them to the database. This is really helpful because as we know there is database drift between all the environments. In a typical software development lifecycle, development and testing are multiple changes ahead of pre-production and production. In most cases, pre-production has changes applied to it that are not in production. It needs to be created on the fly.

But that all happens behind the scenes. There is no visibility as to what is happening to pre-production and production. No database architect in their right mind is going to trust that on blind faith.

This was the pushback I received during the start of our adoption of the automated database deployment process. What is going to be changed in pre-production? What is going to be changed in production? Fair enough.

SQL Compare Command Line

Redgate's SQL Compare is a very powerful tool. Looking at the documentation you can see it can compare any of these two options:

  • Database Snapshot
  • Database
  • Source Control Folder
  • Backup

During the compare you can tell it to:

  • Assert Equal - the two items are exactly the same
  • Sync - apply any changes from the source (Source Control Folder) to a destination (Database)
  • Generate Script - create a script in containing the difference between the source (Source Control Folder) and the destination (Database) - in other words, it would generate a delta script.

In addition to all that, it is even possible to control what the comparison tool looks at to strip out all the noise and focus on what you and your team care about. This is done by setting the SQL Compare options.

Auto-generated Delta Scripts

Why is this all important?

Well, a fun fact about Octopus Deploy (our deployment tool) is it will extract the contents of the NuGet package created by Redgate's Custom VSTS step into a folder it has access to. This is the folder that is used by the DLM Suite to deploy from. Well, after the package is extracted, any step in that process has access to that folder, all that is needed is the location (stored in a variable in Octopus Deploy) and knowledge of the folder structure (easy to determine, just download the package).

This is true for any deployment tool, it doesn't matter if you are using TeamCity, Jenkins, Bamboo, VSTS, or something else. They all have to extract the contents from that NuGet package.

Knowing that, we created a Custom Step Template for Octopus Deploy that would compare the extracted folder with the pre-production and production databases and generate a delta script. Octopus Deploy also has an "artifact" repository for each of their releases, which the step template publishes to that delta script to. Anyone could open up the release in Octopus Deploy and view the delta scripts for both Pre-Production and Production. Most Deployment tools have some sort of artifact repository. If the tool doesn't then it is easy to have the process dump the file to a shared folder on a server.

How the delta scripts are used

These delta scripts exist strictly for review purposes only. They provide a pretty good idea of what is going to be changed.

They are not used in the deployment. From the moment they are generated, there is a clock ticking as to when they become obsolete. An emergency fix could go into production that updates a stored procedure or a view or a table.

Please note: All database changes should follow the automation process. Should being the key word. It should be true 99% of the time. But, if your application is going sideways with all hell breaking loose and all that is needed is an index then you bet your ass you're going to create that index that second. Or make that change to a stored procedure. Ideally, that should have been caught in testing or with some sort of automated tool. And steps should be taken to ensure it doesn't happen again, but any developer knows it happens.

Custom Step Template PowerShell

Below are the steps to create your own Custom Step Template in Octopus Deploy. The PowerShell used is fairly generic, the only parts which are specific to Octopus Deploy are the way it gets the parameters and where it places the artifact. You could easily update this to work with TeamCity or VSTS or any other build tool that makes use of PowerShell.

As you can see in the Octopus Deploy Step Template documentation, creating a stepp template is very easy to do. In this option, we selected the "deploy a package" option. This allowed us to have a pre-deployment step, deployment step, and a post-deployment. Our step template is a little more complicated so we needed to make use of the post-deployment step, but for the sake of simplicity, just worry about the deployment step.

After you select "deploy a package option" start filling in the form presented.

  • Set the Package Id to the #{NuGet Package ID} variable. This will allow the process implementing this step to specify the package to deploy.
  • Add the PowerShell below to the deployment step.

## --------------------------------------------------------------------------------------
## Input
## --------------------------------------------------------------------------------------

$databaseServer = $OctopusParameters['database.NameOfServer']
$databaseName = $OctopusParameters['database.NameOfDatabase']
$sqlComparePath = $OctopusParameters['Database.SQLComparePath']


## --------------------------------------------------------------------------------------
## Helpers
## --------------------------------------------------------------------------------------
# Helper for validating input parameters
function Validate-Parameter([string]$foo, [string[]]$validInput, $parameterName) {
    Write-Host "${parameterName}: $foo"
    if (! $foo) {
        throw "No value was set for $parameterName, and it cannot be empty"
    }

    if ($validInput) {
        if (! $validInput -contains $foo) {
            throw "'$input' is not a valid input for '$parameterName'"
        }
    }

}


## --------------------------------------------------------------------------------------
## Configuration
## --------------------------------------------------------------------------------------
Validate-Parameter $databaseName -parameterName "Database Name"
Validate-Parameter $databaseServer -parameterName "Database Server"
Validate-Parameter $sqlComparePath -parameterName "Database.SQLComparePath"

## --------------------------------------------------------------------------------------
## Run
## --------------------------------------------------------------------------------------

$sqlCompareOptions = "DecryptPost2KEncryptedObjects,IgnoreFillFactor,IgnoreWhiteSpace,IgnoreFileGroups,IgnoreWithElementOrder,IgnoreUserProperties,IgnoreDatabaseAndServerName,AddDatabaseUseStatement,ObjectExistenceChecks,IgnorePermissions,IgnoreUsersPermissionsAndRoleMemberships,IgnoreExtendedProperties,IgnoreConstraintNames"

$scriptsFolder = "$(get-location)\db\state"
$updateScriptFile = "$(get-location)\000_DeltaScript.sql"



# creating argument string from variables
$arguments = @(
"/scripts1:$($scriptsFolder)",
"/Server2:$($databaseServer)",
"/Database2:$($databaseName)",
"/scriptFile:$($updateScriptFile)",
"/Include:Identical",
"/Options:$($sqlCompareOptions)"
)

# Exposing variables in logs for troubleshooting
write-host $arguments

# Run SQL Compare
& $sqlComparePath $arguments

New-OctopusArtifact -Path $updateScriptFile -Name "$databaseName $databaseServer 000_Delta.SQL"

Then go in and add the necessary variables

Now give it a name and description and click on save.

Select a project in Octopus Deploy and add this new step. As you can see it is pretty easy to set the variables required.

And now during the deployment you can see the scripts being generated. Click on them to download and view them.

Conclusion

Hopefully, this post gives you some ideas on how to make use of the tooling provided by Redgate to augment your own process. Happy Coding!