Redgate has built a fantastic Database Lifecycle Management (DLM) suite to help with automated database deployments. Their tools cover 95% of the scenarios most people will encounter day in and day out. But...it cannot cover 100% of the scenarios. The other 5% of the scenarios are what this post is about. And quite frankly, it is unfair for a large company making tools for 200,000+ customers to cover that 5%.
Time for a poor man's analogy. At the end of 2015, I purchased my first house with my wife. Just like every other first time home buyer, I made many trips to Home Depot to buy various items for the house. On one of my trips I purchased an 18-Volt Milwaukee Cordless Drill. To say I am happy with the purchase is a bit of an understatement. I was able to put a 3-inch screw through a 2"x4" in under 10 seconds. With a mason bit, it chewed through mortar like it wasn't even there. It has a LED light when the trigger is pulled to light up the area where the business end of the drill meets the wood. In a pinch, it can be a flashlight.
Despite all its features, no matter how hard I try, I cannot get that drill to saw through wood. Drilling through wood? No problem. But it cannot cut a board in half.
It is important to remember that with the Redgate DLM Automation Suite. It cannot do everything. What it can do, it does, very, very, well.
Below are some of 5% of the scenarios not supported by Redgate's DLM Automation Suite I have run into while adopting the tool. Some of them are due to quirks in SQL Server itself. While others are because it would be nearly impossible for any company to implement all the variations of possible implementations. Or, they are simply scaffolding and once done, you won't have to ever worry about it again.
Disclaimer: These scenarios were accurate to the best of my knowledge at the time of this writing. If am incorrect in any way or further clarification is required then please contact me.
The Redgate DLM Automation Suite requires the destination database to exist in order to deploy to it. You, or more like a DBA, has to create the database. It seems a little silly a tool like this would be unable to create a database. How hard can it be? Let's take a step back and examine some reasons why.
Disclaimer: I do not know the actual reasons why below is entirely speculation.
The entire database is stored in a .mdf file while the transaction logs are stored in a .ldf file. I've worked at a few places where DBAs like to shift around those files for various reasons. Or, they might store the files on a different SAN for each SQL Server. Either way, there would be no way for Redgate to know those particulars. It is different for each company. In some cases, it is different for each SQL Server. Yes, that could be scripted, but you might end up spending time just maintaining that script.
In order for a user to create a database, they have to have elevated permissions on the SQL Server. I have yet to meet a DBA who is comfortable having an automated process will that level of permission. Also, if the account running the automated process has permission to create a database it will be db_owner on that database. It can then give db_owner permissions to other users. In testing environments, I'm not too worried about that being abused, but I am in production. I don't want anything messing up production permissions and data.
I work in a multi-tenant world, where each tenant gets their own database. The schema is the same but the data is different. A specific naming scheme is given to each database. A simple example is [DatabaseName]_[TenantName]. But it could get a lot more complex. Redgate would not know what naming scheme is being used. It would have to build some sort of engine using patterns. It is not impossible but is it worth the time?
Yes, it seems a little silly the DLM Automation Suite cannot create a database. None of the issues raised are impossible to overcome, but think about this...once the database is created, you don't have to worry about it again.
This is a quirk with Redgate's default options and SQL Server. By default, Redgate wraps every deployment in a transaction. Either it all goes out, or none of it goes out. This is a problem because a fulltext index cannot be created within a transaction. This is a restriction of SQL Server.
This poses a problem because often times you will have a stored procedure making use of that fulltext index. If you are deploying the stored procedure and fulltext index for the first time it will fail.
This leaves really two options
- Manually create the fulltext index on each SQL server you are deploying to.
- Make use of the "no transaction option" and disable transactions.
Opinion: In my mind, Option 2 is a no-go. I prefer doing all my deployments in a transaction because having a half-assed rollout is worse than no rollout at all. The potential for "data-funk" is large with a transaction-less rollout.
CDC (Change Data Capture)
Change Data Capture or CDC is a little-known feature in SQL Server. Well to me it was little known until I started working for Farm Credit Services of America where CDC is used by a couple of databases. The DLM Automation Suite currently does not support CDC. This is because CDC is treated by SQL Server as a system generated schema. Looking at it from a very simplistic point of view, it is another sys schema. Redgate's tools purposely ignore those types of schemas because those really the low-lying guts of the database.
What this means is any deployments which change something in CDC schema of the database will have to be a manual deployment.
Disclaimer: I do not know all the ins and outs as to why this feature is not supported. My best guess is not a lot of use of it "in the wild." And there are many alternatives to it. Plus, in all honesty, I don't use the feature so it is not a big deal for me. Don't get me wrong, I am sure this is a major deal for some people.
User and Permissions
The DLM Automation Suite has no problem deploying a user and giving them permissions. However, the tool requires any user in source control is deployed to all environments. This can pose a problem when you want to use a different user for each environment. This is typically done as a security measure.
Side Note: I prefer this approach, that way if a test user account gets compromised I don't have to worry as much because it is unable to read production data.
The way around this is to create a role and grant it the necessary permissions. That role can be stored in source control and deployed as part of the build. The user for each environment is manually added to that role by a DBA. You can do this by making use of the "IgnoreUsersRolesAndPermissions" option.
SSIS packages, or SQL Server Integration Services Packages, are a feature in SQL Server which allows for a lot of automated data migration and manipulation. Typical packages I have seen will copy data from one database to another or read some sort of text file into a database. They are run by a job running through the SQL Server Agent. These packages are not stored in a database on SQL Server but rather a package store or a file share. SQL Server Agent jobs are defined inside of the msdb database, which is a system database. I can't imagine Redgate, or really anyone, would recommend putting that database under source control and running it through an automated database deployment process.
Opinion: Do not put the msdb database under source control. I cannot begin to imagine the problems you would run into.
A drill is not a saw. Despite all the great things it can do, Redgate's DLM Automation Suite can't do it all (hey that rhymes)
The above list of the scenarios came from rolling out automated database deployments to other development teams at my company. Most of these are scaffolding and require some sort of manual deployment process. I'm not going to lie, it is a bit of a pain, but not to beat a dead horse, a lot of them are scaffolding. Once they are deployed, you typically don't have to worry about them again.