This post has already been read 294 times!

While Continuous Integration (CI) started with application code you should apply the same principles to databases. The point is that CI reduces your feedback loop such that changes that break the build are caught as soon as they are committed to source control. We do this by building, testing and deploying every commit and reporting any errors.

A little while ago (when I still worked for Redgate) a customer asked for a database CI demo using the Redgate tools and Jenkins. I got it up and running in about an hour and was impressed how easy Jenkins was to configure, despite the fact I had never used it before. I was so impressed that I documented the process in this step by step tutorial.

This tutorial is for a SQL Server databases. If you use Oracle databases you should go here instead.

Before you start, here’s what you need:

  • SQL Server LocalDB. Failing that a SQL Server instance that you can connect to.
  • SQL Source Control (Redgate tool) to connect your database to a source control repository. I’m using Tortoise SVN which is open source. (Database source control is a prerequisite for database CI. If you don’t source control your DB already you should sort that out first. If you work in SSMS, SQL Source Control will make versioning your database trivial.)
  • DLM Automation v2 or later (Redgate tool, comes with SQL Toolbelt) for the Redgate PowerShell cmdlets. (You can use the free 14-day trial to follow this tutorial.)
  • PowerShell v3 or later.
  • And, finally, the latest build of Jenkins CI (open source) which you can get from the Jenkins website.

In my case I do everything locally (I was preparing a POC) but you don’t have to. Your Jenkins node needs to run on a Microsoft OS and have a clear line of communication to your source control repository. You need to have DLM Automation installed on your Jenkins node. LocalDB (if you are using it) also needs to be installed on your Jenkins node, otherwise the account Jenkins runs as needs to have access to an alternate SQL Server instance.

This post assumes that you already have SQL Source Control set up with SVN and that you broadly understand how it works. If you have these bits and pieces in place, you’re ready to follow in my footsteps!

Step 1: Create your PowerShell build script

We need to write a PowerShell script to build, test and deploy your database. You should test this PowerShell script locally before trying to run it from Jenkins.

We are going to use the Redgate DLM Automation PowerShell cmdlets to call the Redgate SQL Compare engine, NuGet, SQLCMD and various other tools to do the heavy lifting.

DLM Automation will take our source code, create a new temporary database in localDB and deploy our source code to it to ensure the code compiles (a process called schema validation). If you are using an alternate SQL Server instance you can carry out your schema validation on that instance instead.

Once validated DLM Automation will create a NuGet package for this version of the database in an output directory. Then (optionally) it will run your tSQLt tests against the package and/or deploy the code to a persistent integration or test database.

Open PowerShell ISE and copy the following script into the top section:

# Script created 11/11/2016 by Alex Yates of DLM Consultants
# Shared freely, but a beer would be appreciated if you find this useful and we are ever in the same pub
# Full documentation for Redgate DLM Automation PowerShell cmdlets at:

# Variables (fill these in)

	# Required - the location of your source code
	$scriptsFolder = "C:\where\is\your\DB\source\code"
	# Required - a name and version number for your package (must not already exist in output directory)
	$packageID = "aNameForYourPackage"
	$packageVersion = 0.1
	# Required - An output directory in which to save your build artifacts (must already exist)
	$outputDir = "C:\JenkinsDrops"
	# Optional - If using an alternate SQL instance for schema validation provide details here. Also, uncomment this parameter from line 34 below.
	#$buildDb = "Data Source=.\SQL2014"
	# Required for sync step only - the database you wish to deploy to. Uncomment below and also lines 46-8 if running a sync step.
	#$targetServerInstance = ".\SQL2014"
	#$targetDatabaseName = "AdventureWorks"
	# Optional - If using SQL Auth for target DB add a username and password. Also, uncomment these parameters from line 42 below.
	#$username = "myUsername"
	#$password = "myPassword"

# Script to build DB (you can probably leave this as is)
	$errorActionPreference = "stop"
	# Validate the scripts folder
	$validatedScriptsFolder = Invoke-DlmDatabaseSchemaValidation $scriptsFolder # -TemporaryDatabaseServer $buildDb

	# Export NuGet package
	$package = New-DlmDatabasePackage $validatedScriptsFolder -PackageId $packageID -PackageVersion $packageVersion
	Export-DlmDatabasePackage $package -Path $outputDir

# Script to run tests and/or deploy to an integration DB (uncomment as appropriate)

	# # Run tSQLt unit tests
	# Invoke-DlmDatabaseTests $package | Export-DlmDatabaseTestResults -OutputFile "$outputDir\$packageID.$packageVersion.junit.xml"

	# # Sync a test database
	# $targetDB = New-DlmDatabaseConnection -ServerInstance $targetServerInstance -Database $targetDatabaseName # -Username $username -Password $password
	# Test-DlmDatabaseConnection $targetDB
# Sync-DlmDatabaseSchema -Source $package -Target $targetDB
  1. Edit the variables at the top appropriately and run the script to ensure it works. Note that if you run it multiple times you’ll need to increment the version number or delete old packages from the output directory.
  2. Optionally uncomment the last two sections to run tSQLt tests and/or deploy to an integration/test database.
  3. Once the script works locally change the values of the $scriptsFolder and $packageVersion parameters to args[0] and args[1]. We’ll pass these variables into the script from Jenkins later. You may also want to think about parameterising and encrypting your passwords if you have included them (if using WinAuth you do not need passwords, make sure the Jenkins service is running as an appropriate user). To learn how to do that check out this post, but I’d worry about that later, after you have something basic working in Jenkins.
  4. Save your script to a file called buildDatabase.ps1 somewhere in your source control repository. (Not in the same directory that you have pointed SQL Source Control at).

Step 2: Create a new job in Jenkins CI

  1. Install Jenkins. For a proof of concept it’s perfectly fine to do this on your local machine. (Download. Next, next, next etc. It should only take a minute or two).
  2. Once installed, from the Jenkins home screen click New Job (top left) and give it a name, something like “[MyDatabase] build”, and select the radio button to “Build a free-style software project”.
  3. Select your source control system and add the location of your source control repository where it says “Repository URL”. For this dummy scenario I’m running everything from my local machine so I used file:///C:/SVNRepository/, which I copied from the Tortoise SVN repo browser.
  4. Finally set up a build trigger. The easiest way is to set it up to poll your SVN repo once a minute (copy five stars (* * * * *) into the text field). In time you may want a push trigger to improve performance but this is a quick way to get started.

Step 3: Configure your build step

There are various ways to run PowerShell scripts from Jenkins:

  1. Use the PowerShell plug-in to run a script from your source code
  2. Use the PowerShell plug-in to run a script saved in Jenkins
  3. Use a regular Windows batch command to call a PowerShell script from your source code

Any of these options is fine. We are going to use the third option since it avoids installing any additional Jenkins plugins. However, if you already have the PowerShell plug-in installed or you can easily install it it’s fine to use that.

  1. Click “Add build step” and select “Execute Windows batch command”.
  2. Paste the following command, replacing the two file paths appropriately, and save:
powershell -command "relative\path\to\BuildScript.ps1 \"%WORKSPACE%\relative\path\to\dbSourceCode\" 0.%BUILD_NUMBER%; exit $LASTEXITCODE"

Now, each time you commit to source control, within 60 seconds Jenkins will run that batch command. This will execute your PowerShell script with two custom parameters, one for the location that Jenkins has checked out your DB source code to and the other for the Jenkins build number, which will be used as the NuGet package version number.

And that’s that. Now commit something to source control and see if it works. If not check the build log to troubleshoot it. If you get it working first time you did better than me on my first attempt! It took me four or five builds before everything worked and I started to see sunny builds.

Final thoughts

It isn’t that hard and doesn’t take that long to set up in practice. Now you can catch mistakes early using Jenkins and DLM Automation, and if you set up automated deployment using a release management tool you’ll easily be able to deploy your databases throughout your pipeline.

Since writing this post I have moved on from Redgate. I now run my own Database Lifecycle Management (DLM) consulting company, DLM Consultants. If you need any help with database CI or anything else DLM related contact me through our website. (Or Twitter: @_AlexYates_)

Leave a Reply

Post Navigation