Creating SQL Server containers with docker compose

Up until now my posts about containers have been talking about working with one container only. In the real world this will never be the case, at any one time there will be multiple containers (I have over 30) running on a host.

I need a way to get multiple containers up and running easily. There are two different approaches to doing this: –

  • Application server driven
  • Container host driven

In the application server driven approach, the application server will contact the container host, build & run a container and capture details of the container (such as the port number) in order for the application(s) to connect.

This ad-hoc approach works well as containers are only spun up and used when needed, conserving resources on the host. However, this does mean that the applications will have to wait until the containers come online.

Ok, I know that spinning up containers is a short process, but I’m all about reducing deployment time.

What if we know how many containers will be needed? What if we want our applications to instantly connect to containers the second they are deployed?

This post is going to go through the steps needed in order to use docker compose to build multiple containers at once. Compose is a tool defined as: –

A tool for defining and running multi-container Docker applications.

As SQL Server people we’re only going to be interested in one application but that doesn’t mean we can’t use compose to our advantage.

What I’m going to do is go through the steps to spin up 5 containers running SQL Server, all listening on different ports with different sa passwords.

Bit of prep before we run any commands. I’m going to create a couple of folders on my C:\ drive that’ll hold the compose and dockerfiles: –

mkdir C:\docker
mkdir C:\docker\builds\dev1
mkdir C:\docker\compose

Within the C:\docker\builds\dev1 directory, I’m going to drop my database files and my dockerfile: –

N.B. – note the name of the dockerfile (dockerfile.dev1)

Here’s the code within my dockerfile: –

# building our new image from the microsft SQL 2017 image
FROM microsoft/mssql-server-windows


# creating a directory within the container
RUN powershell -Command (mkdir C:\\SQLServer)


# copying the database files into the container
# no file path for the files so they need to be in the same location as the dockerfile
COPY DevDB1.mdf C:\\SQLServer
COPY DevDB1_log.ldf C:\\SQLServer

COPY DevDB2.mdf C:\\SQLServer
COPY DevDB2_log.ldf C:\\SQLServer

COPY DevDB3.mdf C:\\SQLServer
COPY DevDB3_log.ldf C:\\SQLServer

COPY DevDB4.mdf C:\\SQLServer
COPY DevDB4_log.ldf C:\\SQLServer

COPY DevDB5.mdf C:\\SQLServer
COPY DevDB5_log.ldf C:\\SQLServer


# attach the databases into the SQL instance within the container
ENV attach_dbs="[{'dbName':'DevDB1','dbFiles':['C:\\SQLServer\\DevDB1.mdf','C:\\SQLServer\\DevDB1_log.ldf']}, \ 
	{'dbName':'DevDB2','dbFiles':['C:\\SQLServer\\DevDB2.mdf','C:\\SQLServer\\DevDB2_log.ldf']}, \ 
	{'dbName':'DevDB3','dbFiles':['C:\\SQLServer\\DevDB3.mdf','C:\\SQLServer\\DevDB3_log.ldf']}, \ 
	{'dbName':'DevDB4','dbFiles':['C:\\SQLServer\\DevDB4.mdf','C:\\SQLServer\\DevDB4_log.ldf']}, \ 
	{'dbName':'DevDB5','dbFiles':['C:\\SQLServer\\DevDB5.mdf','C:\\SQLServer\\DevDB5_log.ldf']}]"

In the C:\docker\compose directory, I’m going to create one file called
docker-compose.yml which is a file for defining the services I want to run in my containers.

The code inside that file is: –

# specify the compose file format
# this depends on what version of docker is running
version: '3'


# define our services, all database containers
# each section specifies a container... 
# the dockerfile name and location...
# port number & sa password
services:
  db1:
    build:
        context: C:\docker\builds\dev1
        dockerfile: dockerfile.dev1
    environment:
      SA_PASSWORD: "Testing11@@"
      ACCEPT_EULA: "Y"
    ports:
      - "15785:1433"
  db2:
    build:
        context: C:\docker\builds\dev1
        dockerfile: dockerfile.dev1
    environment:
      SA_PASSWORD: "Testing22@@"
      ACCEPT_EULA: "Y"
    ports:
      - "15786:1433"
  db3:
    build:
        context: C:\docker\builds\dev1
        dockerfile: dockerfile.dev1
    environment:
      SA_PASSWORD: "Testing33@@"
      ACCEPT_EULA: "Y"
    ports:
      - "15787:1433"
  db4:
    build:
        context: C:\docker\builds\dev1
        dockerfile: dockerfile.dev1
    environment:
      SA_PASSWORD: "Testing44@@"
      ACCEPT_EULA: "Y"
    ports:
      - "15788:1433"
  db5:
    build:
        context: C:\docker\builds\dev1
        dockerfile: dockerfile.dev1
    environment:
      SA_PASSWORD: "Testing55@@"
      ACCEPT_EULA: "Y"
    ports:
      - "15789:1433"

N.B. – To check which versions of docker are compatible with which compose file formats, there is a compatibility matrix here

Now that we have our files created, let’s run our first compose command. To check if it’s installed run: –

docker-compose

N.B. – this is a test command, you should see a help reference output if it is installed (and you can skip the next part).

Hmm…

So we need to install. To do this, run:-

Invoke-WebRequest "https://github.com/docker/compose/releases/download/1.14.0/docker-compose-Windows-x86_64.exe" -UseBasicParsing -OutFile $Env:ProgramFiles\docker\docker-compose.exe

The 1.14.0 in the command above is the latest version. To check what the latest version, jump onto this GitHub page.

Once the install has finished, verify the version: –

docker-compose version

We need to navigate to the C:\docker\compose directory before we run our first compose command: –

cd C:\docker\compose

And now we can run our compose command. The command to utilise compose to build our containers is very simple: –

docker-compose up -d

This script has worked through the docker-compose.yml file and built 5 containers referencing dockerfile.dev1

I can confirm this by running: –

docker ps

Excellent, five containers up and running! By using docker compose we can build multiple containers running SQL with one command. Very useful for building a development environment, once our applications are deployed they can connect to SQL within the containers instantly.

Final note

One thing to mention, you may come across this error: –

The way I got around this was to disable the existing vEthernet (HNS Internal NIC) adapter in my network connections. Running compose seems to create a new virtual NIC, so you will end up with: –

Let me know if you come across any other issues and I’ll investigate 🙂

Thanks for reading!

Monday Coffee: Database Deployments

I’ve always been particularly cautious when it comes to deploying code to databases, some would say overly cautious.

Because of this I’ve always performed manual deployments. Checking the code, testing the code and then manually running it in production. I’m responsible for the availability, resilience and performance of these databases so I should be the one to deploy to them, right?

I think this is a mindset that a lot of DBAs have and in my opinion, completely justified. I don’t want to be woken up in the middle of the night because something’s been released to Production in my absence and it’s caused issues.

However over the last few months I have seen the benefits of continuous integration & continuous deployment so I have been looking at ways to automate our database deployments. We use Octopus Deploy at my company so a database deployment process has been built within that.

The tests we’ve done are really promising and last week we started deploying to our Staging environment. If all goes well we’ll be moving to Production soon.

I’m still a little paranoid that something will go wrong if I’m honest. Because of that the database deployments are separate from the app deployments and I’ll be performing them (for now). We have a really good code review process in place so I highly doubt anything will go wrong but it’s just my nature to take changes like this slowly. Validate each step and move onto the next, proving that what you’re doing is working correctly.

The end game here is to integrate the database deployments with the app deployments and have one person performing them. Specialists (like myself) would only be called upon to perform code reviews and resolve any (hopefully none) issues.

I’m off to go and see if we have any Staging deployments to be performed 🙂

Have a good week!

Running dbatools commands with VS Code tasks

I’ve started to look at the excellent dbatools.io to automate some of the checks that I routinely perform on my SQL instances.

But before I go into this post, I want to say a thank you to Cláudio Silva (t). The powershell commands that are below are based off the code he posted in his excellent blog Have you backed up your SQL logins today?

The first dbatools commands that I looked at are: –

These commands do exactly what they say on the tin. Pretty standard stuff for DBAs but what’s cool is how we can use Visual Studio Code to quickly and easily check that all our databases are being backed up and have a recent (good) CHECK DB.

I’m going to setup two scripts to run the dbatools commands against my SQL instances via Visual Studio Code Tasks.

I don’t have a central management server so I created a database in my local instance of SQL to hold all my server names: –

CREATE DATABASE [DBA];
GO

USE [DBA];
GO

CREATE TABLE [monitoring].[CorporateSQLServers](
	[ServerID] [int] IDENTITY(1,1) NOT NULL,
	[ServerName] [sysname] NOT NULL,
PRIMARY KEY CLUSTERED ([ServerID] ASC) ON [PRIMARY]) 
ON [PRIMARY]
GO

Once the table was created, I added in all the server names that I wanted to perform my checks against.

Next thing was to write the scripts. I took Cláudio’s script and modified (actually simplified would be a better description) it, firstly to get the last backup details of my databases: –

#Requires -module dbatools

#Get a list of instances where you will run the command through
$SQLServers = Invoke-Sqlcmd -ServerInstance "localhost" -Query "SELECT ServerName FROM [DBA].[monitoring].[CorporateSQLServers]"

#For each instance 
$SQLServers | ForEach-Object {
     Get-DbaLastBackup -SqlInstance $_.ServerName
} | Out-GridView -Wait

Then to get the last know good Check DB of all my databases: –

#Requires -module dbatools

#Get a list of instances where you will run the command through
$SQLServers = Invoke-Sqlcmd -ServerInstance "localhost" -Query "SELECT ServerName FROM [DBA].[monitoring].[CorporateSQLServers]"

#For each instance 
$SQLServers | ForEach-Object {
     Get-DbaLastGoodCheckDb -SqlInstance $_.ServerName
} | Out-GridView -Wait

N.B.- The -Wait option is there to prevent the grid output from closing when run as a task

I saved both of these scripts in a directory that Visual Studio Code was pointed at and opened the program.

Then I opened the command palette by running Ctrl + Shift + P and typed Task.

The first option is Configure Task Runner, select that and it will open up the tasks.json file within Visual Studio Code. This is where I can create tasks to run my powershell scripts.

I edited the file with the following code: –

{
    // See https://go.microsoft.com/fwlink/?LinkId=733558
    // for the documentation about the tasks.json format
    "version": "0.1.0",
    "command": "powershell",
    "isShellCommand": true,
    "tasks" : [
        {
            "taskName": "Corporate Database Backup Checks",
            "args": ["-ExecutionPolicy",
                    "Unrestricted",
                    "-NoProfile",
                    "-File","${cwd}/CorporateSQLChecks/LastFullBackup.ps1"],
            "showOutput": "always",
            "suppressTaskName": true
        },
        {
            "taskName": "Corporate Database Integrity Checks",
            "args": ["-ExecutionPolicy",
                    "Unrestricted",
                    "-NoProfile",
                    "-File","${cwd}/CorporateSQLChecks/LastGoodCheckDB.ps1"],
            "showOutput": "always",
            "suppressTaskName": true
        }
    ]
}

By doing this, I will then get the option to run both my scripts from the Visual Studio Code command palette.

Excellent stuff! I now have a quick and easy way to check that all my databases are being backed up and have a recent Check DB.

Thanks for reading!

Friday Reading 2017-07-07

Final Lions Test tomorrow! The win last week sets up a thrilling series decider, if they win they’ll be the first Lions Team to win a series against New Zealand for 46 years! In the build up I’ve been reading…

Setting up Visual Studio Code Tasks
I’ve been using VS Code for a bit now and this week set up tasks to run some of my powershell scripts. Really useful.

Best OS for Docker Host?
A Reddit thread about which OS is recommend for a Docker Host.

Building a dedicated backup test server
Chrissey LeMaire goes through how to automate testing your backups using dbatools.io commands

The Feynman Technique: The Best Way to Learn Anything
Article on a learning technique. The advice about teaching is very true!

SQLSaturday Porto
SQLSaturday #685 has been announced and the call for sessions is open

Have a good weekend!

Persisting data in docker containers – Part Three

Last week in Part Two I went through how to create named volumes and map them to containers in order to persist data.

However, there is also another option available in the form of data volume containers.

The idea here is that create we a volume in a container and then mount that volume into another container. Confused? Yeah, me too but best to learn by doing so let’s run through how to use these things now.

So first we create the data volume container and create a data volume: –

docker create -v C:\SQLServer --name Datastore microsoft/windowsservercore

Note – that we’re not using an image running SQL Server. The windowservercore & mssql-server-windows images use common layers so we can save space by using the windowsservercore image for our data volume container.

Now create a container and mount the directory from data volume container to it: –

docker run -d -p 16789:1433 --env ACCEPT_EULA=Y --env sa_password=Testing11@@ --volumes-from Datastore --name testcontainer microsoft/mssql-server-windows

Actually, let’s also create another container but not start it up: –

docker create -d -p 16789:1433 --env ACCEPT_EULA=Y --env sa_password=Testing11@@ --volumes-from Datastore --name testcontainer2 microsoft/mssql-server-windows

So we now have three containers: –

Let’s confirm that the volume is within the first container: –

docker exec -i testcontainer powershell
ls

Update – April 2018
Loopback has now been enabled for Windows containers, so we can use localhost,16789 to connect locally. You can read more about it here

Ok, let’s get the private IP of the first container and connect to it: –

docker inspect testcontainer

Once you have the IP, connect to the SQL instance within the container and create a database (same as in Part One): –

USE [master];
GO
 
CREATE DATABASE [TestDB]
    ON PRIMARY
(NAME = N'TestDB', FILENAME = N'C:\SQLServer\TestDB.mdf')
    LOG ON
(NAME = N'TestDB_log', FILENAME = N'C:\SQLServer\TestDB_log.ldf')
GO

USE [TestDB];
GO
 
CREATE TABLE dbo.testtable
(ID INT);
GO
 
INSERT INTO dbo.testtable
(ID)
VALUES
(10);
GO 100

Now, let’s stop the first container and start the second container: –

docker stop testcontainer

docker start testcontainer2

And let’s have a look in the container: –

docker exec -i testcontainer2 powershell

ls

cd sqlserver

ls

So the files of the database that we created in the first container are available to the second container via the data container!

Ok, so that’s all well and good but why would I want to use data volume containers instead of the other methods I covered in Part One & Part Two?

The docker documentation here says that it’s best to use a data volume container however they don’t give a reason as to why!

The research that I’ve been doing into this has brought me to discussions where people say that there’s no point in using data volume containers and that just using volumes will do the trick.

I tend to agree that there’s not much point in using data volume containers, volumes will give you everything that you need. But it’s always good to be aware of the different options that a technology can provide.

So which method of data persistence would I recommend?

I would recommend using volumes mapped from the host if you’re going to need to persist data with containers. By doing this you can have greater resilience in your setup by separating your database files from the drive that your docker system is hosted on. I also prefer to have greater control over where the files live as well, purely because I’m kinda anal about these things 🙂

I know there’s an argument about using named volumes as it keeps everything within the docker ecosystem but for me, I don’t really care. I have control over my docker host so it doesn’t matter that there’s a dependency on the file system. What I would say though is, try each of these methods out, test thoroughly and come to decision on your own (there, buck successfully passed!).

Thanks for reading!