0

Limiting resources available to containers

One question that I get asked regularly is, “can you limit the host resources to individual containers?”

This is a great question as you don’t want one container consuming all the resources on your host, starving all the other containers.

It’s actually really simple to limit the resources available to containers, there’s a couple of switches that can be specified at runtime. For example: –

docker run -d -p 15789:1433 `
		--cpus=2 --memory=2048m `
			--env ACCEPT_EULA=Y --env SA_PASSWORD=Testing11@@ `
				--name testcontainer microsoft/mssql-server-linux:2017-GA

What I’ve done here is use the cpus and memory switches to limit that container to a maximum of 2 CPUs and 2GB of RAM. There are other options available, more info is available here.

Simple, eh? But it does show something interesting.

I’m running Docker on my Windows 10 machine, using Linux containers. The way this works is by spinning up a Hyper-V Linux VM to run the containers (you can read more about this here).

When that Linux VM first spins up it only has 2GB of RAM available to it. This isn’t enough to run SQL containers, if you try you’ll get the following error: –

The Linux VM has to have at least 3250MB of RAM available to it in order to run a SQL Server container. But when you run an individual container you can limit that container to less than 3250MB, as I’ve done here.

But how do you decide what limits to impose? Well, there’s always trial and error (it’s served me well) or use the docker stats command.

What I’d recommend doing is spinning up a few containers using docker compose, running some workload against them, and then monitoring using: –

docker stats

This way you can monitor the resources consumed by the containers and make an informed decision when it comes to setting limits.

Thanks for reading!

0

Monday Coffee: Long hours worked

Last week I asked the following question on twitter: –

I had a load of responses from a lot of DBAs out there who have worked some crazy shifts, you can see all the responses here

My personal record is due to a data centre migration. We moved all our physical kit from on-site to a data centre. I racked up 23 hours straight, followed by another 14 hours.

Pretty heavy going but nowhere near some of the responses I had (seriously there were some mammoth shifts in there).

I’m not sure how everyone else feels about these kinds of shifts. They’re definitely not enjoyable when they’re happening but I kind of hold onto mine as a badge of honour. Something that every DBA goes through and it’s one of those experiences that I feel has made me a more seasoned DBA.

Saying that, I don’t want any 20+ hour shifts again any time soon 🙂

Have a good (hopefully not too long) week!

0

Friday Reading 2017-09-29

Another busy week almost over and today I’m heading to Utrecht for SQL Saturday Holland! Never been to Utrecht before so am going to spend some time exploring 🙂

This week I’ve been reading…

Ask HN: Would you run SQL Server on Linux?
James Anderson (t) asks Hacker News readers if they would run SQL Server on Linux

When deadlocks become art
Matthew McGiffen’s (t) post on some deadlock horrors!

Scheduling powershell tasks with sql agent
Dbatools (t) post on how to setup powershell jobs in the SQL Server Agent (I’ve always found it to be a pain)

Measure Your Docker Containers’ Resources
Nick Janetakis’ (t) post on how to measure resources consumed by your containers

Who is using Docker containers and why?
Kumina infographic on how Docker is being implemented

Announcing the public preview of PowerShell in Azure Cloud Shell
Finally, powershell is available in Azure Cloud Shell

Have a good weekend!

0

The docker kill command

When running demos and experimenting with containers I always clear down my environment. It’s good practice to leave a clean environment once you’ve finished working.

To do this I blow all my containers away, usually by running the docker stop command.

But there’s a quicker way to stop containers, the docker kill command.

Time the difference for yourself: –

docker run -d -p 15789:1433 --env ACCEPT_EULA=Y --env SA_PASSWORD=Testing11@@ --name testcontainer microsoft/mssql-server-linux

First try the stop command: –

docker stop testcontainer

And then try the kill command: –

docker kill testcontainer

The kill command is pretty much instant right? But what’s the difference?

Well, according to the documentation, the stop command sends a SIGTERM signal to the main process within the container whereas the kill command sends a SIGKILL signal.

There’s a really good article here explaining the differences between the two signals but from what I can gather, the stop command gracefully shuts down the process within the container and the kill command just stops it dead.

Thanks for reading.

1

An analytical mind

Last week I was reading this article in which a Professor argues that students should be allowed to take smartphones and laptops into exams: –

Professor Mazur urged academics to take the next leap allowing students to use their laptops and phones in exams. He himself encourages students to bring their mobile devices into exams to “look up whatever you want, whenever you want” arguing that in the era of the Google search, students “don’t need to memorize anything.”

Now I don’t completely agree with saying that students don’t need to memorise anything but I do agree with the following statement: –

…he prefers students to be creative and use critical thinking and other analytical skills.

Analytical skills are absolutely critical to any IT professional. The ability to figure out what the problem is, research, and then deploy a solution (with a rollback in mind) is pretty much a cornerstone of what we do.

Ok, I fully admit that “googling” an issue doesn’t appear to be much of a skill but if you think about it, yes it is. Googling follows the same exact analytical procedure I identified above; figuring out the problem, searching for a solution and then identifying the correct one from the results.

However, a certain amount of base knowledge is required for this process. You need knowledge of the systems that you work with in order to identify the solution. Blindly googling and deploying the first solution displayed is a recipe for disaster.

Have a good week!