Monday Coffee: Alerts & Checks

Hope you all had a good weekend, the Lions won so I’m pretty happy.

Over the weekend I was thinking about the alert system that I have in place at my current position. A good alerting system is (obviously) essential for any DBA to have in place but we all know that’s not enough right?

No matter how good your alerts are, detailed monitoring of your systems is also essential.

Detailed monitoring will allow you to catch issues before they occur, taking you from “fire fighting” issues and actively working towards keeping your system stable.

However, the processes around your monitoring system(s) are also just as important. I’ve worked for companies were it was a requirement to spend pretty much all of each morning meticulously checking off tasks in excel spreadsheets, in order to retain a detailed history.

For me, that’s an example of a bad monitoring process. There’s a lot of very good monitoring tools out there that automatically record stats for us and provide reporting.

Monitoring tools are there to allow us as DBAs to quickly observe and review our systems and that good work is wiped out by having to manually enter data. We don’t want to performing manual tasks, we want to be working on our current projects.

So if you’re in a position were you need to be entering in data manually, you should check out dbatools.io

dbatools is a project written by DBAs for DBAs. There’s over 200 commands there that will allow you to quickly analyse your suite of servers, pulling back whatever stats you need. Do away with that manual data entry, there’s a better way πŸ™‚

Have a good week!

Friday Reading 2017-06-30

Second test for the British & Irish Lions tomorrow, they need this one to keep the series alive. I’ll be up early to watch it tomorrow, the team selection is here.

But anyway, this week I’ve been reading…

Using SQL Server Diagnostics
Arun Sirpal goes through how to use a new extension for SSMS that analyses SQL Server dump files

PowerShell Module for the SQL Server Diagnostics API – 1st Command Get-SQLDiagRecommendations
Continuing looking at the new SQL Server Diagnostics tool, Rob Sewell goes through a powershell module he’s built to interact with the API

Demystifying Microsoft Distributed Transaction Coordinator and SQL Server HA Configurations
Allan Hirt clarifies an area that always been a bit of a black box for me, Distributed Transactions

SQL Server 2012 SP4 now to be released in September 2017
I wonder what they found when testing πŸ™‚

‘Vaccine’ created for huge cyber-attack
There’s a defence against the latest ransomware, but that doesn’t mean you shouldn’t be fully patched

Have a good weekend!

Persisting data in docker containers – Part Two

Last week in Part One I went through how to mount directories from the host server into a docker container in order to persist data.

However, you don’t have to do this in order to share volumes between containers. Another method is to create named volumes within the docker ecosystem (as it were) and then map those volumes into containers upon creation.

Here’s how to do it.

First we create the volume that we want to share within docker: –

docker volume create sqldata
docker volume ls


Now let’s create two containers both referencing the volume. One will be up and running whilst the other remains in the stopped state: –

docker run -d -p 15789:1433 -v sqldata:C\sqldata --env ACCEPT_EULA=Y --env sa_password=Testing11@@ --name testcontainer microsoft/mssql-server-windows


docker create -p 15789:1433 -v sqldata:C\sqldata --env ACCEPT_EULA=Y --env sa_password=Testing11@@ --name testcontainer2 microsoft/mssql-server-windows


Let’s have a look in the first container to see if the volume is there: –

docker exec -i testcontainer powershell


Update – April 2018
Loopback has now been enabled for Windows containers, so we can use localhost,15789 to connect locally. You can read more about it here

Ok, it’s there. So grab the private IP address of the container so that we can connect to it in SSMS: –

docker inspect testcontainer


Now we’ll create a database with its files in that location: –

USE [master];
GO
 
CREATE DATABASE [TestDB]
    ON PRIMARY
(NAME = N'TestDB', FILENAME = N'C:\sqldata\TestDB.mdf')
    LOG ON
(NAME = N'TestDB_log', FILENAME = N'C:\sqldata\TestDB_log.ldf')
GO

USE [TestDB];
GO
 
CREATE TABLE dbo.testtable
(ID INT);
GO
 
INSERT INTO dbo.testtable
(ID)
VALUES
(10);
GO 100


Right, now let’s blow that first container away and spin up the second one: –

docker stop testcontainer

docker rm testcontainer

docker start testcontainer2


Hmm, all looks good. But let’s check that the volume is there with the database’s files: –

docker exec -i testcontainer2 powershell

cd sqldata

ls


Cool! The files are there, so let’s connect to the SQL instance within the second container and see if we can attach the database: –

docker inspect testcontainer2


Let’s try the attach: –

USE [master]
GO
CREATE DATABASE [TestDB] ON 
( FILENAME = N'C:\sqldata\TestDB.mdf' ),
( FILENAME = N'C:\sqldata\TestDB_log.ldf' )
 FOR ATTACH
GO


Awesome stuff! We’ve got a database that was created in another container successfully attached into another one.

So at this point you may be wondering what the advantage is of doing this over mounting folders from the host? Well, to be honest, I really can’t see what the advantages are.

The volume is completely contained within the docker ecosystem so if anything happens to the docker install, we’ve lost the data. OK, OK, I know it’s inΒ C:\ProgramData\docker\volumes\ on the host but still I’d prefer to have more control over its location.

I like the idea of mounted volumes better if I’m honest. I can specify where my database files are with much more control and I can also have access if needed.

However, each to their own personal preference and if you have a good reason for using named volumes over mounted volumes, let me know πŸ™‚

Thanks for reading!

Monday Coffee: Attending Conferences

Last week I asked the following question on Twitter: –

I was interested to see the responses as I wanted to know how more experienced presenters get the time off to speak at so many events. Up until now I’ve been using annual leave and this was a way of gauging what was the general standard out there.

Well, the results showed that there is no general standard.

I had a lot of responses saying that companies would give time off and pay for travel expenses as it was viewed as training but then there were others that said the complete opposite (and everything in-between).

Btw, I’m discounting the consultants that replied. You run your own companies! πŸ™‚

My personal opinion is that attending conferences, even as a speaker, is training. I always go to other sessions when I’ve been speaking at events, I’m not going to stay up in the speaker room all day when there are great sessions going on!

However, I can also see the company side of things, especially if there was no formal agreement in the employment contract. You are out of the office after all, and for a least part of the time, unable to work remotely (if say, needed on call).

So, I would offer having a set amount of days that I can use to attend conferences with also the option of working around hours that I’m off (if needed). That way the company gets the work they need from me and I can attend the events that I wish to.

What does your company offer?

Friday Reading 2017-06-23

The weather has been absolutely boiling here in Dublin all week, I’m dying for a cold beer! Anyway, whilst drinking beer I’ll be reading…

Docker Basics Webinar Q&A: Understanding Union Filesystems, Storage & Volumes
Webinar (with transcript) going into the storage concepts behind docker containers

Continuous Delivery to Microsoft Azure with Docker
How to run the Azure CLI from a container

Fresh Paint – Give VS Code a New Look
VS Code is a great editor. This post details some of the themes that can be set.

PASS Summit 2017 – Speakers
More speakers have been announced for this year’s Summit. The line up looks fantastic!

British & Irish Lions Squad Announcement
First Test against New Zealand tomorrow! Here’s the squad.

Have a good weekend!