Friday Reading 2017-06-09

Exciting day today as I’m presenting my first webinar on SQL Server & Containers in the GroupBy June conference! There’s a whole bunch of other great sessions going on there today, I highly recommend that you check them out.

Been a busy week but when I’ve had some time I’ve been reading…

Why I try to help with dbaools
Things like this are on my to-do list but I never really know how to approach it, so this is a cool post by Shane O’Neill (t) on how he got into working on dbatools.

Unable To Call Into C Compiler !?
Nice problem solving post by Arun Sirpal (t)

How I optimised my life to make my job redundant
An old post by Troy Hunt (t) but a good read.

var Microsoft Data Platform MVP = “Yip.”;
Hamish Watson (t) on becoming an MVP.

SETI “Wow!” Signal Wasn’t Chatty Aliens After All — It Was a Fizzing Comet
A 40 year old mystery explained!

Have a good weekend!

Identifying failed queries with extended events

Back to some core SQL this week and one of my favourite features, extended events.

Introduced in SQL Server 2008 they are a big improvement on SQL profiler, in the amount of information that can be tracked (more) and the impact that they have on the system (less).

If you are still using profiler, please stop now! 🙂

One of the sessions that I have constantly running on SQL Server instances that I monitor is one to capture information on failed queries. You need to know what queries are failing on your production instances and why.

So let’s go through setting the session up, here’s the script to setup the extended event (remember to change the file paths for the filename and meta-data file!): –

CREATE EVENT SESSION [FailedQueries] ON SERVER 
ADD EVENT sqlserver.error_reported 
	(ACTION(sqlserver.client_app_name, sqlserver.client_hostname,  
		sqlserver.database_name, sqlserver.sql_text, sqlserver.username) 
	WHERE ([package0].[greater_than_int64]([severity], (10)))) 

ADD TARGET package0.event_file (SET 
	filename = N'C:\SQLServer\XEvents\FailedQueries.xel'
	,metadatafile = N'C:\SQLServer\XEvents\FailedQueries.xem'
	,max_file_size = (5)
	,max_rollover_files = (10))

WITH (STARTUP_STATE = ON)
GO

The new extended event can be viewed under Object Explorer > Management > Extended Events in SSMS: –

What this is going to do is create an extended event that will automatically startup when the SQL instance starts and capture all errors recorded that have a severity level greater than 10.

Full documentation on severity levels can be found here but levels 1 through 10 are really just information and you don’t need to worry about them.

I’ve also added in some extra information in the ACTION section (for bits and bobs that aren’t automatically included) and have set the maximum number of files that can be generated to 10, each with a max size of 5MB.

I’ve left pretty much everything else out for clarity but more details on the options that are available when creating sessions can be found here.

Let’s start the session up: –

ALTER EVENT SESSION [FailedQueries] ON SERVER 
	STATE = START;
GO

Now you can watch the queries be recorded live by right clicking in SSMS and selecting “Watch Live Data”…

…but I want to show you how to parse the events recorded in the file. Let’s test the event by running: –

SELECT 1/0;

Which will obviously fail!

So now we can parse the event, this requires a little XQuery but it’s not too bad (also, remember to change the filepath!): –

SELECT
	[XML Data],
	[XML Data].value('(/event[@name=''error_reported'']/@timestamp)[1]','DATETIME')				AS [Timestamp],
	[XML Data].value('(/event/action[@name=''database_name'']/value)[1]','varchar(max)')		AS [Database],
	[XML Data].value('(/event/data[@name=''message'']/value)[1]','varchar(max)')				AS [Message],
	[XML Data].value('(/event/action[@name=''sql_text'']/value)[1]','varchar(max)')				AS [Statement]
FROM
	(SELECT 
		OBJECT_NAME				 AS [Event], 
		CONVERT(XML, event_data) AS [XML Data]
	FROM 
		sys.fn_xe_file_target_read_file
	('C:\SQLServer\XEvents\FailedQueries*.xel',NULL,NULL,NULL)) as FailedQueries;
GO

So there we have the XML generated by the extended event and we’ve parsed it into readable columns!

Now with larger files XQuery can be heavy intense on resources so what I tend to do is copy the event’s files to my local machine before parsing; so just be wary when running.

As I said earlier I always have this event running on instances of SQL that I monitor. There really shouldn’t be any impact in doing so as (hopefully) there aren’t that many queries that fail in the instance. As ever though, test this thoroughly before going anywhere near production with it.

Thanks for reading!

Monday Coffee: Lab Environments

Having a lab environment is an absolute must for any IT professional. You need somewhere that you can test out code and new functionalities with zero risk. But where’s the best place to host?

My lab is in Azure, it’s convenient and (relatively) cheap but there is one problem with it; it requires an internet connection in order for me to use it.

Now 99% of the time that’s not an issue. Internet connectivity is available pretty much everywhere these days but still it can be an issue. Ever been to a presentation where the presenter hasn’t been able to connect to the wifi? It’s the reason that my demos for more involved subjects are all pre-recorded (that’s not the only reason but it’s a big one).

So what are the other options? Well, I know a couple of people who’ve hosted their own lab but that’s not really an option for many people. It’s expensive, noisy (servers are noisy) and you need quite a bit of space.

Then there’s building one on your local machine, using Virtual Box or Hyper-V. That’s a pretty good way to go, if you have a powerful enough laptop.

Another option is containers. Now that docker allows you to run linux containers on windows you can host pretty much everything on a not-so-powerful laptop. I have multiple images on my laptop that allow me to spin up versions of SQL Server from 2012 to 2017 in seconds. I don’t need to waste resources running full blown versions of all the different versions of SQL that I work with, the containers give me (almost) everything I need.

OK, I admit that there are some limitations so that’s where my Azure lab steps in but for most things, spinning up a container is a fast and simple way for me to test out code and investigate issues.

I highly recommend that anyone working with SQL Server (or any other app really) download docker and get involved with using containers. There really isn’t a downside to doing so.

Have a good week!

Friday Reading 2017-06-02

The weather in Dublin has reverted to type (aka rain) but hopefully we’ll get some sun this weekend! This week I’ve been doing a lot of work with powershell so what I’ve been reading kinda focuses on that…

GitHub – PowerShell Module for Docker
A powershell module for docker!

Parsing Docker Commands
Thought I’d (shamelessly) plug my own Github repo for working with docker 🙂

AWS Tools for Windows PowerShell
I’ve been troubleshooting our AWS snapshots this week so have been reading the official documentation for the powershell AWS tools

DBATools
An excellent powershell module for interacting with SQL Server. Getting to grips with this is something I’ve had on my to-do list for far too long.

Microsoft Doubles Down on Containers with Deis Acquisition
An article about Microsoft acquiring a company called Deis. Further proof that Microsoft is in the container business for the long run.

Have a good weekend!

Parsing Docker Commands

Bit of a powershell themed post this week as I haven’t had that much time to research so this one falls firmly into the “what I’ve been doing this week” category.

My company moved to using containers a while ago now, it’s been really fun setting up and I’ve written about the architecture and process (here)

But, so that you don’t have to click the above link, I’ll quickly recap what we’re doing now.

We use containers but aren’t on Windows Server 2016 or SQL Server 2016 so we’re using a product called Windocks that allows earlier versions of SQL Server to run in containers on earlier versions of Windows Server.

We have a physical host running the Windocks daemon and all our app VMs contact the host to build and reference SQL instances within containers. Each container is built from a custom image that contains stripped down versions of our production databases that we call baselines.

Each month (it’ll be more frequent soon) we update the custom image by: –

  • Creating new baselines of our production databases from backups
  • Committing those backups to TFS
  • Deleting the old image from docker repository
  • Building a new image from a dockerfile referencing those backups
  • Committing the new image

What I’ve been working on is the automation of the new image once new baselines are checked into source control.

One of the requirements that’s come out of this is the ability to parse the docker images & docker ps commands.

These commands give you an overview of what’s on your docker host, the images you have in your repository and what containers you have (and what state they’re in).

What I needed to do was parse those commands so I could work out things like: –

  • What images do we have available?
  • What version are those images (when were they built)?
  • What size are the images?
  • How many containers have been built?
  • When were the containers built?
  • What state are the containers in?

I needed to be able to gather this information and pass it into commands so that my scripts would be able to work out how to proceed. So I’ve written a bit of code in order to do just that.

This is a bit of a change for me, I usually just drop code into my posts but as it’s still a work in process, what I’ve done is create a GitHub account and uploaded the script. You can find it here: – https://github.com/dbafromthecold/parsedockercommands

Really simple to use, just change the variables at the top to your environment and you’re off. The only slightly tricky bit is making sure that your docker engine is configured for remote administration but I’ve also fully detailed how to set that up here.

What you’ll end up with is two arrays holding details of all the images and containers on your host which you can then use for, well, whatever!

There’s probably better ways of doing this but it’s always fun to work out how to do this yourself. I’m more than open to suggestions on how to improve the script so let me know if you have anything. 🙂

Thanks for reading!