John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

Falling back in love with Data Community events

14th March 2022 By John McCormack Leave a Comment

Data Community Events

people enjoying data community events

Last week, I had hoped to go to SQLBits conference in London but a variety of factors meant I could really only attend the Saturday morning sessions (and virtually at that). I’ve really missed Data Community Events like SQLBits and others.

I wanted to go because I’ve been slightly disengaged from the data community for a year or so, probably due a combination of factors such as lockdowns, zoom fatigue and a few speaking knock backs in the last year which dented my confidence a bit. However I wanted to at least attend SQLBits in part, in the hope it would inspire me and kick start some new blogging and possibly presenting opportunities. I’ve had some great times attending and presenting at data community events in the past such as at DataGrillen, DataScotland, and SQLBits and I knew it would be worth the effort to get back into the swing of things.

Just attend some sessions and take it from there

Attending sessions where you know little or nothing of the subject matter can be extremely rewarding.

I’m glad I did. The sessions I attended were all extremely enlightening and I enjoyed following along. One thing that immediately came back to me is that attending sessions where you know little or nothing of the subject matter can be extremely rewarding. These sessions serve to keep you informed of the overall technology trends and who is doing what. For example, I won’t have much opportunity to use Azure Arc in the near future but it’s been around long enough that I can’t ignore it completely. Attending Ben Weissman’s 20 minute taster session was just enough and it gave me some ideas about how it could be used in co-ordination with our on premises environment. I also really enjoyed learning about the developments to SQL Managed Instance since I last used them over a year ago. Some of the improvements released during the previous 12 months could actually make it a far more viable product for my company.

Keep it going

Thursday night (17th March 2022) sees the the latest meeting of the Glasgow Data User Group. I will make a point of attending, even although the speaker is discussing ETL in the cloud which is not a big area of professional interest for me, I know I will learn something and I hope it will entertaining as well as informative. Plus it will be good to see some old faces, albeit we are still remote.

12 blog posts

I committed to 12 blog posts this year, this one can serve as #1 and it gets me started. Hopefully by attending many other events, I can find the inspiration needed to get back to creating my own content and keeping up with developments in the data community. As well as keeping up with old friends.

Filed Under: front-page, SQL Server Tagged With: community, data community, sqlbits

Free SQL Tools to make your life easier

9th February 2021 By John McCormack 2 Comments

t-sql tuesday logo

I’ve written and spoken about free SQL tools to make your life easier on a few occasions. It was the subject of my presentation at SQLBits 2020 so I was delighted to see Mikey Brownowski (b|t) choose tooling as the subject for February’s T-SQL Tuesday #135. #tsl2sday.

In this month’s t-SQL Tuesday, Mikey asks us to “Write a blog post about the most helpful and effective tools you use or know of.” I use quite a few free tools. I love the way the SQL Community embraces sharing and I know I wouldn’t have achieved a fraction of my career success without free tools. In my SQLBits talk, I discussed the following free tools:

  1. sp_whoisactive
  2. First Responder Kit
  3. Statistics Parser
  4. Ola Hallengren’s Maintenance Solution
  5. DBATools
  6. DLM Dashboard

For this post, I will focus on Statistics Parser, written by Richie Rump (b|t). My blog has info on some other free sql tools as well.

Statistics Parser

Legend

Legend has it that Richie Rump wrote it during a Brent Ozar conference session. I asked him about this and he told me:

Well, I started it in one of his training classes. It was in Atlanta. It was the last day and it was mentioned that there was an Excel spreadsheet that parsed out statistics io output. I found that as interesting but odd. So I started out writing out a web based parser right there in class. The core of it was finished on the plane ride home.

Richie Rump 2021

So what does Statistics Parser do?

If you run SET STATISTICS TIME,IO ON before you run a query in SSMS, you will get information back on how much data was accessed and how long it took. Things like logical reads, physical reads, CPU time and elapsed time etc.

If you are only querying one or two tables, it is easy enough to just read this in the messages window. But what about those complex stored procs or queries hitting multiple tables and views that return a very long list of outputs? The output can be long and intimidating and certainly hard to understand at a glance.

Statistics Parser is a web page which allows you to paste in the statistics time,io output from the SSMS messages tab, and it formats into neat tables showing how much IO happens for each table. It is immediately easier to read and you get a handy % column on the right hand side showing you which tables are being read the most. I find this really useful for query tuning because it lets me know where my biggest pain points are. For complex queries, which touch a lot of tables, it just makes it easy to see at a glance where you should initially focus your attention. It also shows worktable and workfile tables which serves as a handy hint that tempdb is in play.

A really handy feature is that you can open multiple browser tabs, but give each tab it’s own name. Then you can paste the output from the original query in to a tab you have named as “before” or “original”, then give a name to each tab as you try something new. e.g. “index_abc_added” or “fixed_cursors”. I like to do this when working through query tuning options against a restored copy of production. I use it for tuning CPU and reads. I quite often find that if I can make a big enough impact on the reads, the CPU will also come down.

via GIPHY

How to use Statistics Parser

  1. Run the following query in the same SSMS window as a query you are about to troubleshoot:
    1. SET STATISTICS TIME,IO ON
  2. Run your query
  3. Copy/paste the message from SSMS into statisticsparser.com
  4. Click Parse button
  5. Open more tabs as needed

Thanks Richie

Thanks for making a great tool, free and easy to use. And thanks for answering my questions.

FYI

Attendees of SQLBits 2020 can still login using their personal link and see all the recorded content so my talk is there. In it, I spend a few minutes demonstrating Statistics Parser. I’m not sure if they will make it free to everyone else in future, like they’ve done in previous years.

Featured Image

Image by Lu Lettering from Pixabay

Filed Under: front-page, T-SQL Tuesday Tagged With: #tsql2sday, community, free tools, statistics parser

Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2025

 

Loading Comments...