John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

John McCormack's blogs

How good was Data:Scotland conference in 2024

15th September 2024 By John McCormack Leave a Comment

It’s the Sunday following Data:Scotland conference for 2024 and I’m still thinking about what a great day it was. From a personal perspective, I was involved as a speaker and volunteer so it was a very full day.

Volunteering

Volunteering is great, you get to meet loads of people and you do make a big difference to the smooth running of the day. I worked on the registration desk, which is something I do most years and I was room monitor for 4 amazing sessions.

  1. Johan Ludwig Bratas: An introduction to Snowflake – the data cloud
  2. Brian Bønk: Know the game you are in and you will not win
  3. Chris Webb: Power BI Coplilot Deep Dive
  4. Jamie McLaughlin: Don’t serve the data, serve the business

Speaking

I was on first thing with my talk which was called: The Individual Contributor – How to progress your career without going into management. I have to say I was absolutely gobsmacked to have a full room, not only that but we had to turn away some people due to fire regulations. I genuinely expected only a smattering of attendees so I’m very grateful to everyone who came. And I hope it was of some value.

For those who missed it, I have made a very abridged version and stuck it on my YouTube channel. For the video, I’ve focussed on the specifics of going from an associate to a mid level engineer. I plan to follow up with another on making the next transition to senior engineer, and what is required.

Colleagues old and new

I met loads of colleagues and friends, old and new and always enjoy a catchup. Lunch was fantastic so well done to the organisers and the venue. Those of us who were speaking attended a dinner on the Thursday night, and it was a welcome opportunity to catch up, and for some of them to try haggis on a pizza for the first time.

Data:Scotland conference 2025

I’ll be back in 2025 in whatever capacity I’m needed. I’m sure they will be looking for volunteers so if you think you’d like to do this and have any questions, please feel free to message me on LinkedIn.

Photos

A full house for the Individual Contributor career talk
John McCormack answering questions
John McCormack answering questions
John and Louise at the registration desk
John and Mikey enjoying a beer

Photo credits: Mikey Bronowski

Filed Under: front-page

How to easily upload, download and migrate SSIS Packages with DTUTIL

9th June 2023 By John McCormack 2 Comments

SSIS and DTUTIL background

Woman writing code with reflection in glasses

SQL Server Integration Services (SSIS) is a powerful tool, but migrating packages across SQL Servers can be a slow and thankless task if you don’t use automation. The single best way to do this is by using DTUTIL, a command-line utility provided directly by Microsoft. When it comes to moving 1000s of packages, I would only use DTUTIL because it can achieve in minutes what it would takes days to achieve using point and click.

–

Migrating a single package

Migrating individual packages from one SQL Server to another directly is also possible with DTUTIL. This is an example command below:

dtutil /SQL "SourceFolder/SourcePackage" /SOURCESERVER "SourceServer" /COPY SQL;"DestinationFolder/DestinationPackage" /DESTSERVER "DestinationServer"
  • Specifies the source package location and server with /SQL “SourceFolder/SourcePackage” and /SOURCESERVER “SourceServer”.
  • Copies the package to the destination SQL Server with /COPY SQL;”DestinationFolder/DestinationPackage”.
  • Identifies the target SQL Server with /DESTSERVER “DestinationServer”.

–

Migrating multiple packages

This process can be scripted similarly for multiple package migrations. The easiest way is to generate the commands using T-SQL, print them to screen then copy and paste in to a command window.

If you run the command below on the source server, it will list out a command to migrate every package.

DECLARE 
	@sourceserver sysname = N'DBSERVER1'
	,@destserver sysname = N'DBSERVER2'
	,@foldername sysname = N'DATAWAREHOUSE'
	
SELECT 
'DTUTIL /SQL "'+sspf.foldername+'\'+ssp.[name]+'" /SOURCESERVER "'+@sourceserver+'" /COPY SQL;"'+sspf.foldername+'\'+ssp.[name]+'" /DESTSERVER "'+@destserver+'"' as cmd
FROM msdb.dbo.sysssispackages ssp
JOIN msdb.dbo.sysssispackagefolders sspf
ON ssp.folderid = sspf.folderid
-- WHERE sspf.foldername = @foldername -- Uncomment line to only use specific folder

–

Upload SSIS packages from disk

Uploading an SSIS package from your disk to SQL Server using DTUTIL is quite straightforward. The command line syntax you will use looks like this:

dtutil /FILE c:\mypackage.dtsx /DestServer DBSERVER2 /COPY SQL;"foldername/mypackage"

–

Download SSIS packages to disk (Great for backup)

Downloading an SSIS package to your disk from SQL Server using DTUTIL is also straightforward. The command line syntax you will use looks like this:

dtutil /SQL mypackage /COPY FILE; c:\destPackage.dtsx

Again, to download multiple packages, a script that lists out and provides a command would be the most efficient way to proceed. Try adapting the ‘migrate’ script above.

–

Summary

In conclusion, DTUTIL is a highly useful tool for managing SSIS package migration. While it might take a bit of practice to get comfortable with the syntax, its flexibility and efficiency make it worth the effort. Remember to always check your commands before executing them to prevent any unwanted actions. A full list of command is available from Microsoft Learn.

And that’s it, folks! I don’t have much other SSIS content, only this one which is more of a hack. With these simple and effective DTUTIL commands, you can now manage your SSIS packages with ease. Feel free to share this blog post with anyone who might find it helpful, and don’t hesitate to leave a comment below if you have any questions or want to share your own experiences with DTUTIL.

Filed Under: front-page, SSIS

T-SQL STUFF – Taking it to the limit

11th April 2023 By John McCormack 3 Comments

t-sql tuesday logo

I don’t often go down code based rabbit holes, its just not what I do day to day but a while back, someone asked on twitter’s #sqlhelp hashtag if there was a character length limit to the STUFF function. The documentation didn’t have an answer. For that reason only, I decided to take a look. By some coincidence this month on behalf of T-SQL Tuesday, Reitse Eskens (b|t) asks:

“What are your most fun script, procedures or statements that you’ve written”

T-SQL Tuesday #161 Invitation. Having fun with T-SQL

So, I thought I’d share my code. Spoiler alert I ran it up to 50 million characters on my laptop and it kept working but it started to take a long time (4H, 28 minutes). Going up to these limits is not your day to day typical use of the function, it’s quite unusual and not really a production scenario. I’m not sure this code serves any useful purpose but by sharing, it saves it sitting smugly in my “SillyStuff” folder doing nothing.

What is the point of the STUFF function

There’s no point in me paraphrasing, here is Microsoft’s own explanation: The STUFF function inserts a string into another string. It deletes a specified length of characters in the first string at the start position and then inserts the second string into the first string at the start position.

So in my example where I am trying to stuff text into 50 million character long strings, it’s a mostly pointless exercise in terms of reality. A good use case might include obfuscating something sensitive like PII data, formatting dates and times with non standard characters or simply inserting a string at a specific place within another string.

The Code

/*
	This came about after @PamelaMooney tweeted: #SQLHelp Is anyone aware of a character length on the STUFF function?
	I still don't have a definitive answer but I can get it working up to 50 million characters. If you are stuffing beyond that, good luck to you.
*/

DECLARE @var NVARCHAR(MAX) = N''
DECLARE @counter int = 0
WHILE @counter < 200001
BEGIN
SET @var += '500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv'

SET @counter +=1
END
SELECT STUFF(@var, 5, 1, ' Still works! ')
SELECT LEN(@var)

I’m going to leave it there for now, its a waste of CPU cycles if I keep trying to break it. If anyone wants to push this to the limit, please feel free to let me know the results in the comments. I hope it was a fun read, back when I hosted T-SQL Tuesday, I asked about snippets of useful code. Although this is not so useful, it’s the type of topic I like to see offered.

Filed Under: front-page, T-SQL Tuesday Tagged With: stuff, t-sql, t-sql tuesday

Thoughts on sitting DP-300 Azure Database Administrator exam

3rd April 2023 By John McCormack Leave a Comment

On Sunday morning, I sat and passed exam DP-300 which gives the certification of Microsoft Certified: Azure Administrator Associate. I’ve been sitting Microsoft exams for a number of years now but this was my first for a while. I wanted to jot down what thoughts still remained in my head following the exam.

Preparation

I prepared for the exam by following the course from ACloudGuru. I’m a big fan of this site as it has helped me learn so much about AWS; so I thought I would trust it for Azure learning as well. I have to admit I had been going through the course at a snail’s pace and by the time of completion, I had forgotten some of things I had learned initially. I sat their exam simulator and got 77% which was not a pass.

I realised from doing the exam simulator that there were some areas where my knowledge was weaker, specifically on some newer features I hadn’t worked with day to day and also migration options for the likes of Postgres and MariaDB.

I wrote down each wrong answer, learned why it was wrong and what was correct. I resat the test and got 87%. I repeated the process and got 89%. A lot of the questions were repeating by now and I felt ready to sit the exam.

Sitting the DP-300 exam

The exam itself was MUCH harder than the exam simulator, however in fairness, the exam simulator was pushing you to achieve 80% which allowed a bit of a buffer come exam day. (Microsoft only look for 700/1000). There are some areas I think the simulator could have covered more extensively and a bigger bank of questions would have been helpful.

However when it came to the exam, I managed to rely on my experience to pull some old knowledge out of the darkest recesses of my brain, and made a few educated guesses. Thank fully I passed with 800+. Not the best score but the exam itself is pass or fail.

Focus areas for sitting DP-300

I can’t be too specific here due to NDA but I would suggest if you are an experienced SQL DBA but have limited or no recent Azure working experience, some areas to focus on are:

  1. PAAS vs IAAS
    1. Remember PAAS needs less administrative effort than IAAS or on premises. Look for clues in the question.
    2. Business Critical and Premium vs General Purpose and Standard
    3. Azure SQL Server IaaS Agent Extension
    4. Database mail for managed instance
  2. Migration options (Online and offline) – And not just for SQL Server
  3. Encryption vs Dynamic data masking
  4. Query store
  5. Key Vault
  6. Some Azure networking like Virtual network and VPN
  7. Know your performance DMVs and how to diagnose blocking and slow executing queries
  8. Backup of system DBs

What’s next

I’m still deciding but it’s given me the bug after a couple of years of not doing much active learning. So I think AZ-104 to help me re-establish some Azure infrastructure knowledge or DP-203 to introduce me to some parts of Data engineering that I’m looking to learn.

Filed Under: Azure SQL DB, DP-300, front-page Tagged With: azure, certifications, dp-300

Using ChatGPT for the first time

7th December 2022 By John McCormack Leave a Comment

I played with ChatGPT for the first time today. For those who don’t know what ChatGPT is, I asked it and this was the response.

ChatGPT is an open-source chatbot framework that enables developers to quickly create natural language understanding chatbots using the GPT-3 language model.

https://beta.openai.com/playground

If that still doesn’t make sense, I asked it to explain in a less technical style.

ChatGPT is a tool that makes it easy for developers to create chatbots that can understand and respond to natural language conversations.

https://beta.openai.com/playground
[Read more…]

Filed Under: front-page, Guides Tagged With: AI, AWS CLI, chatgpt, t-sql

  • 1
  • 2
  • 3
  • …
  • 22
  • Next Page »
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2025

 

Loading Comments...