John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

John McCormack's blogs

How to easily upload, download and migrate SSIS Packages with DTUTIL

9th June 2023 By John McCormack Leave a Comment

SSIS and DTUTIL background

Woman writing code with reflection in glasses

SQL Server Integration Services (SSIS) is a powerful tool, but migrating packages across SQL Servers can be a slow and thankless task if you don’t use automation. The single best way to do this is by using DTUTIL, a command-line utility provided directly by Microsoft. When it comes to moving 1000s of packages, I would only use DTUTIL because it can achieve in minutes what it would takes days to achieve using point and click.

–

Migrating a single package

Migrating individual packages from one SQL Server to another directly is also possible with DTUTIL. This is an example command below:

dtutil /SQL "SourceFolder/SourcePackage" /SOURCESERVER "SourceServer" /COPY SQL;"DestinationFolder/DestinationPackage" /DESTSERVER "DestinationServer"
  • Specifies the source package location and server with /SQL “SourceFolder/SourcePackage” and /SOURCESERVER “SourceServer”.
  • Copies the package to the destination SQL Server with /COPY SQL;”DestinationFolder/DestinationPackage”.
  • Identifies the target SQL Server with /DESTSERVER “DestinationServer”.

–

Migrating multiple packages

This process can be scripted similarly for multiple package migrations. The easiest way is to generate the commands using T-SQL, print them to screen then copy and paste in to a command window.

If you run the command below on the source server, it will list out a command to migrate every package.

DECLARE 
	@sourceserver sysname = N'DBSERVER1'
	,@destserver sysname = N'DBSERVER2'
	,@foldername sysname = N'DATAWAREHOUSE'
	
SELECT 
'DTUTIL /SQL "'+sspf.foldername+'\'+ssp.[name]+'" /SOURCESERVER "'+@sourceserver+'" /COPY SQL;"'+sspf.foldername+'\'+ssp.[name]+'" /DESTSERVER "'+@destserver+'"' as cmd
FROM msdb.dbo.sysssispackages ssp
JOIN msdb.dbo.sysssispackagefolders sspf
ON ssp.folderid = sspf.folderid
-- WHERE sspf.foldername = @foldername -- Uncomment line to only use specific folder

–

Upload SSIS packages from disk

Uploading an SSIS package from your disk to SQL Server using DTUTIL is quite straightforward. The command line syntax you will use looks like this:

dtutil /FILE c:\mypackage.dtsx /DestServer DBSERVER2 /COPY SQL;"foldername/mypackage"

–

Download SSIS packages to disk (Great for backup)

Downloading an SSIS package to your disk from SQL Server using DTUTIL is also straightforward. The command line syntax you will use looks like this:

dtutil /SQL mypackage /COPY FILE; c:\destPackage.dtsx

Again, to download multiple packages, a script that lists out and provides a command would be the most efficient way to proceed. Try adapting the ‘migrate’ script above.

–

Summary

In conclusion, DTUTIL is a highly useful tool for managing SSIS package migration. While it might take a bit of practice to get comfortable with the syntax, its flexibility and efficiency make it worth the effort. Remember to always check your commands before executing them to prevent any unwanted actions. A full list of command is available from Microsoft Learn.

And that’s it, folks! I don’t have much other SSIS content, only this one which is more of a hack. With these simple and effective DTUTIL commands, you can now manage your SSIS packages with ease. Feel free to share this blog post with anyone who might find it helpful, and don’t hesitate to leave a comment below if you have any questions or want to share your own experiences with DTUTIL.

Filed Under: front-page, SSIS

T-SQL STUFF – Taking it to the limit

11th April 2023 By John McCormack 3 Comments

t-sql tuesday logo

I don’t often go down code based rabbit holes, its just not what I do day to day but a while back, someone asked on twitter’s #sqlhelp hashtag if there was a character length limit to the STUFF function. The documentation didn’t have an answer. For that reason only, I decided to take a look. By some coincidence this month on behalf of T-SQL Tuesday, Reitse Eskens (b|t) asks:

“What are your most fun script, procedures or statements that you’ve written”

T-SQL Tuesday #161 Invitation. Having fun with T-SQL

So, I thought I’d share my code. Spoiler alert I ran it up to 50 million characters on my laptop and it kept working but it started to take a long time (4H, 28 minutes). Going up to these limits is not your day to day typical use of the function, it’s quite unusual and not really a production scenario. I’m not sure this code serves any useful purpose but by sharing, it saves it sitting smugly in my “SillyStuff” folder doing nothing.

What is the point of the STUFF function

There’s no point in me paraphrasing, here is Microsoft’s own explanation: The STUFF function inserts a string into another string. It deletes a specified length of characters in the first string at the start position and then inserts the second string into the first string at the start position.

So in my example where I am trying to stuff text into 50 million character long strings, it’s a mostly pointless exercise in terms of reality. A good use case might include obfuscating something sensitive like PII data, formatting dates and times with non standard characters or simply inserting a string at a specific place within another string.

The Code

/*
	This came about after @PamelaMooney tweeted: #SQLHelp Is anyone aware of a character length on the STUFF function?
	I still don't have a definitive answer but I can get it working up to 50 million characters. If you are stuffing beyond that, good luck to you.
*/

DECLARE @var NVARCHAR(MAX) = N''
DECLARE @counter int = 0
WHILE @counter < 200001
BEGIN
SET @var += '500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv500 characters pqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuv'

SET @counter +=1
END
SELECT STUFF(@var, 5, 1, ' Still works! ')
SELECT LEN(@var)

I’m going to leave it there for now, its a waste of CPU cycles if I keep trying to break it. If anyone wants to push this to the limit, please feel free to let me know the results in the comments. I hope it was a fun read, back when I hosted T-SQL Tuesday, I asked about snippets of useful code. Although this is not so useful, it’s the type of topic I like to see offered.

Filed Under: front-page, T-SQL Tuesday Tagged With: stuff, t-sql, t-sql tuesday

Thoughts on sitting DP-300 Azure Database Administrator exam

3rd April 2023 By John McCormack Leave a Comment

On Sunday morning, I sat and passed exam DP-300 which gives the certification of Microsoft Certified: Azure Administrator Associate. I’ve been sitting Microsoft exams for a number of years now but this was my first for a while. I wanted to jot down what thoughts still remained in my head following the exam.

Preparation

I prepared for the exam by following the course from ACloudGuru. I’m a big fan of this site as it has helped me learn so much about AWS; so I thought I would trust it for Azure learning as well. I have to admit I had been going through the course at a snail’s pace and by the time of completion, I had forgotten some of things I had learned initially. I sat their exam simulator and got 77% which was not a pass.

I realised from doing the exam simulator that there were some areas where my knowledge was weaker, specifically on some newer features I hadn’t worked with day to day and also migration options for the likes of Postgres and MariaDB.

I wrote down each wrong answer, learned why it was wrong and what was correct. I resat the test and got 87%. I repeated the process and got 89%. A lot of the questions were repeating by now and I felt ready to sit the exam.

Sitting the DP-300 exam

The exam itself was MUCH harder than the exam simulator, however in fairness, the exam simulator was pushing you to achieve 80% which allowed a bit of a buffer come exam day. (Microsoft only look for 700/1000). There are some areas I think the simulator could have covered more extensively and a bigger bank of questions would have been helpful.

However when it came to the exam, I managed to rely on my experience to pull some old knowledge out of the darkest recesses of my brain, and made a few educated guesses. Thank fully I passed with 800+. Not the best score but the exam itself is pass or fail.

Focus areas for sitting DP-300

I can’t be too specific here due to NDA but I would suggest if you are an experienced SQL DBA but have limited or no recent Azure working experience, some areas to focus on are:

  1. PAAS vs IAAS
    1. Remember PAAS needs less administrative effort than IAAS or on premises. Look for clues in the question.
    2. Business Critical and Premium vs General Purpose and Standard
    3. Azure SQL Server IaaS Agent Extension
    4. Database mail for managed instance
  2. Migration options (Online and offline) – And not just for SQL Server
  3. Encryption vs Dynamic data masking
  4. Query store
  5. Key Vault
  6. Some Azure networking like Virtual network and VPN
  7. Know your performance DMVs and how to diagnose blocking and slow executing queries
  8. Backup of system DBs

What’s next

I’m still deciding but it’s given me the bug after a couple of years of not doing much active learning. So I think AZ-104 to help me re-establish some Azure infrastructure knowledge or DP-203 to introduce me to some parts of Data engineering that I’m looking to learn.

Filed Under: Azure SQL DB, DP-300, front-page Tagged With: azure, certifications, dp-300

Using ChatGPT for the first time

7th December 2022 By John McCormack Leave a Comment

I played with ChatGPT for the first time today. For those who don’t know what ChatGPT is, I asked it and this was the response.

ChatGPT is an open-source chatbot framework that enables developers to quickly create natural language understanding chatbots using the GPT-3 language model.

https://beta.openai.com/playground

If that still doesn’t make sense, I asked it to explain in a less technical style.

ChatGPT is a tool that makes it easy for developers to create chatbots that can understand and respond to natural language conversations.

https://beta.openai.com/playground
[Read more…]

Filed Under: front-page, Guides Tagged With: AI, AWS CLI, chatgpt, t-sql

DataGrillen 2022 – Photos and notes

8th June 2022 By John McCormack Leave a Comment

I’m just back from DataGrillen 2022. As ever, I wanted to put together a short post and use the opportunity to share some things I learned, and share some photos of the event. First of all, thanks for such a hospitable welcome from William Durkin and Ben Weissman and congratulations for hosting a very successful event.

On top of everything else that goes into planning a complex event like this, they also had an added responsibility this time to keep people free of covid as far as was reasonably possible. This meant mandatory testing, vaccine passports and masking wearing (inside only). It was a small price to pay after two years of online conferences only.

The barbecue event was on the middle night after the sessions. There was a great choice of food and everything was cooked really well, and of course there were complimentary beers to wash it down.

Following on from the main event which comprised of two days of technical sessions, the host town of Lingen was celebrating the historic Kivelingsfest, a medieval festival dating back hundreds of years. As a result, many of the conference attendees stayed on an extra day to attend the festival which was great fun.

It was home on Sunday but not before we got the chance to celebrate André Kamman’s by taking a boat trip along Amsterdam’s canals.

Datagrillen 2022 photos

  • DataGrillen 2022 Keynote slide
  • Andrew Pruski DeepDive into Docker presentation
  • Jess and Sander presenting
  • Argenis talks about ransonware
  • Monica Rathbaun talks Azure SQL DB performance
  • John Martin speaks to the room
  • The Data Dance Teacher Robert French
  • Beer in the final session of the day
  • Barbecue food
  • Glass of beer
  • kivelingsfest fun
  • Amsterdam canal boat
  • Amsterdam canal boat with large greenhouse in background
  • Fire at Kivelingfest Lingen

DataGrillen 2022 Sessions

These are the sessions I attended. I tried to fit in as much as possible from the two day event.

John Martin – Performing successful cloud migrations – Chalk & talk

It was a very informal delivery by John who guided us through a timeline and the phases of a successful cloud migration. It was a cloud agnostic session so the key information could be used with any major cloud provider.

Andrew Pruski – A deep dive into Docker

I wanted to attend this as I haven’t used Docker a great deal and I wanted to learn more. Andrew provided a great overview and worked through some very well prepared demos. I am keen to try to follow these up by visiting his github repo for the content.

Gianluca Sartori – Time Series for relational people

Gianluca gave a great overview of the differences between relational and time series databases and went on to review a few. He settled on influxdb and demonstrated in detail how it could be used along with Grafana and Telegraf agent (TIG stack) to provide really useful metrics. I do use the TIG stack in my work but I learned about some great settings for dealing with or discarding older data that I plan to take forward.

Monica Rathbun – Performance tuning Azure SQL Database

This session was packed to the rafters and extremely hot, yet Monica’s enthusiastic and engaging style made it so easy to stay focused on the message. The message was delivered well and easy to understand. I took away some key information here regarding query store and readable secondary databases. Overall, it was my favourite session of DataGrillen 2022.

Argenis Fernandez – Ransomware sucks. Beef up your databases against it.

This had to be close contender for my top session. Argenis knew what he was talking about and it showed. He gave very good advice and I have returned home with a checklist that I will need to turn into an action plan.

André Kamman – Looking under the hood of the parquet format

I’m familiar with parquet as I use AWS Athena a lot however I’ve never really looked under the hood. I honestly expected this session to be a bit above me but André’s communication style made it very accessible for attendees of all knowledge levels and easy to follow along.

Jess Pomfret and Sander Stad – Deploying Azure Resources with PowerShell Azure Functions

Jess and I were fellow newcomers at Data Grillen in 2019. I always enjoy her sessions so this was not to be missed. It was my first time seeing Sander and the two presenters worked well to deliver this session. I left convinced that I need to change at least one part of my automation routine to include Azure functions, so I will hopefully be checking them out this week. (If time allows, the trouble with conferences is you return home with SO many ideas)

Heini Ilmarinen – Less Clicking, More Coding! Azure Data Platform Development Using Infrastructure as Code

I’ve dipped my toe into Terraform but I’ve used BICEP and Cloud formation a bit more. After Heini’s session, I get the benefits of Terraform and I’ve added it to my ever increasing list of things to review. A great session in which Heini did well to keep us engaged all the way through.

Robert French – Goldilocks and the three business bears; story telling for business

This was the last session of a very busy two days. They broke the mould when they made Robert and as expected, his session also broke the mould. Robert delivered the session with unmatchable energy, kind of at a blistering pace but the pace was just right at the same time. He got through a huge amount of visual aids. He gave great advice on where you eyes are drawn to in report, how to make key data pop out and how to usual colours and fonts carefully.

Filed Under: front-page, SQL Server, Training Tagged With: data grillen, datagrillen, kivelingsfest

  • 1
  • 2
  • 3
  • …
  • 21
  • Next Page »
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2023