John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

John McCormack's blogs

The curious case of the space in char(1)

10th August 2019 By John McCormack 1 Comment

What happens when you store a space in a char(1) column?

I was asked by a colleague why his where clause wasn’t being selective when filtering on a space value. The column was a char(1) data type. To understand the curious case of the space in char(1), we need to understand how the char data type works and also a bit more about the need for it in this scenario.

Scenario

My colleague had built a user table which included a gender column. A char(1) was chosen because one letter could be used to record the gender of the user. It was also small which helped with performance and overall size of the table. All non specified or non binary genders were assigned a ‘ ‘. (A space) The assumption was that the space character would behave the same as an M or F when used in the SELECT clause.

How the char column works

A char is fixed-length data type and any value that fall short of the fixed length is padded out with empty spaces. This means that inserting ” into the gender column has the same affect as entering ‘ ‘. This can be tested using the LEN function:

[sql]

DECLARE @tbl TABLE (name char(10))

INSERT INTO @tbl VALUES (”) — empty char(1)
INSERT INTO @tbl VALUES (‘ ‘) — a space in char(1)
INSERT INTO @tbl VALUES (‘John’)
INSERT INTO @tbl VALUES (‘John      ‘)

SELECT name,LEN(name) as len_name FROM @tbl

[/sql]

The spaces do not affect the LEN of the value of the char column

The spaces do not affect the LEN of the value of the char column

Build a test table – including putting a space in char(1)

[sql]

CREATE TABLE dbo.People (Description NVARCHAR (20),Gender CHAR(1))

INSERT INTO dbo.People(Description,Gender) VALUES(N’Male’, ‘M’)
INSERT INTO dbo.People(Description,Gender) VALUES(N’Female’, ‘F’)
INSERT INTO dbo.People(Description,Gender) VALUES(N’Other with space’, ‘ ‘)
INSERT INTO dbo.People(Description,Gender) VALUES(N’Other without space’, ”)
INSERT INTO dbo.People (Description,Gender) VALUES(N’Other as O’, ‘O’)

[/sql]

Query / Results

[sql]

— Select all rows and show the LEN of the gender column
SELECT
Description,
Gender,
LEN(Gender) AS Len_Gender
FROM dbo.People

[/sql]

Select space in char(1)

[sql]

— We get the same result with either query so the space is ignored
SELECT * FROM dbo.People WHERE Gender = ‘ ‘
SELECT * FROM dbo.People WHERE Gender = ”

[/sql]

Select char where = ' '

[sql]

— We get the row we want when we use a value such a O instead of a space
SELECT * FROM dbo.People WHERE Gender = ‘O’

[/sql]

Select char where = 'o'

Workarounds

The best option is use to a different character like a ‘O’ or something else suitable, anything other than a space in char(1) would work. If you weren’t too concerned about space used in this column, using a longer varchar data type would allow user to enter anything they wanted. Failing that, a TINY int would allow 256 options and still only uses 1 byte.

Its 2019

Yes, I know there a whole discussion about whether we should be storing gender and possible values but this is simply a post about using CHAR(1).

Further reading:

https://docs.microsoft.com/en-us/sql/t-sql/data-types/char-and-varchar-transact-sql?view=sql-server-2017 

https://stackoverflow.com/questions/4175878/storing-sex-gender-in-database 

Slightly tenuous link: https://johnmccormack.it/2016/02/how-does-the-sql-server-engine-process-my-select-query/

 

Filed Under: front-page, T-SQL

Efficient maintenance of SSISDB

7th August 2019 By John McCormack 3 Comments

Maintenance of SSISDBMaintenance of SSISDB within SQL Server

The SSIS Server Maintenance Job is used for maintenance of SSISDB. It manages the retention of operations records in SSISDB. I noticed it had been turned off by someone last year and it hadn’t run since. Therefore, SSISDB had become bloated and there was only 10MB left on the data drive meaning the database could no longer auto grow.

Nobody was aware, why would they be? After all, nothing was failing. We didn’t have disk space monitoring enabled so the only time we found out there was a problem was when the disk had filled up.

I made 2 unsuccessful attempts at running the SSIS Server Maintenance Job. However, after several hours of processing and still no available free space in the database, I knew the job wasn’t coping with the sheer number of rows it had to delete. The deletes all happen from the parent table (internal.operations) and then all child tables using using cascading deletes. This approach maintains referential integrity but is not great for performance.

Due to this, I needed a new approach to the maintenance of SSISDB. As we hadn’t maintained these tables for 13/14 months, I was asking too much of SQL Server to let me delete everything at once. (Truncates wouldn’t do because I had to keep the last 2 weeks data).

A bit of investigation showed me that these were the related tables.

  • internal.event_message_context
  • internal.event_messages
  • internal.executable_statistics
  • internal.execution_data_statistics
  • internal.execution_component_phases
  • internal.execution_data_taps
  • internal.execution_parameter_values
  • internal.execution_property_override_values
  • internal.executions
  • internal.operation_messages
  • internal.extended_operation_info
  • internal.operation_os_sys_info
  • internal.validations
  • internal.operation_permissions
  • internal.operations

My approach

  1. Amend the retention period (days) in catalog.properties to 400 (because 14 was unmanageable with > 400 days of history)
  2. Write a delete script or find a reliable one that does this work due to SSISDB’s native stored procedures failing to cope
  3. Ensure SSISDB is in SIMPLE recovery model because it will reduce t-log growth
  4. Run the delete script and see how it performs and how much space is freed up in order that the days for deletions can be optimised
  5. Repeat steps 1-4 (each time lowering retention period (days)) until I achieve my target retention period of 14
  6. Ensure this never happens again 😎 (because it’s no fun getting 300 failure emails an hour)

1. Reduce retention period (days) in catalog.properties to 400 (This allowed me delete rows based on only 22,000 IDs)

To do this in T-SQL:

[sql]
EXEC [SSISDB].[catalog].[configure_catalog] @property_name=N’RETENTION_WINDOW’, @property_value=400
[/sql]

To do this is SSMS:

Right click SSISDB from Integration Services Catalog in SQL Server Management Studio. Then, amend Retention Period (days) to a suitable initial value – in my case 400 days.

SSISDB Catalog Properties

2. Script out or find a reliable script that does this work manually.

I struck gold with a superb script from Tim Mitchell which honours the Retention Period (days) value. I decided this was better than writing my own. Please follow the link to review this script along with other useful information (or likewise, get the latest version from Tim’s Github).

3. Ensure SSISDB is in SIMPLE recovery model (as it helps with transaction log)

[sql]
SELECT DB_NAME(database_id) AS DBName,recovery_model_desc
FROM sys.databases
WHERE DB_NAME(database_id) = ‘SSISDB’
[/sql]
If it’s not, I recommend you ALTER database to SIMPLE recovery in order to minimise logging. It’s not an essential step but will save bloating your transaction log.

[sql]
ALTER DATABASE [SSISDB] SET RECOVERY SIMPLE WITH NO_WAIT
[/sql]

4. Run script (from step 2) and see how it performs and how much space is freed up

[sql]
SELECT NAME,
size/128/1024.0 AS FileSizeGB,
size/128/1024.0 – CAST(FILEPROPERTY(NAME, ‘SpaceUsed’) AS INT)/128/1024.0 AS FreeSpaceGB,
(size/128/1024.0 – CAST(FILEPROPERTY(NAME, ‘SpaceUsed’) AS INT)/128/1024.0) / (size/128/1024.0) * 100 AS PercentFree
FROM SSISDB.sys.database_files
WHERE NAME = ‘SSISdata’;
[/sql]

5. Repeat steps 1-4 (each time lowering retention period (days)) until you achieve your target retention period

I started small by reducing the retention period (days) to 390. This allowed me to measure the time and impact of removing 10 days of data. I then went down to a retention period of 360 days. I found 14 days to be the best performing decrements so I kept doing this until I had only 14 days of data remaining. Following this, I kept the new script in place (scheduled nightly) via SQL Agent. There was no need to continue using the SSISDB cleanup stored procedure internal.cleanup_server_retention_window.

SSISDB purge job duration

6. Ensure this never happens again

They say prevention is better than cure. Here are some ideas on how to implement this and ensure that your maintenance of SSISDB is ongoing:

  • SQL Agent Job to check if date of oldest id is greater than expected (And alert of not)
  • Write a check in PowerShell or even better, consider writing a command for DBATools.
  • Professional monitoring tools may alert on this but I haven’t checked.

Further reading

  • My other posts which mention SSIS
  • SSISDB best practices

 

Filed Under: front-page, Guides, SQL Server Tagged With: sql, SQL server, ssis, ssisdb

AZCopy for SQL Backups and other stuff

17th July 2019 By John McCormack 1 Comment

AZCopy for SQL Backups and other stuff

AZCopy is a useful command line utility for automating the copying of files and folders to Azure Storage Account containers. Specifically, I use AZCopy for SQL Backups but you can use AZCopy for copying most types of files to and from Azure.

In this blog post example (which mirrors a real world requirement I had), the situation is that whilst I need to write my SQL backups over a network share, I also want to push them up to Azure Storage (in a different region) to allow developers quicker downloads/restores. This is why I need to use AZCopy. If I only needed my backups to be written to Azure, I could have used BACKUP TO URL instead.

Authentication

I use Shared Access Signatures (SAS token) to authorise access to the storage account container. You can create an SAS token for any Access key by clicking Generate SAS and connection string button from storage accounts/Shared Access Signature.

For this scenario, my SAS token is limited to write / add / create operations and cannot be used to read or delete data. This is just to limit the scope for security reasons, but you can decide whatever meets your needs.

SAS tokens have an expiry date. I’ve set mine to expire on 16th August 2019 so I will need to create a new token at this time and update my AZCopy command if I want to continue with the automated copying via a scheduled script.

sas configuration options
SAS Configuration options 
Command
Storage Account: johnbox
Container: instance1

The basic copy command is simple. An example is:
[shell]azcopy cp “<source location>” “<destination location>”[/shell]

For using an SAS token, we append this to the destination url:
[shell]azcopy cp “C:\temp\johns_backup.bak” “https://johnbox.blob.core.windows.net/instance1?sas-token”[/shell]

However, the azcopy sync option is more efficient because it won’t force a copy over the network if the source and destination already match. The example below also syncs the full folder, not just a specific file.
[shell]azcopy sync “C:\temp” “https://johnbox.blob.core.windows.net/instance1?sas-token” [/shell]

 Schedule

I chose to schedule this in a sql agent job. The job step is directly after the step where I back up to my network share. You could also use windows task scheduler if you prefer.

If you haven’t used AZCopy yet?

Download AZCopy from https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10. The page also gives instructions on how to run and authenticate AZCopy commands.

If you have time, please have a look at my other Azure posts.

Filed Under: Azure, front-page, Guides Tagged With: azcopy, azure

Alert if your availability group fails over

10th July 2019 By John McCormack Leave a Comment

A simple way to send an alert if your Always On Availability Group fails over. You can run the t-sql below or set it up visually in SQL Agent.

USE [msdb]

/*
Sends an alert if the AG group fails over.
I've limited the requests to one alert per group (by specifying the DB name in @event_description_keyword
Let me know what you think of this approach, I found it better than multiple alerts for each DB in the Availability Group
*/

EXEC msdb.dbo.sp_add_alert @name=N'AG role change - AG1',
@message_id=1480,
@severity=0,
@enabled=1,
@delay_between_responses=0,
@include_event_description_in=1,
@database_name=N'',
@event_description_keyword=N'PRD_DB1',
@notification_message=N'There has been a failover of your Always On Availability Group AG1 - Please investigate.',
@job_id=N'00000000-0000-0000-0000-000000000000'

-- Add an operator if it doesn't already exist
EXEC msdb.dbo.sp_add_operator @name=N'John McCormack',
@enabled=1,
@pager_days=0,
@email_address=N'john.mccormack@example.com'

-- Add a notification
EXEC msdb.dbo.sp_add_notification @alert_name=N'AG role change - AG1', @operator_name=N'John McCormack', @notification_method = 1

 

Filed Under: Always On Availability Groups, front-page

You can run SQL Server on AWS?

18th June 2019 By John McCormack 1 Comment


Barbecue food at DataGrillenMy presentation

I presented on running SQL Server on AWS at DataGrillen in Germany. Specifically, I covered the differences between RDS and EC2, migration options and the AWS CLI. Despite being a daunting prospect, I enjoyed speaking at a major conference for the first time.

I received a few questions and managed to answer most. For the one I could not remember, I promised to provide an answer.

I was asked who provides the SSL certificates when using SSL connections with your RDS instance. The answer:  When you create a SQL Server DB instance, Amazon RDS creates an SSL certificate for it. The SSL certificate includes the DB instance endpoint as the Common Name (CN) for the SSL certificate to guard against spoofing attacks. Read more.

Link to slides, please feel free to view or download.

The event

DataGrillen is a community Data Platform event which is free to attend and draws speakers and attendees from across the world. It is so inclusive and welcoming, it’s easy to see why it’s a can’t miss event for so many many people.

Speakers dinner DataGrillen 2019
Speakers dinner
DataGrillen 2019 welcomes new speakers
Welcoming new speakers
DataGrillen 2019 lunch
Lunch
Chris Taylor talks containers
Chris Taylor talks Azure Devops
John McCormack and Chrissy LeMaire
With Chrissy LeMaire
DataGrillen 2019 beer drinking
Beers and networking
BArbecue food at Data Grillen
Barbecue food
DataGrillen 2019 after party
Some post Grillen drinks

Links to my other AWS content

  • https://johnmccormack.it/2017/12/ec2-sql-server-backups-to-amazon-s3/
  • https://johnmccormack.it/2018/08/how-to-create-a-table-in-aws-athena/

AWS documentation

  • https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_SQLServer.html
  • https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ConnectToMicrosoftSQLServerInstance.html
  • https://docs.aws.amazon.com/cli/latest/reference/rds/restore-db-instance-to-point-in-time.html
  • https://docs.aws.amazon.com/cli/latest/userguide/install-windows.html#install-msi-on-windows 
  • https://aws.amazon.com/blogs/database/powering-up-database-mail-on-amazon-rds-for-sql-server-how-under-armour-runs-database-mail-on-amazon-rds-for-sql-server/ 

Thanks

John

Filed Under: AWS RDS, AWS SQL Server Tagged With: aws, DataGrillen2019, ec2, rds, SQL server

  • « Previous Page
  • 1
  • …
  • 12
  • 13
  • 14
  • 15
  • 16
  • …
  • 22
  • Next Page »
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2025

 

Loading Comments...