John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

John McCormack's blogs

Data Glasgow User Group

13th February 2020 By John McCormack Leave a Comment

John McCormack Presenting SQL Server ToolsI spoke at the relaunch of the Glasgow Data User Group (formerly Glasgow SQL Server User Group).

It was a great event, hosted by Craig, Louise and Robert. My presentation was on some useful free community tools, which can make your life as a DBA much easier. It shows that DBAs with zero budget to spend on expensive software still have a wide range of free software to make our lives easier. Moreover, even with a large budget, some of these free tools are still the best in their class so are worth reviewing.

Slides for tonight’s talk on SQL Server Community Tools are available on Github.

 

Filed Under: front-page

Change a user’s resource class within Azure Synapse Analytics using Azure Automation runbooks

17th December 2019 By John McCormack Leave a Comment

Change a user’s resource class within Azure Synapse Analytics | A solution

This post shows a method for allowing developers to change a user’s resource class within Azure Synapse Analytics | Azure SQL Data Warehouse without granting them membership of the db_owner fixed database role.

Scenario:

I was a approached by a data warehouse developer who wanted to be able to take advantage of the feature in Azure Synapse Analytics (Formerly Azure SQL Data Warehouse) called ‘Resource classes’. Resource classes allow users in certain roles to have access to extra compute resources for their queries. This can increase their max memory per query (although there is an overall trade-off on concurrency). Unfortunately, due to this being managed via roles, the person looking to change resource classes needs to be in the db_owner role so they can manage database role membership. As I didn’t want to add the developer to the db_owner role, a different solution was required.

Solution:

A separate sql user can be added to the db_owner role on the SQL Data Warehouse in question. We can then create a credential in azure automation based upon this user. Finally, we create a Runbook which executes db_addrolemember (but only allows a specific few roles related to resource groups). This means we can give an end user the ability to add/remove people by granting them access to run the runbook.

How to:

Create SQL login/user
-- Create sql login
CREATE LOGIN [dev_dba_automation] WITH PASSWORD = 'S3cur£P@ssword!'

-- Change into DWH context, not master
CREATE USER [dev_dba_automation] WITH DEFAULT_SCHEMA=[dbo]

-- FYI
-- ALTER ROLE X ADD MEMBER Y doesn't currently work with Azure SQL DWH
Create Azure Automation Credential

From your automation account, find the Credentials option (under shared resources) on the left hand menu.

Add Credential To Azure Automation

Then create your new credential, using the newly created sql login name as the user name and then add the correct password.

Add New Credential To Azure Automation

Please note if the end user has privileged access to Azure Automation with a role such as owner or contributor, they could use the credential to carry out other tasks in the data warehouse so please be careful about who will have access to creating new runbooks.

Create Runbook
  • Log into Azure Automation and create a new runbook.
  • Copy this Powershell code in.
<#
.DESCRIPTION
This runbook allows the person running it to add a database user to a
database role on dev-datawarehouse

.PARAMETER Database
String name of the SQL Server database to connect to

.PARAMETER databaserole
String name of the database role you wish to add a member to

.PARAMETER rolemember
String name of the user you wish to add to the database role

.PARAMETER add_or_drop
String Must = add for db_addrolemember or drop for sp_droprolemember

.NOTES
AUTHOR: John McCormack
LASTEDIT: Dec 16, 2019
#>

param(

[parameter(Mandatory=$True]
[string] $Database,

[parameter(Mandatory=$True]
[string] $databaserole,

[parameter(Mandatory=$True]
[string] $rolemember,

[parameter(Mandatory=$True]
[string] $add_or_drop
)

# Variables
$SqlCredentialAsset = 'dev_az_automation'
$SqlServer = 'dev-datawarehouse.database.windows.net'
$SqlServerPort = 1433
$SqlCredential = Get-AutomationPSCredential -Name $SqlCredentialAsset

if ($SqlCredential -eq $null)
{
throw "Could not retrieve '$SqlCredentialAsset' credential asset. Check that you created this first in the Automation service."
}
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password

# Define the connection to the SQL Database
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$SqlServer,$SqlServerPort;Database=$Database;User ID=$SqlUsername;Password=$SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")

# Open the SQL connection
$Conn.Open()

# Run sp_addrolemember as the credential user
#$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_addrolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
if($add_or_drop -eq "add")
{
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_addrolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
}
else
{
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_droprolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
}
$Cmd.CommandTimeout=120

if ($databaserole -Like "*rc*")
{
# Execute the SQL command
$Ds=New-Object system.Data.DataSet
$Da=New-Object system.Data.SqlClient.SqlDataAdapter($Cmd)
[void]$Da.fill($Ds)

# We should look at adding some logging for this
WRITE-OUTPUT $rolemember $add_or_drop"ed to database role:$databaserole."
}
else
{
WRITE-OUTPUT "Invalid role"
}

# Close the SQL connection
$Conn.Close()
  • Run the workbook and observe the output.
Further reading
  • https://docs.microsoft.com/en-us/azure/sql-data-warehouse/resource-classes-for-workload-management#example-code-for-finding-the-best-resource-class
  • https://docs.microsoft.com/en-us/sql/t-sql/statements/alter-role-transact-sql?view=sql-server-ver15#examples
  • https://gallery.technet.microsoft.com/scriptcenter/How-to-use-a-SQL-Command-be77f9d2
    • (Base script I adapted)
Other Azure Content
  • https://johnmccormack.it/2019/03/put-tempdb-files-on-d-drive-in-azure-iaas/

Filed Under: Azure Automation, Azure SQL Data Warehouse, Azure Synapse Analytics Tagged With: azure, azure automation, azure sql data warehouse, azure synapse analytics

3 things a new DBA should learn first

6th November 2019 By John McCormack Leave a Comment

3 things a new DBA should learn first

Why would I put a post together about the 3 things a new DBA should learn first? Well, I was asked by a colleague what I thought she should be focusing on learning and mastering as newish SQL DBA. She was looking to become more experienced and confident in her role. Obviously this is an opinion piece but I felt it was worth sharing. The 3 suggestions are not in any particular order, and of course I’m sure not everyone will agree but it’s an opinion based on my own experience as a SQL DBA. (Feel free to disagree in the comments, I’m sure people will strongly about some item’s I’ve omitted)

  • What is happening on the server right now
  • Backup and restore of SQL Databases
  • Scripting i.e T-SQL and PowerShell

What is happening on the SQL server right now

Help, the SQL Server is on fire and my really important process is not working. Why is the server so slow?

This scenario will happen to you. Invariably, the person (often a manager) will stand over you and expect you to knock out one or two lines of t-sql wizardry to get things running along smoothly again. First of all, I should say that in a perfect world, you will ask them to raise a ticket and you will work on it according to its priority against your other tasks. Then you can let them know what the issue was and what can be done to prevent it happening again. But we rarely work in a perfect world. In this scenario, you need one primary line of t-sql to get started.

EXEC sp_whoisactive

The procedure sp_whoisactive doesn’t come with SQL Server. It is a community script created by Adam Machanic. You can download it from GitHub. If you don’t have it installed on your SQL Servers, It is something your should really consider as it gives much more useful and readable information that sp_who2 and it’s a lot easier than pulling together your own code. It’s mature and very safe and has been installed on SQL Servers worldwide since 2007.

sp_whoisactive offers loads of optional parameters that allow you to customise and sort the output according to your own preference but just running the proc on its own without parameters will give you an ordered list of everything that is executing, at that point in time. (Ordered by duration descending). If you see things running in the minutes that usually take seconds, maybe you need to see if they are blocking other transactions.

example output from sp_whoisactive

One parameter I find really useful during an incident is @find_block_leaders. By running [sql]EXEC sp_whoisactive @find_block_leaders = 1[/sql] , you can see exactly how many sessions are being blocked from each blocking session. In the example below, you can see that an INSERT transaction in session_id 52 is blocking 5 other sessions. Each of these are trying to read from the table with an open insert transaction so they are blocked. You either need to wait for 52 to finish or you need to kill it in order for the other transactions to move on.

example output from sp_whoisactive @find_block_leadersA quick note on killing spids. I really only recommend this if you know what the process is and you have an idea of how long it will take to rollback. (Remember those other 5 spids are still blocked until the rollback completes and this is a single threaded process)

Of course, it might not be blocking and in that case, you will need more scripts to analyse what is running regularly, how queries are performing and if any optimisations are needed. This will need to be in another blog as I want to keep this post fairly succinct.

Backup and restore of SQL Databases

Knowing how to back up a database is an essential skill of a DBA, certainly one of the top required skills. Equally important is knowing how to restore your database backups. This is something you should regularly practice to ensure that when/if the time comes that you need to act quickly, you are well rehearsed. Trying to restore a database to a point in time in the middle of a P1 emergency, is the stuff of nightmares if you haven’t done it before.

Learn the different types of backups available and how often you should be doing each of them. This will vary depending on your business needs. Even on the same instance, some databases may need point in time recovery and others wont. e.g. It might be fairly acceptable to back up your small master database once per day but you cannot afford to lose more than 5 minutes of orders in the event of your instance going down. In the case of your orders database, you will need a transaction log backup every 5 minutes. Depending on the size of your database, you will be looking at a combination of regular full and differential backups or just regular full backups (as well as your transaction log backups of course)

If you are on a PAAS database such as Azure SQL Database or AWS RDS, the backups can be done automatically without any administration effort but you will still want to practice restoring to a new instance, possibly into a new region or availability zone in the event of a disaster.

Other backup related topics to look into are compression, encryption, striping and retention management.

Scripting i.e T-SQL and PowerShell

This may be a strong opinion but I believe point and click DBAs are going the way of the dinosaurs. SQL Server Management Studio (SSMS) still makes it possible to do lots of work with knowing much T-SQL or PowerShell but this does not scale. If you end up managing tens or even hundreds of SQL Servers, scripting will be your friend. The more tedious work you can automate, the more time you have to work on more interesting tasks.

Learn T-SQL

Despite me saying that point and click is not the way to go, SSMS is very useful for providing the T-SQL code for how to do something. Say I wanted to create a new sql login that doesn’t expire and I want to give it sysadmin permissions. The first time I do this, I can step through the GUI but instead of clicking OK, I can click Script and the code is opened out in SSMS for me to review and run. I can also save this as a .sql script and either use it as a template for creating future sql logins, or refer to it often enough that I learn the syntax.

script out create login from ssms

USE [master]
GO
CREATE LOGIN [john] WITH PASSWORD=N'ae34bhijkfgcd5', DEFAULT_DATABASE=[master], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
GO
ALTER SERVER ROLE [sysadmin] ADD MEMBER [john]
GO

Learn Powershell

If this seems daunting, please don’t worry. There is a huge community project called dbatools that has more functionality for managing SQL Server than the official Microsoft SQL Server module. It’s a great way to start running Powershell commands and building your knowledge as you go.

You can start with commands as an when you need them and build up your knowledge from there. The website is extremely useful and by tagging @psdbatools on twitter, you will usually get help pretty quickly from someone involved. As your skills and confidence increase, you may even choose to add code to the project by making a pull request. As I write this post, dbatools has over 15,000 commits from 175 contributors. (I’m one of them in a VERY small way)

Summary

There are more than 3 things a new DBA should learn, a lot more but these 3 items will help you remain calm in a crisis and let you automate work, meaning you can be more efficient and have time to spend on interesting work.

Further reading

  • A brief history of activity monitoring
  • DBATools in a month of lunches
  • Backup and restore of SQL Server Databases

Filed Under: front-page, Guides, SQL Server

Data Scotland 2019 presentation

13th September 2019 By John McCormack 2 Comments

Data Scotland 2019 presentation:

My Data Scotland 2019 presentation: You can run SQL Server on AWS

Data Scotland 2019: You can run SQL Server on AWS from John McCormack

Useful links:

https://aws.amazon.com/rds/sqlserver/

https://docs.aws.amazon.com/en_pv/AmazonRDS/latest/UserGuide/USER_ConnectToMicrosoftSQLServerInstance.html

https://aws.amazon.com/blogs/developer/handling-credentials-with-aws-tools-for-windows-powershell/

https://johnmccormack.it/?s=aws

 

Filed Under: AWS RDS, AWS SQL Server, front-page Tagged With: #datascotland, datascotland2019, ec2, rds, SQL server

Database recovery models

26th August 2019 By John McCormack 1 Comment

Database recovery models

3 choiceThis post is about database recovery models for SQL Server databases. Having the correct recovery model for a database is crucial in terms of your backup and restore strategy for the database. It also defines if you need to do maintenance of the transaction log or if you can leave this task to SQL Server. Let’s look at the various recovery models and how they work.

The three database recovery models for SQL Server:

  • Full
  • Bulk-Logged
  • Simple

Full Recovery

Every operation is logged and written to the transaction log. This is required for very low or no data loss requirements and means that regular log backups are required. If they are not taken, the transaction log will grow indefinitely (or until it fills up the disk or hits the maximum sized specified). If this occurs, a 9002 error is generated, and no further data modifications can take place until the log is backed up or truncated. Full recovery model is usually used in production, especially for OLTP workloads.

Without a full backup being taken, databases in the full recovery model are treated as if they are in simple recovery model, the log is automatically truncated. So it is imperative to take a full backup of your database when you first use full recovery. (A differential backup would also suffice if there has previously been a full backup that would form part of your backup chain).

Choose the full recovery model if you want or need to do any of the following things:

  • Have point in time recovery
  • Minimise data loss including potentially no data loss
  • Recover to a marked transaction
  • If you want to restore individual pages
  • VLDBS with multiple filegroups where you might want to be able to utilise piecemeal restores
  • If you want to use availability groups, database mirroring or log shipping technologies

Bulk-Logged Recovery

Only extents that are modified are recorded in the transaction log. This means they are reliant upon the database’s data files. Certain types of transaction are classed as minimally logged including:

  • BCP
  • BULK INSERT
  • INSERT INTO SELECT
  • SELECT INTO
  • CREATE INDEX
  • ALTER INDEX
  • DROP INDEX

You are also required to take log backups with the bulk-logged recovery model. The log backups refer to the Bulk Changed Map (BCM) to identify modified extents in need of backup. Read up more on BCM: Further Reading

Choose the bulk-logged recovery model if you want or need to do any of the following things:

  • Improve performance of bulk-logged operations
  • Minimise the size of the transaction log

Simple Recovery

The transaction log is automatically managed by SQL Server and cannot be backed up. SQL Server reclaims the space used by automatically truncating the log. This recovery model is mostly used for non-prod environments. Although SQL Server manages the transaction log in simple recovery model, there is still a chance your t-log could fill up. This would occur if open transactions are left in place (not committed or rolled back) or by long running transactions.

Choose the simple recovery model in the following scenarios:

  • You can afford data loss since the latest full backup (or latest differential)
  • It’s a non-prod database and can easily be restored or refreshed
  • Read only databases
  • Rarely updated databases including data warehouses where the data is updated only once per day
  • If you are utilising a 3rd part solution such as a SAN snapshots
    • Important Note: If you are a DBA or have responsibility for backups/restores, make sure that you fully understand any backup technology in place and that you know how to recover using it.

Switching recovery model

The default recovery model is determined by the recovery model of the model database but can be changed at any time using an ALTER DATABASE command. If you do change the recovery model, you’ll need to understand the impact on the backup chain.

  • Switching from FULL or BULK_LOGGED to SIMPLE will break the log backup chain. It’s worthwhile taking once last log backup before making this switch, this will allow you point in time recovery up to that point.
  • Switching FROM SIMPLE to FULL or BULK_LOGGED requires a new full or differential backup (if full exists) in order to initialise a new log chain. So, take one of these backups as soon as possible so that you can start backing up your logs.

Verifying Recovery Model

There’s a handy DBATools PowerShell command for checking that the database you think are in Full Recovery model are actually in Full Recovery Model. Test-DbaDbRecoveryModel

Test-DbaDbRecoveryModel -SqlInstance localhost | Select InstanceName, Database, ConfiguredRecoveryModel, ActualRecoveryModel | Out-GridView

Note: In the output below, you can see that NebulaDataSolutions database is configured for FULL recovery but is actually in SIMPLE. This is because there is no valid Full backup for NebulaDataSolutions.

Test-DbaDbRecoveryModel output

Filed Under: front-page, SQL Server Recovery Models Tagged With: 70-764, BULK-LOGGED RECOVERY, database recovery models, FULL RECOVERY, SIMPLE RECOVERY, SQL Server recovery models

  • « Previous Page
  • 1
  • …
  • 11
  • 12
  • 13
  • 14
  • 15
  • …
  • 22
  • Next Page »
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2025

 

Loading Comments...