John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

Search Results for: iaas

A month in the life of a DBA contractor

8th February 2019 By John McCormack Leave a Comment

DBA Contractor in GlasgowI left what I considered to be a great permanent job to become a DBA contractor in Glasgow. It was one where I could pick some of my own projects, work with new and interesting technology and work in a team which collaborated well, liked each other and were good at sharing knowledge.

My reasons for moving on were two-fold. First of all, I wanted some career progression and it unfortunately just wasn’t available in my permanent job. Secondly, I wanted to work more with SQL Server and Azure. My last job started out as a SQL DBA but as a lot of people in this line of work will know, the duties have evolved considerably over time. This is mostly due to large scale cloud adoption. In my case, my workplace started heavily using AWS along with other RDBMSs and data platforms. (RDS MySQL & Aurora, ElasticSearch, Elastic Map Reduce (EMR), Glue and Athena). This gives you a lot to learn. Don’t get me wrong, I threw myself into learning about cloud solutions and I loved working with AWS (hopefully I will again) but as time went on, I was worried I would start to lose some SQL Server knowledge.

So what has the first month been like?

I have been working exclusively on the Microsoft Data Platform. At my first client, I’ve worked with SQL Server on-premises, SQL Server running on Azure VMs, Azure SQL DB, Azure Analysis Services and Azure SQL Data Warehouse.

On-Premises

The on-prem SQL Servers are mostly for hosting the databases of legacy applications and 3rd party vendor products. Some of the versions are ‘rather old‘ although these instances actually give the business the fewest headaches. I’ve worked on decommissioning unused databases that were still online, server side traces and native backups.

Azure VMs running SQL Server (IaaS)

These host the databases that have the most active development work. I’ve looked at performance issues and SQL Server configurations to help improve performance. I’ve shared best practice with the team in terms of tempdb and helped to reduce the volume of unnecessary emails from servers. (You know the ones which someone set up years ago but never induce any action from the DBAs).

Azure Analysis Services (PaaS)

This was fun. I only did some basic configuration and permissions but as I’ve never used Analysis Services much over the years, it was cool to see what was involved.

Azure SQL Data Warehouse (Paas)

It was great to get my hands on tech like this. I spent time modernising their manual point and click refresh process with PowerShell. This was a bit of a learning curve but extremely rewarding. This process will make future refreshes much easier. (I’ll share the code once I’m confident it’s good, I’d still like a few more run-throughs before doing this). Next month, I’d like to try to implement a self service process for the developers that allows them to do their own refreshes, freeing up DBA time to work on more critical items.

Azure SQL DB (PaaS)

So far, I haven’t had to do too much with Azure SQL DB on this project. I’ve listed out the instances and instance types using PowerShell. My plan is to review if these are right sized or if efficiencies can be made and I’ll also make sure the alerting is set up correctly.

Plans for next month

I’d like to offer some in depth server health reviews and work with the developers to help them make their code run faster. There is a lot of blocking due to long running stored procedures and sub optimal code. Improving a few of the big hitters will make them much happier and will ease the strain on some of our servers – at least that’s the idea.

Further Reading

  • Azure SQL Data Warehouse
  • PowerShell cmdlets and REST APIs for SQL Data Warehouse

Filed Under: Azure, Azure SQL Analysis Services, Azure SQL Data Warehouse, Azure SQL DB, front-page, Personal

Test read intent connections to an AG Listener

19th April 2019 By John McCormack 4 Comments

To test read intent connections to an AG Listener, I prefer to use SQLCMD but you can also test easily using SSMS.

SQLCMD

The -Kreadonly switch is your key to success here but remember to also specify the database using -d. When not set (and with an initial catalog of master for my login), I found I always got the primary instance back during my check. This simple omission cost me hours of troubleshooting work, because I was convinced my listener wasn’t working correctly. In fact, I just wasn’t testing it correctly.

Read/Write Connections
-sqlcmd -S "SQL01-AG1-list" -d WideWorldImporters -E -q "SELECT @@SERVERNAME;"

Read only Connections
-sqlcmd -S "SQL01-AG1-list" -d WideWorldImporters-E -q "SELECT @@SERVERNAME;" -Kreadonly

The instance that you are connected to will show in the command prompt. Type exit to leave sqlcmd.

SSMS

In object explorer, click Connect and choose Database Engine.

Then, in the bottom right hand side of the dialog box, click on Options >>

In Connection Properties, Click Connect to database and then <Browse server ..>. Choose a DB that is in your availability group.

Connect To Server SSMS

Then click on Additional Connection Parameters and type in ApplicationIntent=ReadOnly

ApplicationIntent=ReadOnly SSMS

Click connect and run SELECT @@SERVERNAMEand you should expect to see the instance name of your secondary replica, providing you have set up the read only routing correctly.

If you change connection and remove ApplicationIntent=ReadOnly from the Additional Connection Parameters, you should see the result as the name of your primary instance.

Summary

Hopefully these 2 simple techniques to test read intent connections to an AG Listener will be useful and help save you time. It’s a simple blog post but I wanted to write it because I was looking at the problem in too much depth and missing the obvious mistake of choosing my database context.

Popular posts

  • https://johnmccormack.it/2019/03/put-tempdb-files-on-d-drive-in-azure-iaas/
  • https://johnmccormack.it/2017/12/ec2-sql-server-backups-to-amazon-s3/
  • https://johnmccormack.it/2018/10/glasgow-super-meetup-aws-athena-presentation/

Filed Under: Always On Availability Groups, front-page Tagged With: AG Listener, Always On, Availability Groups, SQL server

Change a user’s resource class within Azure Synapse Analytics using Azure Automation runbooks

17th December 2019 By John McCormack Leave a Comment

Change a user’s resource class within Azure Synapse Analytics | A solution

This post shows a method for allowing developers to change a user’s resource class within Azure Synapse Analytics | Azure SQL Data Warehouse without granting them membership of the db_owner fixed database role.

Scenario:

I was a approached by a data warehouse developer who wanted to be able to take advantage of the feature in Azure Synapse Analytics (Formerly Azure SQL Data Warehouse) called ‘Resource classes’. Resource classes allow users in certain roles to have access to extra compute resources for their queries. This can increase their max memory per query (although there is an overall trade-off on concurrency). Unfortunately, due to this being managed via roles, the person looking to change resource classes needs to be in the db_owner role so they can manage database role membership. As I didn’t want to add the developer to the db_owner role, a different solution was required.

Solution:

A separate sql user can be added to the db_owner role on the SQL Data Warehouse in question. We can then create a credential in azure automation based upon this user. Finally, we create a Runbook which executes db_addrolemember (but only allows a specific few roles related to resource groups). This means we can give an end user the ability to add/remove people by granting them access to run the runbook.

How to:

Create SQL login/user
-- Create sql login
CREATE LOGIN [dev_dba_automation] WITH PASSWORD = 'S3cur£P@ssword!'

-- Change into DWH context, not master
CREATE USER [dev_dba_automation] WITH DEFAULT_SCHEMA=[dbo]

-- FYI
-- ALTER ROLE X ADD MEMBER Y doesn't currently work with Azure SQL DWH
Create Azure Automation Credential

From your automation account, find the Credentials option (under shared resources) on the left hand menu.

Add Credential To Azure Automation

Then create your new credential, using the newly created sql login name as the user name and then add the correct password.

Add New Credential To Azure Automation

Please note if the end user has privileged access to Azure Automation with a role such as owner or contributor, they could use the credential to carry out other tasks in the data warehouse so please be careful about who will have access to creating new runbooks.

Create Runbook
  • Log into Azure Automation and create a new runbook.
  • Copy this Powershell code in.
<#
.DESCRIPTION
This runbook allows the person running it to add a database user to a
database role on dev-datawarehouse

.PARAMETER Database
String name of the SQL Server database to connect to

.PARAMETER databaserole
String name of the database role you wish to add a member to

.PARAMETER rolemember
String name of the user you wish to add to the database role

.PARAMETER add_or_drop
String Must = add for db_addrolemember or drop for sp_droprolemember

.NOTES
AUTHOR: John McCormack
LASTEDIT: Dec 16, 2019
#>

param(

[parameter(Mandatory=$True]
[string] $Database,

[parameter(Mandatory=$True]
[string] $databaserole,

[parameter(Mandatory=$True]
[string] $rolemember,

[parameter(Mandatory=$True]
[string] $add_or_drop
)

# Variables
$SqlCredentialAsset = 'dev_az_automation'
$SqlServer = 'dev-datawarehouse.database.windows.net'
$SqlServerPort = 1433
$SqlCredential = Get-AutomationPSCredential -Name $SqlCredentialAsset

if ($SqlCredential -eq $null)
{
throw "Could not retrieve '$SqlCredentialAsset' credential asset. Check that you created this first in the Automation service."
}
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password

# Define the connection to the SQL Database
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$SqlServer,$SqlServerPort;Database=$Database;User ID=$SqlUsername;Password=$SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")

# Open the SQL connection
$Conn.Open()

# Run sp_addrolemember as the credential user
#$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_addrolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
if($add_or_drop -eq "add")
{
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_addrolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
}
else
{
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_droprolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
}
$Cmd.CommandTimeout=120

if ($databaserole -Like "*rc*")
{
# Execute the SQL command
$Ds=New-Object system.Data.DataSet
$Da=New-Object system.Data.SqlClient.SqlDataAdapter($Cmd)
[void]$Da.fill($Ds)

# We should look at adding some logging for this
WRITE-OUTPUT $rolemember $add_or_drop"ed to database role:$databaserole."
}
else
{
WRITE-OUTPUT "Invalid role"
}

# Close the SQL connection
$Conn.Close()
  • Run the workbook and observe the output.
Further reading
  • https://docs.microsoft.com/en-us/azure/sql-data-warehouse/resource-classes-for-workload-management#example-code-for-finding-the-best-resource-class
  • https://docs.microsoft.com/en-us/sql/t-sql/statements/alter-role-transact-sql?view=sql-server-ver15#examples
  • https://gallery.technet.microsoft.com/scriptcenter/How-to-use-a-SQL-Command-be77f9d2
    • (Base script I adapted)
Other Azure Content
  • https://johnmccormack.it/2019/03/put-tempdb-files-on-d-drive-in-azure-iaas/

Filed Under: Azure Automation, Azure SQL Data Warehouse, Azure Synapse Analytics Tagged With: azure, azure automation, azure sql data warehouse, azure synapse analytics

A successful performance tuning project

5th June 2020 By John McCormack Leave a Comment

Performance tuning project

I’m coming to the end of what has been a successful performance tuning project for SQL Server. I wanted to share some of the things that made it a success.

Corporate buy in

The company had a goal to improve the page load times of a number of key pages within our application. This was to improve the user experience for our customers. They acknowledged that the database code, both indexes and stored procedures needed optimsation but so too did aspects of the application code. It was good to see this acknowledged as I’ve been in many situations where the database takes all the blame.

The company approved a considerable amount of resource in terms of personnel to work solely on this stability and optimisation project. It included senior developers, testers and a project manager. I was brought in to look at the database performance. Whilst some business as usual (BAU) priorities did come in from time to time, a large core of the team was protected from this and allowed to get on with the work of making the system go faster, thus improving the customer experience.

Daily standups

We held daily standups where we covered what we had done since the last meeting, what we were working on and if anything was blocking our progress. These were kept short so as to not get in the way of the development work, but allowed everyone an overview of what the other team members were working on. Often, as a result of these, side conversations spun up and team members helped out others who were looking for a bit of assistance. (Or simply to bounce ideas around)

Collaboration

The team were willing to help each other. When Pull Requests (PRs) were submitted, these were swiftly approved where there were no objections, or challenged in a positive way which helped get the best overall result. When the API calls were showing as slow, but nothing was obvious on the SQL server, heads were put together to use the tools at our disposal to get to the root cause. This often included Azure App Insights which I had not previously used, and this helped us get the end to end transaction details. We could pull out the SQL for any areas which were slow and work on making it perform better.

Measuring improvements

The Azure Instance class for the SQL Server had previously been doubled so there was no appetite to scale it up again. The hope was that we may eventually be able to scale back down after a period of stability.

The system previously had issues with blocking, high CPU and slow durations so I wanted to reduce page reads, CPU and duration for all of the SQL calls I was working on. I wouldn’t consider a PR if at least 2 of these metrics were not improved. My main focus was on reducing duration of calls but I didn’t want to improve one thing, and make others worse as a consequence. In my own tickets, I always made sure to document the before and after metrics from my standalone testing so as to give confidence that they would be included in upcoming releases.

CPU graph showing performance over time.

We also used Apdex which is a standardised way of measuring application performance. It ranks page views on whether the user is satisfied, tolerating or frustrated. The more we move users out of the frustrated and tolerating groups, and in to satisfied, the higher the Apdex score will be. As our project moved through release cycles, we were able to see steady increases in our Apdex scores. Apdex also allowed us to identify what was hurting us most and create tickets based on this information.

Top Quality Load Test Environment

We had a top quality load test environment which used production masked backups for the databases. I set up the availability groups to match production, the servers were all sized the same as production and had the same internal settings such as tempdb size, sp_configure settings and trace flags etc. We were able to replay the same tests over and over again using Gatling, and our testers made really useful reports available to help us analyse the performance of each hotfix. If it was a proven fix, it was promoted to a release branch, if it wasn’t, it was binned.

End Game

This intensity was kept up for the almost 2 months and it was ultimately transformative for the business. Whilst there are still many further improvements that can be made, the specialised squad is being disbanded and team members are being reallocated to other squads. Performance should be a way of life now, rather than an afterthought or another performance tuning project.

We can be happy that we improved the Apdex scores, sped up a huge number of regularly used SQL transactions, and removed the large CPU peaks that dominated our core business hours.

If you enjoyed this, you may also enjoy some of these other posts.

  • https://johnmccormack.it/2020/05/how-dbatools-can-help-with-performance-tuning/
  • https://johnmccormack.it/2019/03/put-tempdb-files-on-d-drive-in-azure-iaas/

Filed Under: front-page, Guides, SQL Server Tagged With: Performance tuning, project, Scrum, sql, SQL Performance, SQL server

How do I set up database mail for Azure SQL DB Managed Instance

5th October 2020 By John McCormack 2 Comments

How do I set up database mail for Azure SQL DB Managed Instance

It’s not too difficult to set up database mail for Azure SQL DB Managed Instance in comparison to SQL Server (on-prem or IaaS) however there are a few extra things to consider. This post will describe how to set up database mail for Azure SQL DB Managed Instance. I will use Sendgrid as the mail provider but you can follow the same steps for any other mail provider or your company’s smtp server.

Before I go on, my personal opinion is that including database mail is a massive feature for Managed Instances. The lack of DB Mail on Azure SQL DB Single Database or Amazon RDS is a major blocker to PaaS adoption. Now with Managed Instance, we can have PaaS and database mail.

Create an Azure SQL DB Managed Instance

Whilst I’m planning a separate post on how to provision an Azure SQL DB Managed Instance, I’m going to assume for the purposes of this post that you already have an instance up and running. If you don’t, have a look at this howto article before going any further.

Change outbound settings on your Network Security Group (NSG)

All managed instances are created with a NSG. The NSG allows you to create inbound and outbound rules for traffic coming in and out of your database. You can configure ports, protocols and IP ranges etc.

To allow secure smtp for sendgrid, you must open port 587 for outbound traffic. Otherwise, any attempt to send mail will result in an error. You can do this in the Azure Portal by navigating to the NSG or you could do it in Powershell by modifying and running the code below.

# This code was originally taken from https://docs.microsoft.com/en-us/azure/service-fabric/scripts/service-fabric-powershell-add-nsg-rule
# and amended to be used to set up an outbound rule for Database Mail on Port 587

Login-AzAccount
Get-AzSubscription
Set-AzContext -SubscriptionId "yourSubscriptionID"

$RGname="RG-JOHN-DEMO"
$port=587
$rulename="DBMailOutbound"
$nsgname="nsg-johnmccormack-demo"

# Get the NSG resource
$nsg = Get-AzNetworkSecurityGroup -Name $nsgname -ResourceGroupName $RGname

# Add the inbound security rule
$nsg | Add-AzNetworkSecurityRuleConfig -Name $rulename -Description "Allow DBMail" -Access Allow `
-Protocol * -Direction Outbound -Priority 500 -SourceAddressPrefix "*" -SourcePortRange * `
-DestinationAddressPrefix * -DestinationPortRange $port

# Update the NSG
$nsg | Set-AzNetworkSecurityGroup

Set up a sendgrid account

The simplest way if you haven’t done it before is just to use the Azure Portal.

  1. Start typing sendgrid in the search bar
  2. Click on Sendgrid accounts from services
  3. Click + Add
  4. Complete the form to create your new account. (See image at bottom of post)
  5. Click manage sendgrid to get the credentials you will need for Database Mail (See other at bottom of post)
    1. Once you are logged in:
      1. Click Settings
      2. Click API Keys
      3. Click Create API Key
      4. Create and verify a sender identity. (If you don’t do this, no sendgrid emails will work)

Enable database mail for your Managed Instance

You must run this t-sql code to enable database mail on your Managed Instance:

EXEC sp_configure 'show advanced options', 1
RECONFIGURE

EXEC sp_configure 'Database Mail XPs', 1
RECONFIGURE

Create a suitable database mail account and profile

Your database mail profile must be called AzureManagedInstance_dbmail_profile. If you try to call it anything else, your database mail won’t work.

-- Create a database mail account
IF NOT EXISTS (SELECT 1 FROM msdb.dbo.sysmail_account WHERE [name] = 'DBA Sendgrid Account')
BEGIN
DECLARE @servername NVARCHAR(255) = @@servername
SET @servername = substring(@servername, 1, charindex('.', @servername)-1)
EXECUTE msdb.dbo.sysmail_add_account_sp
@account_name = 'DBA Sendgrid Account',
@description = 'DB account for DBAs and SQL Agent',
@email_address = 'john.mccormack@*****.***',
@display_name = @servername,
@mailserver_name = 'smtp.sendgrid.net',
@mailserver_type = 'SMTP',
@username = 'apikey',
@password = '******************',
@port = 587,
@enable_ssl = 1;
END
ELSE
PRINT 'sysmail_account already configured'

-- Create a database mail profile (Profile must be called AzureManagedInstance_dbmail_profile)
IF NOT EXISTS (SELECT 1 FROM msdb..sysmail_profile WHERE name = 'AzureManagedInstance_dbmail_profile')
BEGIN
EXECUTE msdb.dbo.sysmail_add_profile_sp
@profile_name = 'AzureManagedInstance_dbmail_profile',
@description = 'Main profile for sending database mail';

-- Associate account with profile
EXECUTE msdb.dbo.sysmail_add_profileaccount_sp
@profile_name = 'AzureManagedInstance_dbmail_profile',
@account_name = 'DBA Sendgrid Account',
@sequence_number = 1 ;
END
ELSE
PRINT 'DBMail profile already configured'

Send a test email

-- Test Email
EXEC msdb.dbo.sp_send_dbmail
@profile_name = 'AzureManagedInstance_dbmail_profile',
@recipients = 'john.mccormack@*****.***',
@body = 'Email successfully sent from managed instance.',
@subject = 'OK, this works now. Thats great!';

So that’s all there is to it. Now you know how to set up database mail for Azure SQL DB Managed Instance.

If you have any comments or suggestions, please tweet me or leave them in the comments section.

John

More handy Azure links

https://johnmccormack.it/2020/09/how-do-i-find-the-agent-service-account-for-azure-sql-database-managed-instance/

https://johnmccormack.it/2019/12/change-a-users-resource-class-with-azure-automation-runbooks/

Images to help you set up with sendgrid configuration for database mail

Create sendgrid account in Azure Portal
Create Sendgrid
Manage Sendgrid button in Azure Portal
Manage Sendgrid

Filed Under: Azure SQL DB Managed Instance, front-page Tagged With: azure, azure sql db, azure sqldb managed instance, database mail, managed instance, sendgrid

  • « Previous Page
  • 1
  • 2
  • 3
  • Next Page »
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2025

 

Loading Comments...