John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

Change a user’s resource class within Azure Synapse Analytics using Azure Automation runbooks

17th December 2019 By John McCormack Leave a Comment

Change a user’s resource class within Azure Synapse Analytics | A solution

This post shows a method for allowing developers to change a user’s resource class within Azure Synapse Analytics | Azure SQL Data Warehouse without granting them membership of the db_owner fixed database role.

Scenario:

I was a approached by a data warehouse developer who wanted to be able to take advantage of the feature in Azure Synapse Analytics (Formerly Azure SQL Data Warehouse) called ‘Resource classes’. Resource classes allow users in certain roles to have access to extra compute resources for their queries. This can increase their max memory per query (although there is an overall trade-off on concurrency). Unfortunately, due to this being managed via roles, the person looking to change resource classes needs to be in the db_owner role so they can manage database role membership. As I didn’t want to add the developer to the db_owner role, a different solution was required.

Solution:

A separate sql user can be added to the db_owner role on the SQL Data Warehouse in question. We can then create a credential in azure automation based upon this user. Finally, we create a Runbook which executes db_addrolemember (but only allows a specific few roles related to resource groups). This means we can give an end user the ability to add/remove people by granting them access to run the runbook.

How to:

Create SQL login/user
-- Create sql login
CREATE LOGIN [dev_dba_automation] WITH PASSWORD = 'S3cur£P@ssword!'

-- Change into DWH context, not master
CREATE USER [dev_dba_automation] WITH DEFAULT_SCHEMA=[dbo]

-- FYI
-- ALTER ROLE X ADD MEMBER Y doesn't currently work with Azure SQL DWH
Create Azure Automation Credential

From your automation account, find the Credentials option (under shared resources) on the left hand menu.

Add Credential To Azure Automation

Then create your new credential, using the newly created sql login name as the user name and then add the correct password.

Add New Credential To Azure Automation

Please note if the end user has privileged access to Azure Automation with a role such as owner or contributor, they could use the credential to carry out other tasks in the data warehouse so please be careful about who will have access to creating new runbooks.

Create Runbook
  • Log into Azure Automation and create a new runbook.
  • Copy this Powershell code in.
<#
.DESCRIPTION
This runbook allows the person running it to add a database user to a
database role on dev-datawarehouse

.PARAMETER Database
String name of the SQL Server database to connect to

.PARAMETER databaserole
String name of the database role you wish to add a member to

.PARAMETER rolemember
String name of the user you wish to add to the database role

.PARAMETER add_or_drop
String Must = add for db_addrolemember or drop for sp_droprolemember

.NOTES
AUTHOR: John McCormack
LASTEDIT: Dec 16, 2019
#>

param(

[parameter(Mandatory=$True]
[string] $Database,

[parameter(Mandatory=$True]
[string] $databaserole,

[parameter(Mandatory=$True]
[string] $rolemember,

[parameter(Mandatory=$True]
[string] $add_or_drop
)

# Variables
$SqlCredentialAsset = 'dev_az_automation'
$SqlServer = 'dev-datawarehouse.database.windows.net'
$SqlServerPort = 1433
$SqlCredential = Get-AutomationPSCredential -Name $SqlCredentialAsset

if ($SqlCredential -eq $null)
{
throw "Could not retrieve '$SqlCredentialAsset' credential asset. Check that you created this first in the Automation service."
}
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password

# Define the connection to the SQL Database
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$SqlServer,$SqlServerPort;Database=$Database;User ID=$SqlUsername;Password=$SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")

# Open the SQL connection
$Conn.Open()

# Run sp_addrolemember as the credential user
#$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_addrolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
if($add_or_drop -eq "add")
{
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_addrolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
}
else
{
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_droprolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
}
$Cmd.CommandTimeout=120

if ($databaserole -Like "*rc*")
{
# Execute the SQL command
$Ds=New-Object system.Data.DataSet
$Da=New-Object system.Data.SqlClient.SqlDataAdapter($Cmd)
[void]$Da.fill($Ds)

# We should look at adding some logging for this
WRITE-OUTPUT $rolemember $add_or_drop"ed to database role:$databaserole."
}
else
{
WRITE-OUTPUT "Invalid role"
}

# Close the SQL connection
$Conn.Close()
  • Run the workbook and observe the output.
Further reading
  • https://docs.microsoft.com/en-us/azure/sql-data-warehouse/resource-classes-for-workload-management#example-code-for-finding-the-best-resource-class
  • https://docs.microsoft.com/en-us/sql/t-sql/statements/alter-role-transact-sql?view=sql-server-ver15#examples
  • https://gallery.technet.microsoft.com/scriptcenter/How-to-use-a-SQL-Command-be77f9d2
    • (Base script I adapted)
Other Azure Content
  • https://johnmccormack.it/2019/03/put-tempdb-files-on-d-drive-in-azure-iaas/

Filed Under: Azure Automation, Azure SQL Data Warehouse, Azure Synapse Analytics Tagged With: azure, azure automation, azure sql data warehouse, azure synapse analytics

AZCopy for SQL Backups and other stuff

17th July 2019 By John McCormack 1 Comment

AZCopy for SQL Backups and other stuff

AZCopy is a useful command line utility for automating the copying of files and folders to Azure Storage Account containers. Specifically, I use AZCopy for SQL Backups but you can use AZCopy for copying most types of files to and from Azure.

In this blog post example (which mirrors a real world requirement I had), the situation is that whilst I need to write my SQL backups over a network share, I also want to push them up to Azure Storage (in a different region) to allow developers quicker downloads/restores. This is why I need to use AZCopy. If I only needed my backups to be written to Azure, I could have used BACKUP TO URL instead.

Authentication

I use Shared Access Signatures (SAS token) to authorise access to the storage account container. You can create an SAS token for any Access key by clicking Generate SAS and connection string button from storage accounts/Shared Access Signature.

For this scenario, my SAS token is limited to write / add / create operations and cannot be used to read or delete data. This is just to limit the scope for security reasons, but you can decide whatever meets your needs.

SAS tokens have an expiry date. I’ve set mine to expire on 16th August 2019 so I will need to create a new token at this time and update my AZCopy command if I want to continue with the automated copying via a scheduled script.

sas configuration options
SAS Configuration options 
Command
Storage Account: johnbox
Container: instance1

The basic copy command is simple. An example is:
[shell]azcopy cp “<source location>” “<destination location>”[/shell]

For using an SAS token, we append this to the destination url:
[shell]azcopy cp “C:\temp\johns_backup.bak” “https://johnbox.blob.core.windows.net/instance1?sas-token”[/shell]

However, the azcopy sync option is more efficient because it won’t force a copy over the network if the source and destination already match. The example below also syncs the full folder, not just a specific file.
[shell]azcopy sync “C:\temp” “https://johnbox.blob.core.windows.net/instance1?sas-token” [/shell]

 Schedule

I chose to schedule this in a sql agent job. The job step is directly after the step where I back up to my network share. You could also use windows task scheduler if you prefer.

If you haven’t used AZCopy yet?

Download AZCopy from https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10. The page also gives instructions on how to run and authenticate AZCopy commands.

If you have time, please have a look at my other Azure posts.

Filed Under: Azure, front-page, Guides Tagged With: azcopy, azure

Put tempdb files on D drive in Azure IAAS

21st March 2019 By John McCormack 7 Comments

Data loss warning Azure VMTempdb files on D drive you say?

Azure warn you not to to store data on the D drive in Azure VMs, but following this advice could mean you are missing out on some very fast local storage. It’s good general advice because this local storage is not permanently attached to your instance, meaning you could lose data or log files if your VM is stopped and restarted but what if you could afford to lose certain files? Say files that are recreated during startup anyway.

TempDB is the ideal candidate for this. No other database is suitable! Putting the tempdb data and log files onto D drive can be achieved quite easily with a little bit of effort. And you will most likely see a big improvement in tempdb read/write latency.

Just look at these results! Even for someone like me who sometimes looks at charts and often can’t see the obvious, I don’t think there is any denying when the change took place here.

tempdb latency azure IAAS
Blazing fast now

 

Overview of how to configure SQL Server to use D drive for tempdb on Azure VM

  1. Run the ALTER DATABASE command for all of the files you need to change e.g.
    • ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev', FILENAME = 'D:\TempDB\tempdev.mdf')
  2. Create Folder D:\TempDB or something similar
  3. Restart SQL Server and make sure your files are created in D: as expected
  4. Manually remove any old tempdb files still lying around

Now to make sure tempdb is created when the VM restarts

  1. Create a Powershell script which will create the D:\TempDB folder on startup and start the SQL services
    • You could use this script at https://community.idera.com/database-tools/blog/b/community_blog/posts/configuring-tempdb-on-azure-iaas-for-sql-server 
    • Schedule this script with a Task Scheduler startup job
    • Change your execution policy to remote signed if it isn’t that. Set-ExecutionPolicy RemoteSigned
  2. Change the startup type of your SQL Services to Automatic (Delayed Start)
  3. Test this thoroughly in dev before you rely on it in production. If it doesn’t work, you’ll need to remember how you configured tempdb and manually create the folders, then start SQL Server. Not a fun task under pressure.
More tempdb goodness

Configuring TempDB for SQL Server 2016 instances

IT Certification Category (English)728x90

Filed Under: Azure, Azure VM, front-page Tagged With: azure, azure iaas, sql server on Azure, tempdb

A month in the life of a DBA contractor

8th February 2019 By John McCormack Leave a Comment

DBA Contractor in GlasgowI left what I considered to be a great permanent job to become a DBA contractor in Glasgow. It was one where I could pick some of my own projects, work with new and interesting technology and work in a team which collaborated well, liked each other and were good at sharing knowledge.

My reasons for moving on were two-fold. First of all, I wanted some career progression and it unfortunately just wasn’t available in my permanent job. Secondly, I wanted to work more with SQL Server and Azure. My last job started out as a SQL DBA but as a lot of people in this line of work will know, the duties have evolved considerably over time. This is mostly due to large scale cloud adoption. In my case, my workplace started heavily using AWS along with other RDBMSs and data platforms. (RDS MySQL & Aurora, ElasticSearch, Elastic Map Reduce (EMR), Glue and Athena). This gives you a lot to learn. Don’t get me wrong, I threw myself into learning about cloud solutions and I loved working with AWS (hopefully I will again) but as time went on, I was worried I would start to lose some SQL Server knowledge.

So what has the first month been like?

I have been working exclusively on the Microsoft Data Platform. At my first client, I’ve worked with SQL Server on-premises, SQL Server running on Azure VMs, Azure SQL DB, Azure Analysis Services and Azure SQL Data Warehouse.

On-Premises

The on-prem SQL Servers are mostly for hosting the databases of legacy applications and 3rd party vendor products. Some of the versions are ‘rather old‘ although these instances actually give the business the fewest headaches. I’ve worked on decommissioning unused databases that were still online, server side traces and native backups.

Azure VMs running SQL Server (IaaS)

These host the databases that have the most active development work. I’ve looked at performance issues and SQL Server configurations to help improve performance. I’ve shared best practice with the team in terms of tempdb and helped to reduce the volume of unnecessary emails from servers. (You know the ones which someone set up years ago but never induce any action from the DBAs).

Azure Analysis Services (PaaS)

This was fun. I only did some basic configuration and permissions but as I’ve never used Analysis Services much over the years, it was cool to see what was involved.

Azure SQL Data Warehouse (Paas)

It was great to get my hands on tech like this. I spent time modernising their manual point and click refresh process with PowerShell. This was a bit of a learning curve but extremely rewarding. This process will make future refreshes much easier. (I’ll share the code once I’m confident it’s good, I’d still like a few more run-throughs before doing this). Next month, I’d like to try to implement a self service process for the developers that allows them to do their own refreshes, freeing up DBA time to work on more critical items.

Azure SQL DB (PaaS)

So far, I haven’t had to do too much with Azure SQL DB on this project. I’ve listed out the instances and instance types using PowerShell. My plan is to review if these are right sized or if efficiencies can be made and I’ll also make sure the alerting is set up correctly.

Plans for next month

I’d like to offer some in depth server health reviews and work with the developers to help them make their code run faster. There is a lot of blocking due to long running stored procedures and sub optimal code. Improving a few of the big hitters will make them much happier and will ease the strain on some of our servers – at least that’s the idea.

Further Reading

  • Azure SQL Data Warehouse
  • PowerShell cmdlets and REST APIs for SQL Data Warehouse

Filed Under: Azure, Azure SQL Analysis Services, Azure SQL Data Warehouse, Azure SQL DB, front-page, Personal

  • « Previous Page
  • 1
  • 2
  • 3
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2025

 

Loading Comments...