John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

Thoughts on sitting DP-300 Azure Database Administrator exam

3rd April 2023 By John McCormack Leave a Comment

On Sunday morning, I sat and passed exam DP-300 which gives the certification of Microsoft Certified: Azure Administrator Associate. I’ve been sitting Microsoft exams for a number of years now but this was my first for a while. I wanted to jot down what thoughts still remained in my head following the exam.

Preparation

I prepared for the exam by following the course from ACloudGuru. I’m a big fan of this site as it has helped me learn so much about AWS; so I thought I would trust it for Azure learning as well. I have to admit I had been going through the course at a snail’s pace and by the time of completion, I had forgotten some of things I had learned initially. I sat their exam simulator and got 77% which was not a pass.

I realised from doing the exam simulator that there were some areas where my knowledge was weaker, specifically on some newer features I hadn’t worked with day to day and also migration options for the likes of Postgres and MariaDB.

I wrote down each wrong answer, learned why it was wrong and what was correct. I resat the test and got 87%. I repeated the process and got 89%. A lot of the questions were repeating by now and I felt ready to sit the exam.

Sitting the DP-300 exam

The exam itself was MUCH harder than the exam simulator, however in fairness, the exam simulator was pushing you to achieve 80% which allowed a bit of a buffer come exam day. (Microsoft only look for 700/1000). There are some areas I think the simulator could have covered more extensively and a bigger bank of questions would have been helpful.

However when it came to the exam, I managed to rely on my experience to pull some old knowledge out of the darkest recesses of my brain, and made a few educated guesses. Thank fully I passed with 800+. Not the best score but the exam itself is pass or fail.

Focus areas for sitting DP-300

I can’t be too specific here due to NDA but I would suggest if you are an experienced SQL DBA but have limited or no recent Azure working experience, some areas to focus on are:

  1. PAAS vs IAAS
    1. Remember PAAS needs less administrative effort than IAAS or on premises. Look for clues in the question.
    2. Business Critical and Premium vs General Purpose and Standard
    3. Azure SQL Server IaaS Agent Extension
    4. Database mail for managed instance
  2. Migration options (Online and offline) – And not just for SQL Server
  3. Encryption vs Dynamic data masking
  4. Query store
  5. Key Vault
  6. Some Azure networking like Virtual network and VPN
  7. Know your performance DMVs and how to diagnose blocking and slow executing queries
  8. Backup of system DBs

What’s next

I’m still deciding but it’s given me the bug after a couple of years of not doing much active learning. So I think AZ-104 to help me re-establish some Azure infrastructure knowledge or DP-203 to introduce me to some parts of Data engineering that I’m looking to learn.

Filed Under: Azure SQL DB, DP-300, front-page Tagged With: azure, certifications, dp-300

You don’t need to run CHECKDB on Azure SQL DB

12th February 2021 By John McCormack 5 Comments

right or wrong sign for dbcc checkdb
Featured image by Tumisu from Pixabay

CHECKDB on Azure SQL DB?

This is a selfish post because I’ll probably forget and go googling for info on this topic, so I though I would link to an authoritative answer on the question, do you need to run DBCC CHECKDB on Azure SQL DB?

https://azure.microsoft.com/en-gb/blog/data-integrity-in-azure-sql-database/

Microsoft distinguished engineer Peter Carlin says you don’t need to, it is monitored by Azure on your behalf. And if something was to go wrong, they will also work with you to put things right.

Although Peter says you can run them if you want, his post gives me enough peace of mind to stop running these on my Azure SQL DB instances. Let me know if you agree in the comments.

For more Azure content, check out my Azure tag.

Filed Under: Azure SQL DB, front-page Tagged With: azure sql db, DBCC, integrity checks

How to find out Azure SQL DB Startup time

10th February 2021 By John McCormack 1 Comment

Alarm clock and cup of tea
Wakey Wakey SQL DB

Azure SQL DB Startup time

The query

SELECT sqlserver_start_time FROM sys.dm_os_sys_info;

Why write a post about this?

The traditional methods used for to find the start up time for SQL Server don’t work in Azure SQL DB.

I searched high and low to find this and thought I’ve got to share, and hopefully make it search engine friendly. A traditional google or bing search wasn’t bringing up the best way to find this out. I saw a lot of complicated queries to pull data, convert it and estimate start up time using functions and all kinds of magic. Actually, it turns out that the start up time is provided for you in sys.dm_os_sys_info (which works in Azure SQL DB). It also work in SQL Server which is a bonus.

Permissions to check Azure SQL DB Startup time

The permissions you need to run this query vary depending on the pricing tier of your Azure SQL DB. I ran this successfully on a P2 and a basic tier database.

On SQL Server, requires VIEW SERVER STATE permission.
On SQL Database Basic, S0, and S1 service objectives, and for databases in elastic pools, the Server admin or an Azure Active Directory admin account is required. On all other SQL Database service objectives, the VIEW DATABASE STATE permission is required in the database.

https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-os-sys-info-transact-sql?view=sql-server-ver15

I hope this one liner t-sql script was useful. Please let me know in the comments if you know a better way to get the start up time of Azure SQL DB, using T-SQL.

While you’re on my site, why not take a look to see if you can save money on your Azure SQL DBs, and other Azure resources by looking at my guide: 7 ways for data teams to save money in Azure.

Featured image by congerdesign from Pixabay

Filed Under: Azure SQL DB, front-page Tagged With: azure, azure sql db, azure sqldb managed instance

7 ways for data teams to save money in Azure

19th January 2021 By John McCormack Leave a Comment

Save money in Azure

save money in azure - piggy bank with sunglasses

In this series of posts, I list 7 ways to quickly save money in Azure by using cost optimisation principles. Get on top of your cloud costs and start saving money by putting these into action. Whilst this series is specific to Azure, most of the principles can be applied to other public cloud providers. The only thing that differs is the product. Think RDS as an equivalent for SQL DB or S3 for Storage Accounts.

It’s important to keep on top of your cloud costs. Operating in the cloud means it is easy to just spin up new instances in minutes. Whilst that is great and allows company to work in agile manner, the switch to an Operational Expenditure (OPEX) model means that the cost increase can be gradual and go unnoticed.

1. Review your backup retention policy

Azure IaaS SQL Backups – Stop burning money

2. Right size your Azure SQL DBs and managed instances

Your Azure SQL Database and Managed Instance is too big

3. Turn non production instances off out of hours

Turn the cloud off at bedtime to save 70%

4. Right Size your VMs to save money in Azure

Your Azure SQL Virtual Machine might be too big

5. Consider moving to Azure SQL DB serverless

Save money with Azure SQL DB serverless

6. Reserved instances

Save with reserved instances – even up to 73%

7. Delete unused instances

Delete unused instances to save money in Azure

Bonus steps:

This mini series of steps are also important.

  1. Implement Elastic Pools to share resources
  2. Look to see if you are eligible for Azure Hybrid Benefit

Everybody wins

By optimising your costs in Azure and ensuring you are paying the right price for all of your services, everybody wins. Not only are you helping your company save money, it is good for your career and it could even save jobs in your company. Not to mention turning off what you don’t use is good for the environment too.

Need some help? – Book a call

Please get in touch if you would like to schedule a free introductory 15 minute call for some help in reducing your Azure bill.

Filed Under: Azure, cost-optimization Tagged With: azure, azure billing, azure iaas, azure sql db, cost optimisation, cost optimization

Your Azure SQL Database and Managed Instance is too big

15th January 2021 By John McCormack Leave a Comment

Don’t buy for the future

With the cloud, it’s important to understand the concepts of scaling and elasticity. Your cloud services should be scaled at the right size to handle current workloads, with a small buffer. With Azure SQL Database, there is no need to buy that 32 vCore service, when all you need right now is 4. The rest will be available when you need them. You should be ready to scale up or out when the time is right but not before. With the cloud, you should shift your thinking from a CAPEX operating model to and OPEX operating model.

CAPEX vs OPEX

Capital Expenditure (CAPEX) describes where an up-front purchase is made to provide value into the future. e.g. A Server. Once it is purchased, you don’t need to buy it again and can use it until it outlives is usefulness. It’s a big upfront expense to provide long term value. For something like a database server, you might forecast what size you will need in 5 years and buy accordingly.

Operational Expenditure (OPEX) on the other hand describes ongoing costs and operating expenses. Items such as electrical power, rent and staff salaries come under OPEX. Pay As You Go cloud solutions fall under operational expenditure (OPEX) as they are ongoing costs. With this operating model, you should only pay for what you need, and adjust your scale as driven by demand, not future projections.

Resource utilisation and purchase models

Example graph for Azure SQL Database

DTU Model

Small to medium single databases can use the DTU purchase model which is a blend of cpu, memory and IO. They come with fixed storage per instance type. When monitoring performance, you should be looking mainly at DTU percentage used and storage percentage used.

If you are averaging above 85% and have regular peaks higher than that for DTU usage, you should scale up. You can adjust this figure based on how relaxed or nervous you are the DTU getting pinned at 100%.

If you are averaging below 50% and have few or no peaks above this, you can scale down. (Providing the lower class has enough storage) Again, adjust this to suit your own needs but there is no need to pay double unless you absolutely need to bake in a lot of spare capacity.

vCore

The vCore purchase model provides more flexibility than the DTU model. You can choose your compute and storage resources separately. There is no way to allocate extra memory however, this is bound to the number of vCores you have. Managed Instances only use the vCore purchase model.

I would apply a similar rule to scaling up or down here based on needs. Any instance using < 50% of CPU is a good candidate for scaling down.

Azure SQL Database Service Tiers

Ensure you are on the right service tier. Azure SQL Single DB comes with a choice of Basic, Standard and Premium.

Basic (DTU)

If you are on the basic service tier, then from a cost optimization process there is not too much more to say. It has the cheapest instance types, and the lowest resources so if it is doing a job for you, then there is no need to change. It only offers 7 days backup retention so be aware.

Standard (DTU)

A large number of workloads will be running on standard. If you are on the standard tier, you can scale down based on utilisation however you should bear a couple of things in mind.

  • S2 and below does not allow columnstore indexing
  • S1 instance types and below use HDD storage and this might have an impact on your query durations
    • S2 and above uses SSD

Premium (DTU)

Moving to the premium service tier should be well thought out as it includes the most expensive instances, and may mean that you are over provisioned. If you are on a premium instance, look closely at your utilisation, particularly DTU percent used. For seemingly over provisioned instances, you should also take note of a couple of factors before making a decision. Premium is particularly good for IO intensive queries due to a higher average IOPS and a lower IO latency.

Scaling down from premium service tier means:

  • A decrease in average IOPS from 25 to (1-4) IOPS per DTU
  • An increase in IO latency from 2ms to 5ms
  • A new maximum database size of 1 TB

* Changing instance size but staying with premium service tier affects resources only

If you can handle the drop in instance size and/or reduction in service tier, I would recommend scaling down. Remember to keep a closer eye on newly changed instances, and scale back up if performance is no longer acceptable.

https://docs.microsoft.com/en-us/azure/azure-sql/database/service-tiers-dtu

General Purpose (vCore)

General purpose is the budget option but is suitable for a lot of production workloads. If you are currently on GP and you are happy with your performance, then do not change to a higher service tier.

Business Critical (vCore)

Most production workloads can run in Business Critical but you might be over provisioned. 1 read-only replica is built into the price so you can offload read-only workloads to your replica. It doesn’t come cheap though and is more than double the price of a General Purpose instance due to increased hardware and licensing. Scaling down to General Purpose will take away this replica so beware if you use it. A good option may be to scale to a lower Business Critical Instance, so that you still have a read replica built into the price.

Hyperscale (vCore)

Hyperscale is not currently available with Managed instance. (January 2021)

Your instances on Hyperscale are roughly double the cost of General Purpose but less than Business Critical. The main considerations are database size and log write throughput. On an exceptionally busy and large database, hyperscale is often the best choice.

Be aware of the reduced performance and features available if you think you can scale down to General Purpose. Changing tiers down from Hyperscale is not a native or easy task. It isn’t supported so you will need to export and import your data to a new database.

Scaling down to GP means:

  • Your database must be less than 4TB
  • You will move to premium remote storage, rather than local SSD
  • You cannot continue to use In-Memory OLTP
  • It’s a lot of work

https://docs.microsoft.com/en-us/azure/azure-sql/database/service-tiers-general-purpose-business-critical

Summary

The main thing is to regularly review your instances and make sure you are not paying for too much headroom. Make sure that by scaling down, you still have the functionality you need. If you scale down and performance is not what you expected, you should scale back up. Cost optimization shouldn’t come at the expense of performance. It is about ensuring that you are paying for what you need, and nothing more.

If you missed it, please post 1 in this cost optimization series, please have a read about Azure IaaS SQL Backups.

IT Certification Category (English)728x90

Filed Under: Azure, cost-optimization, front-page Tagged With: azure billing, cost optimisation, cost optimization, managed instance, SQL Database, SQL DB

  • 1
  • 2
  • 3
  • Next Page »
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2025

 

Loading Comments...