John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

Thoughts on sitting DP-300 Azure Database Administrator exam

3rd April 2023 By John McCormack Leave a Comment

On Sunday morning, I sat and passed exam DP-300 which gives the certification of Microsoft Certified: Azure Administrator Associate. I’ve been sitting Microsoft exams for a number of years now but this was my first for a while. I wanted to jot down what thoughts still remained in my head following the exam.

Preparation

I prepared for the exam by following the course from ACloudGuru. I’m a big fan of this site as it has helped me learn so much about AWS; so I thought I would trust it for Azure learning as well. I have to admit I had been going through the course at a snail’s pace and by the time of completion, I had forgotten some of things I had learned initially. I sat their exam simulator and got 77% which was not a pass.

I realised from doing the exam simulator that there were some areas where my knowledge was weaker, specifically on some newer features I hadn’t worked with day to day and also migration options for the likes of Postgres and MariaDB.

I wrote down each wrong answer, learned why it was wrong and what was correct. I resat the test and got 87%. I repeated the process and got 89%. A lot of the questions were repeating by now and I felt ready to sit the exam.

Sitting the DP-300 exam

The exam itself was MUCH harder than the exam simulator, however in fairness, the exam simulator was pushing you to achieve 80% which allowed a bit of a buffer come exam day. (Microsoft only look for 700/1000). There are some areas I think the simulator could have covered more extensively and a bigger bank of questions would have been helpful.

However when it came to the exam, I managed to rely on my experience to pull some old knowledge out of the darkest recesses of my brain, and made a few educated guesses. Thank fully I passed with 800+. Not the best score but the exam itself is pass or fail.

Focus areas for sitting DP-300

I can’t be too specific here due to NDA but I would suggest if you are an experienced SQL DBA but have limited or no recent Azure working experience, some areas to focus on are:

  1. PAAS vs IAAS
    1. Remember PAAS needs less administrative effort than IAAS or on premises. Look for clues in the question.
    2. Business Critical and Premium vs General Purpose and Standard
    3. Azure SQL Server IaaS Agent Extension
    4. Database mail for managed instance
  2. Migration options (Online and offline) – And not just for SQL Server
  3. Encryption vs Dynamic data masking
  4. Query store
  5. Key Vault
  6. Some Azure networking like Virtual network and VPN
  7. Know your performance DMVs and how to diagnose blocking and slow executing queries
  8. Backup of system DBs

What’s next

I’m still deciding but it’s given me the bug after a couple of years of not doing much active learning. So I think AZ-104 to help me re-establish some Azure infrastructure knowledge or DP-203 to introduce me to some parts of Data engineering that I’m looking to learn.

Filed Under: Azure SQL DB, DP-300, front-page Tagged With: azure, certifications, dp-300

Why I certified to become more than a SQL DBA

11th May 2021 By John McCormack 2 Comments

t-sql tuesday logo

For this month’s T-SQL Tuesday, Andy Leonard (b|t) asks How Do You Respond When Technology Changes Under You?

It’s a great topic and inspired me to write a T-SQL Tuesday post this month. I’ve missed the last couple due to a combination of just not feeling like it, and the previous topics weren’t best suited to me. Hopefully this can inspire me to get writing again.

Hey, I thought I was the SQL DBA!

When I got my start with SQL Server, I was just that –  A SQL DBA. I worked as a junior member of a big team, we had a large SQL Server estate and I just wanted to learn to become as experienced as the senior folk. I was able to learn and be mentored and was enjoying making progress. Then I heard we were going to start using “The Cloud”. Not only that but I was going to be responsible for the operation of our Kafka clusters, EMR and ElasticSearch.

A lot of these technologies were developer led, they knew why they wanted to use them and from an operations perspective, it was about making sure things didn’t fail, and errors were handled. I had no background in them, and didn’t really understand them well. Simply learning to use PuTTY to log into a linux VM was one of my early successes.

AWS Certifications FTW

The move to the cloud, in this case AWS was going ahead with or without me so I had to embrace it. Of course, I learned and collaborated with colleagues and got to an acceptable standard but it was only when I decided to do AWS certifications that things really took off. I found that by studying for the AWS Cloud Solutions Architect certification that I got an appreciation for all cloud services, not just the data ones like RDS. Studying for the AWS certifications improved my networking skills; I learned about object vs block storage and of course I learned all of the different data products (often managed services but not always).

I approached the certification process from a genuine perspective of learning the technology well, rather than just passing the exam. Simply passing the exam does not do much for you. Even if it gets you a job, you will likely be found out pretty quickly.

A further benefit to learning one cloud technology well is that when we started using Azure, I found the concepts were the same. I understood the networking, the security groups and PaaS vs IaaS. It made this transition much easier and meant that when I moved to a job that was predominantly based in Azure, I didn’t have the huge knowledge gap that I would have had, if I had not done my AWS certifications.

So for me, a structured learning approach is best. It’s not for everyone but I am certainly glad that my job is no longer just “SQL DBA”.

Filed Under: Certification, front-page

Free practice questions to help you pass DP-900

26th January 2021 By John McCormack 3 Comments

pass DP-900 checklist

Pass DP-900 certification from Azure

These are a series of free practice questions and tests to help you pass DP-900 – Microsoft Azure Data Fundamentals. They are written completely by me based on my review of the publicly available course materials. Unlike some sites, these are not exam dumps, so are a perfectly honest and legitimate way to study towards your Azure DP-900 certification.

These practice questions should be seen as a supplement to Microsoft Learn and not a replacement. My number one tip for passing Azure DP-900 is to complete the free learning paths on Microsoft Learn for the DP-900 course.

This series is a work in progress so please keep coming back for more questions and quizzes. I didn’t want to wait until everything was published before sharing this help with you. I hope that you enjoy the practice questions and good luck with your mission to pass DP-900.

I’d also appreciate any feedback if any of the answers appear to be incorrect or if the questions are too easy or too hard.


More questions, quizzes and links are added almost daily. Bookmark this page and come back for new practice questions to help you pass DP-900.

DP-900 practice question links ⬇

Azure Data Fundamentals Learning Paths

  • Explore core data concepts
    • Explore core data concepts
    • Explore roles and responsibilities in the world of data
    • Describe concepts of relational data
    • Explore concepts of non-relational data
    • Explore concepts of data analytics
  • Explore relational data in Azure
  • Explore non-relational data in Azure
  • Explore modern data warehouse analytics in Azure

IT Certification Category (English)728x90

Filed Under: Certification, DP-900, front-page, Training Tagged With: certifications, dp-900, training

DP-900 – Exam prep reading list

17th August 2020 By John McCormack Leave a Comment

DP-900 Exam prep reading list  (further reading)

Data visualisation example for DP-900 Exam prep reading listThis DP-900 Exam prep reading list comes from Microsoft Learn. Links will be added as I go through the course material / suggested reading on the Microsoft Learn site until I have completed the full course. I am currently working my way through the material.

If you are thinking of sitting DP-900 Azure data fundamentals, I recommend following through the free course on the Microsoft learn site (scroll down past exam info) and also reading the recommended links. The links may add a bit of time onto your learning, and you can be the judge about about how intensively you need to study the extra material, however the more prepared you are, the greater your chance of success with the certification exam. Links duplicated in Microsoft Learn are only shown once in the list below.

I find it great that Microsoft are making these resources available for free these days. Some video learning would also be nice but this is a great start.

Azure Data Fundamentals: Explore core data concepts

Explore core data concepts

  • https://docs.microsoft.com/en-us/office/troubleshoot/access/database-normalization-description
  • https://docs.microsoft.com/en-us/azure/cosmos-db/introduction
  • https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction
  • https://docs.microsoft.com/en-us/azure/azure-sql/database/sql-database-paas-overview

Explore roles and responsibilities in the world of data

  • https://docs.microsoft.com/en-us/azure/databricks/scenarios/what-is-azure-databricks
  • https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-overview
  • https://docs.microsoft.com/en-us/power-bi/fundamentals/power-bi-overview
  • https://docs.microsoft.com/en-us/azure/data-factory/introduction

Describe concepts of relational data

  • https://docs.microsoft.com/en-us/office/troubleshoot/access/database-normalization-description
  • https://docs.microsoft.com/en-us/sql/odbc/reference/structured-query-language-sql?view=sql-server-ver15
  • https://docs.microsoft.com/en-us/azure/azure-sql/azure-sql-iaas-vs-paas-what-is-overview

Explore concepts of non-relational data

  • https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-overview
  • https://docs.microsoft.com/en-us/azure/cosmos-db/index-overview
  • https://docs.microsoft.com/en-us/azure/cosmos-db/table-introduction
  • https://docs.microsoft.com/en-us/azure/cosmos-db/graph-introduction

Explore concepts of data analytics

  • https://docs.microsoft.com/en-us/power-bi/create-reports/
  • https://azure.microsoft.com/en-us/services/databricks/
  • https://azure.microsoft.com/en-gb/services/cognitive-services/  (Broken link on site)
  • https://docs.microsoft.com/en-us/azure/architecture/data-guide/relational-data/etl

Azure Data Fundamentals: Explore relational data in Azure

Explore relational data offerings in Azure

Explore provisioning and deploying relational database offerings in Azure

Query relational data in Azure

Azure Data Fundamentals: Explore non-relational data in Azure

Explore non-relational data offerings in Azure

Explore provisioning and deploying non-relational data services in Azure

Manage non-relational data stores in Azure

Azure Data Fundamentals: Explore modern data warehouse analytics in Azure

Examine components of a modern data warehouse

Explore data ingestion in Azure

Explore data storage and processing in Azure

Get started building with Power BI

 

Recommended posts:

SQL Server on Amazon RDS 

Put tempdb files on d drive in Azure IAAS

Filed Under: Azure, Certification, DP-900 Tagged With: azure data fundamentals, dp-900

Database recovery models

26th August 2019 By John McCormack 1 Comment

Database recovery models

3 choiceThis post is about database recovery models for SQL Server databases. Having the correct recovery model for a database is crucial in terms of your backup and restore strategy for the database. It also defines if you need to do maintenance of the transaction log or if you can leave this task to SQL Server. Let’s look at the various recovery models and how they work.

The three database recovery models for SQL Server:

  • Full
  • Bulk-Logged
  • Simple

Full Recovery

Every operation is logged and written to the transaction log. This is required for very low or no data loss requirements and means that regular log backups are required. If they are not taken, the transaction log will grow indefinitely (or until it fills up the disk or hits the maximum sized specified). If this occurs, a 9002 error is generated, and no further data modifications can take place until the log is backed up or truncated. Full recovery model is usually used in production, especially for OLTP workloads.

Without a full backup being taken, databases in the full recovery model are treated as if they are in simple recovery model, the log is automatically truncated. So it is imperative to take a full backup of your database when you first use full recovery. (A differential backup would also suffice if there has previously been a full backup that would form part of your backup chain).

Choose the full recovery model if you want or need to do any of the following things:

  • Have point in time recovery
  • Minimise data loss including potentially no data loss
  • Recover to a marked transaction
  • If you want to restore individual pages
  • VLDBS with multiple filegroups where you might want to be able to utilise piecemeal restores
  • If you want to use availability groups, database mirroring or log shipping technologies

Bulk-Logged Recovery

Only extents that are modified are recorded in the transaction log. This means they are reliant upon the database’s data files. Certain types of transaction are classed as minimally logged including:

  • BCP
  • BULK INSERT
  • INSERT INTO SELECT
  • SELECT INTO
  • CREATE INDEX
  • ALTER INDEX
  • DROP INDEX

You are also required to take log backups with the bulk-logged recovery model. The log backups refer to the Bulk Changed Map (BCM) to identify modified extents in need of backup. Read up more on BCM: Further Reading

Choose the bulk-logged recovery model if you want or need to do any of the following things:

  • Improve performance of bulk-logged operations
  • Minimise the size of the transaction log

Simple Recovery

The transaction log is automatically managed by SQL Server and cannot be backed up. SQL Server reclaims the space used by automatically truncating the log. This recovery model is mostly used for non-prod environments. Although SQL Server manages the transaction log in simple recovery model, there is still a chance your t-log could fill up. This would occur if open transactions are left in place (not committed or rolled back) or by long running transactions.

Choose the simple recovery model in the following scenarios:

  • You can afford data loss since the latest full backup (or latest differential)
  • It’s a non-prod database and can easily be restored or refreshed
  • Read only databases
  • Rarely updated databases including data warehouses where the data is updated only once per day
  • If you are utilising a 3rd part solution such as a SAN snapshots
    • Important Note: If you are a DBA or have responsibility for backups/restores, make sure that you fully understand any backup technology in place and that you know how to recover using it.

Switching recovery model

The default recovery model is determined by the recovery model of the model database but can be changed at any time using an ALTER DATABASE command. If you do change the recovery model, you’ll need to understand the impact on the backup chain.

  • Switching from FULL or BULK_LOGGED to SIMPLE will break the log backup chain. It’s worthwhile taking once last log backup before making this switch, this will allow you point in time recovery up to that point.
  • Switching FROM SIMPLE to FULL or BULK_LOGGED requires a new full or differential backup (if full exists) in order to initialise a new log chain. So, take one of these backups as soon as possible so that you can start backing up your logs.

Verifying Recovery Model

There’s a handy DBATools PowerShell command for checking that the database you think are in Full Recovery model are actually in Full Recovery Model. Test-DbaDbRecoveryModel

Test-DbaDbRecoveryModel -SqlInstance localhost | Select InstanceName, Database, ConfiguredRecoveryModel, ActualRecoveryModel | Out-GridView

Note: In the output below, you can see that NebulaDataSolutions database is configured for FULL recovery but is actually in SIMPLE. This is because there is no valid Full backup for NebulaDataSolutions.

Test-DbaDbRecoveryModel output

Filed Under: front-page, SQL Server Recovery Models Tagged With: 70-764, BULK-LOGGED RECOVERY, database recovery models, FULL RECOVERY, SIMPLE RECOVERY, SQL Server recovery models

Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2025

 

Loading Comments...