As a SQL Server DBA, when migrating to an Azure SQL Database, you will need to have an initial estimate of DTUs so as to assign an appropriate Service tier to an Azure SQL Database. An appropriate Service Tier will ensure that you met most of your application performance goals. Estimating a lower or a higher Service tier will result in decreased performance or increased cost, respectively.
This chapter teaches you how to use the DTU calculator to make an appropriate initial estimate of the Service tier. You can, at any time, change your service tier by monitoring the Azure SQL Database performance once it's up and running.
Developed by Justin Henriksen, an Azure Solution Architect at Microsoft, the DTU Calculator can be used to find out the initial Service tier for an Azure SQL Database. The calculator is available at https://dtucalculator.azurewebsites.net.
DTU Calculator Work Flow
The DTU Calculator works as shown in the following diagram:
First, you have to set up a trace to record the following counters for at least an hour:
You can run a trace by using either the command-line utility or the PowerShell script provided on the DTU Calculator website.
Capture the counters on a workload similar to that of the production environment. The trace generates a CSV report.
The DTU Calculator uses this report to analyze and suggest an initial Service Tier.
Let’s get back to Mike. Mike is unsure about the service tier that he has to select while migrating to the Azure SQL Database. Hence, he wants to make use of the DTU Calculator to select the service tier to migrate to. The following steps describe how he can use the DTU Calculator to determine the initial service tier for his database:
The
sql-performance-cl
folder has two files,
SqlDtuPerfmon.exe
and
SqlDtuPerfmon.exe.config
. The first one is the executable which, when run, will capture the counter in a
csv
file, and the second file is a configuration file that specifies the counters to be captured.
SqlDtuPerfmon.exe.config
in Notepad and make changes, as suggested in the following points:SQL COUNTER
comment, modify the value of SqlCategory
as per the instance name of your SQL Server.MSSQL$SQL2016:Databases
with SQLServer:Databases.
Packtpub
, then replace MSSQL$SQL2016:Databases
with MSSQL$Packtpub:Databases.
CsvPath
key shown in the preceding screenshot.C:CodeLesson02DTUCalculatorsql-perfmon-clSqlDtuPerfmon.exe
file to run the performance trace.A new command prompt will appear and will display the counters as they are being monitored and saved into the CSV file.
You should get a similar command-line window, as shown in the following screenshot:
SqlDtuPerfmon.exe
on and capture the performance counters.C:CodeLesson02DTUCalculatorsql-perfmon-log.csv
file.Overall Recommendation
The DTU Calculator suggests to migrate to Premium – P2 tier, as it will cover approximately 100% of the workload.
You can hover your mouse over different areas of the chart to check what percent the workload of each tier covers:
Recommendation Based on CPU Utilization
The calculator recommends Premium – P2 tier based on CPU utilization:
Recommendation Based on Iops Utilization
Based solely on Iops utilization, the DTU Calculator suggests going for a Standard – S2 tier, which covers approximately 90% of the workload. However, it also mentions that approximately 10% of the workloads will require a higher service tier. Therefore, the decision lies with you; the Premium – P2 Service tier is costlier, however, it covers 100% of the workload. If we are okay with 10% of the workload performing slowly, we can choose the Standard – S2 service tier.
Remember, this is an estimation of the Service Tier and you can at any time scale up to a higher Service Tier for improved performance:
Recommendation Based on Log Utilization
Based solely on Log Utilization, the DTA Calculator recommends the Basic Service tier. However, it is also mentioned that approximately 11% of the workload requires a higher Server tier.
At this point, you may start with the Basic Service Tier and then scale up as and when required. This will certainly save money. However, you should consider that the Basic Service allows only 30 concurrent connections and has a maximum database size limit of 2 GB:
3.133.127.37