Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Azure Sql Database Log I/O Seems High

I've been optimizing our Azure Sql Database and started getting really good performance. The main concern now is the Logging that it does. When running a insert/update load test, everything is low except the CPU which is peaking around 15% and the logging which is peaking around 25%. Since the logging IO is hitting 25%, this causes the DTUs to be 25%. I turned off Auditing in the settings for the database but that did nothing. Is there a way to reduce the logging that is being done? I'm not even sure where the logs are being saved.

Any insight on this would help as I've googled and couldn't locate anything worth mentioning about the logging that is happening.

Here is a screen shot of the metrics...

Azure SQL Database Metrics

Workflow Details:

I don't have byte sizes on me as I'm not in the office atm. Every Task is a SELECT and either INSERT or UPDATE, a typical add or update flow using Entity Framework. These tasks fire off and finish at a rate of 63 tasks per second to create those metrics.

like image 381
Jacob Roberts Avatar asked Jul 10 '15 14:07

Jacob Roberts


People also ask

How do I shrink an Azure SQL log file?

In Azure SQL Database, to shrink files you can use either DBCC SHRINKDATABASE or DBCC SHRINKFILE commands: DBCC SHRINKDATABASE shrinks all data and log files in a database using a single command. The command shrinks one data file at a time, which can take a long time for larger databases.

What is data IO percentage in Azure?

This metric is the average CPU percentage based on the limit of the service tier. This is one of the metrics that makes up DTU. This metric is the average Data I/O percentage based on the limit of the service tier.

Is Azure SQL high availability?

Azure SQL Managed Instance features a built-in high availability solution, that is deeply integrated with the Azure platform. It is dependent on Service Fabric for failure detection and recovery, and on Azure Blob storage for data protection.

How many DTUs are equal to a vCore?

To choose the service objective, or compute size, for the migrated database in the vCore model, you can use a simple but approximate rule of thumb: every 100 DTUs in the Basic or Standard tiers require at least 1 vCore, and every 125 DTUs in the Premium tier require at least 1 vCore.


1 Answers

The metrics represents the writes to the transaction log of the database. The transaction log tracks the changes to the data and hence this for the most part is a function of the amount of data you insert or update. As you said auditing has no impact on the log rate in SQL Database.

like image 113
Jan Engelsberg Avatar answered Oct 29 '22 23:10

Jan Engelsberg