This post has been republished via RSS; it originally appeared at: Microsoft Tech Community - Latest Blogs - .
We are excited to announce that the transaction log write throughput (referred to as "log rate") limit is now doubled for the Azure SQL Managed Instances Business Critical with Premium-series hardware. The new limit of 192 MiB/s significantly improves the overall performance for write-intensive user scenarios such as data ingestion and index maintenance and allows your instances to manage more workload for the same price!
What is log rate?
Log rate in SQL Managed Instance refers to the rate at which transaction log records are generated and written to the transaction log file. Log rate is directly correlated with the number of transactions processed by the database because every executed transaction generates a certain amount of data that is written to the transaction log.
Log rate is important because it directly impacts the performance and stability of your database operations, especially for write-intensive workloads.
Log rate limit is now 192 MiB/s
In SQL Managed Instance, the log rate limit is based on the number of vCores and the service tier. For the Business Critical tier, the log rate is limited to 4.5 MiB/s per vCore, with a new maximum of 192 MiB/s per instance with no per-database cap limitations (i.e. a single database on a Business Critical instance can use up the whole 192 MiB/s log rate quota). You can benefit from the new log rate limit by choosing Business Critical Premium-series hardware for your SQL Managed Instance.
Business Critical Log write throughput limit |
Before |
Now |
Standard series |
4 MiB/s per vCore |
4.5 MiB/s per vCore |
Premium series |
4 MiB/s per vCore |
4.5 MiB/s per vCore |
Premium series memory optimized |
4 MiB/s per vCore |
4.5 MiB/s per vCore |
Why is the log rate limit important?
A higher limit for the log rate ensures that your instance can manage more transactions when needed. Therefore, max log rate is important in scenarios when you need higher log rates, such as:
- High-volume transaction processing workloads (OLTP): If your application has a high volume of data modifications (INSERT, UPDATE, DELETE), a higher log rate can help accommodate the increased transaction log generation and improve performance.
- Bulk data operations: When performing bulk data operations, such as bulk inserts or large-scale data imports, a higher log rate can help process these operations more efficiently and reduce the time it takes to complete them.
- Database replication or synchronization: If you are using features like transactional replication or other synchronization mechanisms that rely on transaction log shipping, a higher log rate can ensure timely data synchronization between primary and secondary instances.
How to monitor the log rate?
Monitoring and managing the log rate is essential to ensure optimal performance and avoid potential bottlenecks in your SQL Managed Instance. You can use QPI library to monitor the log rate and take appropriate actions if needed.
STEP 1: Take a snapshot of the current values in sys.dm_io_virtual_file_stats DM.
EXEC qpi.snapshot_file_stats
STEP 2: Find information about each file (size, throughput, read, write [mbps]).
SELECT * FROM qpi.file_stats
We are interested in the information about log files (size, throughput, read, write [mbps]).
SELECT * FROM qpi.file_stats
WHERE file_name = 'log'
Note: TempDB log file, templog, is not governed i.e., templog does not count for the log rate of a SQL MI.
Note: If you renamed any data log file such that its file_name was no longer called “log”, you would need to readjust these queries to sum all log files (except templog) to the log rate of the MI.
STEP 3: Find the sum of values from the column write_mbps for all log files on the instance to get the log rate.
SELECT SUM(write_mbps) AS lograte
FROM qpi.file_stats
WHERE file_name = 'log'
This “lograte” is now limited to 192 MiB/s!
More good news: this is free of charge!
These improvements are a fantastic value for our customers since there are no associated pricing changes; we have doubled your log rate limit on SQL MI Business Critical for no extra charge.
These changes were automatically enabled for all existing and future Business Critical Azure SQL Managed Instances that use Premium-series hardware as of the beginning of August 2023.
Conclusion
This improvement makes Azure SQL Managed Instance an excellent choice for your performance-hungry database workloads and allows you to migrate more of your larger SQL workloads to Azure SQL MI. If you're still new to Azure SQL Managed Instance, now is a great time to get started and take Azure SQL Managed Instance for a spin!