Yesterday, I wrote a blog post SQL SERVER – Statistics Modification Counter – sys.dm_db_stats_properties where I discussed how many rows have been modified based on sys.dm_db_stats_properties. I received lots of follow up questions and comments on the blog. Here is the question asked in my yesterday’s Comprehensive Database Performance Health Check about rows sampled from sys.dm_db_stats_properties.
Rows Sampled – sys.dm_db_stats_properties
In the previous blog, we had discussed how to figure out how many rows are modified since the statistics last updated. Now today we are going to extend the same script and in addition to the same details, we will list additional details about rows in the original table, rows used in the statistical sampling, and % of the rows from the original table sampled.
SELECT OBJECT_SCHEMA_NAME(obj.object_id) SchemaName, obj.name TableName, stat.name, modification_counter, [rows], rows_sampled, rows_sampled* 100 / [rows] AS [% Rows Sampled], last_updated FROM sys.objects AS obj INNER JOIN sys.stats AS stat ON stat.object_id = obj.object_id CROSS APPLY sys.dm_db_stats_properties(stat.object_id, stat.stats_id) AS sp WHERE obj.is_ms_shipped = 0 ORDER BY modification_counter DESC
I have the above query, it will give you result very similar to the following.
To make sure that the query only returns user databases, I have also included additional where condition about is_ms_shipped = 0.
How to use this script?
While there are many different reasons for the poor performance of your T-SQL query. This script one of the scripts which give us some indication of poor performance.
If you see consistently poor performance for queries that are related to one of the tables, there is a good chance that the SQL Server Query Optimizer is not producing an efficient execution plan for the query. One of the reasons for the same might be out of date statistics or sampled statistics.
You can decide a statistics is out of date based on a very old date on the last_updated column and very high modification_counter based on the table row numbers. When your table is huge and has a very high number of rows, the SQL Server engine, only takes a sample of the data and builds the statistics. You can figure that out from the columns rows_sampled and low value in % Rows Sampled.
In both of the above scenarios, it is a good idea to update your statistics with the full scan. You can read more about how to update the table with full scan here: SQL SERVER – Update Table Statistics in Parallel with FULLSCAN
After updating your statistics, if you still see the query poorly performing, in that case, you might have very different reasons for the poor performance. To fix the query performance, I would suggest you start with SQL SERVER – Query Specific Wait Statistics and Performance Tuning.
Let me know if you find this script helpful.
Reference: Pinal Dave (https://blog.sqlauthority.com)