I'm doing some SQL tuning these days and find one weird sql during the test:
SELECT StatMan([SC0],[SC1], [SB0000])
FROM (SELECT TOP 100 PERCENT [SC0],[SC1], step_direction([SC0]) over (order by NULL) AS [SB0000]
FROM (SELECT [tableA] AS [SC0],[tableB] AS [SC1]
FROM [dbo].[url] WITH (READUNCOMMITTED,SAMPLE 3.408654e+000 PERCENT)
) AS _MS_UPDSTATS_TBL_HELPER
ORDER BY [SC0],[SC1], [SB0000]
) AS _MS_UPDSTATS_TBL
OPTION (MAXDOP 1)
Looks this is doing some "reindex" or "rebuild" some db index according to SQL Server. But my question is how can we avoid this during the long load test besides "reindex" for each tables before the testing.
And this SQL will consume 16862ms because of my table contains enough rows. And there are many insert action in my test.
By default, each SQL Server and Azure SQL Database has enabled Automatic statistics updates.
Updating statistics ensures that queries compile with up-to-date statistics. Updating statistics via any process may cause query plans to recompile automatically.
MS SQL faces performance issues while executing queries and working within SQL environment in a regular routine. There are many issues that you must be dealing with such as poor queries, slow processing, indexes with tons of writes and reads, memory or CPU issues.
This seems to be from updating statistics.
Will updating statistics happen in a normal production environment? If so, shouldn't a load test, to reflect a production environment, update statistics as well?
To turn off the AUTO_UPDATE_STATISTICS option, use sp_autostats on the desired table(s) (see http://msdn.microsoft.com/en-us/library/ms188775.aspx ).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With