Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What are the best SQL Server performance optimization techniques? [closed]

I've always taken the approach of first deploying the database with a minimal set of indexes and then adding/changing indexes as performance dictates.

This approach works reasonably well. However, it still doesn't tell me where I could improve performance. It only tells me where performance is so bad that users complain about it.

Currently, I'm in the process of refactoring database objects on a lot of our applications.

So should I not bother to look for performance improvements since "premature optimization is the root of all evil"?

When refactoring application code, the developer is constantly looking for ways to improve the code quality. Is there a way to constantly be looking for improvements in database performance as well? If so, what tools and techniques have you found to be most helpful?

I've briefly played around with the "Database engine tuning advisor" but didn't find it to be helpful at all. Maybe I just need more experience interpreting the results.

like image 321
Chad Braun-Duin Avatar asked Sep 19 '08 16:09

Chad Braun-Duin


People also ask

Which are the most important technique used in SQL performance Optimisation?

1. Use the Database Engine Tuning Advisor. An important performance optimization tool for SQL performance tuning is the Database Engine Tuning Advisor. This tool allows admins to evaluate single Transact-SQL statements or a batch of statements to determine where improvements can be made.

What are the SQL performance tuning and optimization?

What is SQL Performance Tuning? SQL tuning is the process of improving SQL queries to accelerate your servers performance. It's general purpose is to reduce the amount of time it takes a user to receive a result after issuing a query, and to reduce the amount of resources used to process a query.


1 Answers

My approach is to gather commands against the server or database into a table using SQL Server Profiler. Once you have that, you can query based on the max and avg execution times, max and avg cpu times, and (also very important) the number of times that the query was run.

Since I try to put all database access code in stored procedures it's easy for me to break out queries. If you use inline SQL it might be harder, since a change to a value in the query would make it look like a different query. You can try to work around this using the LIKE operator to put the same types of queries into the same buckets for calculating the aggregates (max, avg, count).

Once you have a "top 10" list of potential problems you can start looking at them individually to see if either the query can be reworked, an index might help, or making a minor architecture change is in order. To come up with the top 10, try looking at the data in different ways: avg * count for total cost during the period, max for worst offender, just plain avg, etc.

Finally, be sure to monitor over different time periods if necessary. The database usage might be different in the morning when everyone is getting in and running their daily reports than it is at midday when users are entering new data. You may also decide that even though some nightly process takes longer than any other query it doesn't matter since it's run during off hours.

Good luck!

like image 75
Tom H Avatar answered Oct 15 '22 20:10

Tom H