Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What do you do to make sure a new index does not slow down queries?

When we add or remove a new index to speed up something, we may end up slowing down something else. To protect against such cases, after creating a new index I am doing the following steps:

  1. start the Profiler,
  2. run a SQL script which contains lots of queries I do not want to slow down
  3. load the trace from a file into a table,
  4. analyze CPU, reads, and writes from the trace against the results from the previous runs, before I added (or removed) an index.

This is kind of automated and kind of does what I want. However, I am not sure if there is a better way to do it. Is there some tool that does what I want?

Edit 1 The person who voted to close my question, could you explain your reasons?

Edit 2 I googled up but did not find anything that explains how adding an index can slow down selects. However, this is a well known fact, so there should be something somewhere. If nothing comes up, I can write up a few examples later on.

Edit 3 One such example is this: two columns are highly correlated, like height and weight. We have an index on height, which is not selective enough for our query. We add an index on weight, and run a query with two conditions: a range on height and a range on weight. because the optimizer is not aware of the correlation, it grossly underestimates the cardinality of our query.

Another example is adding an index on increasing column, such as OrderDate, can seriously slow down a query with a condition like OrderDate>SomeDateAfterCreatingTheIndex.

like image 409
A-K Avatar asked Sep 16 '11 20:09

A-K


People also ask

Can adding an index slow down a query?

As shown, indexes can speed up some queries and slow down others.

How do I make indexes faster in SQL?

Add the index to the new empty table. copy the data from the old table to the new table in chunks. drop the old table. My theory is that it will be less expensive to index the data as it is added than to dig through the data that is already there and add the index after the fact.

Why does index query slow down?

Why? Because sequentially scanning an index is actually less efficient than sequentially scanning a table, for indexes with a large number of rows for a given key.

How query performance can be improved using indexing?

Indexing makes columns faster to query by creating pointers to where data is stored within a database. Imagine you want to find a piece of information that is within a large database. To get this information out of the database the computer will look through every row until it finds it.


2 Answers

Ultimately what you're asking can be rephrased as 'How can I ensure that the queries that already use an optimal, fast, plan do not get 'optimized' into a worse execution plan?'.

Whether the plan changes due to parameter sniffing, statistics update or metadata changes (like adding a new index) the best answer I know of to keep the plan stable is plan guides. Deploying plan guides for critical queries that already have good execution plans is probably the best way to force the optimizer into keep using the good, validated, plan. See Applying a Fixed Query Plan to a Plan Guide:

You can apply a fixed query plan to a plan guide of type OBJECT or SQL. Plan guides that apply a fixed query plan are useful when you know about an existing execution plan that performs better than the one selected by the optimizer for a particular query.

The usual warnings apply as to any possible abuse of a feature that prevents the optimizer from using a plan which may be actually better than the plan guide.

like image 88
Remus Rusanu Avatar answered Oct 12 '22 01:10

Remus Rusanu


How about the following approach:

  • Save the execution plans of all typical queries.
  • After applying new indexes, check which execution plans have changed.
  • Test the performance of the queries with modified plans.
like image 31
Klas Lindbäck Avatar answered Oct 12 '22 00:10

Klas Lindbäck