Given the SQL Statement:
SELECT *
FROM MY_TABLE
WHERE SomeNumberField in (0,99999)
If I can guarantee that the majority of rows in MY_TABLE
have SomeNumberField
set to 99999, and can project that this will remain the case indefinately, is it better to write the above query like this:
SELECT *
FROM MY_TABLE
WHERE SomeNumberField in (99999,0)
If you have an index on SomeNumberField
, then this statement will be either just split into two range scans on the index, or processed as a TABLE SCAN / CLUSTERED INDEX SCAN
with a filter.
The latter is more probable, provided that the majority of your rows have SomeNumberField = 999999
The range scans will always be performed in the index order, regardless of the constants order in the IN
predicate.
Filter comparison time is negligible compared to the time required to fetch the data pages.
This kind of optimization can be done automatically by the SQL optimizer provided you gather some statistics that are used by the optimizer. You can use tools provided by the database vendor for this task. For SQL server refer to the following article on MSDN: http://msdn.microsoft.com/en-us/library/cc966419.aspx
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With