Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SQL Server Index - Any improvement for LIKE queries?

We have a query that runs off a fairly large table that unfortunately needs to use LIKE '%ABC%' on a couple varchar fields so the user can search on partial names, etc. SQL Server 2005

Would adding an index on these varchar fields help any in terms of select query performance when using LIKE or does it basically ignore the indexes and do a full scan in those cases?

Any other possible ways to improve performance when using LIKE?

like image 545
schooner Avatar asked Apr 29 '09 19:04

schooner


People also ask

Does index work with like?

Indexes cannot be used with LIKE '%text%' predicates. They can, however with LIKE 'text%'. With fulltext indexing, the situation may be different but I guess it would require more work than just creating an index. If you are always checking all 4 columns, you may as well make a single FULLTEXT index on the 4 columns.

Does index improve query performance?

Indexing makes columns faster to query by creating pointers to where data is stored within a database. Imagine you want to find a piece of information that is within a large database. To get this information out of the database the computer will look through every row until it finds it.

Do indexes improve joins?

Indexes can help improve the performance of a nested-loop join in several ways. The biggest benefit often comes when you have a clustered index on the joining column in one of the tables. The presence of a clustered index on a join column frequently determines which table SQL Server chooses as the inner table.


2 Answers

Only if you add full-text searching to those columns, and use the full-text query capabilities of SQL Server.

Otherwise, no, an index will not help.

like image 188
Lasse V. Karlsen Avatar answered Sep 22 '22 08:09

Lasse V. Karlsen


You can potentially see performance improvements by adding index(es), it depends a lot on the specifics :)

How much of the total size of the row are your predicated columns? How many rows do you expect to match? Do you need to return all rows that match the predicate, or just top 1 or top n rows?

If you are searching for values with high selectivity/uniqueness (so few rows to return), and the predicated columns are a smallish portion of the entire row size, an index could be quite useful. It will still be a scan, but your index will fit more rows per page than the source table.

Here is an example where the total row size is much greater than the column size to search across:

create table t1 (v1 varchar(100), b1 varbinary(8000)) go --add 10k rows of filler insert t1 values ('abc123def', cast(replicate('a', 8000) as varbinary(8000))) go 10000 --add 1 row to find insert t1 values ('abc456def', cast(replicate('a', 8000) as varbinary(8000))) go  set statistics io on  go select * from t1 where v1 like '%456%' --shows 10001 logical reads  --create index that only contains the column(s) to search across create index t1i1 on t1(v1) go select * from t1 where v1 like '%456%' --or can force to  --shows 37 logical reads 

If you look at the actual execution plan you can see the engine scanned the index and did a bookmark lookup on the matching row. Or you can tell the optimizer directly to use the index, if it hadn't decide to use this plan on its own: select * from t1 with (index(t1i1)) where v1 like '%456%'

If you have a bunch of columns to search across only a few that are highly selective, you could create multiple indexes and use a reduction approach. E.g. first determine a set of IDs (or whatever your PK is) from your highly selective index, then search your less selective columns with a filter against that small set of PKs.

If you always need to return a large set of rows you would almost certainly be better off with a table scan.

So the possible optimizations depend a lot on the specifics of your table definition and the selectivity of your data.

HTH! -Adrian

like image 23
ahains Avatar answered Sep 21 '22 08:09

ahains