Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running queries on tables with more than 1million rows in

I am indexing all the columns that I use in my Where / Order by, is there anything else I can do to speed the queries up?

The queries are very simple, like:

SELECT COUNT(*) 
  FROM TABLE 
 WHERE user = id 
   AND other_column = 'something'`

I am using PHP 5, MySQL client version: 4.1.22 and my tables are MyISAM.

like image 450
webnoob Avatar asked May 06 '11 22:05

webnoob


2 Answers

Talk to your DBA. Run your local equivalent of showplan. For a query like your sample, I would suspect that a covering index on the columns id and other_column would greatly speed up performance. (I assume user is a variable or niladic function).

A good general rule is the columns in the index should go from left to right in descending order of variance. That is, that column varying most rapidly in value should be the first column in the index and that column varying least rapidly should be the last column in the index. Seems counter intuitive, but there you go. The query optimizer likes narrowing things down as fast as possible.

like image 88
Nicholas Carey Avatar answered Nov 09 '22 22:11

Nicholas Carey


If all your queries include a user id then you can start with the assumption that userid should be included in each of your indexes, probably as the first field. (Can we assume that the user id is highly selective? i.e. that any single user doesn't have more than several thousand records?)

So your indexes might be:

user + otherfield1
user + otherfield2
etc.

If your user id is really selective, like several dozen records, then just the index on that field should be pretty effective (sub-second return).

What's nice about a "user + otherfield" index is that mysql doesn't even need to look at the data records. The index has a pointer for each record and it can just count the pointers.

like image 28
dkretz Avatar answered Nov 09 '22 23:11

dkretz