I have a table that will have millions of entries, and a column that has BIGINT(20)
values that are unique to each row. They are not the primary key, but during certain operations, there are thousands of SELECT
s using this column in the WHERE
clause.
Q: Would adding an index to this column help when the amount of entries grows to the millions? I know it would for a text value, but I'm unfamiliar with what an index would do for INT
or BIGINT
.
A sample SELECT
that would happen thousands of times is similar to this:
`SELECT * FROM table1 WHERE my_big_number=19287319283784
If you have a very large table, then searching against values that aren't indexed can be extremely slow. In MySQL terms this kind of query ends up being a "table scan" which is a way of saying it must test against each row in the table sequentially. This is obviously not the best way to do it.
Adding an index will help with read speeds, but the price you pay is slightly slower write speeds. There's always a trade-off when making an optimization, but in your case the reduction in read time would be immense while the increase in write time would be marginal.
Keep in mind that adding an index to a large table can take a considerable amount of time so do test this against production data before applying it to your production system. The table will likely be locked for the duration of the ALTER TABLE
statement.
As always, use EXPLAIN
on your queries to determine their execution strategy. In your case it'd be something like:
EXPLAIN SELECT * FROM table1 WHERE my_big_number=19287319283784
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With