We have a table that currently has a TEXT column and the length of the column averages at about 2,000 characters. We wanted to see what the performance of queries that select that column would be if the average was 5k, 10k, 20k etc.
We set up an isolated test and found that as the length of the TEXT column increased linearly, the query time increased exponentially.
Anyone have any quick thoughts on why this might be. Can provide more info but pretty straight forward.
The amount of data stored in a database has a great impact on its performance. It is usually accepted that a query becomes slower with additional data in the database.
Yes, extra data can slow down queries because it means fewer rows can fit into a page, and this means more disk accesses to read a certain number of rows and fewer rows can be cached in memory.
One of the reasons for that could be because TEXT
and BLOB
fields are not stored alongside with all other 'regular' fields, so that database engine actually needs to pull these from another area of disk.
We'd need to see your query Is it just a lookup by ID field, or do you search in TEXT
field? In the latter case as average length of stored text increases, so does the amount of data for the database to process and it grows exponentially.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With