Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MYSQL table becoming large

I have a table in which approx 100,000 rows are added every day. I am supposed to generate reports from this table. I am using PHP to generate these reports. Recently the script which used to do this is taking too long to complete. How can I improve the performance by shifting to something else than MYSQL which is scalable in the long run.

like image 868
user557348 Avatar asked Feb 27 '26 04:02

user557348


2 Answers

MySQL is very scalable, that's for sure.

The key is not changing the db from Mysql to other but you should:

  1. Optimize your queries (can sound silly for others but I remember for instance that a huge improvment I've done sometime ago is to change SELECT * into selecting only the column(s) I need. It's a frequent issue I meet in others code too)
  2. Optimize your table(s) design (normalization etc).
  3. Add indexes on the column(s) you are using frequently in the queries.

Similar advices here

like image 158
Cristian Boariu Avatar answered Mar 01 '26 16:03

Cristian Boariu


For generating reports or file downloads with large chunks of data you should concider using flush and increasing time_limit and memory limit.

I doubt the problem lies in the amount of rows, since MySQL can support ALOT of rows. But you can of course fetch x rows a time and process them in chunks.

I do assume your MySQL is properly tweaked for performance.

like image 38
Wesley van Opdorp Avatar answered Mar 01 '26 17:03

Wesley van Opdorp



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!