Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

large amount of inserts per seconds causing massive CPU load

I have a PHP script that in every run, inserts a new row to a Mysql db (with a relative small amount of data..) I have more than 20 requests per second, and this is causing my CPU to scream for help..

I'm using the sql INSERT DELAYED method with a MyISAM engine (although I just notice that INSERT DELAYED is not working with MyISAM).

My main concern is my CPU load and I started to look for ways to store this data with more CPU friendly solutions.

My first idea was to write this data to an hourly log files and once an hour to retrieve the data from the logs and insert it to the DB at once.

Maybe a better idea is to use NoSQL DB instead of log files and then once an hour to insert the data from the NoSQL to the Mysql..

I didn't test yet any of these ideas, so I don't really know if this will manage to decrease my CPU load or not. I wanted to ask if someone can help me find the right solution that will have the lowest affect over my CPU.

like image 589
jsbuster Avatar asked Feb 22 '23 10:02

jsbuster


1 Answers

I recently had a very similar problem and my solution was to simply batch the requests. This sped things up about 50 times because of the reduced overhead of mysql connections and also the greatly decreased amount of reindexing. Storing them to a file then doing one larger (100-300 individual inserts) statement at once probably is a good idea. To speed things up even more turn off indexing for the duration of the insert with

ALTER TABLE tablename DISABLE KEYS
insert statement
ALTER TABLE tablename ENABLE KEYS

doing the batch insert will reduce the number of instances of the php script running, it will reduce the number of currently open mysql handles (large improvement) and it will decrease the amount of indexing.

like image 66
hackartist Avatar answered Feb 24 '23 02:02

hackartist