Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

is breaking up large MySql queries beneficial

Tags:

mysql

I have a MySql table that has about 5.5 million rows of data. Every month I need to reload this data with a new data file. Since it takes a while to remove the old data, I'm adding limit 2000000 to split the job up into chunks. Example:

DELETE FROM `list_content` WHERE list_id = 3 limit 2000000

My theory is that memory might be released after the query is done, and splitting it into chunks like this might be beneficial in not consuming resources. However I haven't found anything that supports my theory. Is there any benefit to splitting up a query like this instead of just letting it run for 20 minutes?

like image 279
khartnett Avatar asked Nov 12 '22 19:11

khartnett


1 Answers

To answer your question directly - there is no benefit. Assuming that you're not hitting any physical limits (like the maximum amount of physical memory your OS supports) and that MySQL is a well written program, then no.

A couple things you might consider

(1)

DELETE QUICK FROM `list_content` WHERE list_id = 3

followed by a

OPTIMIZE TABLE `list_content`

DELETE QUICK will not perform any house-keeping on the index blocks as its deleting. You can then do all the house-keeping at once with the OPTIMIZE TABLE statement.

(2)

If you're wiping out the whole table, then load the new data into a new table... call it list_temp

DROP TABLE `list_content`; -- very quick since this is simple a delete file op
RENAME TABLE `list_temp` to `list_content`; -- also quick since this is a file rename op
like image 106
Andy Jones Avatar answered Jan 04 '23 03:01

Andy Jones