Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MySQL archive data...what to do when it's too big

I use an INSERT INTO & DELETE FROM combination in a PHP script to take data out of an operational MySQL table and put into into an archive table.

The archive table has gotten too big. Even though no day-to-day operations are performed on it, mysqldump chokes when we back up (error 2013):

Error 2013: Lost connection to MySQL server during query when dumping table 'some_table' at row: 1915554

What can I do? Should my PHP script move it to another DB (how?)? Is it okay to keep the large table in the operational db?--in that case, how do I get around the mysqldump issue?

Thanks!

like image 756
Kyle Cureau Avatar asked Nov 24 '11 02:11

Kyle Cureau


1 Answers

Are you by chance dumping using memory buffering and running out of swap and physical RAM? If so, you can try dumping row by row instead.

Try adding --quick to your mysqldump statement.

According to the documentation, you should combine --single-transaction with --quick.

Source: http://dev.mysql.com/doc/refman/5.5/en/mysqldump.html

like image 197
Will Bickford Avatar answered Sep 23 '22 16:09

Will Bickford