I'm working on developing a regular dump of our database. I'm using this script to create the backup and then feeding it through a regular cron job. In the end we end up with a text file as well as an email archive of everything.
The problem we've encountered is the size of two of our tables. They each have 60k fields and grow daily. I'm thinking incremental backup are the best solution for backup, but if it ever came to restoring it... It will be a huge project.
My question is a two parter:
a) Is there a more straight forward way to backup huge tables on a daily basis and, if not,
b) Is there an easy way to restore a backup from daily/weekly incremental backups?
Thanks!
You may wish to check out Maatkit. It's a bunch of perl scripts. One of which is mk-parallel-dump which spawns multiple copies of mysqldump (by default, 1 per CPU in the machine) allowing the dump to go MUCH faster. You can set this up in a cron job as well, like Daniel suggested.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With