I'm trying to create a cron job for database backup.
This is what I have so far:
mysqldump.sh
mysqldump -u root -ptest --all-databases | gzip > "/db-backup/backup/backup-$(date)" 2> dump.log
echo "Finished mysqldump $(date)" >> dump.log
Cron job:
32 18 * * * /db-backup/mysqldump.sh
The problem I am having is the job is not executing through cron or when I am not in the directory.
Can someone please advise. Are my paths incorrect?
Also, the following line I'm not sure will output errors to the dump.log:
mysqldump -u root -ptest --all-databases | gzip > "/db-backup/backup/backup-$(date)" 2> dump.log
What worked:
mysqldump -u root -ptest --all-databases | gzip > "../db-backup/backup/backup-$(date).sql.gz" 2> ../db-backup/dump.log
echo "Finished mysqldump $(date)" >> ../db-backup/dump.log
Mysqldump will attempt to dump all the triggers in your database by default. To be able to dump a table's triggers , you must have the TRIGGER privilege for the table.
By default, mysqldump locks all the tables it's about to dump. This ensure the data is in a consistent state during the dump.
There are a couple of things you can check, though more information is always more helpful (permissions and location of file, entire file contents, etc).
mysqldump.sh
file with the Shebang syntax for your environment. I would venture to guess #!/bin/bash
would be sufficient.mysqldump -u ....
use the absolute path /usr/bin/mysqldump
(or where ever it is on your system). Absolute paths are always a good idea in any form of scripting since it's difficult to say if the user has the same environment as you do.As for storing the errors in dump.log, I don't believe your syntax is correct. I'm fairly sure you're piping the errors from gzip
into dump.log, not the errors from mysqldump
. This seems like a fairly common question which arrives at the answer of mysqldump $PARAMS | gzip -c dump-$(date)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With