I maintain a couple of low-traffic sites that have reasonable user uploaded media files and semi big databases. My goal is to backup all the data that is not under version control in a central place.
At the moment I use a nightly cronjob that uses dumpdata
to dump all the DB content into JSON files in a subdirectory of the project. The media uploads is already in the project directory (in media
).
After the DB is dumped, the files are copied with rdiff-backup
(makes an incremental backup) into another location. I then download the rdiff-backup directory on a regular basis with rsync
to store a local copy.
What do you use to backup your data? Please post your backup solution - if you only have a few hits per day on your site or if you maintain a high traffic one with shareded databases and multiple fileservers :)
Thanks for your input.
Recently, I've found this solution called Django-Backup and has worked for me. You can even combine the task of backing up the databases or media files with a cronjob.
Regards,
My backup solution works the following way:
Every night, dump the data to a separate directory. I prefer to keep data dump directory distinct from the project directory (one reason being that project directory changes with every code deployment).
Run a job to upload the data to my Amazon S3 account and another location using rsync
.
Send me an email with the log.
To restore a backup locally I use a script to download the data from S3 and upload it locally.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With