I'm unsure as to whether this question should be posted in the Database Administrators' section or here, so please advise if I got it wrong.
I have a Django-based website which doesn't change much. I use python manage.py dumpdata --all --indent=2 > backup.json
and reload the data with loaddata
if I need to redeploy or the db gets corrupted. (I'm aware about integrity errors that have occurred when not excluding auth
and content_types
)
Since I'm using PostgreSQL on the backend, is it "best practise" or "wiser" for me to use pg_dump
instead, and then pg_restore
if something goes wrong or if I need to redeploy?
So dumpdata
dumps all data associated with the selected apps (and/or models), and pg_dump
performs a full dump of the db. Is this the same thing or is there a fundamental difference that I've missed (mind you I have 0 experience with DBA)?
Which option do I go for and why?
it is both best practice and wiser for you to use pg_dump
instead of dumpdata. pg_dump
is faster, the output is more compact (particularly with the -Fc option) and it can be loaded faster with pg_restore
than loaddata. Last but not least the integrity errors that you spoke of will not happen with pg_dump/pg_restore.
Generally pg_dump is used to dump the entire database however the -t option allows you to dump one or few tables at a time
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With