I've been trying to dump a relatively small amount of data (80 rows or so of django-cms text plugin
1) remotely via Heroku toolbelt:
heroku run python manage.py dumpdata text
But I get random incomplete output that gets closer to EOF every run (presumably cached?).
11:09 PM $> heroku run python manage.py dumpdata text | wc -c
108351
11:09 PM $> !!
120629
11:09 PM $> !!
122693
11:10 PM $> !!
122949
11:10 PM $> !!
153419
11:13 PM $> !!
120877
Anyone run into something similar? I'm using Django 1.4 with postgresql.
1 although, it is blobs of HTML o_0: see docs.
Edit: assume this is just a limitation?? pg_dump's/ restore was my "backup" plan.
Yet another workaround is to add a sleep command to stop the session timing out.
heroku run "python manage.py dumpdata; sleep 10"
Presumably the number grows along with your database...
Looks like for some reason the script times out. This is either a bug or a "feature" on Heroku's part. Here is a workaround:
https://devcenter.heroku.com/articles/heroku-postgres-import-export
Another simpler workaround is to run dumpdata from within a heroku bash prompt:
heroku run bash
python manage.py dumpdata ...
Then capture the output from your terminal. Copy and paste worked for me. I'm sure there's a fancier way to do this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With