I have the initial data from my old database which takes around 6GB. I could "dumpdata" my old database without any problem. But when I attempted to restore them to the new database, I got the MemoryError
:
python manage.py loaddata fixtures/initial_data.json
MemoryError: Problem installing fixture 'fixtures/initial_data.json':
Is there any way to make loaddata
work with chunks or is it possible to load that big file?
I've wrote this script, which is a fork of django's dumpdata, but dumps data in chunks to avoid MemoryError. And then load these chunks one by one.
Script is available at https://github.com/fastinetserver/django-dumpdata-chunks
Example usage:
1) Dump data into many files:
mkdir some-folder
./manage.py dumpdata_chunks your-app-name
--output-folder=./some-folder --max-records-per-chunk=100000
2) Load data from the folder:
find ./some-folder | egrep -o "([0-9]+_[0-9]+)" | xargs ./manage.py loaddata
PS. I used it to move data from Postgresql to MySQL.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With