Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MemoryError while loading huge initial data

I have the initial data from my old database which takes around 6GB. I could "dumpdata" my old database without any problem. But when I attempted to restore them to the new database, I got the MemoryError:

    python manage.py loaddata fixtures/initial_data.json
    MemoryError: Problem installing fixture 'fixtures/initial_data.json': 

Is there any way to make loaddata work with chunks or is it possible to load that big file?

like image 340
cem Avatar asked Aug 31 '13 20:08

cem


1 Answers

I've wrote this script, which is a fork of django's dumpdata, but dumps data in chunks to avoid MemoryError. And then load these chunks one by one.

Script is available at https://github.com/fastinetserver/django-dumpdata-chunks

Example usage:

1) Dump data into many files:

mkdir some-folder

./manage.py dumpdata_chunks your-app-name
--output-folder=./some-folder --max-records-per-chunk=100000

2) Load data from the folder:

find ./some-folder | egrep -o "([0-9]+_[0-9]+)" | xargs ./manage.py loaddata

PS. I used it to move data from Postgresql to MySQL.

like image 83
Kostyantyn Avatar answered Nov 02 '22 12:11

Kostyantyn