I'm trying to create a very large image (25000x25000) by pasting together many smaller images. Upon calling Image.new() with such large dimensions, python runs out of memory and I get a MemoryError.
Is there a way to write out an image like this incrementally, without having the whole thing resident in RAM?
EDIT:
Using ImageMagick's montage
command, it seems possible to create arbitrarily sized images. It looks like it's not trying loading the final image into RAM (it uses very little memory during the process) but rather streaming it out to disk, which is ideal.
You may try to use GDAL library. It provides bindings to Python. Here is combined tutorial presenting how to read and write images using C++, C and Python APIs Depending on GDAL operations and functions being used, GDAL can handle very large images and process images which are too large to be held in RAM.
Not too suprising you're running out of memory; that image will take over 2gig in memory and depending on the system you're using your OS might not be able to allocate enough virtual memory to python to run it, regardless of your actual RAM.
You are definitely going to need to write it out incrementally. If you're using a raw format you could probably do this per row of images, if they are all of the same dimensions. Then you could concatenate the files, otherwise you'd have to be a bit more careful with how you encode the data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With