Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I configure ImageMagick to not use any temporary files?

I'm trying to run ImageMagick without touching the filesystem: image data is read from memory/the network, and written back to a socket as a blob.

However, ImageMagick continually tries to write temp files, which either fill up my test system due to aborted/failed tests, or cause problems on systems with extremely slow disks: it's a weird constraint, but many of my conversion hosts are embedded-like systems with block devices that are extremely slow to respond to any operations, even stat()s.

Questions:

  1. Is there a way to configure ImageMagick to not touch the disk during image processing?Assume that all required modules that ImageMagick will use have already been loaded, and that none of the ImageMagick functionality that farms processing out to subprocesses that talk to the filesystem will be used.

  2. What are the side effects of doing this? I'm OK with processing that won't fit in memory failing, rather than falling back to the disks.

I'm converting using the C++ or Perl ImageMagick APIs, not the convert utility. If another binding has support for this, though, I'm fine switching platforms.

What I've Tried

  • I've set the MAGICK_TEMPORARY_PATH to various places, including /dev/null on POSIX. /dev/null sometimes seems to work, but my target systems aren't POSIX, so I'm not sure if I can count on it. Rather than fooling the temp file management system, I'd prefer something that I can trust to disable the need for temp files in the first place.
  • I've used the registry:temporary-path option, with similar results.
  • Setting the various temporary path options to nonexistent/unusable locations (e.g. /dev/null or a non-writable location) often seems to result in temporary files being created in the directory from which my program is launched. If that directory is non-writable, I have seen temp files get created in the system /tmp directory even if the code was told not to use it. These seem to happen in exceptional cases (e.g. segfaulting/OOMing/kill -9'd situations), so I gather that files may be created in these places normally, but are usually cleaned up.
like image 690
Zac B Avatar asked Aug 29 '16 15:08

Zac B


1 Answers

The trick is to set MAGICK_DISK_LIMIT=0 and MAGICK_AREA_LIMIT/MAGICK_AREA_LIMIT to a value larger than the maximum memory required by the program.

#!/usr/bin/perl
BEGIN {
    $ENV{'MAGICK_DISK_LIMIT'  }='0';
    $ENV{'MAGICK_AREA_LIMIT'  }='1Mb';
    $ENV{'MAGICK_MEMORY_LIMIT'}='1Mb';
}

use Image::Magick;    
my $img = Image::Magick->new();
my $err = $img->Read('in.png');
die $err if $err;

If ImageMagick exceeds this memory limit it will die with a message like:

Exception 450: Memory allocation failed `in.png' @ error/png.c/MagickPNGErrorHandler/1645 at ./imagemem.pl line 11.

You can view ImageMagick's default limits by running identify -list resource

like image 186
Finbar Crago Avatar answered Nov 05 '22 23:11

Finbar Crago