Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

php zip archive memory, ram and max file size

I have folders that have user generated images that get rather large. When trying to zip these I am getting errors around 1.5 gig sized zip files.

My questions have to do with memory and I think that php is holding both the zip open and all the images in memory. A possible solution I have is to send each image on upload to the zip file in addition to the appropriate folder, this seems to work so far but what are the limits of the zip file size, from what I can find around 4 gb. How does this affect/interact with the amount of ram on the server?\ and is the limit actually 4gb or can I continue to add files to the zip indefinitely or should I have the script check the size of the zip and if it's over X gigs rename it and create a new zip. I have searched google and read the documentation but have found conflicting information or incomplete info so I'm looking for some definitive answers and advice. Thanks

like image 406
quick_learner42 Avatar asked Apr 20 '13 15:04

quick_learner42


2 Answers

In case you still want to use the PHP ZipArchive, there are a few things you can do to prevent certain server/OS limitations:

  • Memory Although it might seem obvious in your case, i have seen many examples on how to use ZipArchive that use addFromString to add a new File to the archive. DON'T! This will allocate memory to open the file and store its content in it, which will make you run out of memory fast, use addFile instead. Make also sure that you free all the memory you don't need.

  • Execution time Increase the maximum execution time for your script, either via the php.ini, or with ini_set (e.g. ini_set('max_execution_time', 600); to have a maximum execution time of 10min)

  • File Handles Some OS have limits on the number of open files which can cause a problem because PHP only adds the files to the zip once you close the zip file. To prevent problems with the number of open files just close and reopen the zip file every x files (e.g. every 1000), this will force PHP to compress and add the files already assigned to the archive.

  • File Size There may be some file size limitations of the OS, a bigger file also means that PHP needs more memory to manage it, so i personally prefer to use a maximum file size after which i just open a new zip file using an index number. If the exact file size does not matter to you, you can just count the size of the files going into the archive and then switch after you reach a certain limit, or you can close the archive every x files and check its size on disk to decide wether to start a new archive or not (remember, the files get only compressed once you close the archive)

I personally like to limit the filesize by getting the size of the files going into the archive and applying a likely compression factor to it to see when the maximum archive size will probably be reached (jpg files ~0.9, zip files = 1, text files ~0.10, ...) and then switch to the next volume.

like image 106
Larzan Avatar answered Sep 19 '22 14:09

Larzan


As the stream or memory may still be in use, you must "flush" archive from memory after each/some files:

$zip->close();
unset($zip);
$zip = new ZipArchive;
$zip->open("arch.zip");
like image 44
Artur Muszyński Avatar answered Sep 17 '22 14:09

Artur Muszyński