Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PHP Aborting when creating large .zip file

My php script running on CentOS 5.6 and PHP 5.2.12 using ZipArchive() and successfully creates .zip files over 1.6Gb but not for a larger archive of 2GB or larger - PHP aborts with no apparent error. Nothing in the PHP error log or stderr. The script is being executed at the cmd line and not interactively.

The script runs for about 8min and the temp archive grows and while checking the filesize, the last listing showed the tmp file was 2120011776 in size and then the tmp file disappears and the PHP script falls thru the logic and executes the code after the archive create.

For some reason top shows the CPU still at 95% and is creating a new tmp archive file - it does this for say another 5+ min and silently stops and leaves the un-completed tmp archive file. In this test - there was less then 4000 expected files.

The script as noted works just fine creating smaller archive files.

Tested on several different sets of large source data - same result for large files.

This issue sounds similar to this question: Size limit on PHP's zipArchive class?

I thought maybe the ls -l command was returning a count of 2K blocks and thus 2120011776 would be close to 4GB but that size is in bytes - the size of the xxxx.zip.tmpxx file.

Thanks!

like image 878
blainelang Avatar asked Apr 21 '11 14:04

blainelang


1 Answers

It could be many things. I'm assuming that you have enough free disk space to handle the process. As others have mentioned, there could be some problems fixed either by editing your php.ini file or using the ini_set() function in the code itself.

How much memory does your machine have? If it exhausts your actual memory, then it makes sense that it would abort regularly after a certain size. So, check the free memory usage before the script and monitor it as the script executes.

A third option could be based on the file system itself. I don't have much experience with CentOS, but some file systems do not allow files over 2 gb. Although, from the product page, it seems like most systems on CentOS can handle it.

A fourth option, which seems to be the most promising, appears if you look at the product page linked above, another possible culprit is "Maximum x86 per-process virtual address space," which is approximately 3gb. x86_64 is about 2tb, so check the type of processor.

Again, it seems like the fourth option is the culprit.

like image 185
Shawn Patrick Rice Avatar answered Sep 27 '22 02:09

Shawn Patrick Rice