I have the code below to output a big file, but it's falling over because PHP's memory use seems to grow and grow as the file is read:
<?php
// various header() calls etc.
$stream = fopen($tarfile,'r');
ob_end_flush();
while (!feof($stream)) {
$buf = fread($stream, 4096);
print $buf;
flush();
unset($buf);
$aa_usage = memory_get_usage(TRUE); // ← this keeps going up!
}
fclose($stream);
I had thought that by the combination of flush and unset the additional memory use would be limited to the 4k buffer, but I'm clearly wrong.
If all you need is to output the content of a file then the right tool to do it is the PHP function readfile()
. Replace all the code you posted with:
readfile($tarfile);
As the documentation says:
Note:
readfile()
will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off withob_get_level()
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With