I have a 200MB file that I want to give to a user via download. However, since we want the user to only download this file once, we are doing this:
echo file_get_contents('http://some.secret.location.com/secretfolder/the_file.tar.gz');
to force a download. However, this means that the whole file has to be loaded in memory, which usually doesn't work. How can we stream this file to them, at some kb per chunk?
Try something like this (source http://teddy.fr/2007/11/28/how-serve-big-files-through-php/):
<?php
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
// Here goes your code for checking that the user is logged in
// ...
// ...
if ($logged_in) {
$filename = 'path/to/your/file';
$mimetype = 'mime/type';
header('Content-Type: '.$mimetype );
readfile_chunked($filename);
} else {
echo 'Tabatha says you haven\'t paid.';
}
?>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With