Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Downloading large files reliably in PHP

Tags:

I have a php script on a server to send files to recipents: they get a unique link and then they can download large files. Sometimes there is a problem with the transfer and the file is corrupted or never finishes. I am wondering if there is a better way to send large files

Code:

$f = fopen(DOWNLOAD_DIR.$database[$_REQUEST['fid']]['filePath'], 'r'); while(!feof($f)){     print fgets($f, 1024); } fclose($f); 

I have seen functions such as

http_send_file http_send_data 

But I am not sure if they will work.

What is the best way to solve this problem?

Regards
erwing

like image 273
Erwing Avatar asked Feb 28 '09 00:02

Erwing


People also ask

What is the easiest way to download large files?

For very large size downloads (more than 2GB), we recommend that you use a Download Manager to do the downloading. This can make your download more stable and faster, reducing the risk of a corrupted file. Simply save the download file to your local drive.

Why do PHP files download instead of executed?

This is normally due to an improper handler code. In the . htaccess file, you will want to ensure the handler code matches your version of php. If it does not, the php files may try to download instead of process.


1 Answers

Chunking files is the fastest / simplest method in PHP, if you can't or don't want to make use of something a bit more professional like cURL, mod-xsendfile on Apache or some dedicated script.

$filename = $filePath.$filename;  $chunksize = 5 * (1024 * 1024); //5 MB (= 5 242 880 bytes) per one chunk of file.  if(file_exists($filename)) {     set_time_limit(300);      $size = intval(sprintf("%u", filesize($filename)));      header('Content-Type: application/octet-stream');     header('Content-Transfer-Encoding: binary');     header('Content-Length: '.$size);     header('Content-Disposition: attachment;filename="'.basename($filename).'"');      if($size > $chunksize)     {          $handle = fopen($filename, 'rb');           while (!feof($handle))         {            print(@fread($handle, $chunksize));            ob_flush();           flush();         }           fclose($handle);      }     else readfile($path);      exit; } else echo 'File "'.$filename.'" does not exist!'; 

Ported from richnetapps.com / NeedBee. Tested on 200 MB files, on which readfile() died, even with maximum allowed memory limit set to 1G, that is five times more than downloaded file size.

BTW: I tested this also on files >2GB, but PHP only managed to write first 2GB of file and then broke the connection. File-related functions (fopen, fread, fseek) uses INT, so you ultimately hit the limit of 2GB. Above mentioned solutions (i.e. mod-xsendfile) seems to be the only option in this case.

EDIT: Make yourself 100% that your file is saved in utf-8. If you omit that, downloaded files will be corrupted. This is, because this solutions uses print to push chunk of a file to a browser.

like image 142
trejder Avatar answered Sep 19 '22 08:09

trejder