Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Download from Laravel storage without loading whole file in memory

Tags:

php

laravel

I am using Laravel Storage and I want to serve users some (larger than memory limit) files. My code was inspired from a post in SO and it goes like this:

$fs = Storage::getDriver(); $stream = $fs->readStream($file->path);  return response()->stream(     function() use($stream) {         fpassthru($stream);     },      200,     [         'Content-Type' => $file->mime,         'Content-disposition' => 'attachment; filename="'.$file->original_name.'"',     ]); 

Unfourtunately, I run into an error for large files:

[2016-04-21 13:37:13] production.ERROR: exception 'Symfony\Component\Debug\Exception\FatalErrorException' with message 'Allowed memory size of 134217728 bytes exhausted (tried to allocate 201740288 bytes)' in /path/app/Http/Controllers/FileController.php:131 Stack trace: #0 /path/vendor/laravel/framework/src/Illuminate/Foundation/Bootstrap/HandleExceptions.php(133): Symfony\Component\Debug\Exception\FatalErrorException->__construct() #1 /path/vendor/laravel/framework/src/Illuminate/Foundation/Bootstrap/HandleExceptions.php(118): Illuminate\Foundation\Bootstrap\HandleExceptions->fatalExceptionFromError() #2 /path/vendor/laravel/framework/src/Illuminate/Foundation/Bootstrap/HandleExceptions.php(0): Illuminate\Foundation\Bootstrap\HandleExceptions->handleShutdown() #3 /path/app/Http/Controllers/FileController.php(131): fpassthru() #4 /path/vendor/symfony/http-foundation/StreamedResponse.php(95): App\Http\Controllers\FileController->App\Http\Controllers\{closure}() #5 /path/vendor/symfony/http-foundation/StreamedResponse.php(95): call_user_func:{/path/vendor/symfony/http-foundation/StreamedResponse.php:95}() #6 /path/vendor/symfony/http-foundation/Response.php(370): Symfony\Component\HttpFoundation\StreamedResponse->sendContent() #7 /path/public/index.php(56): Symfony\Component\HttpFoundation\Response->send() #8 /path/public/index.php(0): {main}() #9 {main}   

It seems that it tries to load all of the file into memory. I was expecting that usage of stream and passthru would not do this... Is there something missing in my code? Do I have to somehow specify chunk size or what?

The versions I am using are Laravel 5.1 and PHP 5.6.

like image 723
Džuris Avatar asked Apr 21 '16 18:04

Džuris


People also ask

How do you make a Laravel file downloadable?

Downloading files in Laravel is even more simple than uploading. You can pass download() method with file path to download file. Same way, if you want to download file from the public folder, you can use download() method from Response class.

How do I get the path of a file in Laravel storage?

Retrieve the file pathphp $storagePath = Storage::disk('local')->getDriver()->getAdapter()->getPathPrefix();

How do I delete files after downloading in Laravel?

Just place it in the controller instead of the app/start/global. php." DeleteFileAfterSend(true) works great on Laravel 5.3 as well. Although the documentation doesn't state anything about it, you can still use it.


2 Answers

It seems that output buffering is still building up a lot in memory.

Try disabling ob before doing the fpassthru:

function() use($stream) {     while(ob_get_level() > 0) ob_end_flush();     fpassthru($stream); }, 

It could be that there are multiple output buffers active that is why the while is needed.

like image 167
Christiaan Avatar answered Sep 20 '22 00:09

Christiaan


Instead of loading the whole file into memory at once, try to use fread to read and send it chunk by chunk.

Here is a very good article: http://zinoui.com/blog/download-large-files-with-php

<?php  //disable execution time limit when downloading a big file. set_time_limit(0);  /** @var \League\Flysystem\Filesystem $fs */ $fs = Storage::disk('local')->getDriver();  $fileName = 'bigfile';  $metaData = $fs->getMetadata($fileName); $handle = $fs->readStream($fileName);  header('Pragma: public'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header('Cache-Control: private', false); header('Content-Transfer-Encoding: binary'); header('Content-Disposition: attachment; filename="' . $metaData['path'] . '";'); header('Content-Type: ' . $metaData['type']);  /*     I've commented the following line out.     Because \League\Flysystem\Filesystem uses int for file size     For file size larger than PHP_INT_MAX (2147483647) bytes     It may return 0, which results in:          Content-Length: 0      and it stops the browser from downloading the file.      Try to figure out a way to get the file size represented by a string.     (e.g. using shell command/3rd party plugin?) */  //header('Content-Length: ' . $metaData['size']);   $chunkSize = 1024 * 1024;  while (!feof($handle)) {     $buffer = fread($handle, $chunkSize);     echo $buffer;     ob_flush();     flush(); }  fclose($handle); exit; ?> 

Update

A simpler way to do this: just call

if (ob_get_level()) ob_end_clean(); 

before returning a response.

Credit to @Christiaan

//disable execution time limit when downloading a big file. set_time_limit(0);  /** @var \League\Flysystem\Filesystem $fs */ $fs = Storage::disk('local')->getDriver();  $fileName = 'bigfile';  $metaData = $fs->getMetadata($fileName); $stream = $fs->readStream($fileName);  if (ob_get_level()) ob_end_clean();  return response()->stream(     function () use ($stream) {         fpassthru($stream);     },     200,     [         'Content-Type' => $metaData['type'],         'Content-disposition' => 'attachment; filename="' . $metaData['path'] . '"',     ]); 
like image 21
Kevin Avatar answered Sep 17 '22 00:09

Kevin