Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I upload big (video) files in streams to AWS S3 with Laravel 5 and filesystem?

I want to upload a big video file to my AWS S3 bucket. After a good deal of hours, I finally managed to configure my php.ini and nginx.conf files, so they allowed bigger files.

But then I got a "Fatal Error: Allowed Memory Size of XXXXXXXXXX Bytes Exhausted". After some time I found out larger files should be uploaded with streams using fopen(),fwrite(), and fclose().

Since I'm using Laravel 5, the filesystem takes care of much of this. Except that I can't get it to work.

My current ResourceController@store looks like this:

public function store(ResourceRequest $request)
{
    /* Prepare data */
    $resource = new Resource();
    $key = 'resource-'.$resource->id;
    $bucket = env('AWS_BUCKET');
    $filePath = $request->file('resource')->getRealPath();

    /* Open & write stream */
    $stream = fopen($filePath, 'w');
    Storage::writeStream($key, $stream, ['public']);

    /* Store entry in DB */
    $resource->title = $request->title;
    $resource->save();

    /* Success message */
    session()->flash('message', $request->title . ' uploadet!');
    return redirect()->route('resource-index');
}

But now I get this long error:

CouldNotCreateChecksumException in SignatureV4.php line 148:

A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Stream\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.

So I am currently completely lost. I can't figure out if I'm even on the right track. Here are the resource I try to make sense of:

  • AWS SDK guide for PHP: Stream Wrappers
  • AWS SDK introduction on stream wrappers
  • Flysystem original API on stream wrappers

And just to confuse me even more, there seems to be another way to upload large files other than streams: The so called "multipart" upload. I actually thought that was what the streams where all about...

What is the difference?

like image 544
MartinJH Avatar asked Jun 22 '15 20:06

MartinJH


People also ask

How can I upload files larger than 5gb to S3?

Note: If you use the Amazon S3 console, the maximum file size for uploads is 160 GB. To upload a file that is larger than 160 GB, use the AWS CLI, AWS SDK, or Amazon S3 REST API.

What is the best way for the application to upload the large files in S3?

When you upload large files to Amazon S3, it's a best practice to leverage multipart uploads. If you're using the AWS Command Line Interface (AWS CLI), then all high-level aws s3 commands automatically perform a multipart upload when the object is large. These high-level commands include aws s3 cp and aws s3 sync.

What is the maximum file size allowed in Amazon S3?

Individual Amazon S3 objects can range in size from a minimum of 0 bytes to a maximum of 5 TB. The largest object that can be uploaded in a single PUT is 5 GB.

How do I put videos on my aws S3?

In the Amazon S3 console, choose the bucket where you want to upload an object, choose Upload, and then choose Add Files. In the file selection dialog box, find the file that you want to upload, choose it, choose Open, and then choose Start Upload. You can watch the progress of the upload in the Transfer pane.


2 Answers

I had the same problem and came up with this solution. Instead of using

Storage::put('file.jpg', $contents);

Which of course ran into an "out of memory error" I used this method:

use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;

// ...

public function uploadToS3($fromPath, $toPath)
{
    $disk = Storage::disk('s3');
    $uploader = new MultipartUploader($disk->getDriver()->getAdapter()->getClient(), $fromPath, [
        'bucket' => Config::get('filesystems.disks.s3.bucket'),
        'key'    => $toPath,
    ]);

    try {
        $result = $uploader->upload();
        echo "Upload complete";
    } catch (MultipartUploadException $e) {
        echo $e->getMessage();
    }
}

Tested with Laravel 5.1

Here are the official AWS PHP SDK docs: http://docs.aws.amazon.com/aws-sdk-php/v3/guide/service/s3-multipart-upload.html

like image 108
HazA Avatar answered Oct 09 '22 04:10

HazA


the streaming part applies to downloads.

for uploads you need to know the content size. for large files multipart uploads are the way to go.

like image 39
Mircea Avatar answered Oct 09 '22 04:10

Mircea