Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Filepicker? Upload big files via HTML5 to S3 with no backend

Upload files with multipart/form-data is straight forward and works well most of time until you started to be focused on big files upload. If we look closely what happens during a file upload:

  • client sends POST request with the file content in BODY

  • webserver accepts the request and initiates data transfer (or returns error 413 if the file size is exceed the limit)

  • webserver starts to populate buffers (depends on file and buffers size), store it on disk and send it via socket/network to back-end

  • back-end verifies the authentication (take a look, once file is uploaded)

  • back-end reads the file and cuts few headers Content-Disposition, Content-Type, stores it on disk again back-end performs all you need to do with the file

To avoid such overhead we dump file on disk (Nginx client_body_in_file_only) and manage the callback to be send further down the line. Then queue worker picks the file up and do what required. It works for inter-server communication pretty slick but we have to solve similar problem with client side upload.

We also have client-side S3 upload solution. No back-end interaction happens. For video upload we manage the video to convert to the format h.264 Baseline / AAC with Zencoder.

Currently we use modified Flash uploader based on s3-swf-upload-plugin with combination of Zencoder JS SDK which is really efficient but uses Flash.

Question. How to reach the same goal with HTML5 file uploader? Does Filepicker.io and Zencoder solve the problem? What is the recommended way to manage HTML5 file upload with no back-end interaction?

The requirements are the following:

  • HTML5, not flash
  • to upload video with post-processing to make it compatible with HTML5 players and mobile
  • to upload images wtih post-processing (resize, crop, rotate)
  • to upload documents like PDF with a preview functionality

Does https://www.filepicker.com make a good job?

like image 897
Anatoly Avatar asked Aug 24 '15 19:08

Anatoly


People also ask

What is the best way for the application to upload the large files in S3?

When you upload large files to Amazon S3, it's a best practice to leverage multipart uploads. If you're using the AWS Command Line Interface (AWS CLI), then all high-level aws s3 commands automatically perform a multipart upload when the object is large. These high-level commands include aws s3 cp and aws s3 sync.

How do I upload a file greater than 100 megabytes on Amazon S3?

Instead of using the Amazon S3 console, try uploading the file using the AWS Command Line Interface (AWS CLI) or an AWS SDK. Note: If you use the Amazon S3 console, the maximum file size for uploads is 160 GB. To upload a file that is larger than 160 GB, use the AWS CLI, AWS SDK, or Amazon S3 REST API.

Is multipart upload faster?

Multipart Upload is generally faster because it can make full use of your available upload bandwidth.

How do I upload videos to Amazon S3?

In the Amazon S3 console, choose the bucket where you want to upload an object, choose Upload, and then choose Add Files. In the file selection dialog box, find the file that you want to upload, choose it, choose Open, and then choose Start Upload. You can watch the progress of the upload in the Transfer pane.


1 Answers

I'm using filepicker for 2 years now, and without doubt it's worth the price. don't try to manage file upload (from google drive, from ios, from my camera, from dropbox...) Filepicker handles that very well and provide you a ready to use url. Spend more time working on your core business, file upload is really easy to delegate

like image 142
131 Avatar answered Sep 20 '22 15:09

131