Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Upload files to Amazon S3 With Dropzone.js issue

I'm trying to upload files to S3 service using Dropzone.js

I use this tutorial to upload the files directly from the client:

https://devcenter.heroku.com/articles/s3-upload-node - this tutorial doesn't include the implementation with dropzone js (which was a nightmare)

The flow is pretty simple:

  1. Ask from my server to get signature from amazon
  2. get the signed request url + the expected file url from amazon
  3. override dropzone.options.url with the signed request url
  4. call dropzone.processFile to upload the file to the server

The file is uploaded to the server, until here everything is ok, when I'm trying to view the file (in S3 Bucket interface) it seems like the file was not write correctly and i can't view it.

According to the source code the file is upload using FormData object.

Dropzone.prototype.submitRequest = function(xhr, formData, files) {
  return xhr.send(formData);
}

if i change the source code from:

xhr.send(formData)

to

xhr.send(files[0])

Everything works great but i lose to ability to upload multiple files.

This is the dropzone config:

{
   url: 'http://signature_url',
   accept: _dropzoneAcceptCallback,
   method: 'put',
   headers: {
      'x-amz-acl': 'public-read',
      'Accept': '*/*',
      'Content-Type': file.type
   },
   clickable: ['.choose-files'],
   autoProcessQueue: false
}

Request HTTP Headers

Hope it's enough :)

Thanks.

like image 317
Yochai Akoka Avatar asked Dec 30 '15 09:12

Yochai Akoka


People also ask

How do I upload files to Amazon S3?

In the Amazon S3 console, choose the bucket where you want to upload an object, choose Upload, and then choose Add Files. In the file selection dialog box, find the file that you want to upload, choose it, choose Open, and then choose Start Upload. You can watch the progress of the upload in the Transfer pane.

What is the best way for the application to upload the large files in S3?

When you upload large files to Amazon S3, it's a best practice to leverage multipart uploads. If you're using the AWS Command Line Interface (AWS CLI), then all high-level aws s3 commands automatically perform a multipart upload when the object is large. These high-level commands include aws s3 cp and aws s3 sync.


1 Answers

Here's what worked for my on the dropzone init parameters and node S3 signature on the backend:

HTML Frontend Code using Dropzone:

var myDropzone = new Dropzone(dropArea, { 
    url:"#",
    dictDefaultMessage: "Drag n drop or tap here",
    method: "PUT",
    uploadMultiple: false,
    paramName: "file",
    maxFiles: 10,
    thumbnailWidth: 80,
    thumbnailHeight: 80,
    parallelUploads: 20,
    autoProcessQueue: true,
    previewTemplate: dropPreviewTemplate,
    //autoQueue: false, // Make sure the files aren't queued until manually added
    previewsContainer: dropPreviewContainer, // Define the container to display the previews
    clickable: true, //".fileinput-button" // Define the element that should be used as click trigger to select files.
    accept: function(file, cb) {
        //override the file name, to use the s3 signature
        //console.log(file);
        var params = {
          fileName: file.name,
          fileType: file.type,
        };

        //path to S3 signature 
        $.getJSON('/uploader', params).done(function(data) {
            //console.log(data);

          if (!data.signedRequest) {
            return cb('Failed to receive an upload url');
          }

          file.signedRequest = data.signedRequest;
          file.finalURL = data.downloadURL;
          cb();
        }).fail(function() {
          return cb('Failed to receive an upload url');
        });
    },
    sending: function(file, xhr) {

        console.log('sending')
        var _send = xhr.send;
        xhr.setRequestHeader('x-amz-acl', 'public-read');
        xhr.send = function() {
            _send.call(xhr, file);
        }

    },
    processing:function(file){

        this.options.url = file.signedRequest;

    }
    });

Here's the libraries I used on the node.js side

var Crypto = require("crypto"),
    AWS = require("aws-sdk"),

Here's a sample of the CORS config on S3

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
    <AllowedOrigin>*</AllowedOrigin>
    <AllowedMethod>PUT</AllowedMethod>
    <AllowedHeader>*</AllowedHeader>
</CORSRule>

Here's the code to generate the S3 Signature on node.js :

        getPolicy:function(req,res)
        {
            var fileId = Crypto.randomBytes(20).toString('hex').toUpperCase();

            var prefix = "bl_";
            var newFileName = prefix+fileId;//req.query.fileName;

            var s3 = new AWS.S3();
            var s3_params = {
                Bucket: BUCKET,
                Key: newFileName,
                Expires: 60,
                ContentType: req.query.fileType,
                ACL: 'public-read'
            };
            s3.getSignedUrl('putObject', s3_params, function(err, data){
                if(err){
                    console.log(err);
                }
                else{
                    var return_data = {
                        signedRequest: data,
                        uploadURL: 'https://'+BUCKET+'.s3.amazonaws.com/'+newFileName,
                        downloadURL: 'http://'+BUCKET+'.s3-website-us-east-1.amazonaws.com/'+newFileName,
                    };
                    res.write(JSON.stringify(return_data));
                    res.end();
                }
            });


        }

Hopefully some of this is helpful.

like image 179
Aaron Rau Avatar answered Sep 18 '22 13:09

Aaron Rau