Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Streaming-s3 is not uploading file to aws properly

I am using nodejs to upload files to aws server. and found that file size is not properly there. I am getting 2.1KB.

here is my code

var uploadFile = function (fileReadStream, awsHeader, cb) {

    //set options for the streaming module
    var options = {
        concurrentParts: 2,
        waitTime: 20000,
        retries: 2,
        maxPartSize: 10 * 1024 * 1024
    };
    //call stream function to upload the file to s3
    var uploader = new streamingS3(fileReadStream, config.aws.accessKey, config.aws.secretKey, awsHeader, options);
    //start uploading
    uploader.begin();// important if callback not provided.

    // handle these functions
    uploader.on('data', function (bytesRead) {
        //console.log(bytesRead, ' bytes read.');
    });

    uploader.on('part', function (number) {
        //console.log('Part ', number, ' uploaded.');
    });

    // All parts uploaded, but upload not yet acknowledged.
    uploader.on('uploaded', function (stats) {
        //console.log('Upload stats: ', stats);
    });

    uploader.on('finished', function (response, stats) {
        console.log(response);
        cb(null, response);
    });

    uploader.on('error', function (err) {
        console.log('Upload error: ', err);
        cb(err);
    });
};

although, I got file name in my aws bucket. but then I try to open the file it says failed.

I am trying to upload this file from url: https://arxiv.org/pdf/1701.00003.pdf

like image 772
Ankita Kashyap Avatar asked May 04 '17 06:05

Ankita Kashyap


1 Answers

There is no need anymore for an external module for uploading streams to s3. Now there is a new method in aws-sdk called s3.upload, which can upload an arbitrarily sized buffer, blob, or stream. You can check the documentation here

The code I used:

const aws = require('aws-sdk');
const s3 = new aws.S3({
    credentials:{
        accessKeyId: "ACCESS_KEY",
        secretAccessKey: "SECRET_ACCESS_KEY"
    }
});
const fs = require('fs');
const got = require('got');

//fs stream test
s3.upload({
    Bucket: "BUCKET_NAME",
    Key: "FILE_NAME",
    ContentType: 'text/plain',
    Body: fs.createReadStream('SOME_FILE')
})
.on("httpUploadProgress", progress => console.log(progress))
.send((err,resp) => {
    if(err) return console.error(err);
    console.log(resp);
})


//http stream test
s3.upload({
    Bucket: "BUCKET_NAME",
    Key: "FILE_NAME",
    ContentType: 'application/pdf',
    Body: got.stream('https://arxiv.org/pdf/1701.00003.pdf')
})
.on("httpUploadProgress", progress => console.log(progress))
.send((err,resp) => {
    if(err) return console.error(err);
    console.log(resp);
})

To prove my point even further I tried the code with the pdf you posted in your question and here is the link for my test bucket showing that the pdf works as expected.

like image 90
Alex Michailidis Avatar answered Oct 08 '22 20:10

Alex Michailidis