Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Saving an image stored on s3 using node.js?

I'm trying to write an image server that uses node.js to store images on s3. Uploading the image works fine, and I can download and view it correctly using an s3 browser client (I'm using dragondisk, specifically, but I've successfully downloaded it with other ones too), but when I download it with node and try to write it to disk, I'm unable to open the file (it says it may be damaged or use a file format that Preview does not recognize). I'm using the amazon sdk for node and fs to write the file. I know that you can pass an optional encoding to fs.writeFile, but I've tried them all and it doesn't work. I've also tried setting ContentType on putObject and ResponseContentType on getObject, as well as ContentEncoding and ResponseContentEncoding (and all of these things in various combinations). Same result. Here's some code:

var AWS = require('aws-sdk')
  , gm = require('../lib/gm')
  , uuid = require('node-uui')
  , fs = require('fs');

AWS.config.loadFromPath('./amazonConfig.json');
var s3 = new AWS.S3();

var bucket = 'myBucketName'; // There's other logic here to set the bucket name.

exports.upload = function(req, res) {
    var id = uuid.v4();
    gm.format("/path/to/some/image.jpg", function(format){
        var key = req.params.dir + "/" + id + "/default." + format;
        fs.readFile('/path/to/some/image.jpg', function(err, data){
            if (err) { console.warn(err); }
            else {
                s3.client.putObject({
                    Bucket: bucket,
                    Key: key,
                    Body: data,
                    ContentType: 'image/jpeg'
                    // I've also tried adding ContentEncoding (in various formats) here.
                 }).done(function(response){
                    res.status(200).end(JSON.stringify({ok:1, id: id}));
                }).fail(function(response){
                    res.status(response.httpResponse.statusCode).end(JSON.stringify(({err: response})));
                });
            }
        });
    });
};

exports.get = function(req, res) {
    var key = req.params.dir + "/" + req.params.id + "/default.JPEG";
    s3.client.getObject({
        Bucket: bucket, 
        Key:  key,
        ResponseContentType: 'image/jpeg'
        // Tried ResponseContentEncoding here in base64, binary, and utf8
    }).done(function(response){
        res.status(200).end(JSON.stringify({ok:1, response: response}));
        var filename = '/path/to/new/image/default.JPEG';
        fs.writeFile(filename, response.data.Body, function(err){
            if (err) console.warn(err);
            // This DOES write the file, just not as an image that can be opened.
            // I've tried pretty much every encoding as the optional third parameter
            // and I've matched the encodings to the ResponseContentEncoding and
            // ContentEncoding above (in case it needs to be the same)
        });
    }).fail(function(response){
        res.status(response.httpResponse.statusCode).end(JSON.stringify({err: response}));
    });
};

Incidentally, I'm using express for routing, so that's where req.params comes from.

like image 354
tandrewnichols Avatar asked Dec 20 '12 19:12

tandrewnichols


People also ask

How do I save a file in S3 bucket?

To upload folders and files to an S3 bucketSign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/ . In the Buckets list, choose the name of the bucket that you want to upload your folders or files to. Choose Upload.


2 Answers

For people who are still struggling with this issue. Here is the approach I used with native aws-sdk.

var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );

inside your router method :- ContentType should be set to the content type of the image file

  buf = new Buffer(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
  var data = {
    Key: req.body.userId, 
    Body: buf,
    ContentEncoding: 'base64',
    ContentType: 'image/jpeg'
  };
  s3Bucket.putObject(data, function(err, data){
      if (err) { 
        console.log(err);
        console.log('Error uploading data: ', data); 
      } else {
        console.log('succesfully uploaded the image!');
      }
  });

s3_config.json file is:-

{
  "accessKeyId":"xxxxxxxxxxxxxxxx",
  "secretAccessKey":"xxxxxxxxxxxxxx",
  "region":"us-east-1"
}
like image 109
Divyanshu Das Avatar answered Oct 10 '22 04:10

Divyanshu Das


Ok, after significant trial and error, I've figured out how to do this. I ended up switching to knox, but presumably, you could use a similar strategy with aws-sdk. This is the kind of solution that makes me say, "There has to be a better way than this," but I'm satisfied with anything that works, at this point.

var imgData = "";
client.getFile(key, function(err, fileRes){
    fileRes.on('data', function(chunk){
        imgData += chunk.toString('binary');
    }).on('end', function(){
        res.set('Content-Type', pic.mime);
        res.set('Content-Length', fileRes.headers['content-length']);
        res.send(new Buffer(imgData, 'binary'));
    });
});

getFile() returns data chunks as buffers. One would think you could just pipe the results straight to front end, but for whatever reason, this was the ONLY way I could get the service to return an image correctly. It feels redundant to write a buffer to a binary string, only to write it back into a buffer, but hey, if it works, it works. If anyone finds a more efficient solution, I would love to hear it.

like image 25
tandrewnichols Avatar answered Oct 10 '22 04:10

tandrewnichols