Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

stream uploading an gm-resized image to s3 with aws-sdk

so what i want to do is to stream an image from a url, process it with graphicsmagick and stream-upload it to s3. i just dont get it working.

streaming the processed image to local disk (using fs.createWriteStream) works without a problem.

when i buffer my stream, the final image in s3 has at least the expected size (kb-wise), but i can not open that image.

thats my current progress:

var request = require('request');

var gm = require("gm");

var AWS = require('aws-sdk');

var mime = require('mime');

var s3 = new AWS.S3();

gm(request('http://www.some-domain.com/some-image.jpg'), "my-image.jpg")
  .resize("100^", "100^")
  .stream(function(err, stdout, stderr) {
    var str = '';
    stdout.on('data', function(data) {
       str += data;
    });
    stdout.on('end', function(data) {
      var data = {
        Bucket: "my-bucket",
        Key: "my-image.jpg",
        Body: new Buffer(str, 'binary'), // thats where im probably wrong
        ContentType: mime.lookup("my-image.jpg")
      };
      s3.client.putObject(data, function(err, res) {
        console.log("done");
      });
    });
  });

i did try some stuff like creating a filewritestream and filereadstream, but i think there should be some cleaner an nicer solution to that problem...

EDIT: the first thing i tried was setting the Body to stdout (the suggested answer from @AndyD):

var data = {
    Bucket: "my-bucket",
    Key: "my-image.jpg",
    Body: stdout,
    ContentType: mime.lookup("my-image.jpg")
  };

but that returns following error:

Cannot determine length of [object Object]'

EDIT2:

  • nodeversion: 0.8.6 (i also tried 0.8.22 and 0.10.0)
  • aws-sdk: 0.9.7-pre.8 (installed today)

the complete err:

{ [Error: Cannot determine length of [object Object]]
  message: 'Cannot determine length of [object Object]',
  object:
  { _handle:
   { writeQueueSize: 0,
    owner: [Circular],
    onread: [Function: onread] },
 _pendingWriteReqs: 0,
 _flags: 0,
 _connectQueueSize: 0,
 destroyed: false,
 errorEmitted: false,
 bytesRead: 0,
 _bytesDispatched: 0,
 allowHalfOpen: undefined,
 writable: false,
 readable: true,
 _paused: false,
 _events: { close: [Function], error: [Function: handlerr] } },
name: 'Error' }
like image 788
hereandnow78 Avatar asked Apr 04 '13 17:04

hereandnow78


1 Answers

you don't need to read the stream yourself (in your case you seem to be converting from binary to string and back due to var str='' and then appending data which is a binary buffer etc...

Try letting putObject pipe the stream like this:

gm(request('http://www.some-domain.com/some-image.jpg'), "my-image.jpg")
  .resize("100^", "100^")
  .stream(function(err, stdout, stderr) {
      var data = {
        Bucket: "my-bucket",
        Key: "my-image.jpg",
        Body: stdout
        ContentType: mime.lookup("my-image.jpg")
      };
      s3.client.putObject(data, function(err, res) {
        console.log("done");
      });
    });
  });

See these release notes for more info.

If streaming/pipe doesn't work then something like this might which will load everything into memory and then upload. You're limited to 4Mb I think in this case.

    var buf = new Buffer('');
    stdout.on('data', function(data) {
       buf = Buffer.concat([buf, data]);
    });
    stdout.on('end', function(data) {
      var data = {
        Bucket: "my-bucket",
        Key: "my-image.jpg",
        Body: buf,
        ContentType: mime.lookup("my-image.jpg")
      };
      s3.client.putObject(data, function(err, res) {
        console.log("done");
      });
    });
like image 154
AndyD Avatar answered Sep 20 '22 20:09

AndyD