The node.js API for S3 gives the following description for the data returned in the callback of getObject
. From http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#getObject-property :
Body — (Buffer, Typed Array, Blob, String, ReadableStream) Object data.
Is this for real? Is there no way to control which of these things it is?
PDF. Retrieves objects from Amazon S3. To use GET , you must have READ access to the object. If you grant READ access to the anonymous user, you can return the object without using an authorization header.
You can store virtually any kind of data in any format. Please refer to the Amazon Web Services Licensing Agreement for details. Q: How much data can I store in Amazon S3? The total volume of data and number of objects you can store are unlimited.
Sign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/ . In the Buckets list, choose the name of the bucket that contains the object. In the Objects list, choose the name of the object for which you want an overview. The object overview opens.
You can get the object's contents by calling getObjectContent on the S3Object . This returns an S3ObjectInputStream that behaves as a standard Java InputStream object.
I don't know if you can control in advance the type of the data.Body field provided in the getObject() callback. If all you want to do is determine if you've received a buffer, you can try Node's Buffer.isBuffer(data.Body) class method.
Alternately, you might want to avoid the issue altogether and use this approach from Amazon's S3 documentation:
var s3 = new AWS.S3();
var params = {Bucket: 'myBucket', Key: 'myImageFile.jpg'};
var file = require('fs').createWriteStream('/path/to/file.jpg');
s3.getObject(params).createReadStream().pipe(file);
Presuming you'll be using this code in a typical node.js async callback environment, it might make more sense to see the code like so:
var fs = require('fs');
function downloadFile(key, localPath, callback) {
var s3 = new AWS.S3();
var params = {Bucket: 'myBucket', Key: key};
var file = fs.createWriteStream(localPath);
file.on('close') {
callback();
}
file.on('error', function(err) {
callback(err);
});
s3.getObject(params).createReadStream().pipe(file);
}
I couldn't find any way to change the Body type either, however after noticing the Body was a buffer, I transformed the buffer into a ReadableStream with this handy & pretty straightforward function: AWS.util.buffer.toStream
(or perhaps you might want to use another lib like streamifier).
I was looking for something where I could validate errors before doing anything else, in Amazon's example it translates to "create the Write Stream only if there were no errors".
s3.getObject(params, function(err, data) {
if (err) {
console.log(err);
return;
}
var file = require('fs').createWriteStream(name);
var read = AWS.util.buffer.toStream(data.Body);
read.pipe(file);
read.on('data', function(chunk) {
console.log('got %d bytes of data', chunk.length);
});
});
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With