In a Node.js project I am attempting to get data back from S3.
When I use getSignedURL
, everything works:
aws.getSignedUrl('getObject', params, function(err, url){ console.log(url); });
My params are:
var params = { Bucket: "test-aws-imagery", Key: "TILES/Level4/A3_B3_C2/A5_B67_C59_Tiles.par"
If I take the URL output to the console and paste it in a web browser, it downloads the file I need.
However, if I try to use getObject
I get all sorts of odd behavior. I believe I am just using it incorrectly. This is what I've tried:
aws.getObject(params, function(err, data){ console.log(data); console.log(err); });
Outputs:
{ AcceptRanges: 'bytes', LastModified: 'Wed, 06 Apr 2016 20:04:02 GMT', ContentLength: '1602862', ETag: '9826l1e5725fbd52l88ge3f5v0c123a4"', ContentType: 'application/octet-stream', Metadata: {}, Body: <Buffer 01 00 00 00 ... > } null
So it appears that this is working properly. However, when I put a breakpoint on one of the console.log
s, my IDE (NetBeans) throws an error and refuses to show the value of data. While this could just be the IDE, I decided to try other ways to use getObject
.
aws.getObject(params).on('httpData', function(chunk){ console.log(chunk); }).on('httpDone', function(data){ console.log(data); });
This does not output anything. Putting a breakpoint in shows that the code never reaches either of the console.log
s. I also tried:
aws.getObject(params).on('success', function(data){ console.log(data); });
However, this also does not output anything and placing a breakpoint shows that the console.log
is never reached.
What am I doing wrong?
Retrieves objects from Amazon S3. To use GET , you must have READ access to the object. If you grant READ access to the anonymous user, you can return the object without using an authorization header.
But as s3. getObject is an asynchronous call the current thread moves on & nothing gets added to the fileContentList while I am doing the merging.
When doing a getObject()
from the S3 API, per the docs the contents of your file are located in the Body
property, which you can see from your sample output. You should have code that looks something like the following
const aws = require('aws-sdk'); const s3 = new aws.S3(); // Pass in opts to S3 if necessary var getParams = { Bucket: 'abc', // your bucket name, Key: 'abc.txt' // path to the object you're looking for } s3.getObject(getParams, function(err, data) { // Handle any error and exit if (err) return err; // No error happened // Convert Body from a Buffer to a String let objectData = data.Body.toString('utf-8'); // Use the encoding necessary });
You may not need to create a new buffer from the data.Body
object but if you need you can use the sample above to achieve that.
Since I wrote this answer in 2016, Amazon has released a new JavaScript SDK, @aws-sdk/client-s3
. This new version improves on the original getObject()
by returning a promise always instead of opting in via .promise()
being chained to getObject()
. In addition to that, response.Body
is no longer a Buffer
but, one of Readable|ReadableStream|Blob
. This changes the handling of the response.Data
a bit. This should be more performant since we can stream the data returned instead of holding all of the contents in memory, with the trade-off being that it is a bit more verbose to implement.
In the below example the response.Body
data will be streamed into an array and then returned as a string. This is the equivalent example of my original answer. Alternatively, the response.Body
could use stream.Readable.pipe()
to an HTTP Response, a File or any other type of stream.Writeable
for further usage, this would be the more performant way when getting large objects.
If you wanted to use a Buffer
, like the original getObject()
response, this can be done by wrapping responseDataChunks
in a Buffer.concat()
instead of using Array#join()
, this would be useful when interacting with binary data. To note, since Array#join()
returns a string, each Buffer
instance in responseDataChunks
will have Buffer.toString()
called implicitly and the default encoding of utf8
will be used.
const { GetObjectCommand, S3Client } = require('@aws-sdk/client-s3') const client = new S3Client() // Pass in opts to S3 if necessary function getObject (Bucket, Key) { return new Promise(async (resolve, reject) => { const getObjectCommand = new GetObjectCommand({ Bucket, Key }) try { const response = await client.send(getObjectCommand) // Store all of data chunks returned from the response data stream // into an array then use Array#join() to use the returned contents as a String let responseDataChunks = [] // Handle an error while streaming the response body response.Body.once('error', err => reject(err)) // Attach a 'data' listener to add the chunks of data to our array // Each chunk is a Buffer instance response.Body.on('data', chunk => responseDataChunks.push(chunk)) // Once the stream has no more data, join the chunks into a string and return the string response.Body.once('end', () => resolve(responseDataChunks.join(''))) } catch (err) { // Handle the error or throw return reject(err) } }) }
@aws-sdk/client-s3
Documentation LinksGetObjectCommand
GetObjectCommandInput
GetObjectCommandOutput
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With