Update: For future reference, Amazon have now updated the documentation from what was there at time of asking. As per @Loren Segal's comment below:-
We've corrected the docs in the latest preview release to document this parameter properly. Sorry about the mixup!
I'm trying out the developer preview of the AWS SDK for Node.Js and want to upload a zipped tarball to S3 using putObject
.
According to the documentation, the Body
parameter should be...
Body - (Base64 Encoded Data)
...therefore, I'm trying out the following code...
var AWS = require('aws-sdk'), fs = require('fs'); // For dev purposes only AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' }); // Read in the file, convert it to base64, store to S3 fs.readFile('myarchive.tgz', function (err, data) { if (err) { throw err; } var base64data = new Buffer(data, 'binary').toString('base64'); var s3 = new AWS.S3(); s3.client.putObject({ Bucket: 'mybucketname', Key: 'myarchive.tgz', Body: base64data }).done(function (resp) { console.log('Successfully uploaded package.'); }); });
Whilst I can then see the file in S3, if I download it and attempt to decompress it I get an error that the file is corrupted. Therefore it seems that my method for 'base64 encoded data' is off.
Can someone please help me to upload a binary file using putObject
?
Go to “Services/Storage/S3” and then click in “Create Bucket”. Then, fill the information required in the form. The bucket name must be unique, so be creative :) Click next, next, next and you will have the bucket created.
html is the root and files with paths such as js/ css/ images/ taken from the root folder. Note: Its important to understand that you cannot run NodeJS in S3 and instead you will be using the internal web hosting from S3 to serve the static content.
You don't need to convert the buffer to a base64 string. Just set body to data and it will work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With