I am trying to save a JSON File from AWS Lambda to S3.
(to be more precise: I want to create a new file 'supertest.json'
containing the 'data' inside the S3 bucket 'gpiocontroll-XYZ'
)
The Lambda function looks like this:
'use strict'
const aws = require('aws-sdk');
const s3 = new aws.S3();
//const fs = require('fs');
function saveJSONtoS3(data){
console.log('SAVEJSON', data);
var params = {
Bucket: 'gpiocontroll-XYZ', // your bucket name,
Key: 'test.txt', // path to the object you're looking for
Body: data
}
s3.putObject(params, function(err, data) {
// Handle any error and exit
if (err)
console.log('ERROR', err);
else {
console.log('UPLOADED SUCCESS');
}
console.log('INSIDE FUNCTION');
});
console.log('END')
}
module.exports = {
saveJSONtoS3 : saveJSONtoS3
}
The log on Lambda looks like:
2017-12-27T20:04:29.382Z 255d436d-eb41-11e7-b237-1190c4f33d2d SAVEJSON {"table":[{"pin":"1","state":"aus"}]}
2017-12-27T20:04:29.402Z 255d436d-eb41-11e7-b237-1190c4f33d2d END
END RequestId: 255d436d-eb41-11e7-b237-1190c4f33d2d
REPORT RequestId: 255d436d-eb41-11e7-b237-1190c4f33d2d Duration: 362.29 ms Billed Duration: 400 ms Memory Size: 128 MB Max Memory Used: 43 MB
So it seems like everything is fine but the s3.putObject function just don't get triggered. Lambda and S3 are both in the same region. The S3 is public with an IAM user. Do I need to log in in the Lambda function somehow?
Thanks a lot!
AWS LAMBDA- Serverless Computing Amazon S3 service is used for file storage, where you can upload or remove files. We can trigger AWS Lambda on S3 when there are any file uploads in S3 buckets. AWS Lambda has a handler function which acts as a start point for AWS Lambda function.
Unfortunately it seems s3 does not allow content-type application/json.... I should save my file as text/plain and then add header with php? While using content-type headers is certainly good, they are not required. If you know that a certain file contains JSON, you can just parse the response text with JSON.
If you head to the Properties tab of your S3 bucket, you can set up an Event Notification for all object “create” events (or just PutObject events). As the destination, you can select the Lambda function where you will write your code to unzip and gzip files. Now, every time there is a new .
As @dashmug said, your example is not a Lambda function.
You must have exports.handler
in your file somewhere unless specified in the function configuration.
All Lambda functions start by calling exports.handler
with ( event, context, callback )
parameters. These include the data of the event
or action, some additional context
, and a success/fail callback
.
Here is what you are looking for:
Update: changed S3.putObject
Promise wrapped function to S3.putObject().promise()
per @dashmug’s recommendation.
Requires AWS SDK for JavaScript (v2.3.0 - March 31 2016 or later)
'use strict';
const
AWS = require( 'aws-sdk' ),
S3 = new AWS.S3();
exports.handler = ( event, context, callback ) => {
console.log( `FUNCTION STARTED: ${new Date()}` );
S3.putObject( {
Bucket: 'gpiocontroll-XYZ',
Key: 'test.txt',
Body: 'stuff'
} )
.promise()
.then( () => console.log( 'UPLOAD SUCCESS' ) )
.then( () => callback( null, 'MISSION SUCCESS' ) )
.catch( e => {
console.error( 'ERROR', e );
callback( e );
} );
};
Note: you must give the Lambda function IAM permissions to the S3 Bucket you are trying to access. In the case above, your IAM role should look something like this:
{
"Effect": "Allow",
"Action": [ "s3:PutObject" ],
"Resource": [
"arn:aws:s3:::gpiocontroll-XYZ/*"
]
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With