I have upload a zip file to S3 bucket.I need to trigger my below lambda
function immediately once the zip file has been uploaded.Kindly help me how to proceed
exports.handler = function (event, context) {
MyLambdaFuntion();
}
MyLambdaFuntion()
{
var bucketName = "TestBucket1";
var fileKey = "test.js";
s3.getObject(params, function (err, data) {
if (err)
console.log(err, err.stack);
else {
console.log(data);
}
});
}
Amazon S3 can send an event to a Lambda function when an object is created or deleted. You configure notification settings on a bucket, and grant Amazon S3 permission to invoke a function on the function's resource-based permissions policy.
Create a Lambda Function to transform data for your use case. Create an S3 Object Lambda Access Point from the S3 Management Console. Select the Lambda function that you created above. Provide a supporting S3 Access Point to give S3 Object Lambda access to the original object.
There are some steps you need to follow correctly to do so.
step 1: First create your lambda function, select the runtime
and select blank function
or any blue print from the list.
step 2: Select the blank square and choose S3 from the list of services.
step 3: Select the bucket you want to trigger from and choose the event type. In your case it should be Object Created (All)
step 4: Enter prefix, incase if you have any folders inside the S3 and want to triggered only uploading to that folder.
step 5: Enter suffix, to triggered only for the specific suffix '.jpg'
step 6: Tick the enable trigger checkbox and choose Next.
step 7: Now give the fucntion a Name and description. If you want to upload the code or type in the editor there itself, change code entry type.
step 8: In Handler function choose index.handler
this is the function name it will call once the file is uploaded. Index is file name and handler is function name.
step 9: Choose create a custom role
and it directs to a new page there leave all the fields as it is, don't change anything and choose Allow
.
step 10: Now come back to old tab, Select the role --> choose from existing role
and select the newly created role name
step 11: Select Next, review all the selected options and click Create Function
.
Once created the function successfully, then go to trigger tab and you can see the S3 bucket configured for triggering.
Now start writing the code in the code editor or upload it from local to the lambda function in code tab.
Simple S3 code to read a file is below.
var aws = require('aws-sdk'),;
var s3 = new aws.S3({ apiVersion: '2006-03-01', accessKeyId: process.env.ACCESS_KEY, secretAccessKey: process.env.SECRET_KEY, region: process.env.LAMBDA_REGION });
exports.handler = function(event, context, exit){
//console.log('Received event:', JSON.stringify(event, null, 2));
// Get the object from the event and show its content type
const bucket = event.Records[0].s3.bucket.name;
const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
const params = {
Bucket: bucket,
Key: key,
};
s3.getObject(params, function(err, data){
if (err) {
console.log('ERROR ' + err);
exit(err);
} else {
// the data has the content of the uploaded file
}
});
};
Hope this helps!!!
The best option I see is to have a lambda function ready to run automatically every time a file is placed in bucket S3. When the lambda function is called, an event with information from the created file will be sent to the lambda function.
Here is an example of how to trigger:
next:
Here's an example code lambda nodejs to do this:
exports.handler = (event, context, callback) => {
var lastCreatedFile = event.Records[0].s3.object.key;
console.log(lastCreatedFile);
};
I hope it helped you!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With