Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Trigger S3 create event

I use S3 Create events to trigger AWS-Lambdas. If my processing fails I want to do some magic and then trigger the "event" again to start my processing ones more. So far the only option I see to do this is re-uploading the file.

Can I trigger the event "again" without re-uploading the file?

I use Python and boto3.

like image 792
lony Avatar asked Apr 20 '16 16:04

lony


3 Answers

I came across a similar situation today where I needed to re-trigger a Lamda function after the file was already in S3. A co-worker of mine came up with the following that worked for us:

  1. Install the AWS cli tool
  2. Execute something like so:

    aws lambda invoke
        --function-name <lambda function name>
        --payload '{
            "Records":[{
                "s3":{
                    "bucket":{
                        "name":"<bucket name>"
                    },
                    "object":{
                        "key": "<key name>"
                    }
                }
            }]
        }' outfile
    
like image 95
Scott Woodall Avatar answered Oct 19 '22 03:10

Scott Woodall


It is not possible to have an S3 Event trigger again without uploading the file again. However, for a failed processing event if you are using Lambda it will automatically be retried 3 times per the FAQ:

For Amazon S3 bucket notifications and custom events, AWS Lambda will attempt execution of your function three times in the event of an error condition in your code or if you exceed a service or resource limit.

If your processing is failing and you want to have more control over the retry you could instead use SQS to receive the S3 Events. That way your application is able to read messages off the queue and if the processing failes/dies the visibility timeout will eventually be reached and the SQS message can be processed again. This way you can retry indefinitely and also control the visibility timeout period between successive retries.

If you are using Lambda and want to use SQS in combination, this is still possible by scheduling a Lambda function to run every 5 minutes and have that Lambda function read messages off of the queue. Combine this with the 5 minute limit for a Lambda functions run time you can nearly continuously consume messages off of an SQS queue.

like image 30
JaredHatfield Avatar answered Oct 19 '22 05:10

JaredHatfield


One method that is not mention here is that you can "touch" the metadata of the S3 object and it will trigger an event. This way you can get the event message without having to modify or fiddle with the original object data.

Note: the data in the metadata fields do not have to change to trigger the event.

Some strategies here:

  • Use a common metadata tag which can be used for triggering event
  • Get the metadata dictionary first then post it back with the same data
like image 5
James Roland Avatar answered Oct 19 '22 04:10

James Roland