Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SQS dead letter queue not triggered on AWS Lambda invocation errors

I have an AWS Lambda function, which subscribes to a DynamoDB stream and is configured with an SQS dead letter queue (DLQ). I can see that the correct queue is configured in the Management Console. Also I took care to give my function permissions for sqs:SendMessage on my DLQ.

The subscription works, but still "hangs" on invocation errors as if no DLQ were configured. I.e., if there is a message, which leads to an unhandled exception, the function continues to retry this message until it's dropped from the stream. I can see that the number of invocation errors rises, and no DLQ errors are shown in the function's Cloudwatch dashboard. The SQS queue remains empty.

What I want is that failed messages get forwarded to my DLQ and the subscription continues to the next message. Any ideas?

Edit

As Jonathan Seed said below, DLQ's currently don't work with stream-based subscriptions. AWS Support confirmed that they're working on implementing this though.

like image 899
EagleBeak Avatar asked Mar 24 '17 11:03

EagleBeak


People also ask

Can a dead-letter queue trigger Lambda?

You can use an Amazon Lambda function to process messages in an Amazon SQS queue. Lambda polls the queue and invokes your Lambda function synchronously with an event that contains queue messages. You can specify another queue to act as a dead-letter queue for messages that your Lambda function can't process.

What happens when Lambda fails to process SQS message?

If a Lambda function throws an error, the Lambda service continues to process the failed message until: The message is processed without any error from the function, and the service deletes the message from the queue. The Message retention period is reached and SQS deletes the message from the queue.


2 Answers

I believe this is because DynamoDB streams are stream based event sources. The lambda documentation states that when dealing with stream based event sources "if a Lambda function fails, AWS Lambda attempts to process the erring batch of records until the time the data expires"

From my understanding, the lambda function will retry until the event is either processed successfully or expires and disappears from the stream, the event is never "discarded" by the lambda function, as they are in non-stream based event sources.

You may have to implement your own failure handling as a part of your main lambda function if you wish to discard certain events, posting the event manually to a queue/topic and returning succesfully.

like image 138
Jonathan Seed Avatar answered Oct 21 '22 04:10

Jonathan Seed


  • Using DynamoDB streams to trigger lambda means that you are using synchronous invocation. However, DLQ is only available for asynchronous invocations.
  • The good news is that in November 2019, AWS published new error handling mechanisms for Kinesis and DynamoDB event source.

With this feature, you can configure a destination on failure. This destination can be an SNS topic, SQS queue, another lambda function, or an EventBridge event bus.

For adding this through the console UI,

  1. Go to the lambda function
  2. Click on the Add Destination button
  3. Select Stream invocation
  4. Select on failure condition
  5. Select SQS queue as the destination and point it to the SQS that you want to use like a DLQ.

For adding it through cloudformation, follow this documentation.
I'll provide a basic example for the trigger that you need to attach to your lambda function:

LambdaTrigger:
  Type: AWS::Lambda::EventSourceMapping
  Properties:
    FunctionName: !GetAtt Lambda.Arn
    EventSourceArn: !GetAtt TableName.StreamArn
    DestinationConfig:
      OnFailure:
        Destination: !GetAtt DLQ.Arn
like image 25
PodGen4 Avatar answered Oct 21 '22 04:10

PodGen4