Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AWS Lambda to run in background even after sending response to API Gateway

I have searched all through the net but didn't find a solution of how to make this functionality get succeeded. Require help.

My requirement is: I want a that if I trigger an aws lambda function written in node.js and uses an aws-serverless-express module must send back response quickly to API gateway but still should not exit and still run in the backend and we could see cloud watch logs. It must be asynchronous.

The code snippet is:

    app.get('/check', function(req, res){
     method.invoke(req)
     res.status(200).send('success')
   })

I did and checked like this but the lambda function gets stopped and returns the response to api gateway it didn't even runs the method.invoke() function in backend.

Please correct me if anything I am understanding or doing wrong. I checked with this link: Invoke AWS Lambda and return response to API Gateway asyncronously

Is it the only way to do this problem. Creating two lambda functions.

like image 463
learner Avatar asked Jun 21 '17 06:06

learner


People also ask

How do you invoke Lambda function asynchronously using API gateway?

You can invoke a Lambda function asynchronously via API Gateway only if the integration is non-proxy. By default, HTTP APIs are designed to support only proxy integrations for Lambda and HTTP endpoints so it is not possible to set the X-Amz-Invocation-Type header in the API Gateway integration config.

Does API gateway invoke Lambda synchronously?

Each request method (GET, PUT, POST) can be matched to a different Lambda function or a single function can serve all requests from an endpoint (or a group of endpoints). API Gateway will invoke Lambda synchronously. Beware that, even though Lambda timeout limit is 15 minutes, API Gateway is limited to 29 seconds 8.

How AWS Lambda works behind the scenes?

Lambda creates the Execution environment (worker) on a fleet of EC2 instances. These workers are bare metal Nitro instances which are launched in a seperate inaccessible AWS account. These workers have hardware-virtualized MVMs (Micro Virtual Machines) created by Firecracker (Linux's Kernel-based Virtual Machine).


3 Answers

You can achieve this by using AWS Lambda Step functions, connected to API Gateway, having parallel execution of branches with two lambda functions, where one returns a response to API Gateway and other executes asynchronously.

like image 197
Ashan Avatar answered Oct 19 '22 12:10

Ashan


Besides Step Functions, you could just invoke another Lambda function using the SDK built-in to the Lambda environment.

I'm no expert in express or NodeJS but I would also think there should be a way to send the HTTP response back and still continue code execution.

like image 3
jackko Avatar answered Oct 19 '22 12:10

jackko


Can't find a link to the documentation of AWS, but normally it is not possible to continue processing after the Lambda function has returned the response. That's just not how the available runtimes (for the different programming languages) are constructed.

Next to invoking separate asynchronous processes (e.g., other Lambda function requests, or putting work on a queue) or using AWS Step functions as mentioned here, there's a third method that I know that works: supply a special custom runtime for the AWS Lambda functions that addresses this need.

Next to the standard runtimes, you can create and specify a custom runtime to be used for your AWS Lambda functions. In the standard runtimes, the response of your handler is being posted to the Lambda execution context, after which no activities are possible in your handler because the handler is being terminated (or at least: paused).

So, the trick to make additional processing possible after sending the response is to move the responsibility of posting the response to the Lambda operating content from the bootstrap script to the Lambda function handler itself... and continue to do your processing in your Lambda function handler after already having sent the response. Using your custom runtime, processing in the Lambda functions will then not be terminated after having sent the response, since it's not how your custom runtime is constructed.

It's not the architecturally-best solution, as it messes with the responsibilities between the Lambda operating context and your Lambda functions handlers... but it makes it possible to do processing in your Lambda function handlers after having sent the response.

like image 3
Jochem Schulenklopper Avatar answered Oct 19 '22 12:10

Jochem Schulenklopper