Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Google Cloud Functions memory limit exceeded

We do a lot of image processing in Google Cloud Functions using NodeJS and Sharp (libvips) library. Even though we have the memory limit for our functions set to 2Gb the function occasionally runs out of memory and crashes with the 'Error: memory limit exceeded. Function invocation was interrupted.' message.

Is there a way to catch this exception? I want to return a more polite (json) response so my server knows what the problem was.

like image 300
Kirk Olson Avatar asked Feb 13 '19 10:02

Kirk Olson


People also ask

What's the maximum memory size cloud functions can have allocated?

By default, the memory allocated for a function is 256MB or 256 MiB depending on the Cloud Functions product version.

How long can a Google cloud function run?

In Cloud Functions (1st gen), the maximum timeout duration is nine minutes (540 seconds). In Cloud Functions (2nd gen), the maximum timeout duration is 60 minutes (3600 seconds) for HTTP functions and 9 minutes (540 seconds) for event-driven functions.


1 Answers

Application-wide uncaught exceptions in NodeJS Google Cloud Platform apps need to be reported manually.

That being said, more details on the memory limit exceeded error could already be on the logs. You only need to search the error message on the Logs viewer from the GCP console as shown in the docs, or use Advanced Filters e.g. to search by time. The documentation also explains how to write log entries from your Cloud Functions. Then you can use the Stackdriver Logging API, for instance, to export the logs and get a json.

I would also suggest using Stackdriver Monitoring to track the memory usage of your Cloud Function.

like image 163
alextru Avatar answered Oct 27 '22 01:10

alextru