We do a lot of image processing in Google Cloud Functions using NodeJS and Sharp (libvips) library. Even though we have the memory limit for our functions set to 2Gb the function occasionally runs out of memory and crashes with the 'Error: memory limit exceeded. Function invocation was interrupted.' message.
Is there a way to catch this exception? I want to return a more polite (json) response so my server knows what the problem was.
By default, the memory allocated for a function is 256MB or 256 MiB depending on the Cloud Functions product version.
In Cloud Functions (1st gen), the maximum timeout duration is nine minutes (540 seconds). In Cloud Functions (2nd gen), the maximum timeout duration is 60 minutes (3600 seconds) for HTTP functions and 9 minutes (540 seconds) for event-driven functions.
Application-wide uncaught exceptions in NodeJS Google Cloud Platform apps need to be reported manually.
That being said, more details on the memory limit exceeded error could already be on the logs. You only need to search the error message on the Logs viewer from the GCP console as shown in the docs, or use Advanced Filters e.g. to search by time. The documentation also explains how to write log entries from your Cloud Functions. Then you can use the Stackdriver Logging API, for instance, to export the logs and get a json.
I would also suggest using Stackdriver Monitoring to track the memory usage of your Cloud Function.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With