Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

DEADLINE_EXCEEDED while adding tasks to Google Cloud Tasks

I'm occasionally getting an error called DEADLINE_EXCEEDED while adding tasks to Google Cloud Tasks from Node.js client. It's only 17 tasks that I tried to add in a for-loop. Not sure why this is happening occasionally and sometimes works perfectly fine!

Here is the detailed error log:

Error: 4 DEADLINE_EXCEEDED: Deadline exceeded
    at Object.callErrorFromStatus (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/call.js:30:26)
    at Object.onReceiveStatus (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/client.js:175:52)
    at Object.onReceiveStatus (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:341:141)
    at Object.onReceiveStatus (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:304:181)
    at Http2CallStream.outputStatus (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/call-stream.js:115:74)
    at Http2CallStream.maybeOutputStatus (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/call-stream.js:154:22)
    at Http2CallStream.endCall (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/call-stream.js:140:18)
    at Http2CallStream.cancelWithStatus (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/call-stream.js:443:14)
    at Timeout.<anonymous> (/Users/gijo/Desktop/workspace/flying-press-server/node_modules/@grpc/grpc-js/build/src/deadline-filter.js:59:28)
    at listOnTimeout (internal/timers.js:549:17) {
  code: 4,
  details: 'Deadline exceeded',
  metadata: Metadata { internalRepr: Map {}, options: {} }
}

I thought I hit some quotas. But on checking, none of them has exceeded.

enter image description here

Here is how I've created the queue (upsert):

const client = require('./client');

module.exports = async (queue_name, concurrent) => {
  return client.updateQueue({
    queue: {
      name: client.queuePath(
        process.env.GCP_PROJECT,
        process.env.GCP_QUEUE_LOCATION,
        queue_name,
      ),
      rateLimits: {
        maxConcurrentDispatches: concurrent,
        maxDispatchesPerSecond: concurrent,
        maxBurstSize: 100,
      },
      retryConfig: {
        maxAttempts: 3,
        unlimitedAttempts: false,
        maxRetryDuration: {
          seconds: 3600,
        },
        minBackoff: {
          seconds: 60,
        },
        maxBackoff: {
          seconds: 300,
        },
        maxDoublings: 3,
      },
    },
  });
};

and here is how I'm adding tasks:

const client = require('./client');

module.exports = async (queue_name, url, config) => {
  return client.createTask({
    parent: client.queuePath(
      process.env.GCP_PROJECT,
      process.env.GCP_QUEUE_LOCATION,
      queue_name,
    ),
    task: {
      httpRequest: {
        httpMethod: 'POST',
        headers: {
          'Content-Type': 'application/json',
        },
        url: `${process.env.OPTIMIZER_URL}/optimize-page`,
        body: Buffer.from(JSON.stringify({ url, config })).toString(
          'base64',
        ),
      },
    },
  });
};
like image 507
Gijo Varghese Avatar asked Apr 15 '26 06:04

Gijo Varghese


1 Answers

I've hit this exact issue before in Cloud Functions and App Engine when creating tasks. It happened because I wasn't properly awaiting on the create task promise. This causes the request to hang open and eventually hit a timeout, but it only happens sporadically. Basically, the request ends before the async call is completed, which causes issues in "serverless" runtimes like App Engine, Cloud Functions, Cloud Run, etc.

To fix this, ensure you are doing an await on the call to createTask or otherwise await on all of the task promises before returning from the request, e.g. with Promise.all().

like image 97
Tyler Treat Avatar answered Apr 18 '26 02:04

Tyler Treat