I have a small script that enqueues tasks for processing. This script makes a whole lot of database queries to get the items that should be enqueued. The issue I'm facing is that the celery workers begin picking up the tasks as soon as it is enqueued by the script. This is correct and it is the way celery is supposed to work but this often leads to deadlocks between my script and the celery workers.
Is there a way I could enqueue all my tasks from the script but delay execution until the script has completed or until a fixed time delay?
I couldn't find this in the documentation of celery or django-celery. Is this possible?
Currently as a quick-fix I've thought of adding all the items to be processed into a list and when my script is done executing all the queries, I can simply iterate over the list and enqueue the tasks. Maybe this would resolve the issue but when you have thousands of items to enqueue, this might be a bad idea.
revoke cancels the task execution. If a task is revoked, the workers ignore the task and do not execute it. If you don't use persistent revokes your task can be executed after worker's restart. revoke has an terminate option which is False by default.
A task is just a Python function. You can think of scheduling a task as a time-delayed call to the function. For example, you might ask Celery to call your function task1 with arguments (1, 3, 3) after five minutes. Or you could have your function batchjob called every night at midnight.
Celery will stop retrying after 7 failed attempts and raise an exception.
Process of Task Execution by Celery can be broken down into:Your application sends the tasks to the task broker, it is then reserved by a worker for execution & finally the result of task execution is stored in the result backend.
eta/countdown options enable to delay the task execution:
https://docs.celeryq.dev/en/stable/userguide/calling.html#eta-and-countdown
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With