Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do you avoid interrupting running Celery tasks when redeploying?

Tags:

celery

I have an application which has celery workers as part of it. When I deploy this will kill those running processes.

So the tasks will have started, but won't ever finish, and won't be restarted when the deployment finished.

What's the best way to avoid this issue and have those tasks restart when the deployment finishes?

Is it to use acks_late on all my tasks? Or another way?

@celery.task(acks_late=True)
def my_task():
    pass
like image 377
John Avatar asked Sep 12 '25 10:09

John


1 Answers

What we do as part of the deployment process is to send the shutdown signal to all Celery workers in the cluster when we are about to deploy new code. Once the signal has been sent we begin the deployment and once the deployment finishes successfully we have a new set of workers subscribed to a predefined set of queues.

Worker processes that were created by the old Celery workers may continue running for next few hours (even days!), but they will not run any newly scheduled tasks as the moment they got the shutdown signal they unsubscribed from all queues.

As you can see, this process does not involve revoking and re-scheduling of tasks as that would be a much more complicated process as we sometimes have few thousands tasks running concurrently on hundreds of Celery nodes...

like image 179
DejanLekic Avatar answered Sep 15 '25 08:09

DejanLekic



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!