Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python Celery task to restart celery worker

In celery, is there a simple way to create a (series of) task(s) that I could use to automagically restart a worker?

The goal is to have my deployment automagically restart all the child celery workers every time it gets a new source from github. So I could then send out a restartWorkers() task to my management celery instance on that machine that would kill (actually stopwait) all the celery worker processes on that machine, and restart them with the new modules.

The plan is for each machine to have:

  • Management node [Queues: Management, machine-specific] - Responsible for managing the rest of the workers on the machine, bringing up new nodes and killing old ones as necessary
  • Worker nodes [Queues: git revision specific, worker specific, machine specific] - Actually responsible for doing the work.

It looks like the code I need is somewhere in dist_packages/celery/bin/celeryd_multi.py, but the source is rather opaque for starting workers, and I can't tell how it's supposed to work or where it's actually starting the nodes. (It looks like shutdown_nodes is the correct code to be calling for killing the processes, and I'm slowly debugging my way through it to figure out what my arguments should be)

Is there a function/functions restart_nodes(self, nodes) somewhere that I could call or am I going to be running shell scripts from within python?

/Also, is there a simpler way to reload the source into Python than killing and restarting the processes? If I knew that reloading the module actually worked(Experiments say that it doesn't. Changes to functions do not percolate until I restart the process), I'd just do that instead of the indirection with management nodes.

EDIT: I can now shutdown, thanks to broadcast(Thank you mihael. If I had more rep, I'd upvote). Any way to broadcast a restart? There's pool_restart, but that doesn't kill the node, which means that it won't update the source.

I've been looking into some of the behind the scenes source in celery.bin.celeryd:WorkerCommand().run(), but there's some weird stuff going on before and after the run call, so I can't just call that function and be done because it crashes. It just makes 0 sense to call a shell command from a python script to run another python script, and I can't believe that I'm the first one to want to do this.

like image 353
Kevin Meyer Avatar asked Jan 21 '13 21:01

Kevin Meyer


1 Answers

You can try to use broadcast functionality of Celery.

Here you can see some good examples: https://github.com/mher/flower/blob/master/flower/api/control.py

like image 135
Michael Korbakov Avatar answered Oct 05 '22 11:10

Michael Korbakov