If I make a change to tasks.py while celery is running, is there a mechanism by which it can re-load the updated code? or do I have to shut Celery down a re-load?
I read celery had an --autoreload
argument in older versions, but I can't find it in the current version:
celery: error: unrecognized arguments: --autoreload
Create a seperate management command called celery. Write a function to kill existing worker and start a new worker. Now hook this function to autoreload as follows. Now you can run celery worker with python manage.py celery which will autoreload when codebase changes.
Unfortunately --autoreload
doesn't work and it is deprecated.
You can use Watchdog which provides watchmedo a shell utilitiy to perform actions based on file events.
pip install watchdog
You can start worker with
watchmedo auto-restart -- celery worker -l info -A foo
By default it will watch for all files in current directory. These can be changed by passing corresponding parameters.
watchmedo auto-restart -d . -p '*.py' -- celery worker -l info -A foo
Add -R
option to recursively watch the files.
If you are using django and don't want to depend on watchdog, there is a simple trick to achieve this. Django has autoreload utility which is used by runserver to restart WSGI server when code changes.
The same functionality can be used to reload celery workers. Create a seperate management command called celery. Write a function to kill existing worker and start a new worker. Now hook this function to autoreload as follows. For Django >= 2.2
import sys import shlex import subprocess from django.core.management.base import BaseCommand from django.utils import autoreload class Command(BaseCommand): def handle(self, *args, **options): autoreload.run_with_reloader(self._restart_celery) @classmethod def _restart_celery(cls): if sys.platform == "win32": cls.run('taskkill /f /t /im celery.exe') cls.run('celery -A phoenix worker --loglevel=info --pool=solo') else: # probably ok for linux2, cygwin and darwin. Not sure about os2, os2emx, riscos and atheos cls.run('pkill celery') cls.run('celery worker -l info -A foo') @staticmethod def run(cmd): subprocess.call(shlex.split(cmd))
For django < 2.2
import sys import shlex import subprocess from django.core.management.base import BaseCommand from django.utils import autoreload class Command(BaseCommand): def handle(self, *args, **options): autoreload.main(self._restart_celery) @classmethod def _restart_celery(cls): if sys.platform == "win32": cls.run('taskkill /f /t /im celery.exe') cls.run('celery -A phoenix worker --loglevel=info --pool=solo') else: # probably ok for linux2, cygwin and darwin. Not sure about os2, os2emx, riscos and atheos cls.run('pkill celery') cls.run('celery worker -l info -A foo') @staticmethod def run(cmd): subprocess.call(shlex.split(cmd))
Now you can run celery worker with python manage.py celery
which will autoreload when codebase changes.
This is only for development purposes and do not use it in production.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With