I could make celery reload itself automatically when there is changes on modules in CELERY_IMPORTS
in settings.py
.
I tried to give mother modules to detect changes even on child modules but it did not detect changes in child modules. That make me understand that detecting is not done recursively by celery. I searched it in the documentation but I did not meet any response for my problem.
It is really bothering me to add everything related celery part of my project to CELERY_IMPORTS
to detect changes.
Is there a way to tell celery that "auto reload yourself when there is any changes in anywhere of project".
Thank You!
Watchmedo is a command that is part of the watchdog package. I plan on using it for local Django/celery development. As I read about in Auto-reload celery on code changes, watchmedo supports an auto-restart argument. With this, it takes control of a long-running subprocess and restarts it on matched file system events.
This way, you delegate queue creation to Celery. You can use apply_async with any queue and Celery will handle it, provided your task is aware of the queue used by apply_async . If none is provided then the worker will listen only for the default queue.
celery beat is a scheduler; It kicks off tasks at regular intervals, that are then executed by available worker nodes in the cluster. By default the entries are taken from the beat_schedule setting, but custom stores can also be used, like storing the entries in a SQL database.
Celery --autoreload
doesn't work and it is deprecated.
Since you are using django, you can write a management command for that. Django has autoreload utility which is used by runserver to restart WSGI server when code changes.
The same functionality can be used to reload celery workers. Create a seperate management command called celery. Write a function to kill existing worker and start a new worker. Now hook this function to autoreload as follows.
import shlex import subprocess from django.core.management.base import BaseCommand from django.utils import autoreload def restart_celery(): cmd = 'pkill celery' subprocess.call(shlex.split(cmd)) cmd = 'celery worker -l info -A foo' subprocess.call(shlex.split(cmd)) class Command(BaseCommand): def handle(self, *args, **options): print('Starting celery worker with autoreload...') # For Django>=2.2 autoreload.run_with_reloader(restart_celery) # For django<2.1 # autoreload.main(restart_celery)
Now you can run celery worker with python manage.py celery
which will autoreload when codebase changes.
This is only for development purposes and do not use it in production. Code taken from my other answer here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With