I have a Django 1.11 and Celery 4.1 project, and I've configured it according to the setup docs. My celery_init.py
looks like
from __future__ import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings.settings'
app = Celery('myproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
#app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) # does nothing
app.autodiscover_tasks() # also does nothing
print('Registering debug task...')
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
However, when I launch a worker with:
.env/bin/celery worker -A myproject -l info
it shows no tasks being found except for the sample "debug_task", even though I have several installed apps with Celery tasks, with should have been found via the call to app.autodiscover_task()
. This is the initial output my worker generates:
-------------- celery@localhost v4.1.0 (latentcall)
---- **** -----
--- * *** * -- Linux-4.13.0-16-generic-x86_64-with-Ubuntu-16.04-xenial 2017-10-31 15:56:42
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: myproject:0x7f952856d650
- ** ---------- .> transport: amqp://guest:**@localhost:5672//
- ** ---------- .> results: amqp://
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. myproject.celery_init.debug_task
[2017-10-31 15:56:42,180: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2017-10-31 15:56:42,188: INFO/MainProcess] mingle: searching for neighbors
[2017-10-31 15:56:43,211: INFO/MainProcess] mingle: all alone
[2017-10-31 15:56:43,237: INFO/MainProcess] celery@localhost ready.
All my legacy tasks in my app tasks.py
files were defined like:
from celery.task import task
@task(name='mytask')
def mytask():
blah
The docs suggest using the shared_task
decorator, so instead I tried:
from celery import shared_task
@shared_task
def mytask():
blah
But my Celery worker still doesn't see it. What am I doing wrong?
Edit: I've been able to get tasks to show up by explicitly listing them in my setting's CELERY_IMPORTS
list, but even then I have to heavily edit the tasks.py
to remove all imports of my Django project (models.py, etc) or it raises the exception Apps aren't loaded yet.
This is better than nothing, but requires a huge amount of refactoring. Is there a better way?
I had a similar issue, and the solution was to add the include
kwarg to your celery call.
The include argument is a list of modules to import when the worker starts. You need to add our tasks module here so that the worker is able to find our tasks.
app = Celery('myproject',
backend = settings.CELERY.get('backend'),
broker = settings.CELERY.get('broker'),
include = ['ingest.tasks.web', ... ])
Check out http://docs.celeryproject.org/en/latest/getting-started/next-steps.html#proj-celery-py for more information
Just posting this here (I don't know why it works)
from django.conf import settings
app.config_from_object(settings, namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS, force=True)
force=True
seems to be the solution
Another thing that works is calling django.setup()
before instantiating celery.
from __future__ import absolute_import, unicode_literals
import os
import django
from celery import Celery
django.setup() # This is key
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('notifs')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
This method avoids force=True
, import django.conf.settings
and seems cleaner to me. Though i still have no idea why you need to call django.setup
because It isn't stated in the docs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With