I wrote an django app named "task", and added it to *INSTALLED_APPS*.
However when I tried to call it in django shell it raised a NotRegistered expception. Detail information is below:
from task.tasks import add
from celery import registry
# 'task.tasks.add' is registered like below
registry.tasks # 'task.tasks.add': <@task: task.tasks.add>
# Call add()
r = add.delay(3, 4)
r.successful() # print "False"
################ celery debug info: #############################
The full contents of the message body was:
{'retries': 0, 'task': 'task.tasks.add', 'args': (3, 4), 'expires': None, 'eta': None, 'kwargs': {}, 'id': '36d25389-7a0b-4a0a-98f8-d7a17ef9192e'}
Traceback (most recent call last):
File "/usr/local/lib/python2.6/site-packages/celery/worker/consumer.py", line 427, in receive_message
eventer=self.event_dispatcher)
File "/usr/local/lib/python2.6/site-packages/celery/worker/job.py", line 297, in from_message
on_ack=on_ack, delivery_info=delivery_info, **kw)
File "/usr/local/lib/python2.6/site-packages/celery/worker/job.py", line 261, in __init__
self.task = registry.tasks[self.task_name]
File "/usr/local/lib/python2.6/site-packages/celery/registry.py", line 66, in __getitem__
raise self.NotRegistered(key)
NotRegistered: 'task.tasks.add'
UPDATED:
My task definition:
from celery.task import task
@task
def add(x, y):
return x + y
I bet that the name registered in the worker is not the same as in the client.
Start celeryd with
celery worker -l info
To see a list of registered tasks, then make sure that the task you want is listed with the exact same name.
See here for the reason why this is important, and some common causes: http://docs.celeryproject.org/en/latest/userguide/tasks.html#task-names , and especially: http://docs.celeryproject.org/en/latest/userguide/tasks.html#automatic-naming-and-relative-imports
If your task is listed with the same name then you could have an old worker still running that is not updated with the latest code. Kill all running workers with
ps auxww | awk ' /celeryd/ {print $2}' | xargs kill -9
(note this will terminate all running tasks, and you may not get them back when using the redis transport)
In the future you should make sure you don't launch new workers on top of old
ones by using the --pidfile
argument to celeryd.
@linux-warrior: Actually, the task decorator supports both invocations (with parents or without), by using some dark magic :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With