Register a task in the task registry. The task will be automatically instantiated if not already an instance. Name must be configured prior to registration. Unregister task by name.
The "shared_task" decorator allows creation of Celery tasks for reusable apps as it doesn't need the instance of the Celery app. It is also easier way to define a task as you don't need to import the Celery app instance.
Celery provides a way to both design a workflow for coordination and also execute tasks in parallel.
I think you need to restart the worker server. I meet the same problem and solve it by restarting.
I had the same problem:
The reason of "Received unregistered task of type.."
was that celeryd service didn't find and register the tasks on service start (btw their list is visible when you start
./manage.py celeryd --loglevel=info
).
These tasks should be declared in CELERY_IMPORTS = ("tasks", )
in settings file.
If you have a special celery_settings.py
file it has to be declared on celeryd service start as --settings=celery_settings.py
as digivampire wrote.
You can see the current list of registered tasks in the celery.registry.TaskRegistry
class. Could be that your celeryconfig (in the current directory) is not in PYTHONPATH
so celery can't find it and falls back to defaults. Simply specify it explicitly when starting celery.
celeryd --loglevel=INFO --settings=celeryconfig
You can also set --loglevel=DEBUG
and you should probably see the problem immediately.
Whether you use CELERY_IMPORTS
or autodiscover_tasks
, the important point is the tasks are able to be found and the name of the tasks registered in Celery should match the names the workers try to fetch.
When you launch the Celery, say celery worker -A project --loglevel=DEBUG
, you should see the name of the tasks. For example, if I have a debug_task
task in my celery.py
.
[tasks]
. project.celery.debug_task
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap
If you can't see your tasks in the list, please check your celery configuration imports the tasks correctly, either in --setting
, --config
, celeryconfig
or config_from_object
.
If you are using celery beat, make sure the task name, task
, you use in CELERYBEAT_SCHEDULE
matches the name in the celery task list.
I also had the same problem; I added
CELERY_IMPORTS=("mytasks")
in my celeryconfig.py
file to solve it.
app = Celery('proj',
broker='amqp://',
backend='amqp://',
include=['proj.tasks'])
please include=['proj.tasks'] You need go to the top dir, then exec this
celery -A app.celery_module.celeryapp worker --loglevel=info
not
celery -A celeryapp worker --loglevel=info
in your celeryconfig.py input imports = ("path.ptah.tasks",)
please in other module invoke task!!!!!!!!
Using --settings did not work for me. I had to use the following to get it all to work:
celery --config=celeryconfig --loglevel=INFO
Here is the celeryconfig file that has the CELERY_IMPORTS added:
# Celery configuration file
BROKER_URL = 'amqp://'
CELERY_RESULT_BACKEND = 'amqp://'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'America/Los_Angeles'
CELERY_ENABLE_UTC = True
CELERY_IMPORTS = ("tasks",)
My setup was a little bit more tricky because I'm using supervisor to launch celery as a daemon.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With