I am trying to retreive the result of a task which has completed. This works
from proj.tasks import add res = add.delay(3,4) res.get() 7 res.status 'SUCCESS' res.id '0d4b36e3-a503-45e4-9125-cfec0a7dca30'
But I want to run this from another application. So I rerun python shell and try:
from proj.tasks import add res = add.AsyncResult('0d4b36e3-a503-45e4-9125-cfec0a7dca30') res.status 'PENDING' res.get() # Error
How can I retrieve the result?
Answer: Yes, but make sure it's unique, as the behavior for two tasks existing with the same id is undefined.
The "shared_task" decorator allows creation of Celery tasks for reusable apps as it doesn't need the instance of the Celery app. It is also easier way to define a task as you don't need to import the Celery app instance.
This way, you delegate queue creation to Celery. You can use apply_async with any queue and Celery will handle it, provided your task is aware of the queue used by apply_async . If none is provided then the worker will listen only for the default queue.
It works using AsyncResult
. (see this answer)
So first create the task:
from cel.tasks import add res = add.delay(3,4) print(res.status) # 'SUCCESS' print(res.id) # '432890aa-4f02-437d-aaca-1999b70efe8d'
Then start another python shell:
from celery.result import AsyncResult from cel.tasks import app res = AsyncResult('432890aa-4f02-437d-aaca-1999b70efe8d',app=app) print(res.state) # 'SUCCESS' print(res.get()) # 7
This is due to RabbitMQ not actually storing the results. If you need the ability to get the results later on, use redis or SQL as the result backend.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With