Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Celery AsyncResult is always PENDING

Tags:

I'm working on a demo and the code is simple:

# The Config
class Config:
    BROKER_URL = 'redis://127.0.0.1:6379/0'
    CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/0'
    CELERY_ACCEPT_CONTENT = ['application/json']

# The Task
@celery_app.task()
def add(x, y):
    return x + y

To start the worker:

$ celery -A appl.task.celery_app worker --loglevel=info -broker=redis://localhost:6379/0

 -------------- celery@ALBERTATMP v3.1.13 (Cipater)
 ---- **** ----- 
 --- * ***  * -- Linux-3.2.0-4-amd64-x86_64-with-debian-7.6
 -- * - **** --- 
 - ** ---------- [config]
 - ** ---------- .> app:         celery_test:0x293ffd0
 - ** ---------- .> transport:   redis://localhost:6379/0
 - ** ---------- .> results:     disabled
 - *** --- * --- .> concurrency: 2 (prefork)
 -- ******* ---- 
 --- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery

To schedule task:

>>> from appl.task import add
>>> r = add.delay(1, 2)
>>> r.id
'c41d4e22-ccea-408f-b48f-52e3ddd6bd66'
>>> r.task_id
'c41d4e22-ccea-408f-b48f-52e3ddd6bd66'
>>> r.status
'PENDING'
>>> r.backend
<celery.backends.redis.RedisBackend object at 0x1f35b10>

Then the worker will execute the task:

[2014-07-29 17:54:37,356: INFO/MainProcess] Received task: appl.task.add[beeef023-c582-42e1-baf7-9e19d9de32a0]
[2014-07-29 17:54:37,358: INFO/MainProcess] Task appl.task.add[beeef023-c582-42e1-baf7-9e19d9de32a0] succeeded in 0.00108124599865s: 3 

But the result remains PENDING:

>>> res = add.AsyncResult(r.id)
>>> res.status
'PENDING'

I've tried the official FAQ. But it did not help.

>>> celery_app.conf['CELERY_IGNORE_RESULT']
False

What did I do wrong? Thanks!

like image 445
hbrls Avatar asked Jul 29 '14 09:07

hbrls


1 Answers

Its been a while, but am leaving this more for others who come along with a similar issue:

In your screenshot, you see that the results are disabled

enter image description here

When you instantiate your celery instance, make sure that you have the right config inputs

from celery import Celery,Task

# here im using an AMQP broker with a memcached backend to store the results
celery = Celery('task1',broker='amqp://guest:[email protected]:5672//',backend='cache+memcached://127.0.0.1:11211/')

For some reason, i always have trouble getting the celery instance parametered through the config file and hence explicitly passed in the broker and backend during instantiation as shown above

Now you'll see the results rightly configured to be memcached (in my instance - should be redis in yours). Also make sure that your task is picked up in the list of tasks (task1.add)

enter image description here

If you still cant get it to work, while starting celery try using the debug option as below

celery worker -A task1.celery -l debug

see if something is going wrong in the information it spews out

In my case, it fixed your error and result was set to success and i was able to recover 3 on r.get()

like image 133
Shankar ARUL Avatar answered Oct 05 '22 01:10

Shankar ARUL