I've just installed Celery and am trying to follow the tutorial:
I have a file called tasks.py with the following code:
from celery import Celery
app = Celery('tasks', backend='amqp', broker='amqp://')
@app.task
def add(x, y):
return x + y
I installed RabitMQ (I did no configuring with it since the tutorial didn't mention anything of that sort).
I run the celery worker server as follows:
celery -A tasks worker --loglevel=info
It seems to start up normally (here is the output: http://i.imgur.com/qnoNCzJ.png)
Then I run a script with the following:
from tasks import add
from time import sleep
result = add.delay(2,2)
while not result.ready():
sleep(10)
When I check result.ready()
I always get False (so the while loop above runs forever). On the Celery logs, however, everything looks fine:
[2014-10-30 00:58:46,673: INFO/MainProcess] Received task: tasks.add[2bc4ceba-1319-49ce-962d-1ed0a424a2ce]
[2014-10-30 00:58:46,674: INFO/MainProcess] Task tasks.add[2bc4ceba-1319-49ce-962d-1ed0a424a2ce] succeeded in 0.000999927520752s: 4
So the task was recived and succeeds. Yet, result.ready()
is still False. Any insight as to why this might be? I am on Windows 7, and am using RabbitMQ. Thanks in advance.
A better solution is simply to let the task run asynchronously using celery like it was intended to be used and use javascript on the page to poll the celery task periodically to see the status. First, when you create your celery task use the bind=True parameter. This allows you to pass self into the function.
To cancel an already executing task with Celery and Python, we can use the revoke function. to call revoke with the task_id of the task to stop. And we set terminate to True to terminate the task.
Dedicated worker processes constantly monitor task queues for new work to perform. Celery communicates via messages, usually using a broker to mediate between clients and workers. To initiate a task the client adds a message to the queue, the broker then delivers that message to a worker.
Celery is an open source asynchronous task queue or job queue which is based on distributed message passing. While it supports scheduling, its focus is on operations in real time.
Should solve your problem
ignore_result=False
Okay, I've set up a clear VM with fresh celery install, set the following files:
tasks.py:
from celery import Celery
app = Celery('tasks', backend='amqp', broker='amqp://')
@app.task
def add(x, y):
return x + y
And runme.py
from tasks import add
import time
result = add.delay(1,2)
while not result.ready():
time.sleep(1)
print(result.get())
Then I set up celery with:
celery -A tasks worker --loglevel=info
And subsequently I run the runme.py which gives expected result:
[puciek@somewhere tmp]# python3.3 runme.py
3
So clearly the issue is within your setup, most likely somewhere in rabbit-mq installation, so I recommend reinstalling it with latest stable version from sources, which is what I am using, and as you can see - it works just fine.
Update:
Actually, your issue may be as trivial as imaginable - are you sure that you are using same version for celery run, and running your consumer? I just managed to reproduce it, where I've ran celery on Python3.3, and subsequently ran the runme.py with version 2.7. Result was as you've described.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With