Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

why each time a new queue is generated by celery+rabbitmq?

Tags:

python

celery

abmp.py:

from celery import Celery 
app = Celery('abmp', backend='amqp://guest@localhost',broker='amqp://guest@localhost' ) 

@app.task(bind=True) 
def add(self, a, b): 
    return a + b 

execute_test.py

from abmp import add 

add.apply_async( 
args=(5,7), 
queue='push_tasks', 
exchange='push_tasks', 
routing_key='push_tasks' 
) 

execute celery

celery -A abmp worker -E -Q push_tasks -l info 

execute execute_test.py

python2.7 execute_test.py。

Finally to the rabbitmq background view and found that the implementation of execute_test.py each time to generate a new queue, rather than the task thrown into push_tasks queue.

like image 408
shenyang Avatar asked Sep 18 '25 21:09

shenyang


2 Answers

You are using AMQP as result backend. Celery stores each task's result as new queue, named with the task's ID. Use a better suited backend (Redis, for example) to avoid spamming new queues.

like image 177
tuomur Avatar answered Sep 20 '25 11:09

tuomur


When you are using AMQP as the result backend for Celery, default behavior is to store every task result (for 1 day as per the faqs in http://docs.celeryproject.org/en/latest/faq.html).

As per the documentation on current stable version (4.1), this is deprecated and should not be used.

Your options are,

  • Use result_expires setting, if you plan to go ahead with amqp as backend.
  • Use a different backend (like redis)
  • If you dont need the results at all, user ignore_result setting
like image 22
codeara Avatar answered Sep 20 '25 10:09

codeara