from celery import Celery
app = Celery('abmp', backend='amqp://guest@localhost',broker='amqp://guest@localhost' )
@app.task(bind=True)
def add(self, a, b):
return a + b
from abmp import add
add.apply_async(
args=(5,7),
queue='push_tasks',
exchange='push_tasks',
routing_key='push_tasks'
)
celery -A abmp worker -E -Q push_tasks -l info
python2.7 execute_test.py。
Finally to the rabbitmq background view and found that the implementation of execute_test.py each time to generate a new queue, rather than the task thrown into push_tasks queue.
You are using AMQP as result backend. Celery stores each task's result as new queue, named with the task's ID. Use a better suited backend (Redis, for example) to avoid spamming new queues.
When you are using AMQP as the result backend for Celery, default behavior is to store every task result (for 1 day as per the faqs in http://docs.celeryproject.org/en/latest/faq.html).
As per the documentation on current stable version (4.1), this is deprecated and should not be used.
Your options are,
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With