I have my API, and some endpoints need to forward requests to Celery. Idea is to have specific API service that basically only instantiates Celery client and uses send_task() method, and seperate service(workers) that consume tasks. Code for task definitions should be located in that worker service. Basicaly seperating celery app (API) and celery worker to two seperate services.
I dont want my API to know about any celery task definitions, endpoints only need to use celery_client.send_task('some_task', (some_arguments))
. So on one service i have my API, an on other service/host I have celery code base where my celery worker will execute tasks.
I came across this great article that describes what I want to do. https://medium.com/@tanchinhiong/separating-celery-application-and-worker-in-docker-containers-f70fedb1ba6d and this post Celery - How to send task from remote machine?
I need help on how to create routes for tasks from the API? I was expecting for celery_client.send_task()
to have queue=
keyword, but it does not. I need to have 2 queues, and two workers that will consume content from these two queues.
Commands for my workers:
celery -A <path_to_my_celery_file>.celery_client worker --loglevel=info -Q queue_1
celery -A <path_to_my_celery_file>.celery_client worker --loglevel=info -Q queue_2
I have also visited celery "Routing Tasks" documentation, but it is still unclear to me how to establish this communication.
Celery does have a queue
parameter for send_task()
because it takes the same kwargs as apply_async()
https://docs.celeryq.dev/en/stable/reference/celery.html#celery.Celery.send_task https://docs.celeryq.dev/en/stable/reference/celery.app.task.html#celery.app.task.Task.apply_async
celery_client.send_task(
'some_task',
args=(some_arguments,),
queue='your_queue'
)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With