We have a server running celery workers and a Redis queue. The tasks are defined on that server.
I need to be able to call these tasks from a remote machine.
I know that it is done using send_task
but I still haven't figured out HOW? How do I tell send_task
where the queue is? Where do I pass connection params (or whatever needed)? I've been looking for hours and all I can find is this:
from celery.execute import send_task send_task('tasks.add')
Well, that means that I need celery
on my calling machine as well. But what else do I need to set up?
If you look at the celery DOCS on tasks you see that to call a task synchronosuly, you use the apply() method as opposed to the apply_async() method. The DOCS also note that: If the CELERY_ALWAYS_EAGER setting is set, it will be replaced by a local apply() call instead.
Process of Task Execution by Celery can be broken down into:Your application sends the tasks to the task broker, it is then reserved by a worker for execution & finally the result of task execution is stored in the result backend.
The "shared_task" decorator allows creation of Celery tasks for reusable apps as it doesn't need the instance of the Celery app. It is also easier way to define a task as you don't need to import the Celery app instance.
Celery communicates via messages, usually using a broker to mediate between clients and workers. To initiate a task, the Celery client adds a message to the queue, and the broker then delivers that message to a worker. The most commonly used brokers are Redis and RabbitMQ.
This may be a way: Creating a Celery object and using send_task from that object, the object can have the configuration to find the broker.
from celery import Celery celery = Celery() celery.config_from_object('celeryconfig') celery.send_task('tasks.add', (2,2))
celeryconfig is a file containing the celery configuration, there are other ways set config on the celery object.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With