i am working on celery and using rabbitmq server and created a project in django project in a server(where message queue,database exists) and it is working fine, i have created multiple workers also
from kombu import Exchange, Queue
CELERY_CONCURRENCY = 8
CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
CELERY_RESULT_BACKEND = 'amqp'
CELERYD_HIJACK_ROOT_LOGGER = True
CELERY_HIJACK_ROOT_LOGGER = True
BROKER_URL = 'amqp://guest:guest@localhost:5672//'
CELERY_QUEUES = (
Queue('default', Exchange('default'), routing_key='default'),
Queue('q1', Exchange('A'), routing_key='routingKey1'),
Queue('q2', Exchange('B'), routing_key='routingKey2'),
)
CELERY_ROUTES = {
'my_taskA': {'queue': 'q1', 'routing_key': 'routingKey1'},
'my_taskB': {'queue': 'q2', 'routing_key': 'routingKey2'},
}
AMQP_SERVER = "127.0.0.1"
AMQP_PORT = 5672
AMQP_USER = "guest"
AMQP_PASSWORD = "guest"
AMQP_VHOST = "/"`
CELERY_INCLUDE = ('functions')
`
but i want to run workers from another server.so i need some information regarding how to run a worker in another system when i referred few sites it is saying that we need to run the django project on the remote system also is it necessary?
You probably just need to add the --concurrency or -c argument when starting the worker to spawn multiple (parallel) worker instances. Show activity on this post. You can look for Canvas primitives there you can see how to make groups for parallel execution. class celery.
If you look at the celery DOCS on tasks you see that to call a task synchronosuly, you use the apply() method as opposed to the apply_async() method. The DOCS also note that: If the CELERY_ALWAYS_EAGER setting is set, it will be replaced by a local apply() call instead.
celery -A yourproject. app inspect status will give the status of your workers. celery -A yourproject. app inspect active will give you list of tasks currently running, etc.
Here is the gist of the idea:
On Machine A:
On Machine B:
I had the same requirement and experimented with celery. It is a lot easier to do that. I wrote a detailed blog post on that few days back. Check out how to send tasks to remote machine?
You can make use of app.send_task()
with something like the following in your django project:
from celery import Celery
import my_client_config_module
app = Celery()
app.config_from_object(my_client_config_module)
app.send_task('dotted.path.to.function.on.remote.server.relative.to.worker',
args=(1, 2))
First, think about how celery really work?
Celery producer adds a task to queue with name and other important headers to identify the location of your task.
Celery does not add a complete executable function to MQ.
So, When you look at worker(consumer) side.
Celery gets task details from MQ and tries to run this. To run this task there should be available module/files/environment/codebase to execute this task.
Now lets come to your question ...
You try to set worker on a separate machine so logically to execute a function pointed by the task you need complete code environment of tasks and you should connect(Otherwise how you gonna get tasks from MQ ?) with your MQ where tasks live.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With