Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to configure and run celery worker on remote system

i am working on celery and using rabbitmq server and created a project in django project in a server(where message queue,database exists) and it is working fine, i have created multiple workers also

from kombu import Exchange, Queue
CELERY_CONCURRENCY = 8

CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']

CELERY_RESULT_BACKEND = 'amqp'
CELERYD_HIJACK_ROOT_LOGGER = True
CELERY_HIJACK_ROOT_LOGGER = True
BROKER_URL = 'amqp://guest:guest@localhost:5672//'

CELERY_QUEUES = (
  Queue('default', Exchange('default'), routing_key='default'),
  Queue('q1', Exchange('A'), routing_key='routingKey1'),
  Queue('q2', Exchange('B'), routing_key='routingKey2'),
)
CELERY_ROUTES = {
 'my_taskA': {'queue': 'q1', 'routing_key': 'routingKey1'},
 'my_taskB': {'queue': 'q2', 'routing_key': 'routingKey2'},
}


AMQP_SERVER = "127.0.0.1"
AMQP_PORT = 5672
AMQP_USER = "guest"
AMQP_PASSWORD = "guest"
AMQP_VHOST = "/"`


CELERY_INCLUDE = ('functions')

`

but i want to run workers from another server.so i need some information regarding how to run a worker in another system when i referred few sites it is saying that we need to run the django project on the remote system also is it necessary?

like image 814
krishna Avatar asked Nov 18 '14 04:11

krishna


People also ask

How do you make multiple workers with celery?

You probably just need to add the --concurrency or -c argument when starting the worker to spawn multiple (parallel) worker instances. Show activity on this post. You can look for Canvas primitives there you can see how to make groups for parallel execution. class celery.

How do you call celery synchronously?

If you look at the celery DOCS on tasks you see that to call a task synchronosuly, you use the apply() method as opposed to the apply_async() method. The DOCS also note that: If the CELERY_ALWAYS_EAGER setting is set, it will be replaced by a local apply() call instead.

How do I check the status of my celery worker?

celery -A yourproject. app inspect status will give the status of your workers. celery -A yourproject. app inspect active will give you list of tasks currently running, etc.


3 Answers

Here is the gist of the idea:

On Machine A:

  1. Install Celery & RabbitMQ.
  2. Configure rabbitmq so that Machine B can connect to it.
  3. Create my_tasks.py with some tasks and put some tasks in queue.

On Machine B:

  1. Install Celery.
  2. Copy my_tasks.py file from machine A to this machine.
  3. Run a worker to consume the tasks

I had the same requirement and experimented with celery. It is a lot easier to do that. I wrote a detailed blog post on that few days back. Check out how to send tasks to remote machine?

like image 185
Pandikunta Anand Reddy Avatar answered Oct 20 '22 11:10

Pandikunta Anand Reddy


You can make use of app.send_task() with something like the following in your django project:

from celery import Celery
import my_client_config_module

app = Celery()
app.config_from_object(my_client_config_module)

app.send_task('dotted.path.to.function.on.remote.server.relative.to.worker',
              args=(1, 2))
like image 38
lajarre Avatar answered Oct 20 '22 13:10

lajarre


First, think about how celery really work?

Celery producer adds a task to queue with name and other important headers to identify the location of your task.

Celery does not add a complete executable function to MQ.

So, When you look at worker(consumer) side.

Celery gets task details from MQ and tries to run this. To run this task there should be available module/files/environment/codebase to execute this task.

Now lets come to your question ...

You try to set worker on a separate machine so logically to execute a function pointed by the task you need complete code environment of tasks and you should connect(Otherwise how you gonna get tasks from MQ ?) with your MQ where tasks live.

like image 2
GrvTyagi Avatar answered Oct 20 '22 13:10

GrvTyagi