Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running a Celery task when unable to import that task

I have two servers: one running a django app and one running both a rabbitmq queue and a celery worker. My tasks.py on the server running the queue/worker contains a task as follows:

@task(queue="reports")
def test_task():
    time.sleep(120)

My goal is to execute this task from a django view. Since the code for the task is on a different server than the django view I'd like to call the task, I'm trying to use the following code to send the task from django to the worker machine.

send_task("tasks.test_task", task_id=task_id, args=[], kwargs={}, publisher=publisher, queue=queue)

I found this method here, but so far testing it hasn't worked.

I'm testing with tail -F on the celery worker logfile on the celery worker server, then navigating to the url of the view containing send_task in a browser. I'm looking for the task to show up as 'received' in the tail output, but it doesn't.

The celery worker's log level is DEBUG, the logfile shows that the task is registered with the proper name, and the django app's settings.py contains the correct IP and credentials for the rabbitmq server. In trying different approaches, I've occasionally seen an error message in the celery logfile when I changed the string passed to send_task to something that wasn't a valid task (ie send_task('asdf')). This caused an UnregisteredError in the logfile. However, this only happens sometimes, and so far in testing different combinations of settings and calls, I haven't found a way to reliably replicate the behavior.

Also, this is the relevant section of settings.py on the django project (with actual values removed):

CELERY_RESULT_BACKEND = 'amqp'
BROKER_HOST = 'the.correct.IP.address'
BROKER_USER = 'the_correct_user'
BROKER_PASSWORD = 'the_correct_pass'
BROKER_VHOST = 'the_correct_vhost'
BROKER_PORT = 5672

I've googled around and haven't found much on send_task. Any ideas on what I might be doing wrong?

like image 510
Emmett Butler Avatar asked Dec 19 '11 18:12

Emmett Butler


People also ask

What happens when a Celery task fails?

Celery will stop retrying after 7 failed attempts and raise an exception.

How does Celery task queue work?

Dedicated worker processes constantly monitor task queues for new work to perform. Celery communicates via messages, usually using a broker to mediate between clients and workers. To initiate a task the client adds a message to the queue, the broker then delivers that message to a worker.


2 Answers

Resolved, turns out the publisher keyword arg i was passing to send_task was invalid and throwing an error. Didn't see the error because I was AJAX-requesting the page rather than navigating to it directly. Everything else about this situation was correct. I also removed the unnecessary keyword args and args being passed to send_task.

send_task("tasks.test_task", task_id=task_id, queue=queue)
like image 147
Emmett Butler Avatar answered Oct 29 '22 15:10

Emmett Butler


What [I thought you were] trying to do is impossible. Celery workers require access to the task code they are to run. There's no way around that.

REVISED:

But what you really want to do is: have the code available to the workers but NOT to the Django view, which should refer to tasks only by name.

like image 35
dkamins Avatar answered Oct 29 '22 13:10

dkamins