Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Celery workers not consuming enough tasks

I have a strange issue with Celery.

I am using RabbitMQ as message broker and result backend.

Tasks are serialized via pickle, but they only get an id for a file in a database. They fetch it, do some work on it and write the result back to the database. I'm just storing the id in the result backend.

I use a group to supply the tasks and don't run any subtasks from within it.

I have one worker with concurrency=8 (prefork)

If I start the task, all 8 processes are working (100% cpu usage).

After the first task finishes, the strange behavior begins. The process does not begin a new task. The task get's initialized (I used CELERYD_MAX_TASKS_PER_CHILD=1) but the run method doesn't get called.

So the problem is, that not all processes are working all the time.

Tried many configuration settings but nothing changed this behavior.

Do you have any idea?

It's not the database etc. Running message broker and database locally. Also had a look on the workers with flower, it says that most of the time round about 4 processes are active. Other tasks are reserved, but don't start.

Hope u can help me!

like image 912
user2221323 Avatar asked Sep 05 '25 17:09

user2221323


1 Answers

Finally figured it out:

It's just an option I had to put then starting the worker.

Starting the worker with the -Ofair option did it!

See: http://docs.celeryproject.org/en/latest/userguide/optimizing.html#prefork-pool-prefetch-settings

Thanks for your help :)

like image 176
user2221323 Avatar answered Sep 07 '25 09:09

user2221323