Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Celery worker hangs without any error

I have a production setup for running celery workers for making a POST / GET request to remote service and storing result, It is handling load around 20k tasks per 15 min.

The problem is that the workers go numb for no reason, no errors, no warnings.

I have tried adding multiprocessing also, the same result.

In log I see the increase in the time of executing task, like succeeded in s

For more details look at https://github.com/celery/celery/issues/2621

like image 214
Maddy Avatar asked May 16 '15 07:05

Maddy


2 Answers

If your celery worker get stuck sometimes, you can use strace & lsof to find out at which system call it get stuck.

For example:

$ strace -p 10268 -s 10000
Process 10268 attached - interrupt to quit
recvfrom(5,

10268 is the pid of celery worker, recvfrom(5 means the worker stops at receiving data from file descriptor.

Then you can use lsof to check out what is 5 in this worker process.

lsof -p 10268
COMMAND   PID USER   FD   TYPE    DEVICE SIZE/OFF      NODE NAME
......
celery  10268 root    5u  IPv4 828871825      0t0       TCP 172.16.201.40:36162->10.13.244.205:wap-wsp (ESTABLISHED)
......

It indicates that the worker get stuck at a tcp connection(you can see 5u in FD column).

Some python packages like requests is blocking to wait data from peer, this may cause celery worker hangs, if you are using requests, please make sure to set timeout argument.


Have you seen this page:

https://www.caktusgroup.com/blog/2013/10/30/using-strace-debug-stuck-celery-tasks/

like image 73
Gary Gauh Avatar answered Oct 09 '22 11:10

Gary Gauh


I also faced the issue, when I was using delay shared_task with celery, kombu, amqp, billiard. After calling the API when I used delay() for @shared_task, all functions well but when it goes to delay it hangs up.

So, the issue was In main Application init.py, the below settings were missing

This will make sure the app is always imported when # Django starts so that shared_task will use this app.

In init.py

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celeryApp

#__all__ = ('celeryApp',)
__all__ = ['celeryApp']
    

Note1: In place of celery_app put the Aplication name, means the Application mentioned in celery.py import the App and put here

Note2:** If facing only hangs issue in shared task above solution may solve your issue and ignore below matters.

Also wanna mention A=another issue, If anyone facing Error 111 connection issue then please check the versions of amqp==2.2.2, billiard==3.5.0.3, celery==4.1.0, kombu==4.1.0 whether they are supporting or not. Mentioned versions are just an example. And Also check whether redis is install in your system(If any any using redis).

Also make sure you are using Kombu 4.1.0. In the latest version of Kombu renames async to asynchronous.

like image 5
Vinay Kumar Avatar answered Oct 09 '22 11:10

Vinay Kumar