I have a situation here. I have a celery task which uses python logger to log to a log file. However, When I run the celery workers, I can see the log messages on the screen. Its getting written to the log files but I also have loads of messages on my screen.
The issue arises when I put celery in supervisord. I have set std_out path and std_err path but all the worker log messages are getting written to the std_err log file instead of std_out log file. This way I am unable to get the real error messages in the std_err plus the log messages are redundant as its anyway getting written to the application log file.
I tried setting the loglevel while starting celery workers but nothing seems to stop the workers from flushing the log messages to the screen which when run in supervisord, gets written to the std_err.
Here is my supervisor conf file
[program:celery_worker]
command=celery -A CeleryWorker worker --concurrency=4 -l error
directory=/home/swaroop/codebase/src
stdout_logfile=/home/swaroop/codebase/logs/clLogger.log
stderr_logfile=/home/swaroop/codebase/logs/clerrorLogger.log
autostart=true
autorestart=true
Anyone has any solution to this issue? Any technique to segregate the real celery error messages from the application log messages would be helpful.
Unless you specify a celery worker logfile, it will log to stderr
. Therefore, use
-f LOGFILE
when you start a worker to avoid logging to screen.
For example:
$celery -A proj worker -f proj.log -l ERROR
If you are handling all Celery log messages with your own logger but want the startup error messages to go a specific log I do:
celery worker --app proj --logfile=/dev/null
This will stop all messages going to stdout
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With