Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Celery logger configuration

I'm using Django 1.10, python 3.5 and celery 4.1.0 I'm trying to log celery tasks info into a file. So I tried as suggested in celery documentation -

from celery.utils.log import get_task_logger

logger = get_task_logger(__name__)

and tried to log a message inside the task -

logger.info(message)

I expected it to log to my default logger. But it didn't. So I added to settings a dedicated logger named 'celery.task' (as I understand from documentation):

LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'filters': {
    'require_debug_false': {
        '()': 'django.utils.log.RequireDebugFalse',
    },
    'require_debug_true': {
        '()': 'django.utils.log.RequireDebugTrue',
    },
    'require_test_false': {
        '()': 'myapp.utils.classes.logging.RequireTestFalse',
    },
    'suppress_deprecated': {
        '()': 'myapp.utils.classes.logging.SuppressDeprecated'
    }
},
'handlers': {
    'console': {
        'level': 'INFO',
        'class': 'logging.StreamHandler',
        'formatter': 'json',
        'filters': ['suppress_deprecated']
    },
    'celery_file': {
        'level': 'INFO',
        'class': 'myapp.utils.classes.logging.SBRotatingFileHandler',
        'maxBytes': 1024 * 1024 * 200,  # 200 MB
        'backupCount': 10,
        'formatter': 'json',
        'filename': BASE_DIR + '/../log/celery.log',
    }
},
'loggers': {
    'django': {
        'handlers': ['console', 'file'],
        'level': LOG_LEVEL,
        'propagate': True,
    },
    'celery.task': {
        'handlers': ['console', 'celery_file'],
        'level': 'INFO',
        'propagate': True,
            },
}

But I still don't see logs from celery task not in the celery.log file nor in the default log file.

Only when starting celery worker with '-f' - it writes logs to that file Any ideas?

EDIT: I'm trying to use 'after_setup_task_logger' to update the celery.task logger handler to use a handler that exists in my logging.config (in settings) unsuccessfully. I've tried the following:

 @celery.signals.after_setup_task_logger.connect
 def after_setup_logging(logger, **kwargs):
     logging_settings = settings.LOGGING
     celery_handler = logging_settings['handlers']['celery_file']
     logger.addHandler(celery_handler)

But that doesn't work. I'm getting

AttributeError: 'dict' object has no attribute 'createLock'

Which means the handler was not properly created. So I tried getting the handler from 'logging' object. But I don't see my handler in both logging._handlers and logging._handlersList

UPDATE: That's what finally worked for me:

def create_celery_logger_handler(logger, propagate):
    # 209715200 is 1024 * 1024 * 200 or 200 MB, same as in settings
    celery_handler = RotatingFileHandler(
        settings.CELERY_LOG_FILE,
        maxBytes=209715200,
        backupCount=10
    )
    celery_formatter = jsonlogger.JsonFormatter(settings.LOGGING['formatters']['json']['format'])
    celery_handler.setFormatter(celery_formatter)

    logger.addHandler(celery_handler)
    logger.logLevel = settings.LOG_LEVEL
    logger.propagate = propagate


@celery.signals.after_setup_task_logger.connect
def after_setup_celery_task_logger(logger, **kwargs):
    """ This function sets the 'celery.task' logger handler and formatter """
    create_celery_logger_handler(logger, True)


@celery.signals.after_setup_logger.connect
def after_setup_celery_logger(logger, **kwargs):
    """ This function sets the 'celery' logger handler and formatter """
    create_celery_logger_handler(logger, False)
like image 992
user2880391 Avatar asked Jan 16 '18 21:01

user2880391


People also ask

How do you make a Celery log?

Celery have specific option -f --logfile which you can use: -f LOGFILE, --logfile=LOGFILE Path to log file. If no logfile is specified, stderr is used.

How does Celery beat?

celery beat is a scheduler. It kicks off tasks at regular intervals, which are then executed by the worker nodes available in the cluster. By default the entries are taken from the CELERYBEAT_SCHEDULE setting, but custom stores can also be used, like storing the entries in an SQL database.


2 Answers

For what it's worth, this is how I configured celery to use my Django logging settings:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.signals import setup_logging

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app.settings')

app = Celery('app')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

@setup_logging.connect
def config_loggers(*args, **kwags):
    from logging.config import dictConfig
    from django.conf import settings
    dictConfig(settings.LOGGING)

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

That was about it - after that change my django logging settings worked for celery, including the logging I had setup to send log messages over to slack.

like image 125
chander Avatar answered Oct 20 '22 05:10

chander


By default, celery will reset handers on celery.task logger, you could disable this behavior with worker_hijack_root_logger option. Or, you could reconfigure this logger in after_setup_task_logger signal, even dont let celery config the loggers with setup_logging signal:

from celery.signals import setup_logging

@setup_logging.connect()
def config_loggers(*args, **kwargs):
    from logging.config import dictConfig
    dictConfig(app.config['LOGGING_CONFIG'])
like image 7
georgexsh Avatar answered Oct 20 '22 04:10

georgexsh