Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

django+uwsgi logging with TimedRotatingFileHandler "overwrites rotated log file"

I recently switched my django production web app from apache+mod_wsgi to nginx+uwsgi in emperor mode. All it's ok except Time Rotated log files. My web app uses a log file named appname.log to log all requests, and with apache it rotates at midnight without problems.

With uwsgi the file rotates at midnight but some uwsgi process/worker writes into this rotated file (example rotated file: appname.log.2017-01-08) instead of write into appname.log , this cause that the rotated file is overwritten.

A solution seems to be touching the uwsgi .ini file (I'm not completely sure...), but I don't want to restart/reload uwsgi if a user is still connected to my app.

There is a possibility or a configuration that I can use to notify to all uwsgi process that the logfile is changed without restarting the web app? If possible I would have the same behaviour that I have in apache+mod_wsgi.

ConcurrentLogHandler, is too old and I don't want use syslog or logrotate :)

Someone have same problems? someone have suggestions?

thanx

This is my setting:

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'formatters': {
        'verbose': {
            'format': '[%(asctime)s];[%(levelname)s];[Proc:%(process)d];[Thread:%(thread)d];%(module)s-%(funcName)s:%(lineno)d;Msg:%(message)s;'
        },
        'simple': {
            'format': '[%(asctime)s] [%(levelname)s] %(message)s'
        },
    },
    'filters': {
        'require_debug_false': {
            '()': 'django.utils.log.RequireDebugFalse'
        }
    },
    'handlers': {
        'file': {
            'level': 'DEBUG',
            'class': 'logging.handlers.TimedRotatingFileHandler',
            'filename': LOG_FILE,
            'when': 'midnight',
            'interval': 1,
            'backupCount': 365,
            'formatter': 'verbose'
        },
        'null': {
            'level': 'DEBUG',
            'class': 'logging.NullHandler',
        },
        'console': {
            'level': 'DEBUG',
            'class': 'logging.StreamHandler',
            'formatter': 'verbose'
        },
        'mail_admins': {
            'level': 'ERROR',
            'class': 'django.utils.log.AdminEmailHandler',
            'filters': ['require_debug_false']
        }
    },
    'loggers': {
        APP_NAME: {
            'handlers': ['console', 'file'],
            'propagate': True,
            'level': 'INFO',
        },
        'django': {
            'handlers': ['mail_admins', 'file'],
            'level': 'ERROR',
            'propagate': True,
        },
    }
like image 742
Pistis Valentino Avatar asked Jan 10 '17 00:01

Pistis Valentino


1 Answers

We have experienced the same problem with nginx+gunicorn. We had very similar settings:

LOGGING = {
    "version": 1,
    "disable_existing_loggers": False,
    "formatters": {
        "verbose": {"format": "%(levelname)s %(asctime)s %(module)s %(process)d %(thread)d %(message)s"},
        "simple": {"format": "%(levelname)s %(asctime)s  %(message)s", "datefmt": "%Y-%m-%d %H:%M"},
    },
    "filters": {
          "require_debug_true": {"()": "django.utils.log.RequireDebugTrue"},
          "require_debug_false": {"()": "django.utils.log.RequireDebugFalse"},
    },
    "handlers": {
        "logfile": {
            "class": "logging.handlers.TimedRotatingFileHandler",
            "when": "D",
            "interval": 10, 
            "backupCount": 100,
            "filename": LOG_FILE,
            "formatter": "simple",
            "level": "DEBUG",
        },
    },
    "loggers": {    
        APP_NAME: {
            "handlers": ["console", "plus_logfile"],
            "propagate": True,
            "level": "DEBUG",
        },
    },
}

According to Python Logging Cookbook - Logging to a single file from multiple processes:

Although logging is thread-safe, and logging to a single file from multiple threads in a single process is supported, logging to a single file from multiple processes is not supported, because there is no standard way to serialize access to a single file across multiple processes in Python. If you need to log to a single file from multiple processes, one way of doing this is to have all the processes log to a SocketHandler, and have a separate process which implements a socket server which reads from the socket and logs to file. (If you prefer, you can dedicate one thread in one of the existing processes to perform this function.)

So the best way is to have a socket listener that listens multiple uwsgi workers and logs simultaneously.

Setting would look like this:

"loggers": {    
    APP_NAME: {
        "handlers": ["console", "plus_logfile"],
        "propagate": True,
        "level": "DEBUG",
    },
}, 

And you can find an example listener code here. This code however writes to console. So if you want to write to file you need to edit the listener script.

like image 161
cgl Avatar answered Nov 02 '22 22:11

cgl