I have a problem with the RotatingFileHander with Django.
The problem is that when the file is up to the maxBytes size, it will not create a new file, and give an error message when you are trying to do logger.info("any message"):
The strange part is:
Loggers are only initiated once at the top of the file (chartLogger = getLogger...) Different functions in the same file will be using the same name
Logged from file views.py, line 1561
Traceback (most recent call last):
File "C:\Python27\lib\logging\handlers.py", line 77, in emit
self.doRollover()
File "C:\Python27\lib\logging\handlers.py", line 142, in doRollover
os.rename(self.baseFilename, dfn)
WindowsError: [Error 32] The process cannot access the file because it is being used by another process
Inside my settings.py, I have:
LOGGING = {
'version': 1,
'disable_existing_loggers': True,
'formatters' : {
'standard' : {
'format' : '%(asctime)s [%(levelname)s] %(name)s: %(message)s'
},
},
'handlers': {
'celery.webapp' : {
'level' : 'ERROR',
'class' : 'django.utils.log.AdminEmailHandler',
},
'celery' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/celery.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
'views.error' : {
'level' : 'ERROR',
'class' : 'django.utils.log.AdminEmailHandler',
},
'views' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/views.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
},
'loggers': {
'celery.webapp' : {
'level' : 'ERROR',
'handlers' : ['celery.webapp'],
'propogate' : True,
},
'celery.webapp.task' : {
'level' : 'INFO',
'handlers' : ['celery'],
'propogate' : True,
},
'views.logger' : {
'level' : 'ERROR',
'handlers' : ['views.error'],
'propogate' : True,
},
'views.logger.login' : {
'level' : 'INFO',
'handlers' : ['views'],
'propogate' : True,
},
'views.logger.register' : {
'level' : 'INFO',
'handlers' : ['views'],
'propogate' : True,
},
'views.logger.chartConfigure' : {
'level' : 'INFO',
'handlers' : ['views'],
'propogate' : True,
},
'views.logger.sendEmail' : {
'level' : 'INFO',
'handlers' : ['views'],
'propogate' : True,
},
},
}
I have tried to change different file sizes, but it gets stuck at maxBytes.
Althought it said that the process cannot access the file because it is being used by some other processes. All the logging is fine before it hits maxBytes.
EDIT:
I've split the logging between celery and django.
LOGGING = {
'version': 1,
'disable_existing_loggers': True,
'formatters' : {
'standard' : {
'format' : '%(asctime)s [%(levelname)s] %(name)s: %(message)s'
},
},
'handlers': {
'celery.webapp' : {
'level' : 'ERROR',
'class' : 'django.utils.log.AdminEmailHandler',
},
'celery' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/celery.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
'celery_chartConfigure' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/celery_chartConfigure.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
'celery_register' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/celery_register.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
'views.error' : {
'level' : 'ERROR',
'class' : 'django.utils.log.AdminEmailHandler',
},
'views' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/views.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
'views_login' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/views_login.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
'views_sendEmail' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/views_sendEmail.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
'views_register' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/views_register.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
'views_chartConfigure' : {
'level' : 'INFO',
'class' : 'logging.handlers.RotatingFileHandler',
'filename' : 'logs/views_chartConfigure.log',
'maxBytes' : 1024*1024*10, # 10MB
'backupCount' : 10,
'formatter' : 'standard',
},
},
'loggers': {
'celery.webapp' : {
'level' : 'ERROR',
'handlers' : ['celery.webapp'],
'propogate' : True,
},
'celery.webapp.task' : {
'level' : 'INFO',
'handlers' : ['celery'],
'propogate' : True,
},
'celery.webapp.chartConfigure' : {
'level' : 'INFO',
'handlers' : ['celery_chartConfigure'],
'propogate' : True,
},
'celery.webapp.register' : {
'level' : 'INFO',
'handlers' : ['celery_register'],
'propogate' : True,
},
'views.logger' : {
'level' : 'ERROR',
'handlers' : ['views.error'],
'propogate' : True,
},
'views.logger.login' : {
'level' : 'INFO',
'handlers' : ['views_login'],
'propogate' : True,
},
'views.logger.register' : {
'level' : 'INFO',
'handlers' : ['views_register'],
'propogate' : True,
},
'views.logger.chartConfigure' : {
'level' : 'INFO',
'handlers' : ['views_chartConfigure'],
'propogate' : True,
},
'views.logger.sendEmail' : {
'level' : 'INFO',
'handlers' : ['views_sendEmail'],
'propogate' : True,
},
},
}
However, it is still having problems doing doRollOver.
Wouldn't splitting logs between celery and Django solve it issue? Because it isn't several processes accessing the log, but only Django or Celery.
EDIT 2:
I'm also doing Ajax calls. Would this somehow spawn another process that might be interfere with logging?
I guess you are facing the problem described in this post: Django logging with RotatingFileHandler error
That is, when running Django development server, actually there are two processes running.
by default, two processes of Django servers are running. One is the actual server, while the other is to detect changes in the code and reload the server. Therefore, settings.py is imported twice, and consequently the two processes are accessing the log file at the same time.
As advised there, try
python manage.py runserver --noreload
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With