I have a logging setup in python application that logs into file and MongoDB. The setup looks like this:
[logger_root]
handlers=myHandler,mongoHandler
level=DEBUG
qualname=myapp
[handler_myHandler]
class=handlers.RotatingFileHandler
level=DEBUG
formatter=myFormatter
args=('myapp.log', 'a',20000000,10)
[handler_mongoHandler]
class=myapp.MongoLogger.MongoLogger
level=INFO
args=('log',)
[formatter_myFormatter]
format=%(asctime)s - %(name)s - %(levelname)s - %(message)s
And the MongoLogger has emit() function like this:
def emit(self, record):
logdata = record.__dict__
try:
if(self.data == None):
self.initDb()
self.logtable.insert(logdata)
except:
self.handleError(record)
The logging is done like this then:
logger.info("Processing account %s..." % account)
It works reasonably well, but now I have an additional requirement. I want it to have some context - i.e., to be able to define custom value - say, account name - so every log done inside the account processing would have the account name as part of the record
passed to emit
above and also available for the formatter in myFormatter
string.
Is it possible to do this with logging module? Is there maybe another, better way to do the same?
You can define a function inside the account processing code, after you get the account name, like this:
# account_name should already be defined
log = lambda msg: logger.info(msg, extra={'account': account_name})
###
log('Processing account...')
Note the extra
keyword argument. It is used to add additional context to log records—in this case, the account name.
You can use context passed via extra
in the formatter:
format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s - %(account)s'
Note that if you set up formatter like this and forget to pass account
, you'll get a string formatting exception.
extra
keyword argumentPython official documentation (logging cookbook) suggests two approaches to add contextual information to logs:
filter
method returns bool, which indices whether to emit the record). However, it also allows you to process the record - and add attributes based on whatever is required. For example, you can set the attributes based on a global threading.local variable.Below is a Filter
example that appends attributes from a global threading.local variable:
log_utils.py
import logging
import threading
log_context_data = threading.local()
class ThreadingLocalContextFilter(logging.Filter):
"""
This is a filter which injects contextual information from `threading.local` (log_context_data) into the log.
"""
def __init__(self, attributes: List[str]):
super().__init__()
self.attributes = attributes
def filter(self, record):
for a in self.attributes:
setattr(record, a, getattr(log_context_data, a, 'default_value'))
return True
log_context_data
can be set when you start processing an account, and reset when you you're done. However, I recommend setting it using a context manager:
Also in log_utils.py:
class SessionContext(object):
def __init__(self, logger, context: dict = None):
self.logger = logger
self.context: dict = context
def __enter__(self):
for key, val in self.context.items():
setattr(log_context_data, key, val)
return self
def __exit__(self, et, ev, tb):
for key in self.context.keys():
delattr(log_context_data, key)
And a usage example, my_script.py:
root_logger = logging.getLogger()
handler = ...
handler.setFormatter(
logging.Formatter('{name}: {levelname} {account} - {message}', style='{'))
handler.addFilter(ThreadingLocalContextFilter(['account']))
root_logger.addHandler(handler)
...
...
using SessionContext(logger=root_logger, context={'account': account}):
...
...
Notes:
Filter
is only applied to the logger
it is attached to. So if we attach it to a logging.getLogger('foo')
, it won't affect logging.getLogger('foo.bar')
. The solution is to attach the Filter
to a Handler
, rather than a logger
.ThreadingLocalContextFilter
could've rejected records, if log_context_data
doesn't contain the required attribute. This depends on what you need.When to use what?
Tony's answer just provide a function, but use LoggerAdapter
we can get a context logger.
class ILoggerAdapter(LoggerAdapter):
def __init__(self, logger, extra):
super(ILoggerAdapter, self).__init__(logger, extra)
self.env = extra
def process(self, msg, kwargs):
msg, kwargs = super(ILoggerAdapter, self).process(msg, kwargs)
result = copy.deepcopy(kwargs)
default_kwargs_key = ['exc_info', 'stack_info', 'extra']
custome_key = [k for k in result.keys() if k not in default_kwargs_key]
result['extra'].update({k: result.pop(k) for k in custome_key})
return msg, result
then just wrap your logger as
new_logger = ILoggerAdapter(old_logger, extra={'name': 'name'})
# new_logger is just same as old_logger
# but can be pass to record
new_logger.info(msg='haha', id=100)
def emit(self, record):
print(record.name)
print(record.id)
Forgive the self promotion, but I have written a Python package called log-with-context
that solves this problem. You can find it on PyPI and GitHub. Here is how to use log-with-context
to log messages using an arbitrary number of thread-local context values.
log-with-context
and a log formatter.You have to pair log-with-context
with a different logger that displays extra
fields. My favorite log formatter is JSON-log-formatter.
You can install both packages with pip:
pip3 install log-with-context JSON-log-formatter
Here is how to configure JSON-log-formatter
to print JSON logs of INFO
and worse to standard error.
import logging.config
logging.config.dictConfig({
"version": 1,
"disable_existing_loggers": True,
"formatters": {
"json": {"()": "json_log_formatter.JSONFormatter"},
},
"handlers": {
"console": {
"formatter": "json",
"class": "logging.StreamHandler",
}
},
"loggers": {
"": {"handlers": ["console"], "level": "INFO"},
},
})
log-with-context
.Then, here is how to create your logger:
from log_with_context import add_logging_context, Logger
LOGGER = Logger(__name__)
add_logging_context
context manager.You can push and pop context keys and values using the add_logging_context
context manager. Here is some example code to show how to use the context manager:
with add_logging_context(current_request="hi"):
LOGGER.info("Level 1")
with add_logging_context(more_info="this"):
LOGGER.warning("Level 2")
LOGGER.info("Back to level 1")
LOGGER.error("No context at all...")
And this is what the above log messages look like when printed as JSON:
{"current_request": "hi", "message": "Level 1", "time": "2021-08-03T21:29:45.987392"}
{"current_request": "hi", "more_info": "this", "message": "Level 2", "time": "2021-08-03T21:29:45.988786"}
{"current_request": "hi", "message": "Back to level 1", "time": "2021-08-03T21:29:45.989178"}
{"message": "No context at all...", "time": "2021-08-03T21:29:45.989600"}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With