I am trying to set up logging where I can log in both stdout and on to a file. This i have accomplished using the following code:
logging.basicConfig(
level=logging.DEBUG, format='%(asctime)-15s %(levelname)-8s %(message)s',
datefmt='%a, %d %b %Y %H:%M:%S', handlers=[logging.FileHandler(path), logging.StreamHandler()])
The output of this something like this:
2018-05-02 18:43:33,295 DEBUG Starting new HTTPS connection (1): google.com
2018-05-02 18:43:33,385 DEBUG https://google.com:443 "GET / HTTP/1.1" 301 220
2018-05-02 18:43:33,389 DEBUG Starting new HTTPS connection (1): www.google.com
2018-05-02 18:43:33,490 DEBUG https://www.google.com:443 "GET / HTTP/1.1" 200 None
What I am trying to accomplish is logging this output to a file not as it is printing to stdout, but as a dictionary or JSON object similar to something like this (while keeping the stdout as it is at the moment):
[{'time': '2018-05-02 18:43:33,295', 'level': 'DEBUG', 'message': 'Starting new HTTPS connection (1): google.com'}, {...}, {...}]
Is this doable? I understand that I can post process this log file after my process is finished, but I am looking for a more elegant solution because certain things i am logging are quite big objects themselves.
json supports arrays via brackets. There is a difference between json and python dict. it also uses unhelpful language like "sucks" and makes subjective statements like "makes code look dirty" without backing up the claim. Yeah, this answer is conflating JSON (a format) with the way property access works in Javascript.
The biggest benefit of logging in JSON is that it's a structured data format. This makes it possible for you to analyze your logs like big data. It's not just readable text, but a database that can be queried.
So based on @abarnert, i found this Link which provided a good path to making this concept work for the most part. The code as it stands is:
logger=logging.getLogger()
logger.setLevel(logging.DEBUG)
file_handler=logging.FileHandler('foo.log')
stream_handler=logging.StreamHandler()
stream_formatter=logging.Formatter(
'%(asctime)-15s %(levelname)-8s %(message)s')
file_formatter=logging.Formatter(
"{'time':'%(asctime)s', 'name': '%(name)s', \
'level': '%(levelname)s', 'message': '%(message)s'}"
)
file_handler.setFormatter(file_formatter)
stream_handler.setFormatter(stream_formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
Although it does not fully meet the requirement, it doesnt require any pre processing, and allows me to create two log handlers.
Afterwards, i can use something like:
with open('foo.log') as f:
logs = f.read().splitlines()
for l in logs:
for key, value in eval(l):
do something ...
to pull dict
objects instead of fighting with improperly formatted JSON to accomplish what i had set out to accomplish.
Still am hoping for a more elegant solution.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With