Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python logging across multiple modules

I'm trying to add logging (to console rather than a file) to my a piece of code I've been working on for a while. Having read around a bit I have a pattern that I think should work, but I'm not quite sure where I'm going wrong.

I have the following three files (simplified, obviously):

controller.py

import my_module    
import logging
from setup_log import configure_log

def main():
    logger = configure_log(logging.DEBUG, __name__)
    logger.info('Started logging')
    my_module.main()

if __name__ == "__main__":
    main()

setup_log.py

import logging

def configure_log(level=None, name=None):
    logger = logging.getLogger(name)
    logger.setLevel(level)
    console_handler = logging.StreamHandler()
    console_handler.setLevel(logging.DEBUG)
    chFormatter = logging.Formatter('%(levelname)s - %(filename)s - Line: %(lineno)d - %(message)s')
    console_handler.setFormatter(chFormatter)
    logger.addHandler(console_handler)
    return logger

my_module.py

import logging

def main():    
    logger = logging.getLogger(__name__)
    logger.info("Starting my_module")
    print "Something"

if __name__ == "__main__":
    main()

When I run them, only the first call to logging produces an output to console - 'Started logging'. The second call to logging - 'Starting my module' is just passed over.

What have I misunderstood/mangled?

like image 969
Jamie Bull Avatar asked Jun 05 '13 18:06

Jamie Bull


People also ask

How do I create a multiple logging level in Python?

You can set a different logging level for each logging handler but it seems you will have to set the logger's level to the "lowest". In the example below I set the logger to DEBUG, the stream handler to INFO and the TimedRotatingFileHandler to DEBUG. So the file has DEBUG entries and the stream outputs only INFO.

Is Python logging multiprocess safe?

Multiprocessing with logging module — QueueHandlerAlthough logging module is thread-safe, it's not process-safe. If you want multiple processes to write to the same log file, then you have to manually take care of the access to your file.

What are the five levels of logging in Python?

Log messages can have 5 levels - DEBUG, INGO, WARNING, ERROR and CRITICAL. They can also include traceback information for exceptions.


1 Answers

According to the documentation it looks like you might get away with an even simpler setup like so:

If your program consists of multiple modules, here’s an example of how you could organize logging in it:

# myapp.py
import logging
import mylib

def main():
    logging.basicConfig(filename='myapp.log', level=logging.INFO)
    logging.info('Started')
    mylib.do_something()
    logging.info('Finished')

if __name__ == '__main__':
    main()

# mylib.py
import logging

def do_something():
    logging.info('Doing something')

If you run myapp.py, you should see this in myapp.log:

INFO:root:Started
INFO:root:Doing something
INFO:root:Finished

It looks like your call to logger = logging.getLogger(__name__) inside your module is creating a separate track (with a level of NOTSET but no parent relationship to result in a log entry)

like image 192
Jason Sperske Avatar answered Nov 04 '22 01:11

Jason Sperske