I'm trying to add logging (to console rather than a file) to my a piece of code I've been working on for a while. Having read around a bit I have a pattern that I think should work, but I'm not quite sure where I'm going wrong.
I have the following three files (simplified, obviously):
controller.py
import my_module
import logging
from setup_log import configure_log
def main():
logger = configure_log(logging.DEBUG, __name__)
logger.info('Started logging')
my_module.main()
if __name__ == "__main__":
main()
setup_log.py
import logging
def configure_log(level=None, name=None):
logger = logging.getLogger(name)
logger.setLevel(level)
console_handler = logging.StreamHandler()
console_handler.setLevel(logging.DEBUG)
chFormatter = logging.Formatter('%(levelname)s - %(filename)s - Line: %(lineno)d - %(message)s')
console_handler.setFormatter(chFormatter)
logger.addHandler(console_handler)
return logger
my_module.py
import logging
def main():
logger = logging.getLogger(__name__)
logger.info("Starting my_module")
print "Something"
if __name__ == "__main__":
main()
When I run them, only the first call to logging produces an output to console - 'Started logging'. The second call to logging - 'Starting my module' is just passed over.
What have I misunderstood/mangled?
You can set a different logging level for each logging handler but it seems you will have to set the logger's level to the "lowest". In the example below I set the logger to DEBUG, the stream handler to INFO and the TimedRotatingFileHandler to DEBUG. So the file has DEBUG entries and the stream outputs only INFO.
Multiprocessing with logging module — QueueHandlerAlthough logging module is thread-safe, it's not process-safe. If you want multiple processes to write to the same log file, then you have to manually take care of the access to your file.
Log messages can have 5 levels - DEBUG, INGO, WARNING, ERROR and CRITICAL. They can also include traceback information for exceptions.
According to the documentation it looks like you might get away with an even simpler setup like so:
If your program consists of multiple modules, here’s an example of how you could organize logging in it:
# myapp.py
import logging
import mylib
def main():
logging.basicConfig(filename='myapp.log', level=logging.INFO)
logging.info('Started')
mylib.do_something()
logging.info('Finished')
if __name__ == '__main__':
main()
# mylib.py
import logging
def do_something():
logging.info('Doing something')
If you run myapp.py, you should see this in myapp.log:
INFO:root:Started
INFO:root:Doing something
INFO:root:Finished
It looks like your call to logger = logging.getLogger(__name__)
inside your module is creating a separate track (with a level of NOTSET
but no parent relationship to result in a log entry)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With