Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Setting up advanced Python logging

Tags:

python

logging

I'd like to use logging for my modules, but I'm not sure how to design the following requirements:

  • normal logging levels (info, error, warning, debug) but also some additional more verbose debug levels
  • logging messages can have different types; some are meant for the developer, some are meant for the user; those types go to different outputs
  • errors should go to stderr
  • I also need to keep track which module/function/code line wrote a debug message so that I can activate or deactivate individual debug messages in a configuration
  • I need to keep track if errors occured at all to eventually execute a sys.exit() at the end of the program
  • all messages should go to stdout until the loggers are set up

I've read the logging documentation, but I'm not sure what's the most streamline way to use the logging module with the requirements above (how to use concept of Logger, Handler, Filter, ...). Can you point out an idea to set this up? (e.g. write module with two loggers 'user', 'developer'; derive from Logger; do getLogger(__name__); keep error flag like this,... etc.)

like image 249
Gerenuk Avatar asked Feb 09 '12 12:02

Gerenuk


People also ask

What are the five levels of logging in Python?

In Python, the built-in logging module can be used to log events. Log messages can have 5 levels - DEBUG, INGO, WARNING, ERROR and CRITICAL. They can also include traceback information for exceptions. Logs can be especially useful in case of errors to help identify their cause.

What is the highest logging level in Python?

Python Logging Levels There are six log levels in Python; each level is associated with an integer that indicates the log severity: NOTSET=0, DEBUG=10, INFO=20, WARN=30, ERROR=40, and CRITICAL=50. All the levels are rather straightforward (DEBUG < INFO < WARN ) except NOTSET, whose particularity will be addressed next.


1 Answers

1) Adding more verbose debug levels.

Have you thought this through?

Take a look about what the doc says:

Defining your own levels is possible, but should not be necessary, as the existing levels have been chosen on the basis of practical experience. However, if you are convinced that you need custom levels, great care should be exercised when doing this, and it is possibly a very bad idea to define custom levels if you are developing a library. That's because if multiple library authors all define their own custom levels, there is a chance that the logging output from such multiple libraries used together will be difficult for the using developer to control and/or interpret, because a given numeric value might mean different things for different libraries.

Take also a look at When to use logging, there are two very good tables explaining when to use what.

Anyway, if you think you'll need those extra logging levels, take a look at: logging.addLevelName().

2) Some logging messages for the developer, and some for the user

Use different loggers family with different handlers. At the base of each family set Logger.propagate to False.

3) Errors should go to stderr

This already happen by default with StreamHandler:

class logging.StreamHandler(stream=None)

Returns a new instance of the StreamHandler class. If stream is specified, the instance will use it for logging output; otherwise, sys.stderr will be used.

4) Keep track of the source of a log message

Get Loggers with different names, and in your Formatter use format strings with %(name)s.

5) All messages should go to stdout until the loggers are set up

The setup of your logging system should be one of the very first things to do, so I don't really see what this means. If you need to send messages to stdout use print as it should be and already explained in When to use logging.

Last advice: carefully read the Logging Cookbook as it covers pretty well what you need.


From the comment: How would I design to have output to different sources and also filter my module?

I wouldn't filter in the first place, filers are hard to maintain and if they are all in one place that place will have to hold too much information. Every module should get abd set its own Logger (with its own handlers or filters) using or not its parent setting.

Very quick example:

# at the very beginning
root = logging.getLogger()
fallback_handler = logging.StreamHandler(stream=sys.stdout)
root.addHandler(fallback_handler)

# first.py
first_logger = logging.getLogger('first')
first_logger.parent = False
# ... set 'first' logger as you wish
class Foo:
    def __init__(self):
        self.logger = logging.getLogger('first.Foo')
    def baz(self):
        self.logger.info("I'm in baz")

# second.py
second_logger = logging.getLogger('first.second') # to use the same settings

# third.py
abstract_logger = logging.getLogger('abs')
abstract_logger.parent = False
# ... set 'abs' logger
third_logger = logging.getLogger('abs.third')
# ... set 'abs.third' particular settings

# fourth.py
fourth_logger = logging.getLogger('abs.fourth')
# [...]
like image 63
Rik Poggi Avatar answered Sep 29 '22 03:09

Rik Poggi