I believe it's standard practice when using Python's built-in logging module to have the logger
in the main module be the root logger. Assuming this is correct, it seems to me that for any module that may or may not be run as main I need to explicitly check. The reason is that if I follow the standard practice of calling logging.getLogger(__name__)
I'll get a logger named __main__
rather than the root logger:
import logging
print logging.getLogger().name # root
print logging.getLogger(__name__).name # __main__
Is the best practice always to check?
if __name__ == "__main__":
logger = logging.getLogger()
else:
logger = logging.getLogger(__name__)
This is not so bad because I'll always have other code that only runs if __name__ == "__main__"
(often including a call to logging.basicConfig
) but it would be nice to need only one line instead of more.
getLogger(name) is typically executed. The getLogger() function accepts a single argument - the logger's name. It returns a reference to a logger instance with the specified name if provided, or root if not. Multiple calls to getLogger() with the same name will return a reference to the same logger object.
The built-in python logger is I/O blocking. This means that using the built-in logging module will interfere with your asynchronous application performance. aiologger aims to be the standard Asynchronous non-blocking logging for python and asyncio.
The inbuilt logging module in python requires some handful of lines of code to configure log4j-like features viz - file appender, file rotation based on both time & size. For one-liner implementation of the features in your code, you can use the package autopylogger .
You can configure logging as shown above using the module and class functions or by creating a config file or a dictionary and loading it using fileConfig() or dictConfig() respectively. These are useful in case you want to change your logging configuration in a running application.
Yes - i believe that's a good idea. Because - what happens is as follows -
If you are running a program as python prog.py - (the __name__
would be __main__
) and you'd get the root
logger (expected). Or you can even give a name that you'd like (say prog
). and when you import
that module - the name would be the name of the module. (python file name without the extension in this case prog
), which will help you identify the origin of the logs - which is what you'd want. So in general it'd be a good idea to do so.
The practice of using logging.getLogger(__name__)
is meant for a module-level logger, as explained in the
advanced logging tutorial.
In a script (or the main module of an application) I generally don't create a logger at all, but I do change the configuration of the root logger;
opts = argparse.ArgumentParser(prog='foo', description=__doc__)
group.add_argument('-v', '--version', action='version',
version=__version__)
opts.add_argument('--log', default='warning',
choices=['debug', 'info', 'warning', 'error'],
help="logging level (defaults to 'warning')")
opts.add_argument("files", metavar='file', nargs='*',
help="one or more files to process")
args = opts.parse_args(argv)
logging.basicConfig(level=getattr(logging, args.log.upper(), None),
format='%(levelname)s: %(message)s')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With