I like the python logging infrastructure and I want to use it for a number of different overnight jobs that I run. A lot of these jobs use module X let's say. I want the logging for module X to write to a log file not dependent on module X, but based on the job that ultimately led to calling functionality in module X.
So if overnight_script_1.py calls foo() in module X, I want the log of foo() to go to overnight_script_1.log. I also want overnight_script_2.py's call of foo() to log to overnight_script_2.log.
A potential solution to this problem would be to set up the log file based on looking at the 0th argument to sys.argv which can be mapped to my preferred file. Something seems hacky about doing it this way. Is there a preferred design pattern for doing this? I don't want to rummage through different log files based on the module where the function was called to find my diagnostic information for one of my scripts. Here is some code because I am not sure I am making myself clear.
Here is script1.py
import X
import logging_utils as lu
import sys
logname=sys.argv[0][:-3] # logname==script1 with the .py cut off
logger=lu.setup_logger(log_name) # assume this does the formatting and sets the filehandlers
# furthermore assume the file handler is set so that the output goes to script1.log.
# do a bunch of thing
logger.info('I am doing a bunch of things in script1 and I will now call X.foo()')
X.foo() # see module X below
logger.info('I finished X.foo()')
Similary, here is script2.py
import X
import logging_utils as lu
import sys
logname=sys.argv[0][:-3] # logname==script2 with the .py cut off
logger=lu.setup_logger(log_name) # assume this does the formatting and sets the filehandlers
# furthermore assume the file handler is set so that the output goes to script2.log.
# do a bunch of thing
logger.info('I am doing a bunch of things in script2 and I will now call X.foo()')
X.foo() # see module X below
logger.info('I finished X.foo()')
Here is X.py
import logging
import sys
logname=sys.argv[0][:-3] # could be script1 or script2
logger=logging.getLogger(logname)
def foo():
try:
i=1/0
except:
logger.error('oops - division by zero')
Then I want to run:
python script1.py
python script2.py
and get two log files script1.log and script2.log where the division by zero error that occurred in module X is logged in each.
I believe you should just follow the standard set up for library code:
Say you have a package mypkg
and you want this package to log information while letting the user of the package decide which level of logging to use and where the output should go.
The mypkg
should not set the level nor any handler for the logs.
Module x
of mypkg
should be something like this:
import logging
logger = logging.getLogger(__name__)
def my_function():
logger.info('something')
No configuration of the logging whatsoever just getLogger(__name__)
.
Then your script.py
that uses mypkg
and wants to log at level INFO
on the console will do:
import logging
import mypkg
root_logger = logging.getLogger()
console_handler = logging.StreamHandler()
root_logger.setLevel(logging.INFO)
root_logger.addHandler(console_handler)
mypkg.my_function() # produces output to stderr
While script2.py
will log at level DEBUG
to a file:
import logging
import mypkg
root_logger = logging.getLogger()
file_handler = logging.FileHandler('some_file.log')
root_logger.setLevel(logging.DEBUG)
root_logger.addHandler(file_handler)
mypkg.my_function()
Note that by setting the level and handler on the root logger we are setting the level "globally". If the user has some own loggers that he wants to use with level DEBUG
but he wants to use level INFO
for mypkg
he can do:
root_logger = logging.getLogger()
root_logger.setLevel(logging.DEBUG)
mypkg_logger = logging.getLogger('mypkg')
mypkg_logger.setLevel(logging.INFO)
handler = #... whatever you want
root_logger.addHandler(handler)
If your mypkg
contains the modules x.py
, y.py
, a.py
, b.py
and c.py
and you want to log the functions inside x
at DEBUG
and those inside y
at WARNING
and those for a/b/c.py
at INFO
you can do so by setting the levels on the corresponding loggers:
mypkg_logger = logging.getLogger('mypkg')
my_pkg_logger.setLevel(logging.INFO)
x_logger = logging.getLogger('mypkg.x')
x_logger.setLevel(logging.DEBUG)
y_logger = logging.getLogger('mypkg.y')
y_logger.setLevel(logging.WARNING)
Put the handlers only attached to the root_logger
and everything should work fine.
Note: on some versions of python the logging
module might warn you that you defined a logger with no handlers for the library modules. To fix ths you can use a NullHandler
that just drops the logging messages:
# x.py
logger = logging.getLogger(__name__)
logger.addHandler(logging.NullHandler())
So the NullHandler
literally does nothing, except from preventing the logging
module from complaining. AFAIK newer versions do not need this at all.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With