I would like to apply logging library in a class to get reports from the different steps in my code and use info, debug or error functions and keep it in a logging file. In addition I want to use multiprocessing in my code as well. But I could not quite figure out how it works and should be set up, plus I have used it in a code and defined it as following
import logging
logging.basicConfig(filename='logfile.log',level=logging.DEBUG)
it halted the code and prevents to terminate the process. I am wondering how it should be used in a class and stopped and closed the log file?!! Any help would be appreciated...
You may go through Good logging practice in python for getting more idea of logging module and get more detailed info from Python document.
Below is a basic example on how to use logging module, in which I am knowingly raising an exception:
import logging
log = logging.getLogger("mylog")
log.setLevel(logging.DEBUG)
formatter = logging.Formatter(
"%(asctime)s %(threadName)-11s %(levelname)-10s %(message)s")
# Alternative formatting available on python 3.2+:
# formatter = logging.Formatter(
# "{asctime} {threadName:>11} {levelname} {message}", style='{')
# Log to file
filehandler = logging.FileHandler("debug.txt", "w")
filehandler.setLevel(logging.DEBUG)
filehandler.setFormatter(formatter)
log.addHandler(filehandler)
# Log to stdout too
streamhandler = logging.StreamHandler()
streamhandler.setLevel(logging.INFO)
streamhandler.setFormatter(formatter)
log.addHandler(streamhandler)
# Test it
log.debug("Some message")
log.error("An error!")
try:
something()
except:
log.exception("An exception occured!")
In your debug.txt, you will get output as:
2011-01-18 12:07:24,943 MainThread DEBUG Some message
2011-01-18 12:07:24,943 MainThread ERROR An error!
2011-01-18 12:07:24,943 MainThread ERROR An exception occured!
Traceback (most recent call last):
File "./logtest.py", line 17, in
something()
NameError: name 'something' is not defined
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With