I'm writing some code that uses the python logging
system. The idea is that if the LOG
doesn't already exist create the log but if it does then get the log and resume logging to that file. Here is my code:
import logging import os log_filename='Transactions.log') if os.path.isfile(log_filename)!=True: LOG = logging.getLogger('log_filename') LOG.setLevel(logging.DEBUG) # create file handler which logs even debug messages fh = logging.FileHandler('log_filename') fh.setLevel(logging.DEBUG) # create console handler with a higher log level ch = logging.StreamHandler() ch.setLevel(logging.DEBUG) # create formatter and add it to the handlers formatter = logging.Formatter('-->%(asctime)s - %(name)s:%(levelname)s - %(message)s') fh.setFormatter(formatter) ch.setFormatter(formatter) # add the handlers to the logger LOG.addHandler(fh) LOG.addHandler(ch) else: LOG=logging.getLogger()
I suspect the problem is with my else
block but I don't know how to fix. Could anybody shed some light on this situation.
When you set a logging level in Python using the standard module, you're telling the library you want to handle all events from that level on up. If you set the log level to INFO, it will include INFO, WARNING, ERROR, and CRITICAL messages. NOTSET and DEBUG messages will not be included here.
You can set a different logging level for each logging handler but it seems you will have to set the logger's level to the "lowest". In the example below I set the logger to DEBUG, the stream handler to INFO and the TimedRotatingFileHandler to DEBUG. So the file has DEBUG entries and the stream outputs only INFO.
The logging module's FileHandler
takes care of that for you. No need for complexity.
The handler takes an optional mode
parameter, to specify whether it starts writing or appending data to it.
From the docs:
class logging.FileHandler(filename, mode='a', encoding=None, delay=False)
The specified file is opened and used as the stream for logging. If
mode
is not specified,'a'
is used.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With