Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python - asynchronous logging

I need to log plenty of data while running my system code. What logging packages I can use to have an efficient, asynchronous logging? Is the standard Python logging package (https://docs.python.org/2/library/logging.html) asynchronous by default?

like image 529
Ziva Avatar asked Aug 23 '17 14:08

Ziva


1 Answers

Async code can use the usual logging features without resorting to special async modules or wrappers. Code like this is possible.

import logging
        :

    async def do_some_async_stuff(self):
        logging.getLogger(__name__).info("Started doing stuff...")
        :
        logging.getLogger(__name__).warn("Things went awry...")

The concern here is whether submitting log entries will incur some delay while the entries are written to file, depriving the asynchronous system the opportunity to run other tasks during the lapse. This can happen if a blocking handler that writes to file is added directly somewhere along the logging hierarchy.

There's a simple solution for this provided by the standard logging module: use a non-blocking handler that enqueues its messages to the desired blocking handler running in its own private thread.

Pureism aside, there's no hard-bound rule that precludes the use of the QueueHandler for providing async code that logs with a non-blocking log handler, used together with a blocking handler hosted in a QueueListener.

The solution below is entirely compatible with coroutines that call up the logging loggers and submit entries in typical fashion - wrappers with calls to .run_in_executor() aren't needed. Async code won't experience any blocking behavior from the logging system.

For example, a QueueHandler can be set up as the root handler

import queue
from logging.handlers import QueueHandler
    :
log_queue     = queue.Queue()
queue_handler = QueueHandler(log_queue)  # Non-blocking handler.

root = logging.getLogger()
root.addHandler(queue_handler)           # Attached to the root logger.

And the blocking handler you want can be put inside a QueueListener:

from logging.handlers import QueueListener
from logging.handlers import RotatingFileHandler
    :
rot_handler    = RotatingFileHandler(...)   # The blocking handler.
queue_listener = QueueListener(log_queue, 
                               rot_handler) # Sitting comfortably in its
                                            # own thread, isolated from
                                            # async code.
queue_listener.start()

Then configure the handler nested in the listener with whatever log entry formatting you need.

I personally like the rotating file handler because it limits the size and number of log files produced, deleting the oldest when a new backup is created.

like image 161
Todd Avatar answered Oct 10 '22 15:10

Todd