Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

asynchronous logging with python and mongodb

I need to log specific activities of my web application (Python based - SQL alchemy with Postgres) and I don't want to either dump log information on my Postgres database (why fill it up with mostly rubbish?) or use a log file (hard to search).

Ideally I would like to throw everything in another database and do this in an asynchronous way. With the logging being asynchronous I don't need to worry about the write operation failing and breaking to code that does all the important business. Also, if I miss a few logging events, it is probably no big deal.

Mongo seems like an excellent solution since it is well suited for writing operations and easy to setup.

The problem is that I have not managed to find any python tools that cover my needs and in particular the asynchronous requirement.

Any thoughts?

like image 477
Dimitris Avatar asked Oct 07 '22 05:10

Dimitris


1 Answers

Using the log collector daemon like Fluentd / Scribe / Flume, could be an another solution.

fluentd plus mongodb

These daemon is launched at every application nodes, and takes the logs from app processes. They buffer the logs and asynchronously writes out the data to other systems like MongoDB / PostgreSQL / etc. The write is done by batches, so it's a lot more efficient than writing directly from apps.

Here's two links of how to use Fluentd from Python, and how to put the data into MongoDB.

  • Fluentd: Data Import from Python Applications
  • Fluentd: Store Apache Logs into MongoDB
like image 101
Kazuki Ohta Avatar answered Oct 13 '22 23:10

Kazuki Ohta