Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python psycopg2 - Logging events

I'm using psycopg2, and I have a problem with logging events (executed queries, notifications, errors) to a file. I want to get effect like in PgAdmin history window.

For example I'm executing this query:

insert into city(id, name, countrycode, district, population) values (4080,'Savilla', 'ESP', 'andalucia', 1000000)

And in PgAdmin I see effect like this:

Executing query: 
insert into city(id, name, countrycode, district, population) values (4080,'Sevilla', 'ESP', 'andalucia', 1000000)

Query executed in 26 ms.
One row affected.

Can I get similar effect using psycopg2?

I tried to use LoggingCursor, but it not satisfactory for me, because it logs only queries.

Thanks for help.


EDIT:

My code:

conn = psycopg2.extras.LoggingConnection(DSN)
File=open('log.log','a')
File.write('================================')
psycopg2.extras.LoggingConnection.initialize(conn,File)

File.write('\n'+time.strftime("%Y-%m-%d %H:%M:%S") + '---Executing query:\n\t')
q="""insert into city(id, name, countrycode, district, population) values (4080,'Sevilla', 'ESP', 'andalucia', 10000)"""
c=conn.cursor()
c.execute(q)
File.write('\n'+time.strftime("%Y-%m-%d %H:%M:%S") + '---Executing query:\n\t')
q="""delete from city where id = 4080"""
c=conn.cursor()
c.execute(q)
conn.commit()
File.close()

And this is my output log:

================================
2012-12-30 22:42:31---Executing query:
    insert into city(id, name, countrycode, district, population) values (4080,'Sevilla', 'ESP', 'andalucia', 10000)

2012-12-30 22:42:31---Executing query:
    delete from city where id = 4080

I want to see in log file informations about how many rows was affected and informations about errors. Finally I want to have a complete log file with all events.

like image 681
mgierm Avatar asked Dec 30 '12 02:12

mgierm


Video Answer


2 Answers

From what I can see, you have three requirements that are not fulfilled by the LoggingCursor class

  1. Query execution time
  2. Number of rows affected
  3. A complete log file with all events.

For the first requirement, take a look at the source code for the MinTimeLoggingConnection class in psycopg2.extras. It sub-classes LoggingConnection and outputs the execution time of queries that exceed a minimum time (note that this needs to be used in conjunction with the MinTimeLoggingCursor).

For the second requirement, the rowcount attribute of the cursor class specifies

the number of rows that the last execute*() produced (for DQL statements like SELECT) or affected (for DML statements like UPDATE or INSERT)

Thus it should be possible to create your own type of LoggingConnection and LoggingCursor that includes this additional functionality.

My attempt is as follows. Just replace LoggingConnection with LoggingConnection2 in your code and this should all work. As a side-note, you don't need to create a new cursor for your second query. You can just call c.execute(q) again after you've defined your second query.

import psycopg2
import os
import time
from psycopg2.extras import LoggingConnection
from psycopg2.extras import LoggingCursor

class LoggingConnection2(psycopg2.extras.LoggingConnection):
    def initialize(self, logobj):
        LoggingConnection.initialize(self, logobj)

    def filter(self, msg, curs):
        t = (time.time() - curs.timestamp) * 1000
        return msg + os.linesep + 'Query executed in: {0:.2f} ms. {1} row(s) affected.'.format(t, curs.rowcount)

    def cursor(self, *args, **kwargs):
        kwargs.setdefault('cursor_factory', LoggingCursor2)
        return super(LoggingConnection, self).cursor(*args, **kwargs)

class LoggingCursor2(psycopg2.extras.LoggingCursor):    
    def execute(self, query, vars=None):
        self.timestamp = time.time()
        return LoggingCursor.execute(self, query, vars)

    def callproc(self, procname, vars=None):
        self.timestamp = time.time()
        return LoggingCursor.execute(self, procname, vars)

I'm not sure how to create a complete log of all events, but the notices attribute of the connection class may be of interest.

like image 172
Talvalin Avatar answered Oct 02 '22 22:10

Talvalin


Maybe you can get what you're looking for without writing any code.

There's an option in postgresql itself called "log_min_duration" that might help you out.

You can set it to zero, and every query will be logged, along with its run-time cost. Or you can set it to some positive number, like say, 500, and postgresql will only record queries that take at least 500 ms to run.

You won't get the results of the query in your log file, but you'll get the exact query, including the interpolated bound parameters.

If this works well for you, later on, check out the auto_explain module.

Good luck!

like image 37
W. Matthew Wilson Avatar answered Oct 02 '22 23:10

W. Matthew Wilson