Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Streaming data from Postgres into Python

I'm looking for advice on efficient ways to stream data incrementally from a Postgres table into Python. I'm in the process of implementing an online learning algorithm and I want to read batches of training examples from the database table into memory to be processed. Any thoughts on good ways to maximize throughput? Thanks for your suggestions.

like image 383
Chris Avatar asked Feb 24 '14 20:02

Chris


1 Answers

If you are using psycopg2, then you will want to use a named cursor, otherwise it will try to read the entire query data into memory at once.

cursor = conn.cursor("some_unique_name")
cursor.execute("SELECT aid FROM pgbench_accounts")
for record in cursor:
    something(record)

This will fetch the records from the server in batches of 2000 (default value of itersize) and then parcel them out to the loop one at a time.

like image 165
jjanes Avatar answered Oct 19 '22 16:10

jjanes