I have an application that parses and loads data from csv files into a Postgres 9.3 database. In serial execution insert statements/cursor executions work without an issue.
I added celery in the mix to add parallel parsing and inserting of the data files. Parsing works fine. However, I go to run insert statements and I get:
[2015-05-13 11:30:16,464: ERROR/Worker-1] ingest_task.work_it: Exception
Traceback (most recent call last):
File "ingest_tasks.py", line 86, in work_it
rowcount = ingest_data.load_data(con=con, statements=statements)
File "ingest_data.py", line 134, in load_data
ingest_curs.execute(statement)
DatabaseError: error with no message from the libpq
I encountered a similar problem when multiprocessing engine.execute()
. I solved this problem finally by just adding engine.dispose()
right in the first line under the function where the subprocess is supposed to enter, as suggested in the official document:
When a program uses multiprocessing or
fork()
, and anEngine
object is copied to the child process,Engine.dispose()
should be called so that the engine creates brand new database connections local to that fork. Database connections generally do not travel across process boundaries.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With