Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SQLAlchemy: prevent automatic closing

I need to insert/update bulk rows via SQLAlchemy. And get inserted rows.

I tried to do it with session.execute:

 >>> posts = db.session.execute(Post.__table__.insert(), [{'title': 'dfghdfg', 'content': 'sdfgsdf', 'topic': topic}]*2)
 >>> posts.fetchall()

 ResourceClosedError                       Traceback (most recent call last)

And with engine:

In [17]: conn = db.engine.connect()  

In [18]: result = conn.execute(Post.__table__.insert(), [{'title': 'title', 'content':  'content', 'topic': topic}]*2)

In [19]: print result.fetchall()
ResourceClosedError: This result object does not return rows. It has been closed automatically.

The same response is an object has been closed automatically. How to prevent it?

like image 956
I159 Avatar asked Jan 17 '13 09:01

I159


People also ask

Does SQLAlchemy close connection automatically?

connect() method returns a Connection object, and by using it in a Python context manager (e.g. the with: statement) the Connection. close() method is automatically invoked at the end of the block.

Does SQLAlchemy automatically commit?

Getting a ConnectionThe transaction is not committed automatically; when we want to commit data we normally need to call Connection. commit() as we'll see in the next section. “autocommit” mode is available for special cases. The section Setting Transaction Isolation Levels including DBAPI Autocommit discusses this.

What is Session flush in SQLAlchemy?

session. flush() communicates a series of operations to the database (insert, update, delete). The database maintains them as pending operations in a transaction.

Is SQLAlchemy Engine thread safe?

Every pool implementation in SQLAlchemy is thread safe, including the default QueuePool . This means that 2 threads requesting a connection simultaneously will checkout 2 different connections. By extension, an engine will also be thread-safe.


1 Answers

First answer - on "preventing automatic closing".

SQLAlchemy runs DBAPI execute() or executemany() with insert and do not do any select queries. So the exception you've got is expected behavior. ResultProxy object returned after insert query executed wraps DB-API cursor that doesn't allow to do .fetchall() on it. Once .fetchall() fails, ResultProxy returns user the exception your saw.

The only information you can get after insert/update/delete operation would be number of affected rows or the value of primary key after auto increment (depending on database and database driver).

If your goal is to receive this kind information, consider checking ResultProxy methods and attributes like:

  • .inserted_primary_key
  • .last_inserted_params()
  • .lastrowid
  • etc

Second answer - on "how to do bulk insert/update and get resulting rows".

There is no way to load inserted rows while doing single insert query using DBAPI. SQLAlchemy SQL Expression API you are using for doing bulk insert/updates also doesn't provide such functionality. SQLAlchemy runs DBAPI executemany() call and relies on driver implementation. See this section of documentation for details.

Solution would be to design your table in a way that every record would have natural key to identify records (combination of columns' values that identify record in unique way). So insert/update/select queries would be able to target one record. After doing it would be possible to do bulk insert/update first and then doing select query by natual key. Thus you won't need to know autoincremented primary key value.

Another option: may be you can use SQLAlchemy Object Relational API for creating objects - then SQLAlchemy may try to optimize insert into doing one query with executemany for you. It worked for me while using Oracle DB. There won't be any optimization for updates out of the box. Check this SO question for efficient bulk update ideas

like image 113
vvladymyrov Avatar answered Nov 01 '22 22:11

vvladymyrov