I'm trying to do some operations after an insert occurs on a particular table.
user = ModelFactory.create_user(username, email, password)
db.session.add(user)
db.session.commit()
So I have created a method which invokes automatically on after_insert
.
def notify_user(user):
print user.id
book = Book(author=user.id, name='foo')
db.session.add(book)
db.session.commit(book)
@event.listens_for(User, 'after_insert')
def receive_after_insert(mapper, connection, target):
print target
notify_user(target)
But this code shows a warning like,
SAWarning: Usage of the 'Session.add()' operation is not currently supported within the execution stage of the flush process. Results may not be consistent. Consider using alternative event listeners or connection-level operations instead.
% method)
/home/avinash/.virtualenvs/s2s/local/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py:68: SAWarning: An exception has occurred during handling of a previous exception. The previous exception is:
<class 'sqlalchemy.exc.ResourceClosedError'> This transaction is closed
This post shows that we have to do the work in the before_flush
. I tried moving the logic inside before_flush method but it shows that the user.id is None. Yes, that's expected since the entity won't get committed to db.
Likewise I have tried after_flush
event but still I get the same warning.
after_flush
event does work:
@event.listens_for(User, "after_insert")
def receive_after_insert(mapper, connection, user):
print(user.id)
@event.listens_for(Session, "after_flush", once=True)
def receive_after_flush(session, context):
session.add(Book(author=user.id, name="foo"))
But why wouldn't you want to make User.author
a relationship instead?
I think the accepted solution has a memory leak. According to the documentation for the event.listens_for
function:
The once argument does not imply automatic de-registration of the listener function after it has been invoked a first time; a listener entry will remain associated with the target object. Associating an arbitrarily high number of listeners without explicitly removing them will cause memory to grow unbounded even if once=True is specified.
I did some testing, and sure enough, every time the receive_after_insert
event fired in this scenario, it added a new event to the event.registry
. I fixed this issue but adding an event.remove
call to the receive_after_insert
function like this:
@event.listens_for(User, "after_insert")
def receive_after_insert(mapper, connection, user):
print(user.id)
@event.listens_for(Session, "after_flush", once=True)
def receive_after_flush(session, context):
session.add(Book(author=user.id, name="foo"))
event.remove(Session, "after_flush", receive_after_flush)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With