All, I've got a issue with django signals.
I have a model
In an effort to speed up responsiveness of page loads, I'm offloading some intensive processing that must be done, via a call to a second localhost webserver we're running, both using the same database. I'm seeing behavior where the calling process can retrieve the object, but the called process can't. Both port 80 and port [port] are pointing to django processes running off the same database.
In models.py
class A(models.Model):
stuff...
def trigger_on_post_save( sender, instance, create, raw, **keywords):
#This line works
A.objects.get( pk=instance.pk )
#then we call this
urlopen( r'http://127.0.0.1:[port]' +
reverse(some_view_url, args(instance_pk) ).read()
post_save.connect( trigger_on_post_save, A )
In views.py
def some_view_function( request, a_pk ):
#This line raises an object_not_found exception
A.objects.get( pk=a_pk )
Furthermore, after the urlopen call raises an exception, the object does not exist in the database. It was my understanding that post_save was called after the object had been saved, and written to the database. Is this incorrect?
Django Signals - post_delete() To notify another part of the application after the delete event of an object happens, you can use the post_delete signal.
Django includes a “signal dispatcher” which helps decoupled applications get notified when actions occur elsewhere in the framework. In a nutshell, signals allow certain senders to notify a set of receivers that some action has taken place.
We ran into a similar issue and we ended up using on_commit callback (NOTE: This is only possible with Django >= 1.9). So, you could possible do something like:
from django.db import transaction
class A(models.Model):
stuff...
def trigger_on_post_save( sender, instance, create, raw, **keywords):
def on_commit():
urlopen(r'http://127.0.0.1:[port]' +
reverse(some_view_url, args(instance_pk) ).read()
transaction.on_commit(on_commit)
post_save.connect( trigger_on_post_save, A )
The idea here is that you wil be calling your endpoint after the transaction has been committed, so the instance involved in the transaction will be already saved ;).
I believe post_save fires after the save occurs, but before the transaction is commited to the database. By default, Django only commits changes to the database after the request has been completed.
Two possible solutions to your problem:
To be honest though, your whole setup seems a little bit nasty. You should probably look into Celery for asynchronous task queuing.
It's nice place to use decorators. There is slightly extended version of yoanis-gil's answer:
from django.db import transaction
from django.db.models.signals import post_save
def on_transaction_commit(func):
def inner(*args, **kwargs):
transaction.on_commit(lambda: func(*args, **kwargs))
return inner
@receiver(post_save, sender=A)
@on_transaction_commit
def trigger_on_post_save(sender, **kwargs):
# Do things here
Had same issue when creating new model from django admin. Overriding ModelAdmin.save_model
method to manage transaction manually worked.
def save_model(self, request, obj, form, change):
from django.db import transaction
with transaction.commit_on_success():
super(ModelAdmin, self).save_model(request, obj, form, change)
# write your code here
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With