I'm having trouble wrapping my head around this. Right now I have some models that looks kind of like this:
def Review(models.Model) ...fields... overall_score = models.FloatField(blank=True) def Score(models.Model) review = models.ForeignKey(Review) question = models.TextField() grade = models.IntegerField()
A Review is has several "scores", the overall_score is the average of the scores. When a review or a score is saved, I need to recalculate the overall_score average. Right now I'm using a overridden save method. Would there be any benefits to using Django's signal dispatcher?
Only use signals to avoid introducing circular dependencies. If you have two apps, and one app wants to trigger behaviour in an app it already knows about, don't use signals. The app should just import the function it needs and call it directly.
pre_save) is provoked just before the model save() method is called, or you could say model save method is called only after pre_save is called and done its job free of errors.
save() method from its parent class is to be overridden so we use super keyword. slugify is a function that converts any string into a slug. so we are converting the title to form a slug basically. Let us try to create an instance with “Gfg is the best website”.
To answer directly: No. It's sync.
Save/delete signals are generally favourable in situations where you need to make changes which aren't completely specific to the model in question, or could be applied to models which have something in common, or could be configured for use across models.
One common task in overridden save
methods is automated generation of slugs from some text field in a model. That's an example of something which, if you needed to implement it for a number of models, would benefit from using a pre_save
signal, where the signal handler could take the name of the slug field and the name of the field to generate the slug from. Once you have something like that in place, any enhanced functionality you put in place will also apply to all models - e.g. looking up the slug you're about to add for the type of model in question, to ensure uniqueness.
Reusable applications often benefit from the use of signals - if the functionality they provide can be applied to any model, they generally (unless it's unavoidable) won't want users to have to directly modify their models in order to benefit from it.
With django-mptt, for example, I used the pre_save
signal to manage a set of fields which describe a tree structure for the model which is about to be created or updated and the pre_delete
signal to remove tree structure details for the object being deleted and its entire sub-tree of objects before it and they are deleted. Due to the use of signals, users don't have to add or modify save
or delete
methods on their models to have this management done for them, they just have to let django-mptt know which models they want it to manage.
You asked:
Would there be any benefits to using Django's signal dispatcher?
I found this in the django docs:
Overridden model methods are not called on bulk operations
Note that the delete() method for an object is not necessarily called when deleting objects in bulk using a QuerySet or as a result of a cascading delete. To ensure customized delete logic gets executed, you can use pre_delete and/or post_delete signals.
Unfortunately, there isn’t a workaround when creating or updating objects in bulk, since none of save(), pre_save, and post_save are called.
From: Overriding predefined model methods
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With