I see I can override or define pre_save, save, post_save to do what I want when a model instance gets saved.
Which one is preferred in which situation and why?
I shall try my best to explain it with an example:
pre_save and post_save are signals that are sent by the model. In simpler words, actions to take before or after the model's save is called.
A save triggers the following steps
Emit a pre-save signal.
Pre-process the data.
Most fields do no pre-processing — the field data is kept as-is.
Prepare the data for the database.
Insert the data into the database.
Emit a post-save signal.
Django does provide a way to override these signals.
Now,
pre_save signal can be overridden for some processing before the actual save into the database happens - Example: (I dont know a good example of where pre_save would be ideal at the top of my head)
Lets say you have a ModelA which stores reference to all the objects of ModelB which have not been edited yet. For this, you can register a pre_save signal to notify ModelA right before ModelB's save method gets called (Nothing stops you from registering a post_save signal here too).
Now, save method (it is not a signal) of the model is called - By default, every model has a save method, but you can override it:
class ModelB(models.Model):
def save(self):
#do some custom processing here: Example: convert Image resolution to a normalized value
super(ModelB, self).save()
Then, you can register the post_save signal (This is more used that pre_save)
A common usecase is UserProfile object creation when User object is created in the system.
You can register a post_save signal which creates a UserProfile object that corresponds to every User in the system.
Signals are a way to keep things modular, and explicit. (Explicitly notify ModelA if i save or change something in ModelB )
I shall think of more concrete realworld examples in an attempt to answer this question better. In the meanwhile, I hope this helps you
pre_save
it's used before the transaction saves.
post_save
it's used after the transaction saves.
You can use pre_save for example if you have a FileField or an ImageField and see if the file or the image really exists.
You can use post_save when you have an UserProfile and you want to create a new one at the moment a new User it's created.
Don't forget about recursions risk.
If you use post_save method with instance.save() calling, instead of .update method, you should disconnect your post_save signal:
Signal.disconnect(receiver=None, sender=None,
dispatch_uid=None)[source] To disconnect a receiver from a signal,
call Signal.disconnect(). The arguments are as described in
Signal.connect(). The method returns True if a receiver was
disconnected and False if not.
The receiver argument indicates the registered receiver to disconnect.
It may be None if dispatch_uid is used to identify the receiver.
... and connect it again after.
update() method don't send pre_ and post_ signals, keep it in mind.
Related
Is there any way to pass additional parameters to instance I'm saving in DB to later access them after the instance is saved?
Brief example of my case:
I'm using Django's signals as triggers to events, like sending a confirmation email, executed by other processes, like workers.
I'm willing to specify which instance and when should trigger the event, and which should not: sometimes I want created/updated records to trigger series of events, and sometimes I want them to be processes silently or do some other actions.
One solution for this is saving desired behaviour for specific instance in model's field like JSONField and recover this behaviour at post_save, but this seems very ugly way of handlign such problem.
I'm using post_save signal as verification that instance was correctly saved in the DB, because I don't want to trigger event and a moment later something goes wrong while saving instance in DB.
Instances are saved through Django Forms, backend routines and RestFramework Seralizers
One solution is to use an arbitrary model instance attribute (not field) to store the desired state. For example:
def my_view(request):
...
instance._send_message = True if ... else False
instance.save()
#receiver(post_save, sender=MyModel)
def my_handler(sender, instance, **kwargs):
if instance._send_message:
...
When I tried to connect a handler to model's post_save signal I have found that model's ManyToMany field is empty at that moment. I have googled and found a solution here: ManyToManyField is empty in post_save() function
The solution was to connect to m2m_changed signal of the model.
However I still have some questions.
How to precisely detect if model instance was created and not updated
In the answer there was a condition:
if action == 'post_add' and not reverse:
But it seems to be not working when I am editing the instance in admin interface (seems like m2m field being touched when I am clicking "Save" button in admin).
I have discovered one way to do it via assigning instance attribute in post_save handler
and checking for it in m2m_changed handler.
def on_m2m_changed(sender, instance, action, reverse, *args, **kwargs):
if action == "post_add" and not reverse and instance.just_created:
# do stuff
def on_save(sender, instance, created, *args, **kwargs):
instance.just_created = created
But for me it looks bad and I am not sure that it is the correct way to do that. Is there another way to do it?
What to do if we have multiple m2m fields in the model?
Is order of updating m2m fields of the model well-defined and can we rely on it? Or we should connect to each m2m_changed handler and manipulate flags/counters in instance? BTW, can we rely on the fact that m2m_changed is executed after post_save
May be there is another way to handle complete save of the instance with all its m2m fields?
I have this problem too. Apparently this was a bug (7 years old) and was fixed 3 months ago:
https://code.djangoproject.com/ticket/6707
This might also interests you, in this ticket one of the core developers says this works as intended and won't fix it:
https://code.djangoproject.com/ticket/13022
I'm working on a Django application connected to a LDAP server. Here's the trick i'm trying to do.
I have a model called system containing some information about computers. When i add a new system, the model generates a unique UUID, like and AutoField. The point is that this parameter is generated when saving, and only the first time.
After saved, i need a function to keep that UUID and create a new object on my LDAP.
Since i didn't know a lot about signals, i tried overriding the model save function in this way:
def save(self):
# import needed modules
import ldap
import ldap.modlist as modlist
[--OPERATIONS ON THE LDAP--]
super(System, self).save()
In this way, if i modify an existing system all work as should, because its UUID has been generated already. But if i try adding a new system i find the error UUID is None, and i can't work with an empty variable on LDAP (also it would be useless, don't u think so?)
It seems i need to call the function that work on LDAP after the system is saved, and so after an UUID has been generated. I tried to unserstand how to create a post_save function but i couldn't get it.
How can i do that?
Thanks
As you stated on your own, you do need signals, it will allow your code to stay more clean and seperate logic between parts.
The usual approach is to place signals just at the end of your models file:
# Signals
from django.dispatch import receiver
#receiver(models.signals.post_save, sender=YourModel)
def do_something(sender, instance, created, **kwargs):
....
On the above example we connect the post_save signal with the do_something function, this is performed through the decorator #receiver, sender of the decorator points to your Model Class.
Inside your function you have instance which holds the current instance of the model and the created flag which allows you to determine if this is a new record or an old (if the Model is being updated).
Signals would be excellent for something like this, but moving the line super(System, self).save() to the top of the save method might work as well. That means you first save the instance, before passing the saved object to the LDAP.
I am trying to add points to a User's profile after they submit a comment- using the Django comment framework. I think I need to use a post_save but am not sure to be perfectly honest.
Here is what I have as a method in my models.py:
def add_points(request, Comment):
if Comment.post_save():
request.user.get_profile().points += 2
request.user.get_profile().save()
From the examples of post_save I've found, this is far from what is shown - so I think I am way off the mark.
Thank you for your help.
Unfortunately this makes no sense at all.
Firstly, this can't be a method, as it doesn't have self as the first parameter.
Secondly, it seems to be taking the class, not an instance. You can't save the class itself, only an instance of it.
Thirdly, post_save is not a method of the model (unless you've defined one yourself). It's a signal, and you don't call a signal, you attach a signal handler to it and do logic there. You can't return data from a signal to a method, either.
And finally, the profile instance that you add 2 to will not necessarily be the same as the one you save in the second line, because Django model instances don't have identity. Get it once and put it into a variable, then save that.
The Comments framework defines its own signals that you can use instead of the generic post_save. So, what you actually need is to register a signal handler on comment_was_posted. Inside that handler, you'll need to get the user's profile, and update that.
def comment_handler(sender, comment, request, **kwargs):
profile = request.user.get_profile()
profile.points += 2
profile.save()
from django.contrib.comments.signals import comment_was_posted
comment_was_posted.connect(comment_handler, sender=Comment)
I'm having trouble wrapping my head around this. Right now I have some models that looks kind of like this:
def Review(models.Model)
...fields...
overall_score = models.FloatField(blank=True)
def Score(models.Model)
review = models.ForeignKey(Review)
question = models.TextField()
grade = models.IntegerField()
A Review is has several "scores", the overall_score is the average of the scores. When a review or a score is saved, I need to recalculate the overall_score average. Right now I'm using a overridden save method. Would there be any benefits to using Django's signal dispatcher?
Save/delete signals are generally favourable in situations where you need to make changes which aren't completely specific to the model in question, or could be applied to models which have something in common, or could be configured for use across models.
One common task in overridden save methods is automated generation of slugs from some text field in a model. That's an example of something which, if you needed to implement it for a number of models, would benefit from using a pre_save signal, where the signal handler could take the name of the slug field and the name of the field to generate the slug from. Once you have something like that in place, any enhanced functionality you put in place will also apply to all models - e.g. looking up the slug you're about to add for the type of model in question, to ensure uniqueness.
Reusable applications often benefit from the use of signals - if the functionality they provide can be applied to any model, they generally (unless it's unavoidable) won't want users to have to directly modify their models in order to benefit from it.
With django-mptt, for example, I used the pre_save signal to manage a set of fields which describe a tree structure for the model which is about to be created or updated and the pre_delete signal to remove tree structure details for the object being deleted and its entire sub-tree of objects before it and they are deleted. Due to the use of signals, users don't have to add or modify save or delete methods on their models to have this management done for them, they just have to let django-mptt know which models they want it to manage.
You asked:
Would there be any benefits to using Django's signal dispatcher?
I found this in the django docs:
Overridden model methods are not called on bulk operations
Note that the delete() method for an object is not necessarily called
when deleting objects in bulk using a QuerySet or as a result of a
cascading delete. To ensure customized delete logic gets executed, you
can use pre_delete and/or post_delete signals.
Unfortunately, there isn’t a workaround when creating or updating
objects in bulk, since none of save(), pre_save, and post_save are
called.
From: Overriding predefined model methods
Small addition from Django docs about bulk delete (.delete() method on QuerySet objects):
Keep in mind that this will, whenever possible, be executed purely in
SQL, and so the delete() methods of individual object instances will
not necessarily be called during the process. If you’ve provided a
custom delete() method on a model class and want to ensure that it is
called, you will need to “manually” delete instances of that model
(e.g., by iterating over a QuerySet and calling delete() on each
object individually) rather than using the bulk delete() method of a
QuerySet.
https://docs.djangoproject.com/en/1.11/topics/db/queries/#deleting-objects
And bulk update (.update() method on QuerySet objects):
Finally, realize that update() does an update at the SQL level and,
thus, does not call any save() methods on your models, nor does it
emit the pre_save or post_save signals (which are a consequence of
calling Model.save()). If you want to update a bunch of records for a
model that has a custom save() method, loop over them and call save()
https://docs.djangoproject.com/en/2.1/ref/models/querysets/#update
If you'll use signals you'd be able to update Review score each time related score model gets saved. But if don't need such functionality i don't see any reason to put this into signal, that's pretty model-related stuff.
It is a kind sort of denormalisation. Look at this pretty solution. In-place composition field definition.