I am trying to add points to a User's profile after they submit a comment- using the Django comment framework. I think I need to use a post_save but am not sure to be perfectly honest.
Here is what I have as a method in my models.py:
def add_points(request, Comment):
if Comment.post_save():
request.user.get_profile().points += 2
request.user.get_profile().save()
From the examples of post_save I've found, this is far from what is shown - so I think I am way off the mark.
Thank you for your help.
Unfortunately this makes no sense at all.
Firstly, this can't be a method, as it doesn't have self as the first parameter.
Secondly, it seems to be taking the class, not an instance. You can't save the class itself, only an instance of it.
Thirdly, post_save is not a method of the model (unless you've defined one yourself). It's a signal, and you don't call a signal, you attach a signal handler to it and do logic there. You can't return data from a signal to a method, either.
And finally, the profile instance that you add 2 to will not necessarily be the same as the one you save in the second line, because Django model instances don't have identity. Get it once and put it into a variable, then save that.
The Comments framework defines its own signals that you can use instead of the generic post_save. So, what you actually need is to register a signal handler on comment_was_posted. Inside that handler, you'll need to get the user's profile, and update that.
def comment_handler(sender, comment, request, **kwargs):
profile = request.user.get_profile()
profile.points += 2
profile.save()
from django.contrib.comments.signals import comment_was_posted
comment_was_posted.connect(comment_handler, sender=Comment)
Related
I am working on a little django application where the model contains a TextField and a PositiveIntegerField.
I need the PositiveInegerField to be populated with the number of words in the TextField.
This could be done via custom javascript code that count the number of words in the TextField's text area and put the value to the word count field text box before submitting the form but I do not want to play with custom javascript in the admin.
How to assign a value to a PositiveIntegerField programmatically?
This can be achieved using a pre_save Signal.
Create a signal-function like this:
def calculate_wordcount(sender, instance, **kwargs):
# count the words...
instance.word_count = # your count from line above
Then attach this function to your model. The preferred way since Django 1.7 is the application configuration (see doc).
You can attach your function in the AppConfig ready() - method
def ready(self):
pre_save.connect(calculate_wordcount,
sender= ,# your model
weak=False,
dispatch_uid='models.your_model.wordcount')
I'll leave all necessary imports up to you, please comment, if you need further direction!
While in general I think save signals are a reasonable idea, in this particular case you can also override .save() yourself
class MyClass(models.Model):
[stuff]
def save(self, *args, **kwargs):
self.wordcount = #something that calculates wordcount
super(self.myClass, self).save(*args, **kwargs) #Call django's save!
(as a rule of thumb, I'll overwrite save() if I'm just doing things to one model, but use a signal if I'm using one model to affect another. The idea is to keep all things affecting a model near it in the code. But this gets into personal philosophy)
Also WARNING: No matter which version you use, the signal or overwriting save(), bulk_create and bulk_delete will not send a signal or call your specific save function. See the documentation here: https://docs.djangoproject.com/en/1.8/topics/db/models/#overriding-predefined-model-methods
If I do the folllowing:
obj = Model.objects.get(pk=2)
object.field = 'new value'
object.save()
It runs the custom save method that I have written in django.
However, if I do a normal update statement:
Model.objects.filter(pk=2).update(field='new value')
It does not use the custom save method. My question here is two-fold:
1) Why was that decision made in django -- why doesn't every 'save' implement the custom save method.
2) Is there a codebase-wide way to make sure that no update statements are ever made? Basically, I just want to ensure that the custom save method is always run whenever doing a save within the django ORM. How would this be possible?
I'm not a Django developer, but I dabble from time to time and no one else has answered yet.
Why was that decision made in django -- why doesn't every 'save' implement the custom save method.
I'm going to guess here that this is done as a speed optimization for the common case of just performing a bulk update. update works on the SQL level so it is potentially much faster than calling save on lots of objects, each one being its own database transaction.
Is there a codebase-wide way to make sure that no update statements are ever made? Basically, I just want to ensure that the custom save method is always run whenever doing a save within the django ORM. How would this be possible?
You can use a custom manager with a custom QuerySet that raises some Exception whenever update is called. Of course, you can always loop over the Queryset and call save on each object if you need the custom behavior.
Forbidding Update on a Model
from django.db import models
class NoUpdateQuerySet(models.QuerySet):
"""Don't let people call update! Muahaha"""
def update(self, **kwargs):
# you should raise a more specific Exception.
raise Exception('You shall not update; use save instead.')
class Person(models.Model):
first_name = models.CharField(max_length=50)
last_name = models.CharField(max_length=50)
# setting the custom manager keeps people from calling update.
objects = NoUpdateQuerySet.as_manager()
You would just need to set the NoUpdateQuerySet as a manager for each model you don't want to update. I don't really think it's necessary to set a custom QuerySet though; if it were me I would just not call update, and loop through the objects that need to be saved whenever I need to. You may find a time when you want to call update, and this would end up being very annoying.
Forbidding Update Project-Wide
If you really really decide you hate update, you can just monkey-patch the update method. Then you can be completely certain it's not being called. You can monkey-patch it in your project's settings.py, since you know that module will be imported:
def no_update(self, **kwargs):
# probably want a more specific Exception
raise Exception('NO UPDATING HERE')
from django.db.models.query import QuerySet
QuerySet.update = no_update
Note that the traceback will actually be pretty confusing, since it will point to a function in settings.py. I'm not sure how much, if ever, update is used by other apps; this could have unintended consequences.
When I tried to connect a handler to model's post_save signal I have found that model's ManyToMany field is empty at that moment. I have googled and found a solution here: ManyToManyField is empty in post_save() function
The solution was to connect to m2m_changed signal of the model.
However I still have some questions.
How to precisely detect if model instance was created and not updated
In the answer there was a condition:
if action == 'post_add' and not reverse:
But it seems to be not working when I am editing the instance in admin interface (seems like m2m field being touched when I am clicking "Save" button in admin).
I have discovered one way to do it via assigning instance attribute in post_save handler
and checking for it in m2m_changed handler.
def on_m2m_changed(sender, instance, action, reverse, *args, **kwargs):
if action == "post_add" and not reverse and instance.just_created:
# do stuff
def on_save(sender, instance, created, *args, **kwargs):
instance.just_created = created
But for me it looks bad and I am not sure that it is the correct way to do that. Is there another way to do it?
What to do if we have multiple m2m fields in the model?
Is order of updating m2m fields of the model well-defined and can we rely on it? Or we should connect to each m2m_changed handler and manipulate flags/counters in instance? BTW, can we rely on the fact that m2m_changed is executed after post_save
May be there is another way to handle complete save of the instance with all its m2m fields?
I have this problem too. Apparently this was a bug (7 years old) and was fixed 3 months ago:
https://code.djangoproject.com/ticket/6707
This might also interests you, in this ticket one of the core developers says this works as intended and won't fix it:
https://code.djangoproject.com/ticket/13022
I'm working on a Django application connected to a LDAP server. Here's the trick i'm trying to do.
I have a model called system containing some information about computers. When i add a new system, the model generates a unique UUID, like and AutoField. The point is that this parameter is generated when saving, and only the first time.
After saved, i need a function to keep that UUID and create a new object on my LDAP.
Since i didn't know a lot about signals, i tried overriding the model save function in this way:
def save(self):
# import needed modules
import ldap
import ldap.modlist as modlist
[--OPERATIONS ON THE LDAP--]
super(System, self).save()
In this way, if i modify an existing system all work as should, because its UUID has been generated already. But if i try adding a new system i find the error UUID is None, and i can't work with an empty variable on LDAP (also it would be useless, don't u think so?)
It seems i need to call the function that work on LDAP after the system is saved, and so after an UUID has been generated. I tried to unserstand how to create a post_save function but i couldn't get it.
How can i do that?
Thanks
As you stated on your own, you do need signals, it will allow your code to stay more clean and seperate logic between parts.
The usual approach is to place signals just at the end of your models file:
# Signals
from django.dispatch import receiver
#receiver(models.signals.post_save, sender=YourModel)
def do_something(sender, instance, created, **kwargs):
....
On the above example we connect the post_save signal with the do_something function, this is performed through the decorator #receiver, sender of the decorator points to your Model Class.
Inside your function you have instance which holds the current instance of the model and the created flag which allows you to determine if this is a new record or an old (if the Model is being updated).
Signals would be excellent for something like this, but moving the line super(System, self).save() to the top of the save method might work as well. That means you first save the instance, before passing the saved object to the LDAP.
I see I can override or define pre_save, save, post_save to do what I want when a model instance gets saved.
Which one is preferred in which situation and why?
I shall try my best to explain it with an example:
pre_save and post_save are signals that are sent by the model. In simpler words, actions to take before or after the model's save is called.
A save triggers the following steps
Emit a pre-save signal.
Pre-process the data.
Most fields do no pre-processing — the field data is kept as-is.
Prepare the data for the database.
Insert the data into the database.
Emit a post-save signal.
Django does provide a way to override these signals.
Now,
pre_save signal can be overridden for some processing before the actual save into the database happens - Example: (I dont know a good example of where pre_save would be ideal at the top of my head)
Lets say you have a ModelA which stores reference to all the objects of ModelB which have not been edited yet. For this, you can register a pre_save signal to notify ModelA right before ModelB's save method gets called (Nothing stops you from registering a post_save signal here too).
Now, save method (it is not a signal) of the model is called - By default, every model has a save method, but you can override it:
class ModelB(models.Model):
def save(self):
#do some custom processing here: Example: convert Image resolution to a normalized value
super(ModelB, self).save()
Then, you can register the post_save signal (This is more used that pre_save)
A common usecase is UserProfile object creation when User object is created in the system.
You can register a post_save signal which creates a UserProfile object that corresponds to every User in the system.
Signals are a way to keep things modular, and explicit. (Explicitly notify ModelA if i save or change something in ModelB )
I shall think of more concrete realworld examples in an attempt to answer this question better. In the meanwhile, I hope this helps you
pre_save
it's used before the transaction saves.
post_save
it's used after the transaction saves.
You can use pre_save for example if you have a FileField or an ImageField and see if the file or the image really exists.
You can use post_save when you have an UserProfile and you want to create a new one at the moment a new User it's created.
Don't forget about recursions risk.
If you use post_save method with instance.save() calling, instead of .update method, you should disconnect your post_save signal:
Signal.disconnect(receiver=None, sender=None,
dispatch_uid=None)[source] To disconnect a receiver from a signal,
call Signal.disconnect(). The arguments are as described in
Signal.connect(). The method returns True if a receiver was
disconnected and False if not.
The receiver argument indicates the registered receiver to disconnect.
It may be None if dispatch_uid is used to identify the receiver.
... and connect it again after.
update() method don't send pre_ and post_ signals, keep it in mind.