I am new to Django and I'm not able to understand how to work with Django signals. Can anyone please explain "Django signals" with simple examples?
Thanks in advance.
You can find very good content about django signals over Internet by doing very small research.
Here i will explain you very brief about Django signals.
What are Django signals?
Signals allow certain senders to notify a set of receivers that some action has taken place
Actions :
model's save() method is called.
django.db.models.signals.pre_save | post_save
model's delete() method is called.
django.db.models.signals.pre_delete | post_delete
ManyToManyField on a model is changed.
django.db.models.signals.m2m_changed
Django starts or finishes an HTTP request.
django.core.signals.request_started | request_finished
All signals are django.dispatch.Signal instances.
very basic example :
models.py
from django.db import models
from django.db.models import signals
def create_customer(sender, instance, created, **kwargs):
print "Save is called"
class Customer(models.Model):
name = models.CharField(max_length=16)
description = models.CharField(max_length=32)
signals.post_save.connect(receiver=create_customer, sender=Customer)
Shell
In [1]: obj = Customer(name='foo', description='foo in detail')
In [2]: obj.save()
Save is called
Apart from the explanation given by Prashant, you can also use receiver decorator present in django.dispatch module.
e.g.
from django.db import models
from django.db.models import signals
from django.dispatch import receiver
class Customer(models.Model):
name = models.CharField(max_length=16)
description = models.CharField(max_length=32)
#receiver(signals.pre_save, sender=Customer)
def create_customer(sender, instance, created, **kwargs):
print "customer created"
For more information, refer to this link.
In the signals.post_save.connect(receiver=create_customer, sender=Customer)... sender will always be the model which we are defining... or we can use the User as well in the sender.
Signals are used to perform any action on modification of a model instance. The signals are utilities that help us to connect events with actions. We can develop a function that will run when a signal calls it. In other words, Signals are used to perform some action on modification/creation of a particular entry in Database. For example, One would want to create a profile instance, as soon as a new user instance is created in Database
There are 3 types of signal.
pre_save/post_save: This signal works before/after the method save().
pre_delete/post_delete: This signal works before after delete a model’s instance (method delete()) this signal is thrown.
pre_init/post_init: This signal is thrown before/after instantiating a model (init() method).
One of the example, if we want to create a profile of a user as soon as the user is created using post_save signal.
For code example, I found the Geeks for Geeks, which explains is very simple way, and easy to understand.
https://www.geeksforgeeks.org/how-to-create-and-use-signals-in-django/
You can add signals to your models.py file
here is an example for adding an auto slug, if you use a SlugField:
this is the stuff you need to import
from django.utils.text import slugify
from django.dispatch import receiver
from django.db.models.signals import post_save, pre_save
Add the #receiver to the bottom of your class, included the def
If you add the def __str__(self): under the receiver, you will get an error
class Store(models.Model):
name = models.CharField(max_length=100)
slug = models.SlugField(unique=False, blank=True, null=True)
def __str__(self):
return self.name
#receiver(pre_save, sender=Store)
def store_pre_save(sender, instance, *args, **kwargs):
if not instance.slug:
instance.slug = slugify(instance.name)
or you can use post_save
class Store(models.Model):
name = models.CharField(max_length=100)
slug = models.SlugField(unique=False, blank=True, null=True)
def __str__(self):
return self.name
#receiver(post_save, sender=Store)
def store_post_save(sender, instance, created, *args, **kwargs):
if not instance.slug:
instance.slug = slugify(instance.name)
instance.save()
I found this example from this tutorial
Related
class Trade(models.Model):
pips = models.FloatField(default=0)
direction = models.CharField(max_length=30)
new_balance = FloatField(default=0.0)
...
class Summary(models.Model):
winning_trades = models.IntegerField(default=0)
account_balance = FloatField(default=0.0)
...
When a user post a request he/she will populate the Trade model, this will update the summary model and send back to the user new summary data. How can I do this in an elegant way?
You're most likely looking for Django Signals. You'd want your Trade model's create event to trigger a post_save signal that a listener will receive and process.
Assuming you have saved your models in a file models.py,
Create a file signals.py with the following:
# code
from django.db.models.signals import post_save
from django.dispatch import receiver
from .models import Trade, Summary
#receiver(post_save, sender=Trade)
def update_summary(sender, instance, created, **kwargs):
if created:
# query to update Summary as needed
You'll have to add the signals to your app config.
in your apps.py of the relevant app, add the following:
from django.apps import AppConfig
class AppnameConfig(AppConfig):
name = 'appname'
**def ready(self):
import appname.signals**
First, I will encourage you to create a transaction to perform these two actions. If the second one fails, your database will remain consistent.
Then, Django allows you to override the model methods such as save. You should try that with something like the following:
class Trade():
...
def save(self, *args, **kwargs):
with transaction.atomic():
super.save()
update_summary()
Then, in the view, you could query for the recently updated Summary and return it.
class TradeViewSet():
def create(self, request, *args, **kwargs):
user = request.user
trade = Trade.create(request.data)
updated_summary = get_summary(user.id)
response = SummarySerializer(updated_summary)
return Response(response)
After a user is saved, I need to make sure that its instance is associated with a group by default.
I have found two ways to achieve that:
Overriding the model's save() method
models.py:
from django.contrib.auth.models import AbstractUser, Group
class Person(AbstractUser):
def save(self, *args, **kwargs):
super().save(*args, **kwargs)
to_add = Group.objects.get(id=1) # get_or_create is a better option
instance.groups.add(to_add)
Capturing a post_save signal:
signals.py:
from django.conf import settings
from django.contrib.auth.models import Group
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(
post_save,
sender=settings.AUTH_USER_MODEL,
)
def save_the_group(instance, raw, **kwargs):
if not raw:
to_add = Group.objects.get(id=1) # get_or_create is a better option
instance.groups.add(to_add)
Are these methods equal in achieving their goal?
Is there a better one in Django terms of "Good Practice"?
Update
Acquiring a better understanding of how Django works, I see that the
confusion and also the solution lie in BaseModelForm.save():
...
if commit:
# If committing, save the instance and the m2m data immediately.
self.instance.save()
self._save_m2m()
...
and in BaseModelForm._save_m2m():
...
if f.name in cleaned_data:
f.save_form_data(self.instance, cleaned_data[f.name])
...
The instance is first saved to acquire a primary key (post_save
signal emmited) and then all its many to many relations are saved based
on ModelForm.cleaned_data.
If any m2m relation has been added during the post_save signal or at
the Model.save() method, it will be removed or overridden from
BaseModelForm._save_m2m(), depending on the content of the
ModelForm.cleaned_data.
The transaction.on_commit() -which is discussed as a solution in this
asnwer later on and in a few other SO answers from which I was inspired
and got downvoted- will delay the changes in the signal until
BaseModelForm._save_m2m() has concluded its operations.
Although, in some special cases the transaction.on_commit() is very useful, in this case is an overkill, not only because it is complexing the situation in
an awkward manner (the most suitable signal is m2m_changed as explained here) but because avoiding signals altogether, is rather
good.
Therefore, I will try to give a solution that caters for both occasions:
If the instance is saved from Django Admin (ModelForm)
If the instance is saved without using a ModelForm
models.py
from django.contrib.auth.models import AbstractUser, Group
class Person(AbstractUser):
def save(self, *args, **kwargs):
super().save(*args, **kwargs)
if not getattr(self, 'from_modelform', False): # This flag is created in ModelForm
<add - remove groups logic>
forms.py
from django import forms
from django.contrib.auth.forms import UserChangeForm
from django.contrib.auth.models import Group
from my_app.models import Person
class PersonChangeForm(UserChangeForm):
def clean(self):
cleaned_data = super().clean()
if self.errors:
return
group = cleaned_data['groups']
to_add = Group.objects.filter(id=1)
to_remove = Group.objects.filter(id=2)
cleaned_data['groups'] = group.union(to_add).difference(to_remove)
self.instance.from_modelform = True
return cleaned_data
class Meta:
model = Person
fields = '__all__'
This will either work with:
>>> p = Person()
>>> p.username = 'username'
>>> p.password = 'password'
>>> p.save()
or with:
from django.contrib.auth.forms import UserCreationForm
from django.contrib.auth import get_user_model
from django.forms.models import modelform_factory
user_creationform_data = {
'username': 'george',
'password1': '123!trettb',
'password2': '123!trettb',
'email': 'email#yo.gr',
}
user_model_form = modelform_factory(
get_user_model(),
form=UserCreationForm,
)
user_creation_form = user_model_form(data=user_creationform_data)
new_user = user_creation_form.save()
Old answer
Based on either this or that SO questions along with an
article titled "How to add ManytoMany model inside a post_save
signal" the solution I turned to, is to use on_commit(func, using=None):
The function you pass in will be called immediately after a
hypothetical database write made where on_commit() is called would be
successfully committed.
from django.conf import settings
from django.contrib.auth.models import Group
from django.db import transaction
from django.db.models.signals import post_save
from django.dispatch import receiver
def on_transaction_commit(func):
''' Create the decorator '''
def inner(*args, **kwargs):
transaction.on_commit(lambda: func(*args, **kwargs))
return inner
#receiver(
post_save,
sender=settings.AUTH_USER_MODEL,
)
#on_transaction_commit
def group_delegation(instance, raw, **kwargs):
to_add = Group.objects.get(id=1)
instance.groups.add(to_add)
The above code does not take into account that every login causes a
post_save signal.
Digging Deeper
A crucial point made in the relevant Django ticket is that the
above code will not work if a save() call is made inside an
atomic transaction together with a validation that depends on the
result of the group_delegation() function.
#transaction.atomic
def accept_group_invite(request, group_id):
validate_and_add_to_group(request.user, group_id)
# The below line would always fail in your case because the
on_commit
# receiver wouldn't be called until exiting this function.
if request.user.has_perm('group_permission'):
do_something()
...
Django docs describe in more details the constraints under which
on_commit() successfully works.
Testing
During testing, it is crucial to use the
TransactionTestCase or the
#pytest.mark.django_db(transaction=True) decorator when testing with pytest.
This is an example of how I tested this signal.
I have a model that contains multiple many to many fields:
class Author(models.Model):
name = models.CharField(max_length=30)
class Topic(models.Model):
description = models.CharField(max_length=30)
class Article(models.Model):
authors = models.ManyToManyField(Author, related_name='articles')
topics = models.ManyToManyField(Topic, related_name='articles')
I need something pretty simple:
A method to be executed after the article is saved where I can access both authors and topics of that instance.
My first attempt has been with a post_save signal, but the signal is fired when the model itself is saved, not after its relationships are saved, which come from a through model obviously.
After some online reading I realized that I probably need to create my own signal and connect to it. The problem is that I have no idea what to overwrite and where to fire that signal.
Since I need this on multiple models, I thought I could create some M2MPostSaveModel class and have my models inherit it so I can just catch the signals..
But where does Django sends the signals? How can I overwrite it? I honestly have no idea and I had no luck searching in the docs, so apologies if it was there already and I didn't see it.
I finally did it, here's how:
I created a new signal
import django.dispatch
m2m_post_save = django.dispatch.Signal(providing_args=["instance"])
Then I created a subclass of the ModelAdmin class that fires it after it finishes saving the related elements:
class M2MPostSaveModelAdmin(ModelAdmin):
def save_related(self, request, form, formsets, change):
super(M2MPostSaveModelAdmin, self).save_related(request, form, formsets, change)
m2m_post_save.send(sender=self.__class__, instance=form.instance)
Now all I have to do is to subclass my ModelAdmin class with my M2MPostSaveModelAdmin class and hook up the signal to my method.
def doo_something_with_updated_instance(instance, **kwargs):
# Here your instance has all the m2m relationships updated
m2m_post_save.connect(doo_something_with_updated_instance sender=YourModelAdminClass)
I used m2m_changed for separate processing.
def topics_changed(sender, instance, **kwargs):
# Do something
m2m_changed.connect(topics_changed, sender=Article.topics.through)
so do with authors also
I'm trying to write some code that sends an email every time one of the users modifies a model object. Currently, I'm working on having the one of the methods in models.py receive a post_save signal. I realize it's a well known fact that the post_save signal is usually sent twice, thus, the workaround is to utilize the dispatch_uid parameter. I have done this, but for some strange reason, I continue to receive two signals. Here's the code in my app's model.py file.
from django.db import models
from django.db.models.signals import post_save
def send_email(sender, **kwargs):
print "Signal sent." #just a placeholder
post_save.connect(send_email, dispatch_uid="unique_identifier")
class Library_Associates (models.Model):
first_name = models.CharField(max_length = 200)
last_name = models.CharField(max_length = 200)
department_choices = (
('ENG', 'Engineering'),
('ART', 'Arts and Sciences'),
('AFM', 'Accounting and Financial Managment'),
('MAT', 'Mathematics'),
)
department = models.CharField(max_length = 3, choices = department_choices, default = 'ENG')
pub_date = models.DateTimeField ('date published')
def __unicode__(self):
return self.first_name
class Meta:
verbose_name_plural = 'Library Associates'
class Info_Desk_Staff (models.Model):
first_name = models.CharField(max_length=50)
last_name = models.CharField(max_length=50)
salary = models.IntegerField()
hours_worked = models.IntegerField()
def __unicode__(self):
return self.first_name
class Meta:
verbose_name_plural = 'Info Desk Staff'
I already restarted the server several times, reset/deleted all the data for the app and I continue to still receive two signals. Is there something inherently wrong with my code? Any suggestions or insight would be greatly appreciated! Thanks!
Your problem comes from the fact that each time you modify an object via the admin interface, admin app creates the django.contrib.admin.models.LogEntry instance that represents changes made.
Because you are listening to post_save on all objects, your listener is called twice - once for your model, and the second time for the LogEntry model.
List of possible solutions includes:
Registering your listener separately for each of your models (e.g. select your models somehow and do it in a loop) using the sender argument in the post_save method.
for model in get_models():
post_save.connect(send_email, sender = model, dispatch_uid='unique_identifier')
Check if the sender sent to the listener is not an instance of django.contrib.admin.models.LogEntry
from django.contrib.admin.models import LogEntry
...
def send_email(sender, **kwargs):
if isinstance(sender, LogEntry):
return
Give your models a common super class and use that for testing in the listener
class MyModel(models.Model):
pass
class Library_Associates (MyModel):
...
class Info_Desk_Staff (MyModel):
...
def send_email(sender, **kwargs):
if not isinstance(sender, MyModel):
return
I am building a web based application, in which i need to send the notification to registered user by E-mail after updating or modifying some value in tables.
I have a model.py:
class ProileComment(models.Model):
comment = models.TextField()
user = models.CharField(max_length=30, null=False, blank=True)
timestamp = models.DateTimeField(null=False, blank=True)
Method in views.py
def send_email(request):
##
According to my problem if ProfileComment model will update then it should automatically called the send_email method. So that user will get the notification about the changes in database. How should i proceed?
You can use django signals!!
https://docs.djangoproject.com/en/dev/ref/signals/
They allow you to register a signal listener. perhaps pre_save signal would best suite your needs? Whenever something is saved anywhere in your app your signal will be called, and you can make a decision based on the model or whatever other conditions you need.
Or if you only want to send_email for that one model only you could override the save method!
class ProfileComment(models.Model):
def save(self, *args, **kwargs):
# send email?
super(ProfileComent, self, *args, **kwargs) # make sure to call parent!