In my app I want to keep a track of all the questions that are being deleted. And so I have created a class(table) as such in my models file.
class Deleted(models.Model):
question = models.IntegerField(null=True, blank=True)#id of question being deleted
user = models.IntegerField(null=True, blank=True)#id of user deleting the question
dt = models.DateTimeField(null=True, blank=True)#time question is deleted
When a user tries to delete a question This delete function is called:
def delete_questions(request, user, questions):
for q in questions:
q.delete()
My doubt is how can i make a pre_delete signal of django to populate the new table I have created.
~newbie trying hefty task~
Thanks in advance:)
You start off by defining the receiver you want to use:
def log_deleted_question(sender, instance, using, **kwargs):
d = Deleted()
d.question = instance.id
d.dt = datetime.datetime.now() # consider using auto_now=True in your Deleted definition
# not sure how you'd get the user via a signal,
# since it can happen from a number of places (like the command line)
d.save()
Then define your receiver decorator:
from django.db.models.signals import pre_delete
from django.dispatch import receiver
#receiver(pre_delete, sender=Question, dispatch_uid='question_delete_log')
Add it altogether:
from django.db.models.signals import pre_delete
from django.dispatch import receiver
#receiver(pre_delete, sender=Question, dispatch_uid='question_delete_signal')
def log_deleted_question(sender, instance, using, **kwargs):
d = Deleted()
d.question = instance.id
d.dt = datetime.datetime.now()
d.save()
You can put this function in your models.py file, as you know it'll be loaded and connected up correctly.
The problem though, is that you don't get the user requesting the delete. Since a delete can be triggered from the django api (command line, shell, etc), which doesn't have a request associated with it. For this reason, you might want to avoid using signals if it's absolutely critical that you store the user along with the delete.
Related
I am using django with python. I am trying to update the model whenever a field is updated, in this case because i have a lambda function in the cloud, i want when a postgres query update an instance of the model, during the update action, update the age of the Person model below:
data
Contact table
id = 1
name = 'john'
age = 38
sql
UPDATE contacts_contact SET name = 'jane' where id = '1'; # this works fine
now i want to make sure that when the name is changed to jane as above, that the age update automatically in django with the override method
django
class Contact(models.Model):
..
name = models.CharField()
age = models.IntegerField()
def update(self, *args, **kwargs):
if self.age:
self.age = 25
super().update(*args, **kwargs) # i tried this
super(Contact, self).update(*args, **kwargs) # i tried this too
both update methods i tried above do not update the age of the person regardless of the fact that the sql query update worked
is there something that i am missing?
PS: I want to update that field specifically in django, not in the sql query
"I am using django with python. I am trying to update the model whenever a field is updated" if this is the case django signals can help you, in-fact they are built for this purpose only.
follow the below to steps to enable the django-signals for your models.
Inside your_app/app.py you can find the snippets as below. if not paste the same and modify it with your app name. here you're just importing the signals(contents in your_app/signal.py)
from django.apps import AppConfig
class YourAppConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'your_app'
verbose_name = 'Your App'
def ready(self):
import your_app.signals
Inside your_app/signals.py paste the contents below in it. modify it with your own model name. for now I'll use your Contact model.
from django.db.models.signals import post_save
from your_app.models import Contact
from django.dispatch import receiver
#receiver(post_save, sender=Contact)
def create_user(sender, instance, created, **kwargs):
print(sender)
print(instance)
print(kwargs)
if created:
print('Hurray its created', created)
here you can use post_save signal. Django includes a “signal dispatcher” which helps decoupled applications get notified when actions occur elsewhere in the framework. In a nutshell, signals allow certain senders to notify a set of receivers that some action has taken place.
whenever you call .save() method of Contact model(create/update). for example -
c = Contact(name='zzz', age=20)
c.save()
now this will make the post_save signal to get notified about the changes, you can verify the same in your create_user() method print statements. from here you can do whatever you want with your model.
you can refer more about post_save signals here https://docs.djangoproject.com/en/4.0/ref/signals/#post-save
I have a problem with duplicate signals. I have looked-up the relevant part of Django docs and a similar question on Stackoverflow but I still can't get it working right - i.e. the action that I have planned (creation on an ActivityLog entry) is actually happening 4 times :/
I have added the dispatch_uid and the problem still persists, so I guess I'm doing something wrong. Can you please hint me to the error?
Here's my code:
signals.py
from patient.models import Patient
from .models import ActivityLog
#receiver(pre_save, sender=Patient)
def create_new_patient(sender, instance, **kwargs):
ActivityLog.objects.create(
user=instance.created_by_user,
event='created a new patient',
patient_id=instance.patient_id
)
and this is it's usage in the patient.apps module:
from django.apps import AppConfig
from django.db.models.signals import pre_save
app_name = "patient"
class PatientConfig(AppConfig):
name = 'patient'
verbose_name = "Patients"
def ready(self):
from activity_logger.signals import create_new_patient
print('Patient APP is ready!')
pre_save.connect(create_new_patient, sender='patient.Patient', dispatch_uid='patient')
The print Patient APP is ready! does appear twice, and the object gets created 4 times, despite setting the dispatch_uid. What have I misunderstood?
The #receiver(Signal,...) decorator is a shortcut for Signal.connect(...), so you indeed register your create_new_patient handler twice (once thru #receiver when importing your signals module, the second time with pre_save.connect().
Solution : in your App.ready() method, you should just import your app's signal.py module. This will trigger the registration of the handlers decorated with #receiver.
I have to retrieve M objects from a list of Q objects and then maps the M objects to an User. To achieve this, I need to run the code inside a transaction and roll back the DB in the case 1 of M objects are not created:
def polls(request, template_name='polls/polls.html'):
question_list = random.sample(list(Question.objects.all()), 3)
try:
with transaction.atomic():
for question in question_list:
UserToQuestion.objects.create(
user=request.user.id,
question=question.id
)
except IntegrityError:
handle_exception()
How I can achieve this? How the exception should be handled? The django documentation doesn't show a real example.
Is also possible perform this task during the user registration overriding the save method in way each registered user is mapped to N questions?
You need to use django signals while doing user registration.
Also you do not need with transaction.atomic(): part. You need to use
bulk create
First of all, we create a signal, in order to map the related questions to an user when the user is registered. The signal is triggered from post_save() and the call-back function retrieve N random questions and map it to the user. The signal must be connected, to do it, we use the decorator #receiver. We also need to use a transaction, cause of bulk_create() method.
signals.py
import random
from django.dispatch import receiver
from django.contrib.auth.models import User
from django.db import transaction
from django.db.models.signals import post_save
from polls.models import Question, UserToQuestion
#transaction.atomic
#receiver(post_save, sender=User)
def sig_user_registered(instance, **kwargs):
question_list = random.sample(list(Question.objects.all()), 3)
UserToQuestion.objects.bulk_create([
UserToQuestion(user=instance, question=question) for question in question_list
])
The signal must be imported. To do it, we use the method ready().
apps.py
from django.apps import AppConfig
class AccountConfig(AppConfig):
name = 'account'
def ready(self):
import account.signals
Finally, we load the application
settings.py
INSTALLED_APPS = [
# Django apps
# ...
# Third-party apps
# ...
# Your apps
'account.apps.AccountConfig',
# ...
]
I'd like to show a notification to user with some stats (e.g how many items have been sold since last time he logged in)
#receiver(user_logged_in)
def notify_user_on_login(sender, request, user, **kwargs):
items = Item.objects.filter(status=Item.HISTORY_STATUS, seller=user, when_trading__gte=user.last_login)
However, in this signal last_login has already been updated.
According to the source at django.contib.auth django also connects signal with function that updates last_login:
user_logged_in.connect(update_last_login)
Is it possible to call my function BEFORE updating? Or get same result without adding custom field or doing some strange magic?
The last_login is also updated with a handler to that signal, which is surely registered and executed before yours. You might be able to solve your issue by moving your app over django.contrib.auth in INSTALLED_APPS.
Signal handlers depending on order doesn't seem like a good idea though. So I would probably replace Django's handler with your own:
from django.contrib.auth.models import update_last_login
def notify_user_on_login(user):
items = Item.objects.filter(status=Item.HISTORY_STATUS, seller=user, when_trading__gte=user.last_login)
#receiver(user_logged_in)
def after_user_logged_in(sender, user, **kwargs):
notify_user_on_login(user)
update_last_login(sender, user, **kwargs)
user_logged_in.disconnect(update_last_login)
I am working on Django Signals to handle data in Redis whenever any change happens in the Postgres database. But, I am unable to send custom parameters to Signal Receiver. I have gone through a lot of questions, but I am not able to understand how to send extra custom parameters to Signal Receiver.
Usually I do,
#receiver(post_save, sender=post_data)
def addToRedis(sender, instance, **kwargs):
But I want to do,
#receiver(post_save, sender=post_data)
def addToRedis(sender, instance, extra_param=extra_param_data **kwargs):
# Get `extra_param`
Here, I want to read extra_param to store the data in Redis.
I am using Django Rest Framework. And post_save is directly called after serializer.save()
It'll be great if someone can help me out in this.
You can send any additional parameters in a signal as keyword arguments:
#receiver(post_save, sender=post_data)
def addToRedis(sender, instance, **kwargs):
# kwargs['extra_param']
How to send:
my_signal.send(sender=self.__class__, extra_param='...')
If you have no access to the signal send function (eg REST framework internals), you can always use a custom signal.
This Answer the with Respect to Django Rest:
in your views.py
my_obj = Mymodel()
my_obj.your_extra_param = "Param Value" # your_extra_param field (not Defined in Model)
my_obj.save()
post_save.connect(rcver_func,sender=Mymodel,weak=False)
and define a signal.py with following
from django.db.models.signals import post_save
from django.dispatch import receiver
from .models import Mymodel
#receiver(post_save,sender=Mymodel)
def rcver_func(sender, instance, created,**kwargs):
if created:
received_param_value = instance.your_extra_param # same field as declared in views.py