I have class Event and class Participant, which has foreign key to Event.
In Event I have:
model_changed_stamp = models.DateTimeField()
Many participant are taking part in one even.
When any of the instances of class Event changes, or the new one is being created I would like that the value in model_changed_stamp will be updated. In fact I have many other classes like Building, which also have foreign key to Event, and I would like to also keep track of changes.
I came up with idea to use instance class method in Event. I tried:
def model_changed(self):
value = getattr(self, 'model_changed_stamp')
value = datetime.now()
setattr(self, 'model_changed_stamp', value)
and then in save() of Participant, or Building I would like to fire self.event.model_changed()
I would like know how to do it RIGHT. Should I use signals?
UPDATE 0:
According to some reading (e.g Two scoops of Django) use of signals is an overkill for this case.
UPDATE 1: Following suggestions of Daniel Roseman in Participant class in save(self) method I try:
def save(self):
if self.id is None:
self.event.model_changed()
In Event I defined model_changed as follows:
def model_changed(self):
self.model_changed_stamp = datetime.now()
self.save()
And it is not working - not updating the date, when it should i.e when the new Participant is created.
UPDATE 2: WORKING!!! ;-)
after adding: self.save() as last line in model_changed method.
Why not just set it directly? Why all this mucking about with getattr and setattr?
def model_changed(self):
self.model_changed_stamp = datetime.datetime.now()
An even better solution is to define the fields with auto_now=True, so they will be automatically updated with the current time whenever you save.
Yeah, signals is good tool for your task.
You can wite:
model_changed_stamp = models.DateTimeField(auto_now=True)
check docs for auto_now feature.
and then:
from django.db.models.signals import post_save
#receiver(post_save)
def post_save_event_handler(sender, instance, **kwargs):
if not sender in [Building, Participant, ...]:
return
instance.event.save() #updates model_changed_stamp
Related
Let's imagine a simple Food model with a name and an expiration date, my goal is to auto delete the object after the expiration date is reached.
I want to delete objects from the database (postgresql in my case) just after exp_date is reached, not filter by exp_date__gt=datetime.datetime.now() in my code then cron/celery once a while a script that filter by exp_date__lt=datetime.datetime.now() and then delete
Food(models.Model):
name = models.CharField(max_length=200)
exp_date = models.DateTimeField()
*I could do it with a vanilla view when the object is accessed via an endpoint or even with the DRF like so :
class GetFood(APIView):
def check_date(self, food):
"""
checking expiration date
"""
if food.exp_date <= datetime.datetime.now():
food.delete()
return False
def get(self, request, *args, **kwargs):
id = self.kwargs["id"]
if Food.objects.filter(pk=id).exists():
food = Food.objects.get(pk=id)
if self.check_date(food) == False:
return Response({"error": "not found"}, status.HTTP_404_NOT_FOUND)
else:
name = food.name
return Response({"food":name}, status.HTTP_200_OK)
else:
return Response({"error":"not found"},status.HTTP_404_NOT_FOUND)
but it would not delete the object if no one try to access it via an endpoint.
*I could also set cronjob with a script that query the database for every Food object which has an expiration date smaller than today and then delete themor even setup Celery. It would indeed just need to run once a day if I was using DateField but as I am using DateTimeField it would need to run every minute (every second for the need of ny project).
*I've also thought of a fancy workaround with a post_save signal with a while loop like :
#receiver(post_save, sender=Food)
def delete_after_exp_date(sender, instance, created, **kwargs):
if created:
while instance.exp_date > datetime.datetime.now():
pass
else:
instance.delete()
I don't know if it'd work but it seems very inefficient (if someone could please confirm)
Voila, thanks in advance if you know some ways or some tools to achieve what I want to do, thanks for reading !
I would advice not to delete the objects, or at least not effectively. Sceduling tasks is cumbersome. Even if you manage to schedule this, the time when you remove the items will always be slighlty off the time when you scheduled this from happening. It also means you will make an extra query per element, and not remove the items in bulk. Furthermore scheduling is inherently more complicated: it means you need something to persist the schedule. If later the expiration date of some food is changed, it will require extra logic to "cancel" the current schedule and create a new one. It also makes the system less "reliable": besides the webserver, the scheduler daemon has to run. It can happen that for some reason the daemon fails, and then you will no longer retrieve food that is not expired.
Therefore it might be better to combine filtering the records such that you only retrieve food that did not expire, and remove at some regular interval Food that has expired. You can easily filter the objects with:
from django.db.models.functions import Now
Food.objects.filter(exp_date__gt=Now())
to retrieve Food that is not expired. To make it more efficient, you can add a database index on the exp_date field:
Food(models.Model):
name = models.CharField(max_length=200)
exp_date = models.DateTimeField(db_index=True)
If you need to filter often, you can even work with a Manager [Django-doc]:
from django.db.models.functions import Now
class FoodManager(models.Manager):
def get_queryset(*args, **kwargs):
return super().get_queryset(*args, **kwargs).filter(
exp_date__gt=Now()
)
class Food(models.Model):
name = models.CharField(max_length=200)
exp_date = models.DateTimeField(db_index=True)
objects = FoodManager()
Now if you work with Food.objects you automatically filter out all Food that is expired.
Besides that you can make a script that for example runs daily to remove the Food objects that have expired:
from django.db.models import Now
Food._base_manager.filter(exp_date__lte=Now()).delete()
Update to the accepted answer. You may run into Super(): No Arguments if you define the method outside the class. I found this answer helpful.
As Per PEP 3135, which introduced "new super":
The new syntax:
super()
is equivalent to:
super(__class__, <firstarg>)
where class is the class that the method
was defined in, and is the first
parameter of the method (normally self for
instance methods, and cls for class methods).
While super is not a reserved word, the parser recognizes the use of super in a method definition and only passes in the class cell when this is found. Thus, calling a global alias of super without arguments will not necessarily work.
As such, you will still need to include self:
class FoodManager(models.Manager):
def get_queryset(self, *args, **kwargs):
return super().get_queryset(*args, **kwargs).filter(
exp_date__gt=Now()
)
Just something to keep in mind.
Is it possible to add self - I mean the current object it's ManyToManyField?
class City(models.Model):
name = models.CharField(max_length=80)
country = models.ForeignKey('Country')
close_cities = models.ManyToManyField('City',blank=True, related_name='close_cities_set')
If I create let's say x = City.objects.create(...), I want the x to be a part of close_cities by default.
I can't find anything related to this problem. I tried to overwrite create() method but it did not worked.
After trying, I decided to create a signal which adds the city after creating an instance. Unfortunately this does not work but I can't figure out why. The signal is being called, the condition if created is True (checked).
#receiver(post_save,sender=myapp_models.City)
def add_self_into_many_to_many_field(sender, instance, created, **kwargs):
if created:
instance.close_cities.add(instance)
instance.save()
Do you know where is the problem?
In this case pre_save signal will be better solution.
In Your solution city.save calls add_self_into_many_to_many_field. instance.save calls add_self_into_many_to_many_field. and so on...
#receiver(pre_save, sender=myapp_models.City)
def add_self_into_many_to_many_field(sender, instance, **kwargs):
if instance.pk is None:
instance.close_cities.add(instance)
I have a model
class EventArticle(models.Model):
event = models.ForeignKey(Event, related_name='article_event_commentary')
author = models.ForeignKey(Person)
and another model
class Event(models.Model):
attendies = models.ManyToManyField(Person, related_name="people")
How do I restrict an author to only objects that are also attendies?
Typically, the ForeignKey limit_choices_to argument is your friend in situations like this. (See https://docs.djangoproject.com/en/1.8/ref/models/fields/#django.db.models.ForeignKey.limit_choices_to)
You could restrict the author to a list of users of have attended any event. However, even doing that is far from ideal:
# don't try this at home, this will be excruciatingly slow...
def author_options():
all_attendee_ids = []
for e in Event.objects.all():
for a in e.attendies.all():
all_attendee_ids.append(a.id)
return {'author': Person.objects.filter(id_in=set(all_attendee_ids)).id}
# requires Django 1.7+
class EventArticle(models.Model):
event = models.ForeignKey(Event, related_name='article_event_commentary')
author = models.ForeignKey(Person, limit_choices_to=author_options)
However, even though you didn't explicitly state it in your question, I suspect you want authors to be limited to a set of attendees from that particular event, i.e. the same event as specified in the EventArticle model's event field.
In which case, this brings about two problems which I don't believe can be solved cleanly:
you can't pass parameters (i.e. the event ID) when using limit_choices_to, and
until the EventArticle model has a value for event, you wouldn't know which event was in question.
As using limit_choices_to isn't going to work here, there probably isn't a way to fix this cleanly. You could add a method to your EventArticle model which will give you a list of potential authors like so...
class EventArticle(models.Model):
event = models.ForeignKey(Event, related_name='article_event_commentary')
author = models.ForeignKey(Person)
def author_options(self):
if self.event:
return self.event.attendies.all()
else:
return []
...but you will still need to do some legwork to make those options available to the UI so the user can select them. Without knowing more about your setup, I'd be guessing if I tried to answer that part.
You can override save() of EventArticle model to ensure it.
I have a Django model where a lot of fields are choices. So I had to write a lot of "is_something" properties of the class to check whether the instance value is equal to some choice value. Something along the lines of:
class MyModel(models.Model):
some_choicefield = models.IntegerField(choices=SOME_CHOICES)
#property
def is_some_value(self):
return self.some_choicefield == SOME_CHOICES.SOME_CHOICE_VALUE
# a lot of these...
In order to automate this and spare me a lot of redundant code, I thought about patching the instance at creation, with a function that adds a bunch of methods that do the checks.
The code became as follows (I'm assuming there's a "normalize" function that makes the label of the choice a usable function name):
def dynamic_add_checks(instance, field):
if hasattr(field, 'choices'):
choices = getattr(field, 'choices')
for (value,label) in choices:
def fun(instance):
return getattr(instance, field.name) == value
normalized_func_name = "is_%s_%s" % (field.name, normalize(label))
setattr(instance, normalized_func_name, fun(instance))
class MyModel(models.Model):
def __init__(self, *args, **kwargs):
super(MyModel).__init__(*args, **kwargs)
dynamic_add_checks(self, self._meta.get_field('some_choicefield')
some_choicefield = models.IntegerField(choices=SOME_CHOICES)
Now, this works but I have the feeling there is a better way to do it. Perhaps at class creation time (with metaclasses or in the new method)? Do you have any thoughts/suggestions about that?
Well I am not sure how to do this in your way, but in such cases I think the way to go is to simply create a new model, where you keep your choices, and change the field to ForeignKey. This is simpler to code and manage.
You can find a lot of information at a basic level in Django docs: Models: Relationships. In there, there are many links to follow expanding on various topics. Beyong that, I believe it just needs a bit of imagination, and maybe trial and error in the beginning.
I came across a similar problem where I needed to write large number of properties at runtime to provide backward compatibility while changing model fields. There are 2 standard ways to handle this -
First is to use a custom metaclass in your models, which inherits from models default metaclass.
Second, is to use class decorators. Class decorators sometimes provides an easy alternative to metaclasses, unless you have to do something before the creation of class, in which case you have to go with metaclasses.
I bet you know Django fields with choices provided will automatically have a display function.
Say you have a field defined like this:
category = models.SmallIntegerField(choices=CHOICES)
You can simply call a function called get_category_display() to access the display value. Here is the Django source code of this feature:
https://github.com/django/django/blob/baff4dd37dabfef1ff939513fa45124382b57bf8/django/db/models/base.py#L962
https://github.com/django/django/blob/baff4dd37dabfef1ff939513fa45124382b57bf8/django/db/models/fields/init.py#L704
So we can follow this approach to achieve our dynamically set property goal.
Here is my scenario, a little bit different from yours but down to the end it's the same:
I have two classes, Course and Lesson, class Lesson has a ForeignKey field of Course, and I want to add a property name cached_course to class Lesson which will try to get Course from cache first, and fallback to database if cache misses:
Here is a typical solution:
from django.db import models
class Course(models.Model):
# some fields
class Lesson(models.Model):
course = models.ForeignKey(Course)
#property
def cached_course(self):
key = key_func()
course = cache.get(key)
if not course:
course = get_model_from_db()
cache.set(key, course)
return course
Turns out I have so many ForeignKey fields to cache, so here is the code following the similar approach of Django get_FIELD_display feature:
from django.db import models
from django.utils.functional import curry
class CachedForeignKeyField(models.ForeignKey):
def contribute_to_class(self, cls, name, **kwargs):
super(models.ForeignKey, self).contribute_to_class(cls, name, **kwargs)
setattr(cls, "cached_%s" % self.name,
property(curry(cls._cached_FIELD, field=self)))
class BaseModel(models.Model):
def _cached_FIELD(self, field):
value = getattr(self, field.attname)
Model = field.related_model
return cache.get_model(Model, pk=value)
class Meta:
abstract = True
class Course(BaseModel):
# some fields
class Lesson(BaseModel):
course = CachedForeignKeyField(Course)
By customizing CachedForeignKeyField, and overwrite the contribute_to_class method, along with BaseModel class with a _cached_FIELD method, every CachedForeignKeyField will automatically have a cached_FIELD property accordingly.
Too good to be true, bravo!
When using a Django signal like post_save you can prevent it from firing when an object is first created by doing something like:
#receiver(post_save,sender=MyModel)
def my_signal(sender, instance, created,**kwargs):
if not created:
pass # Do nothing, as the item is new.
else:
logger.INFO("The item changed - %s"%(instance) )
However, ManyToMany relations are applied after an item is initially created, so no such argument is passed in, making it difficult to suppress in these cases.
#receiver(m2m_changed,sender=MyModel.somerelation.though)
def my_signal(sender, instance, created,**kwargs):
if __something__: # What goes here?
pass # Do nothing, as the item is new.
else:
logger.INFO("The item changed - %s"%(instance) )
Is there an easy way to suppress an m2m_changed signal when its being done on an object that has just been created?
I think there is no easy way to do that.
As the Django doc says, you can't associate an item with a relation until it's been saved. Example from the doc:
>>> a1 = Article(headline='...')
>>> a1.publications.add(p1)
Traceback (most recent call last):
...
ValueError: 'Article' instance needs to have a primary key value before a many-to-many relationship can be used.
# should save Article first
>>> a1.save()
# the below statement never know it's just following a creation or not
>>> a1.publications.add(p1)
It's logically not possible for a relation record to know whether it is added to "a just created item" or "an item that already exists for some time", without external info.
Some workarounds I came up with:
Solution 1. add a DatetimeField in MyModel to indicate creation time. m2m_changed handler uses the creation time to check when is the item created. It work practically in some cases, but cannot guarantee correctness
Solution 2. add a 'created' attribute in MyModel, either in a post_save handler or in other codes. Example:
#receiver(post_save, sender=Pizza)
def pizza_listener(sender, instance, created, **kwargs):
instance.created = created
#receiver(m2m_changed, sender=Pizza.toppings.through)
def topping_listener(sender, instance, action, **kwargs):
if action != 'post_add':
# as example, only handle post_add action here
return
if getattr(instance, 'created', False):
print 'toppings added to freshly created Pizza'
else:
print 'toppings added to modified Pizza'
instance.created = False
Demo:
p1 = Pizza.objects.create(name='Pizza1')
p1.toppings.add(Topping.objects.create())
>>> toppings added to freshly created Pizza
p1.toppings.add(Topping.objects.create())
>>> toppings added to modified Pizza
p2 = Pizza.objects.create(name='Pizza2')
p2.name = 'Pizza2-1'
p2.save()
p2.toppings.add(Topping.objects.create())
>>> toppings added to modified Pizza
But be careful using this solution. Since 'created' attribute was assigned to Python instance, not saved in DB, things can go wrong as:
p3 = Pizza.objects.create(name='Pizza3')
p3_1 = Pizza.objects.get(name='Pizza3')
p3_1.toppings.add(Topping.objects.create())
>>> toppings added to modified Pizza
p3.toppings.add(Topping.objects.create())
>>> toppings added to freshly created Pizza
That's all about the answer. Then, caught you here! I'm zhang-z from github django-notifications group :)
#ZZY's answer basically helped me realise that this wasn't possible without storing additional fields. Fortunately, I'm using django-model-utils which includes a TimeStampedModel which includes a created field.
Providing a small enough delta, it was relatively easy to check against the created time when catching the signal.
#receiver(m2m_changed,sender=MyModel.somerelation.though)
def my_signal(sender, instance, created,**kwargs):
if action in ['post_add','post_remove','post_clear']:
created = instance.created >= timezone.now() - datetime.timedelta(seconds=5)
if created:
logger.INFO("The item changed - %s"%(instance) )
For an easier and short way of checking in the object is created or not is using the _state.adding attribute:
def m2m_change_method(sender, **kwargs):
instance = kwargs.pop('instance', None)
if instance:
if instance.adding: #created object
pk_set = list(kwargs.pop('pk_set')) #ids of object added to m2m relation
else:
# do something if the instance not newly created or changed
# if you want to check if the m2m objects is new use pk_set query if exists()
m2m_change.connect(m2m_change_method, sender=YourModel.many_to_many_field.through)