Is it possible to add self - I mean the current object it's ManyToManyField?
class City(models.Model):
name = models.CharField(max_length=80)
country = models.ForeignKey('Country')
close_cities = models.ManyToManyField('City',blank=True, related_name='close_cities_set')
If I create let's say x = City.objects.create(...), I want the x to be a part of close_cities by default.
I can't find anything related to this problem. I tried to overwrite create() method but it did not worked.
After trying, I decided to create a signal which adds the city after creating an instance. Unfortunately this does not work but I can't figure out why. The signal is being called, the condition if created is True (checked).
#receiver(post_save,sender=myapp_models.City)
def add_self_into_many_to_many_field(sender, instance, created, **kwargs):
if created:
instance.close_cities.add(instance)
instance.save()
Do you know where is the problem?
In this case pre_save signal will be better solution.
In Your solution city.save calls add_self_into_many_to_many_field. instance.save calls add_self_into_many_to_many_field. and so on...
#receiver(pre_save, sender=myapp_models.City)
def add_self_into_many_to_many_field(sender, instance, **kwargs):
if instance.pk is None:
instance.close_cities.add(instance)
Related
The problem is, that after saving a model instance with new m2m_field value, I want to automatically add some more related objects to it.
class MyModel(models.Model):
m2m_field = models.ManyToManyField("app.RelatedModel")
#receiver(models.signals.m2m_changed, sender=MyModel.m2m_field.through)
def m2m_field_changed(sender, instance, **kwargs):
instance.m2m_field.add(related_object_instance)
That obviously results in an infinite loop, because after adding the instance to the m2m_field, the receiver is fired again and so on. Is there a proper way to do it?
Thanks for any help.
you have to check first if the related object has been added before or no:
#receiver(models.signals.m2m_changed, sender=MyModel.m2m_field.through)
def m2m_field_changed(sender, instance, **kwargs):
if related_object_instance not in instance.m2m_field:
instance.m2m_field.add(related_object_instance)
Supposing I already have a created instance of a Django's model. I want to get another instance that is a duplicate in the database. Is there a universal way to do it without finding out which unique index is responsible for this duplication:
class MyModel(models.Model):
...
instance = MyModel(...)
print(instance.id) # None
...
duplicate = get_duplicate(instance) # what should I type here instead of get_duplicate?
print(duplicate.id) # Some ID in DB
I want the function get_duplicate to not depend on the internal structure of the model. Also I don't want to modify the model.
For example, if I need to find out a duplicate exists I can do instance.save(). In case of IntegrityError there's a duplicate. But how to find out which one?
When you instantiate a model as such MyModel(...), you get an unsaved instance. To propagate it to the database, you have to call .save() on it, at which point instance.id will be set to something. You could also do MyModel.objects.create(...) as a shortcut.
Now, to answer the question, to duplicate a record you already have in the database; set its id to None, and save it again.
instance = MyModel.objects.get(id=1)
instance.id = None
instance.save()
print(instance.id) # 2
If I understand your question correctly, you want .save() to create two database rows instead of one? I don't understand why you'd want that, or how you'd make it useful, but you'd do it by overriding .save() on your model:
class MyModel(models.Model):
...
def save(self, *args, **kwargs):
instance = super().save(*args, **kwargs)
instance.id = None
instance.save()
return instance
As for finding a duplicate instance, this is much more difficult. You're going to have to decide what makes an instance a "duplicate". If it's a user model, for instance, maybe if only the email address is the same, it's a duplicate, but if it's a transaction instance, then EVERY field has to be the same.
As this is to inextricably linked to the model's type, you will want to put this on the model itself. I'll write a toy example below:
class MyModel(models.Model):
a = models.CharField(unique=True, ...)
b = models.CharField(unique=True, ...)
c = models.CharField(...)
def get_duplicates(self):
return type(self).filter(
a=self.a,
b=self.b,
)
In this example, a and b must match, but c doesn't have to.
You've already defined what makes a model a "duplicate" with your unique and unique together keys, so your .get_duplicates() function should be informed by those.
Here is the classic mixin used to know when a Django object is created or modified:
class TimeStampable(models.Model):
created = models.DateTimeField(auto_now_add=True, editable=False)
modified = models.DateTimeField(auto_now=True)
class Meta:
abstract = True
The problem (it's not really a problem for most of us I guess) is that created and modified fields are not equal at the first creation (there is a tiny delta between them).
How would you improve this mixin to solve that specific issue?
I checked django-model-utils source code but found nothing.
I guess we would need to override __init__ method?
If you want 2 timezone objects to be equal, they must be created the exact same instance in time. That is virtually impossible, especially when you make the call to timezone.now() in series.
Essentially what happens at TimeStampable object creation time is that:
created gets a timezone.now instance.
modified gets a timezone.now which is created a tiny time fraction after the created's one.
We can override model's .save() method to solve this problem:
We will use model's _state.adding() method which is an instance of ModelState, to define if an object is yet unsaved (newly created).
If it is newly created, we need to take one (and only one) instance
of timezone.now and pass it to created and modified fields.
If the object just got modified, we must not forget to pass an instance of timezone.now to the modified field.
class MyTimestampableModel(Timestampable):
...
def save(self, *args, **kwargs):
timezone_now = timezone.now()
if self._state.adding:
self.created = timezone_now
self.modified = timezone_now
super(MyTimestampableModel, self).save(*args, **kwargs)
models.py:
class Car(models.Model):
...
class Pictures(models.Model):
car = models.ForeignKey(Car, related_name='pictures')
width = models.PositiveIntegerField(editable=False, default=780)
height = models.PositiveIntegerField(editable=False, default=585)
image = models.ImageField(upload_to = get_file_path, max_length=64, height_field='height', width_field='width')
def __unicode__(self):
return str(self.id)
def delete(self, *args, **kwargs):
storage, path = self.image.storage, self.image.path
super(Pictures, self).delete(*args, **kwargs)
storage.delete(path)
It works nice (I delete one picture from admin panel and this picture is automatically deleted from disk).
But when I deleted Car object through admin panel, images are not removed from disk.
How to fix that?
Thanks!
I'm sure the problem here is that the ORM uses ON DELETE CASCADE to have the database handle removing the relations, meaning your delete method won't get called.
You could probably just apply the same technique you used here and do:
class Car(models.Model):
...
def delete(self, *args, **kwargs):
for picture in self.pictures.all():
storage, path = picture.image.storage, picture.image.path
storage.delete(path)
super(Car, self).delete(*args, **kwargs)
However, you are better off using signals instead of overriding the delete methods https://docs.djangoproject.com/en/dev/ref/signals/#post-delete
Note that the delete() method for an object is not necessarily called when deleting objects in bulk using a QuerySet. To ensure customized delete logic gets executed, you can use pre_delete and/or post_delete signals.
I started looking into this and found some interesting things regarding the admin and deletion of objects.
When deleting an object from the admin, the following function is called,
django/contrib/admin/options.py -> delete_model()
which in turn calls obj.delete() with obj being the current object being deleted.
The delete method for an object then runs the following code,
collector = Collector(using=using)
collector.collect([self])
collector.delete()
The collector object now has an attribute 'data' which contains all of the related objects.
When collector.delete() gets run, it executes the function query.delete_batch(pk_list, self.using) which does a bulk deletion using the argument pk_list which is a list of primary keys for related objects being deleted. The bulk deletion function in turn doesn't call the delete method of each related object being deleted.
The good thing here is that the pre_save and post_save signals do get called for all related objects so we could move your custom deletion code into either a pre_save or post_save signal for the Pictures model.
This should work I'm thinking but I haven't had a chance to test.
I have class Event and class Participant, which has foreign key to Event.
In Event I have:
model_changed_stamp = models.DateTimeField()
Many participant are taking part in one even.
When any of the instances of class Event changes, or the new one is being created I would like that the value in model_changed_stamp will be updated. In fact I have many other classes like Building, which also have foreign key to Event, and I would like to also keep track of changes.
I came up with idea to use instance class method in Event. I tried:
def model_changed(self):
value = getattr(self, 'model_changed_stamp')
value = datetime.now()
setattr(self, 'model_changed_stamp', value)
and then in save() of Participant, or Building I would like to fire self.event.model_changed()
I would like know how to do it RIGHT. Should I use signals?
UPDATE 0:
According to some reading (e.g Two scoops of Django) use of signals is an overkill for this case.
UPDATE 1: Following suggestions of Daniel Roseman in Participant class in save(self) method I try:
def save(self):
if self.id is None:
self.event.model_changed()
In Event I defined model_changed as follows:
def model_changed(self):
self.model_changed_stamp = datetime.now()
self.save()
And it is not working - not updating the date, when it should i.e when the new Participant is created.
UPDATE 2: WORKING!!! ;-)
after adding: self.save() as last line in model_changed method.
Why not just set it directly? Why all this mucking about with getattr and setattr?
def model_changed(self):
self.model_changed_stamp = datetime.datetime.now()
An even better solution is to define the fields with auto_now=True, so they will be automatically updated with the current time whenever you save.
Yeah, signals is good tool for your task.
You can wite:
model_changed_stamp = models.DateTimeField(auto_now=True)
check docs for auto_now feature.
and then:
from django.db.models.signals import post_save
#receiver(post_save)
def post_save_event_handler(sender, instance, **kwargs):
if not sender in [Building, Participant, ...]:
return
instance.event.save() #updates model_changed_stamp