Is there any way that classes always inherit from some base class - python

In my application i have the requirement of keppling logs of all models changes and delete.
So i have created baseclass Audit and extended all classes from it.
I have overridden save , delete methods in it so that i keep old chnages as well when we do some updation.
I want to know that is there any better way of doing that rather than extending all classes fron base class. Or is it all right like that.

For this use case, you may be able to write a generic function that could be used with django signals.
https://docs.djangoproject.com/en/dev/topics/signals/

Related

Django: Are Django models dataclasses?

Can we say that Django models are considered dataclasses? I don't see #dataclass annotation on them or on their base class model.Models. However, we do treat them like dataclasses because they don't have constructors and we can create new objects by naming their arguments, for example MyDjangoModel(arg1= ..., arg2=...).
On the other hand, Django models also don't have init methods (constructors) or inherit from NamedTuple class.
What happens under the hood that I create new Django model objects?
A lot of the magic that happens with models, if not nearly all of it, is from its base meta class.
This can be found in django.db.models.ModelBase specifically in the __new__ function.
Regardless of an __init__ method being defined or not (which actually, it is as per Abdul's comment), doesn't mean it can or should be considered a dataclass.
As described very eloquently in this SO post by someone else;
What are data classes and how are they different from common classes?
Despite django models quite clearly and apparently seeming to have some kind of data stored in them, the models are more like an easy to use (and reuse) set of functions which leverage a database backend, which is where the real state of an object is stored, the model just gives access to it.
It's also worth noting that models don't store data, but simply retrieves it.
Take for example this simple model:
class Person(models.Model):
name = models.CharField()
And then we did something like this in a shell:
person = Person.objects.get(...)
print(person.name)
When we access the attribute, django is actually asking the database for the information and this generates a query to get the value.
The value isn't ACTUALLY stored on the model object itself.
With that in mind, inherently, django models ARE NOT dataclasses. They are plain old regular classes.
Django does not work with data classes. You can define a custom model field. But likely this will take some development work.

GAE: Convert model to subclass of polymodel

I have an existing GAE app with a reasonably small number of entities, and I would like to update the entities to use polymodel.
I currently have entities like this:
class Mammal(db.Model)
class Reptile(db.Model)
and I'd like to change it to this:
class Animal(polymodel.Polymodel)
class Mammal(Animal)
class Reptile(Animal)
My current plan is to do the following procedure:
Iterate over all of the existing entities to change the class names to some temporary class name. E.g., convert class Mammal(db.Model) to class MammalTmp(db.Model) and convert class Reptile(db.Model) to class ReptileTmp(db.Model). In doing this, I would copy all of the properties of the old class to the new class.
Delete all instances of class Mammal(db.Model) and class Reptile(db.Model).
Iterate over all of the temporary entities to change the class names to the desired class name and type. E.g., convert class MammalTmp(db.Model) to class Mammal(polymodel.Polymodel) and convert class ReptileTmp(db.Model) to class Reptile(polymodel.Polymodel). I would again copy all of the properties of the old class to the new class.
Delete all instances of class MammalTmp(db.Model) and class ReptileTmp(db.Model).
This is a laborious procedure! Is there an easier way to accomplish this?
With the way entities are built and then indexed, no unfortunately there is no other way (as far as I know) to do a similar process. I had to do it when I first wanted to implement polymodels and that's the way I did it. Luckily all of these can be done through code so you don't really have to sit at your computer and change all of that manually.
It's lengthy for sure, but think of all the speed benefits the datastore offers. That's why you have to be careful about creating your models in the first place. I know it's not necessarily easy (as I said I fell in the same hole as you and had to write code for those iterations and changes).
A very good way to do such a process programmatically would be to use MapReduce. A "mapper" could definitely do the trick and help you do that faster and more efficiently. Looking into the sample projects might give you some pointers.
I'm not familiar at all with GAE, but could you just redirect your model definitions through an intermediary? I suppose this wouldn't be any faster than just renaming the base class for all your models, though.
Create a redirect class to start with:
# redirect.py
# note: I don't really know where db comes from...
import gae.db as db
class Model(db.Model):
pass
Then add this line to your model file:
import redirect as db
class Mammal(db.Model):
pass
And since db is now the redirect version, you can change the redirect file..
class Model (db.PolyModel):
pass
But now that I've written it, it sounds like just as much work as manually updating the files, and you lose all access to db. for other basic operations. So, maybe I should just downvote my own answer. :D

How to I modify a class of a library to get it to use my extension of another library class?

Short story:
I want to make slight changes to the behavior of a MainClass, and a HelperClass on which it depends, in a popular library. I can easily extend both by subclassing, but how do I tell the top-level class to use my extended version of the helper class?
The MainClass generates instances of HelperClass via simple instantiation (e.g., helperItem = HelperClass()) and from yield(). HelperClass is coded in the same module as MainClass.
Longer:
For a Django Form, I want to generate a nested dictionary holding the data specifying the HTML display of that form. Django Form objects generate HTML by wrapping Field objects in a BoundField class, which has methods to reach into the Field datastructures to generate the appropriate HTML strings.
I want to:
extend / modify Form to use my extended version of BoundField, and
extend Form to add a method that cycles through its fields
calling
getHtmlSpec() on each.
(Here I'm glossing over important Django implementation details, like whether to extend Form or BaseForm, and whether to extend BoundField or / and Input widgets.)
Obviously I could do this by extending Form to reach in to 'fields' and generate this stuff, and that might be better design. But this seems more elegant, and I'm curious even if it isn't the best approach.
That's some ugly way to design a class, and I guess there's an even uglier way to hack around it:
from django.forms import forms
class MyBoundField(object):
pass
forms.BoundField = MyBoundField

Django Abstract Models vs simple Python mixins vs Python ABCs

This is a question prompted by another question from me.
Django provides Abstract base classes functionality (which are not to same as ABC classes in Python?) so that one can make a Model (Django's models.Model) from which one can inherit, but without that Model having an actual table in the database. One triggers this behavior by setting the 'abstract' attribute in the Model's Meta class.
Now the question: why does Django solve it this way? Why the need for this special kind of 'Abstract base class' Model? Why not make a Model mixin by just inheriting from the object class and mixing that in with an existing Model? Or could this also by a task for Python ABCs? (mind you I'm not very familiar with ABC classes in Python, my ignorance might show here)
I'll try to be reasonably brief, since this can easily turn into a lengthy diatribe:
ABCs are out because they were only introduced in Python 2.6, and the Django developers have a set roadmap for Python version support (2.3 support was only dropped in 1.2).
As for object-inheriting mixins, they would be less Pythonic in more ways than just reducing readability. Django uses a ModelBase metaclass for Model objects, which actually analyses the defined model properties on initialisation, and populates Model._meta with the fields, Meta options, and other properties. It makes sense to reuse that framework for both types of models. This also allows Django to prevent abstract model fields from being overriden by inheriting models.
There's plenty more reasons I can think of, all of them minor in themself, but they add up to make the current implementation much more Pythonic. There's nothing inherently wrong with using object-inheriting mixins though.
One of the reasons is because of the way fields are defined on a model.
Fields are specified declaratively, in a way that a normal class would treat as class attributes. Yet they need to become instance attributes for when the class is actually instantiated, so that each instance can have its own value for each field. This is managed via the metaclass. This wouldn't work with a normal abstract base class.

Django signals vs. overriding save method

I'm having trouble wrapping my head around this. Right now I have some models that looks kind of like this:
def Review(models.Model)
...fields...
overall_score = models.FloatField(blank=True)
def Score(models.Model)
review = models.ForeignKey(Review)
question = models.TextField()
grade = models.IntegerField()
A Review is has several "scores", the overall_score is the average of the scores. When a review or a score is saved, I need to recalculate the overall_score average. Right now I'm using a overridden save method. Would there be any benefits to using Django's signal dispatcher?
Save/delete signals are generally favourable in situations where you need to make changes which aren't completely specific to the model in question, or could be applied to models which have something in common, or could be configured for use across models.
One common task in overridden save methods is automated generation of slugs from some text field in a model. That's an example of something which, if you needed to implement it for a number of models, would benefit from using a pre_save signal, where the signal handler could take the name of the slug field and the name of the field to generate the slug from. Once you have something like that in place, any enhanced functionality you put in place will also apply to all models - e.g. looking up the slug you're about to add for the type of model in question, to ensure uniqueness.
Reusable applications often benefit from the use of signals - if the functionality they provide can be applied to any model, they generally (unless it's unavoidable) won't want users to have to directly modify their models in order to benefit from it.
With django-mptt, for example, I used the pre_save signal to manage a set of fields which describe a tree structure for the model which is about to be created or updated and the pre_delete signal to remove tree structure details for the object being deleted and its entire sub-tree of objects before it and they are deleted. Due to the use of signals, users don't have to add or modify save or delete methods on their models to have this management done for them, they just have to let django-mptt know which models they want it to manage.
You asked:
Would there be any benefits to using Django's signal dispatcher?
I found this in the django docs:
Overridden model methods are not called on bulk operations
Note that the delete() method for an object is not necessarily called
when deleting objects in bulk using a QuerySet or as a result of a
cascading delete. To ensure customized delete logic gets executed, you
can use pre_delete and/or post_delete signals.
Unfortunately, there isn’t a workaround when creating or updating
objects in bulk, since none of save(), pre_save, and post_save are
called.
From: Overriding predefined model methods
Small addition from Django docs about bulk delete (.delete() method on QuerySet objects):
Keep in mind that this will, whenever possible, be executed purely in
SQL, and so the delete() methods of individual object instances will
not necessarily be called during the process. If you’ve provided a
custom delete() method on a model class and want to ensure that it is
called, you will need to “manually” delete instances of that model
(e.g., by iterating over a QuerySet and calling delete() on each
object individually) rather than using the bulk delete() method of a
QuerySet.
https://docs.djangoproject.com/en/1.11/topics/db/queries/#deleting-objects
And bulk update (.update() method on QuerySet objects):
Finally, realize that update() does an update at the SQL level and,
thus, does not call any save() methods on your models, nor does it
emit the pre_save or post_save signals (which are a consequence of
calling Model.save()). If you want to update a bunch of records for a
model that has a custom save() method, loop over them and call save()
https://docs.djangoproject.com/en/2.1/ref/models/querysets/#update
If you'll use signals you'd be able to update Review score each time related score model gets saved. But if don't need such functionality i don't see any reason to put this into signal, that's pretty model-related stuff.
It is a kind sort of denormalisation. Look at this pretty solution. In-place composition field definition.

Categories

Resources