DJANGO models proxy inharitence - python

I would like to add the methods of the 'tools' class for 2 of my django model calss.
Each class will use the same methods with it's own model eample:
class mapA(models.Model):
mInd = models.IntegerField()
scId = models.IntegerField()
class mapB(models.Model):
mInd = models.IntegerField()
scId2 = models.IntegerField()
I would like to add the methods like checkInput() to both of them.
So I could run:
mapBInstance.checkInput();
mapAInstance.checkInput();
Ech time the checkInput runs over the data in the mapA or mapB.
I thought about creating a tools class & let each model to inherit from it.
This way the tools class will have logic which is identical to both maps.
When I read the django docs I didn't see example to this case only close solutions.
Is this the correct solution (to use the proxy class)?
class Tools():
def __init__():
...init class...
def checkInput():
..make the checks..
class MapA(Tools, models.Model):
mInd = models.IntegerField()
scId = models.IntegerField()
def checkSelf():
self.checkInput(self.objects.filter(....))
class MapB(Tools, models.Model):
mIndB = models.IntegerField()
scIdB = models.IntegerField()
def checkSelf():
self.checkInput(self.objects.filter(....))

A few things...
There is no this in Python, it's called self.
If you're in Python 2.x, tools should inherit from object. In Python 3, it's implicit, but doesn't hurt:
class tools(object):
...
If you're overriding __init__ in your mixin class (tools), then map classes should probably inherit from it first:
class mapA(tools, models.Model):
...
Only override __init__ if you really need to, it can get complicated.
Class names are pretty much always in CamelCase. This is not required, but is a convention. Also, It's a good idea to name mixin classes transparently:
class ToolsMixin(object):
...
class MapA(ToolsMixin, models.Model):
...
Other then all that, you perfectly can add a method in a mixin and use it in your models. No need for Django proxy models.

If you want MapA and MapB (it would be really helpful if you followed PEP-8) to be distinct models, proxy models won't help you. A proxy model is a model that is different in Python, but in the database it is exactly the same as the model it inherits from. Creating a proxy model that doesn't directly inherit from a single concrete model (one that has a table in the database) is an error.
What you're looking for is an abstract base class:
class Tools(models.Model):
...
class Meta:
abstract = True
class MapA(Tools):
...
class MapB(Tools):
...
An abstract model does not create its own table in the database. Instead, it is as if everything defined in Tools has been defined in both MapA and MapB, but the Tools class is otherwise ignored. This allows you to specify all the methods and fields just once, but still have two separate tables in the database.

Related

Inheritance model update to its parent model

I need extend a model from another model.
Case:
core/models.py
class Master(models.Model):
code = models.CharField(max_length=30, unique=True)
name = models.CharField(max_length=100, blank=False, null=False)
class Meta:
abstract = True
class City(Master):
zipcode = models.IntegerField()
custom/models.py
from core.models import City
class City(City)
newfield = models.CharField(max_length=20)
custom is an app.
I have tried with proxy model but it is not what I need, since proxy model adds a new table. https://docs.djangoproject.com/en/2.2/topics/db/models/#proxy-models
I need is that when I migrate add the new field to City.
More info.
In core the table is created and in custom you can add new fields that the client needs. The idea is that core is only maintained as standard.
Proxy models don't add new tables. From the docs link you mentioned:
The MyPerson class operates on the same database table as its parent Person class.
If you want one table called core_city, and another called custom_city, the second one having an extra field, you simply subclass it. Perhaps it would be easier to use an alias:
from core.models import City as CoreCity
class City(CoreCity):
newfield = models.CharField(max_length=20)
custom_city will have all fields from core_city, plus a newfield. The description of how this works (and an example) is covered in the docs section Multi-table inheritance.
If what you want is to have one single database table, then you should use a proxy Model, however they really don't allow you to create new fields. The field should be created in the parent model, or otherwise exist in the database and not be handled by Django migrations at all.
You are looking for Abstract base classes models:
Abstract base classes are useful when you want to put some common information into a number of other models. You write your base class and put abstract=True in the Meta class.
This is the base class:
#core/models.py
class City(Master):
zipcode = models.IntegerField()
class Meta:
abstract = True # <--- here the trick
Here your model:
#custom/models.py
from core.models import City as CoreCity
class City(CoreCity):
newfield = models.CharField(max_length=20)
For many uses, this type of model inheritance will be exactly what you want. It provides a way to factor out common information at the Python level, while still only creating one database table per child model at the database level.
You can update or create your class constants after its defined like this
from core.models import City
City.newfield = models.CharField(max_length=20)
You may need to use swappable models, using them you can define a City class and change it with whichever model you need later,
but that way you can't import and use the base City model directly, you may need to provide a method like get_city_model for that, as your public API.
class City(Master):
zipcode = models.IntegerField()
class Meta:
swappable = 'CORE_CITY_MODEL'
and maybe replace it later with some other model, then just set CORE_CITY_MODEL to that model in the form of 'app_name.model_name'.
The django.contrib.auth is a good example of this, you may consider checking User model and get_user_model method. Although I think you may face problems if you change your city model after you did run migrate, it may not move your data to the new table, but I'm not sure about this.

Django model subclassing approaches

I'm designing a new Django app and due to several possibilities, I'm not sure which would be the best, thus I'd like opinions, and hopefully improve what I got so far.
This question comes close but not quite. This one touches the flat/nested subject which is helpful, while still not answering the question.
There are many others on the same subject, and yet none tell me what I want to know.
Background
The models have each unique properties with some shared attributes, and I need to reference them in another model, optimally with a single entry point rather than having a field for each possible model.
I want to be able to do complex Django ORM queries involving the Base class and filter by SubClass when needed. E.g Event.objects.all() to return all events. I'm aware of Django model utils Inheritance Manager and intend to use it if possible.
Also, I'll be using django admin to create and manage the objects, so an easy integration is a must. I want to be able to create a new SubEvent directly, without having first to create a Event instance.
Example
To illustrate, let's say I have the following models for app A.
class Event(models.Model):
commom_field = models.BooleanField()
class Meta:
abstract = True
class SubEventA(Event):
email = models.EmailField(unique=True)
class SubEventB(Event):
title = models.TextField()
class SubEventC(Event):
number = models.IntegerField(default=10)
# and so on
And also an app B, where I want to be able to reference a event which can be of any type, like:
class OtherModel(models.Model):
event = models.ForeignKey('A.Event')
# This won't work, because `A.Event` is abstract.
Possible solutions
Use a GenericForeignKey.
# B.models.py
class OtherModel(models.Model):
content_type = models.ForeignKey(ContentType)
object_id = models.PositiveIntegerField()
event = GenericForeignKey('content_type', 'object_id')
What I don't like about this is that I'll lose the querying capabilities Django ORM has, and I might need to do additional fiddling to get it working on admin. Not sure, never dealt with this before
Flatten Event
I can bring it all up to the base class and have flags or checks outside the model definition, something like:
class Event(models.Model):
commom_field = models.BooleanField()
email = models.EmailField(blank=True)
title = models.TextField(blank=True)
number = models.IntegerField(default=10)
This might seem like the best idea at first, but of course there are other kind of fields, and that forces me to allow nulls/blanks for most of them (like the email field), losing the db level integrity check.
OneToOne relationships
Rather than abstract like on 1 or flatten on 2 it is possible to have a db table for each, where the models will look like:
class Event(models.Model):
commom_field = models.BooleanField()
class SubEventA(models.Model):
event = models.OneToOneField(Event)
email = models.EmailField(unique=True)
class SubEventB(models.Model):
event = models.OneToOneField(Event)
title = models.TextField(blank=True)
class SubEventC(models.Model):
event = models.OneToOneField(Event)
number = models.IntegerField(default=10)
So far it solved the two initial problems, but now when I get to the admin interface, I'll have to customize each form to create the base Event before saving a SubEvent instance.
Questions
Is there a better approach?
Can any of the choices I present be improved in any direction (ORM query, DB constraints, admin interface)?
I've pondered about both answers and came up with something based off of those suggestions. Thus I'm adding this answer of my own.
I've chosen to use django-polymorphic, quite nice tool suggested by #professorDante. Since this is a multi-table inheritance, #albar's answer is also somewhat correct.
tl;dr
django-polymorphic attends the 3 main requirements:
Allow django ORM querying style
Keep db level constraints by having a multi-table inheritance and one table for each sub class
Easy django admin integration
Longer version
Django-polymorphic allows me to query all different event instances from the base class, like:
# assuming the objects where previously created
>>> Event.objects.all()
[<SubEventA object>, <SubEventB object>, <SubEventC object>]
It also has great django admin integration, allowing seamless objects creation and editing.
The models using django-polymorphic would look like:
# A.models.py
from polymorphic import PolymorphicModel
class Event(PolymorphicModel):
commom_field = models.BooleanField()
# no longer abstract
class SubEventA(Event):
email = models.EmailField(unique=True)
class SubEventB(Event):
title = models.TextField()
class SubEventC(Event):
number = models.IntegerField(default=10)
# B.models.py
# it doesnt have to be polymorphic to reference polymorphic models
class OtherModel(models.Model):
event = models.ForeignKey('A.Event')
Besides, I can reference only the base model from another class and I can assign any of the subclasses directly, such as:
>>> sub_event_b = SubEventB.objects.create(title='what a lovely day')
>>> other_model = OtherModel()
>>> other_model.event = sub_event_b
My .2c on this. Not sure about your design in #3. Each SubEvent subclasses Event, and has a one-to-one to Event? Isn't that the same thing?
Your proposal on the Generic Key is exactly what it is designed for.
Another possibility - Polymorphism with Mixins. Use something like Django-polymorphic, so querying returns you the subclass you want. I use this all the time and its super useful. Then make Mixins for attributes that will be reused across many classes. So a simple example, making an email Mixin
class EmailMixin(models.Model):
email = models.EmailField(unique=True)
class Meta:
abstract = True
Then use it
class MySubEvent(EmailMixin, models.Model):
<do stuff>
This way you dont have redundant attributes on subclasses, as you would if they were all in the parent.
Why not a multi-table inheritance?
class Event(models.Model):
commom_field = models.BooleanField()
class SubEventA(Event):
email = models.EmailField(unique=True)
class SubEventB(Event):
title = models.TextField(blank=True)
class SubEventC(Event):
number = models.IntegerField(default=10)

python django name conventions

the naming convention for ForeignKey is to name with the lowercase version of the connected model, the following is taken from the docs:
class Car(models.Model):
manufacturer = models.ForeignKey(Manufacturer)
# ...
class Manufacturer(models.Model):
# ...
pass
but I have the following models:
class Work(models.model):
class = models.ForeignKey(Class)
#...
class Class(models.model):
#...
pass
As we know, this will raise an error because we cannot set a variable to class because it is built into python.
Question:
will not following the naming conventions actually be a problem in the SQL database creation?
for example:
class Work(models.model):
cls = models.ForeignKey(Class)
#...
class Class(models.model):
#...
pass
will having a different ForeignKey name mess up the SQL table and database creation?
You definitely want to stay away from reserved words such as Class, List, String, etc. Whether it will mess up your database or not, I'm not sure (try it and see!) but it's definitely a bad idea. If it's not a sql problem, it will mess up something somewhere eventually.
I see 'klass' a lot, if you feel like the word class is absolutely necessary. 'Course', perhaps?

class definition inside a class?

class ItemForm(djangoforms.ModelForm):
class Meta:
model = Item
exclude = ['added_by']
i can not understand what this piece of code is doing .i understood that ItemForm is inheriting Modelform but then a class definition inside a class ??
The Item class is :
class Item(db.Model):
name = db.StringProperty()
quantity = db.IntegerProperty(default=1)
target_price = db.FloatProperty()
priority = db.StringProperty(default='Medium',choices=[
'High', 'Medium', 'Low'])
entry_time = db.DateTimeProperty(auto_now_add=True)
added_by = db.UserProperty()
It's part of Django's magic. The metaclass for ModelForm (among other classes) looks for an inner Meta class and uses it to make various changes to the outer class. It's one of the deeper parts of Python that most people will never have to deal with first-hand.
In Python you can define classes within other classes as a way of encapsulating the inner class. The way Django is using this is actually quite excellent.
See this link for more info: http://www.geekinterview.com/question_details/64739
Meta is a special class definition.
In this example, it is a simple inheritance model. ModelForm creates a form based on a Model class, so via giving a class definiton to ModelForm class, it creates the form elements according to related Model class definiton.

Can't use an inheriting Django model's Meta class to configure a field defined in an inherited abstract model

I would like to use properties from an inheriting model's Meta class to configure a field defined in an abstract model higher up the inheritance tree:
class NamedModel(models.Model):
class Meta:
abstract = True
verbose_name = 'object'
name = models.CharField("Name",
max_length=200,
db_index=True,
help_text="A meaningful name for this %s." % Meta.verbose_name)
# see what I'm trying to do here?
)
...
class OwnedModel(NamedModel):
class Meta(NamedModel.Meta):
verbose_name = 'owned object'
I would like the help text on the name field of OwnedModel forms to say 'A meaningful name for this owned object'. But it does not: the word 'owned' is missing, which would suggest that the verbose_name from the NamedModel.Meta is used when the model is set up, not OwnedModel.Meta.
This isn't quite what I expect from an inheritance point of view: is there some way to get the field to be created whereby Meta.verbose_name refers to the value on the non-abstract model class, not the abstract one on which the field was defined?
Or am I being daft?
(This may seem like a trivial example, and it is: but it's just to illustrate the point of something more important and complex I am trying to do)
Many thanks in advance.
Why don't you try to make a class.
class BaseNamedModelMeta:
abstract = True
verbose_name = "your text"
And then inherit and override whatever you want like this:
class OwnedModel(NamedModel):
class Meta(BaseNamedModelMeta):
verbose_name = 'owned object'
I think this happens because Meta.verbose_name is used and NamedModel.name is created when class NamedModel is parsed. So later, when class OwnedModel gets parsed, there is no chance to change anything.
Maybe you can set the help_text property on OwnedModel.name later on, but this may change NamedModel.name also.
In similar situations I have put the variable parts in class attribute of the model (not Meta) and then used the by run time methods/properties to generate the texts I need.
In fact I ended up doing the following. The base model gets given a dynamic_field_definition() class method, which can be used to patch up the fields, with the cls argument being the correct (inheriting) class. That means that that cls' Meta attributes are of that correct child, not the original base.
I then wire up that method to get called on the class_prepared signal, so that you know everything's otherwise ready.
class NamedModel(models.Model):
...
#classmethod
def dynamic_field_definition(cls):
pass
def dynamic_field_definition(sender, **kwargs):
if issubclass(sender, NamedModel):
sender.dynamic_field_definition()
class_prepared.connect(dynamic_field_definition)
Then the field properties that vary with model class are simply reconfigured by that class method (or more likely the method as overridden in derived classes).
It's a slightly hacky way to bring a last little bit of OO-ness to Django models, but it works fine for my purpose.

Categories

Resources