Dynamically add properties to a django model - python

I have a Django model where a lot of fields are choices. So I had to write a lot of "is_something" properties of the class to check whether the instance value is equal to some choice value. Something along the lines of:
class MyModel(models.Model):
some_choicefield = models.IntegerField(choices=SOME_CHOICES)
#property
def is_some_value(self):
return self.some_choicefield == SOME_CHOICES.SOME_CHOICE_VALUE
# a lot of these...
In order to automate this and spare me a lot of redundant code, I thought about patching the instance at creation, with a function that adds a bunch of methods that do the checks.
The code became as follows (I'm assuming there's a "normalize" function that makes the label of the choice a usable function name):
def dynamic_add_checks(instance, field):
if hasattr(field, 'choices'):
choices = getattr(field, 'choices')
for (value,label) in choices:
def fun(instance):
return getattr(instance, field.name) == value
normalized_func_name = "is_%s_%s" % (field.name, normalize(label))
setattr(instance, normalized_func_name, fun(instance))
class MyModel(models.Model):
def __init__(self, *args, **kwargs):
super(MyModel).__init__(*args, **kwargs)
dynamic_add_checks(self, self._meta.get_field('some_choicefield')
some_choicefield = models.IntegerField(choices=SOME_CHOICES)
Now, this works but I have the feeling there is a better way to do it. Perhaps at class creation time (with metaclasses or in the new method)? Do you have any thoughts/suggestions about that?

Well I am not sure how to do this in your way, but in such cases I think the way to go is to simply create a new model, where you keep your choices, and change the field to ForeignKey. This is simpler to code and manage.
You can find a lot of information at a basic level in Django docs: Models: Relationships. In there, there are many links to follow expanding on various topics. Beyong that, I believe it just needs a bit of imagination, and maybe trial and error in the beginning.

I came across a similar problem where I needed to write large number of properties at runtime to provide backward compatibility while changing model fields. There are 2 standard ways to handle this -
First is to use a custom metaclass in your models, which inherits from models default metaclass.
Second, is to use class decorators. Class decorators sometimes provides an easy alternative to metaclasses, unless you have to do something before the creation of class, in which case you have to go with metaclasses.

I bet you know Django fields with choices provided will automatically have a display function.
Say you have a field defined like this:
category = models.SmallIntegerField(choices=CHOICES)
You can simply call a function called get_category_display() to access the display value. Here is the Django source code of this feature:
https://github.com/django/django/blob/baff4dd37dabfef1ff939513fa45124382b57bf8/django/db/models/base.py#L962
https://github.com/django/django/blob/baff4dd37dabfef1ff939513fa45124382b57bf8/django/db/models/fields/init.py#L704
So we can follow this approach to achieve our dynamically set property goal.
Here is my scenario, a little bit different from yours but down to the end it's the same:
I have two classes, Course and Lesson, class Lesson has a ForeignKey field of Course, and I want to add a property name cached_course to class Lesson which will try to get Course from cache first, and fallback to database if cache misses:
Here is a typical solution:
from django.db import models
class Course(models.Model):
# some fields
class Lesson(models.Model):
course = models.ForeignKey(Course)
#property
def cached_course(self):
key = key_func()
course = cache.get(key)
if not course:
course = get_model_from_db()
cache.set(key, course)
return course
Turns out I have so many ForeignKey fields to cache, so here is the code following the similar approach of Django get_FIELD_display feature:
from django.db import models
from django.utils.functional import curry
class CachedForeignKeyField(models.ForeignKey):
def contribute_to_class(self, cls, name, **kwargs):
super(models.ForeignKey, self).contribute_to_class(cls, name, **kwargs)
setattr(cls, "cached_%s" % self.name,
property(curry(cls._cached_FIELD, field=self)))
class BaseModel(models.Model):
def _cached_FIELD(self, field):
value = getattr(self, field.attname)
Model = field.related_model
return cache.get_model(Model, pk=value)
class Meta:
abstract = True
class Course(BaseModel):
# some fields
class Lesson(BaseModel):
course = CachedForeignKeyField(Course)
By customizing CachedForeignKeyField, and overwrite the contribute_to_class method, along with BaseModel class with a _cached_FIELD method, every CachedForeignKeyField will automatically have a cached_FIELD property accordingly.
Too good to be true, bravo!

Related

How to maintain a table of all proxy models in Django?

I have a model A and want to make subclasses of it.
class A(models.Model):
type = models.ForeignKey(Type)
data = models.JSONField()
def compute():
pass
class B(A):
def compute():
df = self.go_get_data()
self.data = self.process(df)
class C(A):
def compute():
df = self.go_get_other_data()
self.data = self.process_another_way(df)
# ... other subclasses of A
B and C should not have their own tables, so I decided to use the proxy attirbute of Meta. However, I want there to be a table of all the implemented proxies.
In particular, I want to keep a record of the name and description of each subclass.
For example, for B, the name would be "B" and the description would be the docstring for B.
So I made another model:
class Type(models.Model):
# The name of the class
name = models.String()
# The docstring of the class
desc = models.String()
# A unique identifier, different from the Django ID,
# that allows for smoothly changing the name of the class
identifier = models.Int()
Now, I want it so when I create an A, I can only choose between the different subclasses of A.
Hence the Type table should always be up-to-date.
For example, if I want to unit-test the behavior of B, I'll need to use the corresponding Type instance to create an instance of B, so that Type instance already needs to be in the database.
Looking over on the Django website, I see two ways to achieve this: fixtures and data migrations.
Fixtures aren't dynamic enough for my usecase, since the attributes literally come from the code. That leaves me with data migrations.
I tried writing one, that goes something like this:
def update_results(apps, schema_editor):
A = apps.get_model("app", "A")
Type = apps.get_model("app", "Type")
subclasses = get_all_subclasses(A)
for cls in subclasses:
id = cls.get_identifier()
Type.objects.update_or_create(
identifier=id,
defaults=dict(name=cls.__name__, desc=cls.__desc__)
)
class Migration(migrations.Migration):
operations = [
RunPython(update_results)
]
# ... other stuff
The problem is, I don't see how to store the identifier within the class, so that the Django Model instance can recover it.
So far, here is what I have tried:
I have tried using the fairly new __init_subclass__ construct of Python. So my code now looks like:
class A:
def __init_subclass__(cls, identifier=None, **kwargs):
super().__init_subclass__(**kwargs)
if identifier is None:
raise ValueError()
cls.identifier = identifier
Type.objects.update_or_create(
identifier=identifier,
defaults=dict(name=cls.__name__, desc=cls.__doc__)
)
# ... the rest of A
# The identifier should never change, so that even if the
# name of the class changes, we still know which subclass is referred to
class B(A, identifier=3):
# ... the rest of B
But this update_or_create fails when the database is new (e.g. during unit tests), because the Type table does not exist.
When I have this problem in development (we're still in early stages so deleting the DB is still sensible), I have to go
comment out the update_or_create in __init_subclass__. I can then migrate and put it back in.
Of course, this solution is also not great because __init_subclass__ is run way more than necessary. Ideally this machinery would only happen at migration.
So there you have it! I hope the problem statement makes sense.
Thanks for reading this far and I look forward to hearing from you; even if you have other things to do, I wish you a good rest of your day :)
With a little help from Django-expert friends, I solved this with the post_migrate signal.
I removed the update_or_create in __init_subclass, and in project/app/apps.py I added:
from django.apps import AppConfig
from django.db.models.signals import post_migrate
def get_all_subclasses(cls):
"""Get all subclasses of a class, recursively.
Used to get a list of all the implemented As.
"""
all_subclasses = []
for subclass in cls.__subclasses__():
all_subclasses.append(subclass)
all_subclasses.extend(get_all_subclasses(subclass))
return all_subclasses
def update_As(sender=None, **kwargs):
"""Get a list of all implemented As and write them in the database.
More precisely, each model is used to instantiate a Type, which will be used to identify As.
"""
from app.models import A, Type
subclasses = get_all_subclasses(A)
for cls in subclasses:
id = cls.identifier
Type.objects.update_or_create(identifier=id, defaults=dict(name=cls.__name__, desc=cls.__doc__))
class MyAppConfig(AppConfig):
default_auto_field = "django.db.models.BigAutoField"
name = "app"
def ready(self):
post_migrate.connect(update_As, sender=self)
Hope this is helpful for future Django coders in need!

What's behind the mechanism of Django models and fields?

What is the Django/Python mechanism behind the Django model/field part of framework?
To be exact, I am looking for a hint on how Django parses (?) class definition and then knows which fields are required?
from django.db import models
class Car(models.Model):
name = models.CharField(max_length=255, null=True, blank=True)
year_of_production = models.DateField(null=True)
# the rest of fields...
I think the same mechanism is behind Django Forms framework or DRF serializers. I checked repos of these projects but I still can't find any reasonable starting point.
There's a architectural problem under my question. I think I need to implement something similar to this mechanism:
class Field:
def __init__(self, label: str, required: bool = True, **kwargs):
self.label, self.required = label, required
class CharField(Field):
def __init__(self, max_length: int, **kwargs):
self.max_length = max_length
super().__init__(**kwargs)
class DateField(Field):
...
class BooleanField(Field):
...
class Model:
# the mechanisms I do not understand
class MyModelInstance(Model):
name = CharField(...)
# etc.
What I need is really simple solution that knows that field is required. But as I stated before I am not that advanced and I would really appreciate any hints.
Edit: I think I'm looking for sth like Django Forms mechanism, not models/fields.
Forms and Models follow the same basic idea, but forms are a little simpler, so let's take a tour there.
The DeclarativeFieldsMetaclass metaclass is used on Form.
It gathers up the fields at declaration time (with some MRO walking, but the basic idea is to just see if they're isinstance(x, Field)), removes them from the concrete class declaration and moves them into cls.base_fields (where cls is the class you're declaring).
When you instantiate this new Form of yours, this code over here deepcopies self.base_fields (which is on the class level, but that's beside the point) into self.fields (so you can safely modify self.fields within each form instance without affecting others across requests.
That's basically it, really.
Beyond that, if you wanted a thing that gathered required fields on a separate attribute, that'd just be something like
cls.required_fields = {f for f in cls.base_fields if f.required}

Django remove bulk-delete

This is a very simple question: Is there any good way to disable calling a bulk-delete (through querysets of course) on all models in an entire Django project?
The reasoning for this is under the premise that completely deleting data is almost always a poor choice, and an accidental bulk-delete can be detrimental.
Like comments says on your first post, you have to create a subclass for each of these elements:
The model manager
Queryset class
BaseModel
After some search, a great example can be found here, all credits to Akshay Shah, the blog author. Before looking to the code, be aware of that:
However, it inevitably leads to data corruption. The problem is simple: using a Boolean to store deletion status makes it impossible to enforce uniqueness constraints in your database.
from django.db import models
from django.db.models.query import QuerySet
class SoftDeletionQuerySet(QuerySet):
def delete(self):
# Bulk delete bypasses individual objects' delete methods.
return super(SoftDeletionQuerySet, self).update(alive=False)
def hard_delete(self):
return super(SoftDeletionQuerySet, self).delete()
def alive(self):
return self.filter(alive=True)
def dead(self):
return self.exclude(alive=True)
class SoftDeletionManager(models.Manager):
def __init__(self, *args, **kwargs):
self.alive_only = kwargs.pop('alive_only', True)
super(SoftDeletionManager, self).__init__(*args, **kwargs)
def get_queryset(self):
if self.alive_only:
return SoftDeletionQuerySet(self.model).filter(alive=True)
return SoftDeletionQuerySet(self.model)
def hard_delete(self):
return self.get_queryset().hard_delete()
class SoftDeletionModel(models.Model):
alive = models.BooleanField(default=True)
objects = SoftDeletionManager()
all_objects = SoftDeletionManager(alive_only=False)
class Meta:
abstract = True
def delete(self):
self.alive = False
self.save()
def hard_delete(self):
super(SoftDeletionModel, self).delete()
Basically, it adds an alive field to check if the row has been deleted or not, and update it when the delete() method is called.
Of course, this method works only on project where you can manipulate the code base.
There are nice off-the-shelf applications that allow for restoring deleted models (if that is what you are interested in), here are ones I used:
Django softdelete: https://github.com/scoursen/django-softdelete I used it more
Django reversion: https://github.com/etianen/django-reversion this one is updated more often, and allows you to revert to any version of your model (not only after delete, but as well after update).
If you really want to forbid bulk deletes, I'd discourage you from this approach as it will:
Break expectations about applicaiton behaviour. If I call MyModel.objects.all().delete() I want table to be empty afterwards.
Break existing applications.
If you want do do it please follow advice from comment:
I'm guessing this would involve subclassing QuerySet and changing the delete method to your liking, subclassing the default manager and have it use your custom query set, subclassing model - create an abstract model and have it use your custom manager and then finally have all your models subclass your custom abstract model.

Problem with model inheritance and polymorphism

i came with new django problem. The situtaion: i have a model class UploadItemModel, i subcallss it to create uploadable items, like videos, audio files ...
class UploadItem(UserEntryModel):
category = 'abstract item'
file = models.FileField(upload_to=get_upload_directory)
i subclass it like this:
class Video(UploadItem):
category = 'video'
I need to access category attributes from a custom tag. The problem si that i am getting category='abstract item' even if the class is actually Video.
Any clue?
EDIT: I need to use hierarchy because i have several types of item that user can uplaod(Video, Audio files, PDF text). I need to create a class for each type, but there are lot of things in common between those classes(eg forms).
Any clue?
Yes. AFAIK it doesn't work the way you're hoping. Django Models aren't trivially Python classes. They're more like metaclasses which create instances of a kind of "hidden" class definition. Yes, the expected model class exists, but it isn't quite what you think it is. For one thing, the class you use was built for you from your class definition. That's why some static features of Python classes don't work as you'd expect in Django models.
You can't really make use of class-level items like this.
You might want to create an actual field with a default value or something similar.
class UploadItem(UserEntryModel):
category = models.CharFIeld( default='abstract item' )
file = models.FileField(upload_to=get_upload_directory)
Even after the comments being added to the question, I'm still unclear on why this is being done. There do not seem to be any structural or behavioral differences. These all seem like a single class of objects. Subclasses don't seem to define anything new.
Options.
Simply use the class name instead of this "category" item at the class level. Make the class names good enough that you don't need this "category" item.
Use a property
class UploadItem(UserEntryModel):
file = models.FileField(upload_to=get_upload_directory)
#property
def category( self ):
return self.__class__.__name__
You will need to create an additional field that will be a descriptor for that type.
There is a good tutorial here explaining how to use inheritance in Django models
Can you try overriding the __init__ method of the class to assign a category to each instance? For e.g.
class Video(UploadItem):
def __init__(self, *args, **kwargs):
super(Video, self).__init__(*args, **kwargs)
self.category = 'video'

Django Manager Chaining

I was wondering if it was possible (and, if so, how) to chain together multiple managers to produce a query set that is affected by both of the individual managers. I'll explain the specific example that I'm working on:
I have multiple abstract model classes that I use to provide small, specific functionality to other models. Two of these models are a DeleteMixin and a GlobalMixin.
The DeleteMixin is defined as such:
class DeleteMixin(models.Model):
deleted = models.BooleanField(default=False)
objects = DeleteManager()
class Meta:
abstract = True
def delete(self):
self.deleted = True
self.save()
Basically it provides a pseudo-delete (the deleted flag) instead of actually deleting the object.
The GlobalMixin is defined as such:
class GlobalMixin(models.Model):
is_global = models.BooleanField(default=True)
objects = GlobalManager()
class Meta:
abstract = True
It allows any object to be defined as either a global object or a private object (such as a public/private blog post).
Both of these have their own managers that affect the queryset that is returned. My DeleteManager filters the queryset to only return results that have the deleted flag set to False, while the GlobalManager filters the queryset to only return results that are marked as global. Here is the declaration for both:
class DeleteManager(models.Manager):
def get_query_set(self):
return super(DeleteManager, self).get_query_set().filter(deleted=False)
class GlobalManager(models.Manager):
def globals(self):
return self.get_query_set().filter(is_global=1)
The desired functionality would be to have a model extend both of these abstract models and grant the ability to only return the results that are both non-deleted and global. I ran a test case on a model with 4 instances: one was global and non-deleted, one was global and deleted, one was non-global and non-deleted, and one was non-global and deleted. If I try to get result sets as such: SomeModel.objects.all(), I get instance 1 and 3 (the two non-deleted ones - great!). If I try SomeModel.objects.globals(), I get an error that DeleteManager doesn't have a globals (this is assuming my model declaration is as such: SomeModel(DeleteMixin, GlobalMixin). If I reverse the order, I don't get the error, but it doesn't filter out the deleted ones). If I change GlobalMixin to attach GlobalManager to globals instead of objects (so the new command would be SomeModel.globals.globals()), I get instances 1 and 2 (the two globals), while my intended result would be to only get instance 1 (the global, non-deleted one).
I wasn't sure if anyone had run into any situation similar to this and had come to a result. Either a way to make it work in my current thinking or a re-work that provides the functionality I'm after would be very much appreciated. I know this post has been a little long-winded. If any more explanation is needed, I would be glad to provide it.
Edit:
I have posted the eventual solution I used to this specific problem below. It is based on the link to Simon's custom QuerySetManager.
See this snippet on Djangosnippets: http://djangosnippets.org/snippets/734/
Instead of putting your custom methods in a manager, you subclass the queryset itself. It's very easy and works perfectly. The only issue I've had is with model inheritance, you always have to define the manager in model subclasses (just: "objects = QuerySetManager()" in the subclass), even though they will inherit the queryset. This will make more sense once you are using QuerySetManager.
Here is the specific solution to my problem using the custom QuerySetManager by Simon that Scott linked to.
from django.db import models
from django.contrib import admin
from django.db.models.query import QuerySet
from django.core.exceptions import FieldError
class MixinManager(models.Manager):
def get_query_set(self):
try:
return self.model.MixinQuerySet(self.model).filter(deleted=False)
except FieldError:
return self.model.MixinQuerySet(self.model)
class BaseMixin(models.Model):
admin = models.Manager()
objects = MixinManager()
class MixinQuerySet(QuerySet):
def globals(self):
try:
return self.filter(is_global=True)
except FieldError:
return self.all()
class Meta:
abstract = True
class DeleteMixin(BaseMixin):
deleted = models.BooleanField(default=False)
class Meta:
abstract = True
def delete(self):
self.deleted = True
self.save()
class GlobalMixin(BaseMixin):
is_global = models.BooleanField(default=True)
class Meta:
abstract = True
Any mixin in the future that wants to add extra functionality to the query set simply needs to extend BaseMixin (or have it somewhere in its heirarchy). Any time I try to filter the query set down, I wrapped it in a try-catch in case that field doesn't actually exist (ie, it doesn't extend that mixin). The global filter is invoked using globals(), while the delete filter is automatically invoked (if something is deleted, I never want it to show). Using this system allows for the following types of commands:
TemporaryModel.objects.all() # If extending DeleteMixin, no deleted instances are returned
TemporaryModel.objects.all().globals() # Filter out the private instances (non-global)
TemporaryModel.objects.filter(...) # Ditto about excluding deleteds
One thing to note is that the delete filter won't affect admin interfaces, because the default Manager is declared first (making it the default). I don't remember when they changed the admin to use Model._default_manager instead of Model.objects, but any deleted instances will still appear in the admin (in case you need to un-delete them).
I spent a while trying to come up with a way to build a nice factory to do this, but I'm running into a lot of problems with that.
The best I can suggest to you is to chain your inheritance. It's not very generic, so I'm not sure how useful it is, but all you would have to do is:
class GlobalMixin(DeleteMixin):
is_global = models.BooleanField(default=True)
objects = GlobalManager()
class Meta:
abstract = True
class GlobalManager(DeleteManager):
def globals(self):
return self.get_query_set().filter(is_global=1)
If you want something more generic, the best I can come up with is to define a base Mixin and Manager that redefines get_query_set() (I'm assuming you only want to do this once; things get pretty complicated otherwise) and then pass a list of fields you'd want added via Mixins.
It would look something like this (not tested at all):
class DeleteMixin(models.Model):
deleted = models.BooleanField(default=False)
class Meta:
abstract = True
def create_mixin(base_mixin, **kwargs):
class wrapper(base_mixin):
class Meta:
abstract = True
for k in kwargs.keys():
setattr(wrapper, k, kwargs[k])
return wrapper
class DeleteManager(models.Manager):
def get_query_set(self):
return super(DeleteManager, self).get_query_set().filter(deleted=False)
def create_manager(base_manager, **kwargs):
class wrapper(base_manager):
pass
for k in kwargs.keys():
setattr(wrapper, k, kwargs[k])
return wrapper
Ok, so this is ugly, but what does it get you? Essentially, it's the same solution, but much more dynamic, and a little more DRY, though more complex to read.
First you create your manager dynamically:
def globals(inst):
return inst.get_query_set().filter(is_global=1)
GlobalDeleteManager = create_manager(DeleteManager, globals=globals)
This creates a new manager which is a subclass of DeleteManager and has a method called globals.
Next, you create your mixin model:
GlobalDeleteMixin = create_mixin(DeleteMixin,
is_global=models.BooleanField(default=False),
objects = GlobalDeleteManager())
Like I said, it's ugly. But it means you don't have to redefine globals(). If you want a different type of manager to have globals(), you just call create_manager again with a different base. And you can add as many new methods as you like. Same for the manager, you just keep adding new functions that will return different querysets.
So, is this really practical? Maybe not. This answer is more an exercise in (ab)using Python's flexibility. I haven't tried using this, though I do use some of the underlying principals of dynamically extending classes to make things easier to access.
Let me know if anything is unclear and I'll update the answer.
Another option worth considering is the PassThroughManager:
https://django-model-utils.readthedocs.org/en/latest/managers.html#passthroughmanager
You should use QuerySet instead of Manager.
See Documentation here.

Categories

Resources