What is the best way to handle functional django model field defaults? - python

Sometimes a ForeignKey field needs a default. For example:
class ReleaseManager(BaseManager):
def default(self):
return self.filter(default=True).order_by('-modified').first()
class Release(BaseModel):
default = models.BooleanField(default=False)
...
class Server(models.Model):
...
release = models.ForeignKey(Release, null=True, default=Release.objects.default)
All is well and good with the above code until the time comes for db migration whereupon the functional default causes big problems because the default function cannot be serialized. Manual migration can work around this but on a large project where migrations are perhaps squashed periodically this leaves a time bomb for the unwary.
A common workaround is to move the default from the field to the save method of the model but this causes confusion if the model is used by things like the rest framework or in creating forms where the default is expected on the field.

My current favourite workaround works with migrations and with the rest framework and other form generation. It assumes the object manager supplies a default method and uses a specialized ForeignKey field to get at it:
class ForeignKeyWithObjectManagerDefault(models.ForeignKey):
def __init__(self, to, **kwargs):
super().__init__(to, **kwargs)
self.to = to
def get_default(self):
return self.to.objects.default().pk
class Project(SOSAdminObject):
primary = ForeignKeyWithObjectManagerDefault(Primary, related_name='projects')
...
Now migrations work as expected and we can use any functionality we like to supply a default object to a foreign key field.

Related

What's behind the mechanism of Django models and fields?

What is the Django/Python mechanism behind the Django model/field part of framework?
To be exact, I am looking for a hint on how Django parses (?) class definition and then knows which fields are required?
from django.db import models
class Car(models.Model):
name = models.CharField(max_length=255, null=True, blank=True)
year_of_production = models.DateField(null=True)
# the rest of fields...
I think the same mechanism is behind Django Forms framework or DRF serializers. I checked repos of these projects but I still can't find any reasonable starting point.
There's a architectural problem under my question. I think I need to implement something similar to this mechanism:
class Field:
def __init__(self, label: str, required: bool = True, **kwargs):
self.label, self.required = label, required
class CharField(Field):
def __init__(self, max_length: int, **kwargs):
self.max_length = max_length
super().__init__(**kwargs)
class DateField(Field):
...
class BooleanField(Field):
...
class Model:
# the mechanisms I do not understand
class MyModelInstance(Model):
name = CharField(...)
# etc.
What I need is really simple solution that knows that field is required. But as I stated before I am not that advanced and I would really appreciate any hints.
Edit: I think I'm looking for sth like Django Forms mechanism, not models/fields.
Forms and Models follow the same basic idea, but forms are a little simpler, so let's take a tour there.
The DeclarativeFieldsMetaclass metaclass is used on Form.
It gathers up the fields at declaration time (with some MRO walking, but the basic idea is to just see if they're isinstance(x, Field)), removes them from the concrete class declaration and moves them into cls.base_fields (where cls is the class you're declaring).
When you instantiate this new Form of yours, this code over here deepcopies self.base_fields (which is on the class level, but that's beside the point) into self.fields (so you can safely modify self.fields within each form instance without affecting others across requests.
That's basically it, really.
Beyond that, if you wanted a thing that gathered required fields on a separate attribute, that'd just be something like
cls.required_fields = {f for f in cls.base_fields if f.required}

django - dynamic manager

I have a model that has an owner field.
class MyModel(models.Model):
owner = models.CharField(...)
I extended the django User class and added an ownership filed
class AppUser(User):
ownership = models.CharField(...)
I want to create a Manager for MyModel so it will retrieve only objects that correspond with ownership of the currently logged in user.
For example (using Django REST framework):
class MyModelAPI(APIView):
def get(self, request, format=None):
# This query will automatically add a filter of owner=request.user.ownership
objs = MyModel.objects.all()
# rest of code ...
All of the examples of managers user constant values in their queries and i'm looking for something more dynamic. Is this thing even possible?
Thanks
This is not possible with a custom manager because a model manager is instantiated at class loading time. Hence, it is stateless with regard to the http-request-response cycle and could only provide some custom method that you would have to pass the user to anyway. So why don't you just add some convenience method/property on your model (a manager seems unnecessary for this sole purpose)
class MyModel(models.Model):
...
#clsmethod
def user_objects(cls, user):
return cls.objects.filter(owner=user.ownership)
Then, in your view:
objs = MyModel.user_objects(request.user)
For a manager-based solution, look at this question. Another interesting solution is a custom middleware that makes the current user available via some function/module attribute which can be accessed in acustom manager's get_queryset() method, as described here.

Dynamically add properties to a django model

I have a Django model where a lot of fields are choices. So I had to write a lot of "is_something" properties of the class to check whether the instance value is equal to some choice value. Something along the lines of:
class MyModel(models.Model):
some_choicefield = models.IntegerField(choices=SOME_CHOICES)
#property
def is_some_value(self):
return self.some_choicefield == SOME_CHOICES.SOME_CHOICE_VALUE
# a lot of these...
In order to automate this and spare me a lot of redundant code, I thought about patching the instance at creation, with a function that adds a bunch of methods that do the checks.
The code became as follows (I'm assuming there's a "normalize" function that makes the label of the choice a usable function name):
def dynamic_add_checks(instance, field):
if hasattr(field, 'choices'):
choices = getattr(field, 'choices')
for (value,label) in choices:
def fun(instance):
return getattr(instance, field.name) == value
normalized_func_name = "is_%s_%s" % (field.name, normalize(label))
setattr(instance, normalized_func_name, fun(instance))
class MyModel(models.Model):
def __init__(self, *args, **kwargs):
super(MyModel).__init__(*args, **kwargs)
dynamic_add_checks(self, self._meta.get_field('some_choicefield')
some_choicefield = models.IntegerField(choices=SOME_CHOICES)
Now, this works but I have the feeling there is a better way to do it. Perhaps at class creation time (with metaclasses or in the new method)? Do you have any thoughts/suggestions about that?
Well I am not sure how to do this in your way, but in such cases I think the way to go is to simply create a new model, where you keep your choices, and change the field to ForeignKey. This is simpler to code and manage.
You can find a lot of information at a basic level in Django docs: Models: Relationships. In there, there are many links to follow expanding on various topics. Beyong that, I believe it just needs a bit of imagination, and maybe trial and error in the beginning.
I came across a similar problem where I needed to write large number of properties at runtime to provide backward compatibility while changing model fields. There are 2 standard ways to handle this -
First is to use a custom metaclass in your models, which inherits from models default metaclass.
Second, is to use class decorators. Class decorators sometimes provides an easy alternative to metaclasses, unless you have to do something before the creation of class, in which case you have to go with metaclasses.
I bet you know Django fields with choices provided will automatically have a display function.
Say you have a field defined like this:
category = models.SmallIntegerField(choices=CHOICES)
You can simply call a function called get_category_display() to access the display value. Here is the Django source code of this feature:
https://github.com/django/django/blob/baff4dd37dabfef1ff939513fa45124382b57bf8/django/db/models/base.py#L962
https://github.com/django/django/blob/baff4dd37dabfef1ff939513fa45124382b57bf8/django/db/models/fields/init.py#L704
So we can follow this approach to achieve our dynamically set property goal.
Here is my scenario, a little bit different from yours but down to the end it's the same:
I have two classes, Course and Lesson, class Lesson has a ForeignKey field of Course, and I want to add a property name cached_course to class Lesson which will try to get Course from cache first, and fallback to database if cache misses:
Here is a typical solution:
from django.db import models
class Course(models.Model):
# some fields
class Lesson(models.Model):
course = models.ForeignKey(Course)
#property
def cached_course(self):
key = key_func()
course = cache.get(key)
if not course:
course = get_model_from_db()
cache.set(key, course)
return course
Turns out I have so many ForeignKey fields to cache, so here is the code following the similar approach of Django get_FIELD_display feature:
from django.db import models
from django.utils.functional import curry
class CachedForeignKeyField(models.ForeignKey):
def contribute_to_class(self, cls, name, **kwargs):
super(models.ForeignKey, self).contribute_to_class(cls, name, **kwargs)
setattr(cls, "cached_%s" % self.name,
property(curry(cls._cached_FIELD, field=self)))
class BaseModel(models.Model):
def _cached_FIELD(self, field):
value = getattr(self, field.attname)
Model = field.related_model
return cache.get_model(Model, pk=value)
class Meta:
abstract = True
class Course(BaseModel):
# some fields
class Lesson(BaseModel):
course = CachedForeignKeyField(Course)
By customizing CachedForeignKeyField, and overwrite the contribute_to_class method, along with BaseModel class with a _cached_FIELD method, every CachedForeignKeyField will automatically have a cached_FIELD property accordingly.
Too good to be true, bravo!

Django immutable model query

I'm looking at a state isolation / read-only situation in Django (1.6) and i'm looking for a method to make a query return objects that are immutable.
I'm looking to fit something like the following wrapping the usual db atomicity api
MyModel.objects.filter(foo="bar").all(read_only=True)
My current thinking is this will be a custom Manager, but i'd potentially like something that can be added at runtime like:
read_only(MyModel.objects.filter(foo="bar").all())
Without too much voodoo or making them unmanaged (the option to throw an Exception on state change would be good).
The key thing is that the Model supports both read-only and the default read-write query type ideally with changes limited to code that is required to be read-only.
My other option is something like:
with isolation(raise_exception=True):
m = MyModel.objects.get(id=foo)
m.do_unknown_thing_that_may_mutate()
Are there existing solutions I'm missing at a higher level than the database?
One possibility might be to define a proxy class which overrides save to be a no-op:
class MyReadOnlyModel(MyModel):
def save(self, *args, **kwargs):
pass
class Meta:
proxy = True
Then just query MyReadOnlyModel instead of MyModel.

Django remove bulk-delete

This is a very simple question: Is there any good way to disable calling a bulk-delete (through querysets of course) on all models in an entire Django project?
The reasoning for this is under the premise that completely deleting data is almost always a poor choice, and an accidental bulk-delete can be detrimental.
Like comments says on your first post, you have to create a subclass for each of these elements:
The model manager
Queryset class
BaseModel
After some search, a great example can be found here, all credits to Akshay Shah, the blog author. Before looking to the code, be aware of that:
However, it inevitably leads to data corruption. The problem is simple: using a Boolean to store deletion status makes it impossible to enforce uniqueness constraints in your database.
from django.db import models
from django.db.models.query import QuerySet
class SoftDeletionQuerySet(QuerySet):
def delete(self):
# Bulk delete bypasses individual objects' delete methods.
return super(SoftDeletionQuerySet, self).update(alive=False)
def hard_delete(self):
return super(SoftDeletionQuerySet, self).delete()
def alive(self):
return self.filter(alive=True)
def dead(self):
return self.exclude(alive=True)
class SoftDeletionManager(models.Manager):
def __init__(self, *args, **kwargs):
self.alive_only = kwargs.pop('alive_only', True)
super(SoftDeletionManager, self).__init__(*args, **kwargs)
def get_queryset(self):
if self.alive_only:
return SoftDeletionQuerySet(self.model).filter(alive=True)
return SoftDeletionQuerySet(self.model)
def hard_delete(self):
return self.get_queryset().hard_delete()
class SoftDeletionModel(models.Model):
alive = models.BooleanField(default=True)
objects = SoftDeletionManager()
all_objects = SoftDeletionManager(alive_only=False)
class Meta:
abstract = True
def delete(self):
self.alive = False
self.save()
def hard_delete(self):
super(SoftDeletionModel, self).delete()
Basically, it adds an alive field to check if the row has been deleted or not, and update it when the delete() method is called.
Of course, this method works only on project where you can manipulate the code base.
There are nice off-the-shelf applications that allow for restoring deleted models (if that is what you are interested in), here are ones I used:
Django softdelete: https://github.com/scoursen/django-softdelete I used it more
Django reversion: https://github.com/etianen/django-reversion this one is updated more often, and allows you to revert to any version of your model (not only after delete, but as well after update).
If you really want to forbid bulk deletes, I'd discourage you from this approach as it will:
Break expectations about applicaiton behaviour. If I call MyModel.objects.all().delete() I want table to be empty afterwards.
Break existing applications.
If you want do do it please follow advice from comment:
I'm guessing this would involve subclassing QuerySet and changing the delete method to your liking, subclassing the default manager and have it use your custom query set, subclassing model - create an abstract model and have it use your custom manager and then finally have all your models subclass your custom abstract model.

Categories

Resources