Django immutable model query - python

I'm looking at a state isolation / read-only situation in Django (1.6) and i'm looking for a method to make a query return objects that are immutable.
I'm looking to fit something like the following wrapping the usual db atomicity api
MyModel.objects.filter(foo="bar").all(read_only=True)
My current thinking is this will be a custom Manager, but i'd potentially like something that can be added at runtime like:
read_only(MyModel.objects.filter(foo="bar").all())
Without too much voodoo or making them unmanaged (the option to throw an Exception on state change would be good).
The key thing is that the Model supports both read-only and the default read-write query type ideally with changes limited to code that is required to be read-only.
My other option is something like:
with isolation(raise_exception=True):
m = MyModel.objects.get(id=foo)
m.do_unknown_thing_that_may_mutate()
Are there existing solutions I'm missing at a higher level than the database?

One possibility might be to define a proxy class which overrides save to be a no-op:
class MyReadOnlyModel(MyModel):
def save(self, *args, **kwargs):
pass
class Meta:
proxy = True
Then just query MyReadOnlyModel instead of MyModel.

Related

What is the best way to handle functional django model field defaults?

Sometimes a ForeignKey field needs a default. For example:
class ReleaseManager(BaseManager):
def default(self):
return self.filter(default=True).order_by('-modified').first()
class Release(BaseModel):
default = models.BooleanField(default=False)
...
class Server(models.Model):
...
release = models.ForeignKey(Release, null=True, default=Release.objects.default)
All is well and good with the above code until the time comes for db migration whereupon the functional default causes big problems because the default function cannot be serialized. Manual migration can work around this but on a large project where migrations are perhaps squashed periodically this leaves a time bomb for the unwary.
A common workaround is to move the default from the field to the save method of the model but this causes confusion if the model is used by things like the rest framework or in creating forms where the default is expected on the field.
My current favourite workaround works with migrations and with the rest framework and other form generation. It assumes the object manager supplies a default method and uses a specialized ForeignKey field to get at it:
class ForeignKeyWithObjectManagerDefault(models.ForeignKey):
def __init__(self, to, **kwargs):
super().__init__(to, **kwargs)
self.to = to
def get_default(self):
return self.to.objects.default().pk
class Project(SOSAdminObject):
primary = ForeignKeyWithObjectManagerDefault(Primary, related_name='projects')
...
Now migrations work as expected and we can use any functionality we like to supply a default object to a foreign key field.

Django remove bulk-delete

This is a very simple question: Is there any good way to disable calling a bulk-delete (through querysets of course) on all models in an entire Django project?
The reasoning for this is under the premise that completely deleting data is almost always a poor choice, and an accidental bulk-delete can be detrimental.
Like comments says on your first post, you have to create a subclass for each of these elements:
The model manager
Queryset class
BaseModel
After some search, a great example can be found here, all credits to Akshay Shah, the blog author. Before looking to the code, be aware of that:
However, it inevitably leads to data corruption. The problem is simple: using a Boolean to store deletion status makes it impossible to enforce uniqueness constraints in your database.
from django.db import models
from django.db.models.query import QuerySet
class SoftDeletionQuerySet(QuerySet):
def delete(self):
# Bulk delete bypasses individual objects' delete methods.
return super(SoftDeletionQuerySet, self).update(alive=False)
def hard_delete(self):
return super(SoftDeletionQuerySet, self).delete()
def alive(self):
return self.filter(alive=True)
def dead(self):
return self.exclude(alive=True)
class SoftDeletionManager(models.Manager):
def __init__(self, *args, **kwargs):
self.alive_only = kwargs.pop('alive_only', True)
super(SoftDeletionManager, self).__init__(*args, **kwargs)
def get_queryset(self):
if self.alive_only:
return SoftDeletionQuerySet(self.model).filter(alive=True)
return SoftDeletionQuerySet(self.model)
def hard_delete(self):
return self.get_queryset().hard_delete()
class SoftDeletionModel(models.Model):
alive = models.BooleanField(default=True)
objects = SoftDeletionManager()
all_objects = SoftDeletionManager(alive_only=False)
class Meta:
abstract = True
def delete(self):
self.alive = False
self.save()
def hard_delete(self):
super(SoftDeletionModel, self).delete()
Basically, it adds an alive field to check if the row has been deleted or not, and update it when the delete() method is called.
Of course, this method works only on project where you can manipulate the code base.
There are nice off-the-shelf applications that allow for restoring deleted models (if that is what you are interested in), here are ones I used:
Django softdelete: https://github.com/scoursen/django-softdelete I used it more
Django reversion: https://github.com/etianen/django-reversion this one is updated more often, and allows you to revert to any version of your model (not only after delete, but as well after update).
If you really want to forbid bulk deletes, I'd discourage you from this approach as it will:
Break expectations about applicaiton behaviour. If I call MyModel.objects.all().delete() I want table to be empty afterwards.
Break existing applications.
If you want do do it please follow advice from comment:
I'm guessing this would involve subclassing QuerySet and changing the delete method to your liking, subclassing the default manager and have it use your custom query set, subclassing model - create an abstract model and have it use your custom manager and then finally have all your models subclass your custom abstract model.

Google App Engine base and subclass gets

I want to have a base class called MBUser that has some predefined properties, ones that I don't want to be changed. If the client wants to add properties to MBUser, it is advised that MBUser be subclassed, and any additional properties be put in there.
The API code won't know if the client actually subclasses MBUser or not, but it shouldn't matter. The thinking went that we could just get MBUser by id. So I expected this to work:
def test_CreateNSUser_FetchMBUser(self):
from nsuser import NSUser
id = create_unique_id()
user = NSUser(id = id)
user.put()
# changing MBUser.get.. to NSUser.get makes this test succeed
get_user = MBUser.get_by_id(id)
self.assertIsNotNone(get_user)
Here NSUser is a subclass of MBUser. The test fails.
Why can't I do this?
What's a work around?
Models are defined by their "kind", and a subclass is a different kind, even if it seems the same.
The point of subclassing is not to share values, but to share the "schema" you've created for a given "kind".
A kind map is created on base class ndb.Model (it seems like you're using ndb since you mentioned get_by_id) and each kind is looked up when you do queries like this.
For subclasses, the kind is just defined as the class name:
#classmethod
def _get_kind(cls):
return cls.__name__
I just discovered GAE has a solution for this. It's called the PolyModel:
https://developers.google.com/appengine/docs/python/ndb/polymodelclass

In django, how to delete all related objects when deleting a certain type of instances?

I first tried to override the delete() method but that doesn't work for QuerySet's bulk delete method. It should be related to pre_delete signal but I can't figure it out. My code is as following:
def _pre_delete_problem(sender, instance, **kwargs):
instance.context.delete()
instance.stat.delete()
But this method seems to be called infinitely and the program runs into a dead loop.
Can someone please help me?
If the class has foreign keys (or related objects) they are deleted by default like a DELETE CASCADE in sql.
You can change the behavior using the on_delete argument when defining the ForeignKey in the class, but by default it is CASCADE.
You can check the docs here.
Now the pre_delete signal works, but it doesn't call the delete() method if you are using a bulk delete, since its not deleting in a object by object basis.
In your case, using the post_delete signal instead of pre_delete should fix the infinite loop issue. Due to a ForeignKey's on_delete default value of cascade, using pre_delete logic this way will trigger the instance.context object to call delete on instance, which will then call instance.context, and so forth.
Using this approach:
def _post_delete_problem(sender, instance, **kwargs):
instance.context.delete()
instance.stat.delete()
post_delete.connect(_post_delete_problem, sender=Foo)
Can do the cleanup you want.
If you'd like a quick one-off to delete an instance and all of its related objects and those related objects' objects and so on without having to change the DB schema, you can do this -
def recursive_delete(to_del):
"""Recursively delete an object, all of its protected related
instances, those instances' protected instances, and so on.
"""
from django.db.models import ProtectedError
while True:
try:
to_del_pk = to_del.pk
if to_del_pk is None:
return # unsaved object
to_del.delete()
print(f"Deleted {to_del.__class__.__name__} with pk {to_del_pk}: {to_del}")
except ProtectedError as e:
for protected_ob in e.protected_objects:
recursive_delete(protected_ob)
Be careful, though!
I'd only use this to help with debugging in one-off scripts (or on the shell) with test databases that I don't mind wiping. Relationships aren't always obvious and if something is protected, it's probably for a reason.

Django Admin interface with pickled set

I have a model that has a pickled set of strings. (It has to be pickled, because Django has no built in set field, right?)
class Foo(models.Model):
__bar = models.TextField(default=lambda: cPickle.dumps(set()), primary_key=True)
def get_bar(self):
return cPickle.loads(str(self.__bar))
def set_bar(self, values):
self.__bar = cPickle.dumps(values)
bar = property(get_bar, set_bar)
I would like the set to be editable in the admin interface. Obviously the user won't be working with the pickled string directly. Also, the interface would need a widget for adding/removing strings from a set.
What is the best way to go about doing this? I'm not super familiar with Django's admin system. Do I need to build a custom admin widget or something?
Update: If I do need a custom widget, this looks helpful: http://www.fictitiousnonsense.com/archives/22
Update 2: Now I'm looking through different relational models to see if that will work. One idea I'm toying with:
class FooMember(models.Model):
name = models.CharField(max_length=120)
foo = models.ForeignKey('Foo')
class Foo(models.Model):
def get_names(self):
return FooMember.objects.filter(foo__exact=self)
Disadvantages of this include:
It feels excessive to make an entire model for one data field (name).
I would like the admin interface for Foo to allow the user to enter a list of strings. I'm not sure how to do that with this setup; making a custom form widget seems like less work.
Uhm. Django usually stores it's data in an SQL database. Storing a set as a pickled string is definietly not the best way to use an SQL database. It's not immediately obvious which is the right solution in your case, that depends what is in that set, but this is the wrong solution in any case.
You might want a new table for that set, or at least save it as comma separated values or something.

Categories

Resources