Why can't I utilize other model objects in my custom Manager? - python

I want to implement a custom django.db.models.manager.Manager (let's call it MyManager) for MyModel.
The methods in MyManager needs to invoke filter methods on AnotherModel.
Is this possible ? I'm getting an ImportError because of this.

In your MyModel, you need to add your MyManager as an explicit manager.
class MyModel(models.Model):
objects = MyManager()
You can retain the standard Manager and have your manager both, by including this manager by another name.
class MyModel(models.Model):
myobjects = MyManager()
If you are using the django-admin, there are nuances involved in what manager's objects are picked up. You can find those and many other details from the awesome django documentation.

Related

Is there a way to declare a mock model in Django unit tests?

Title says it all. I'll illustrate the question by showing what I'm trying to do.
I have extended Django's ModelForm class to create a ResourceForm, which has some functionality built into its clean() method for working with Resources, the details of which are unimportant. The ResourceForm is basically a library class, and there are no models in the app where the ResourceForm class is defined, so I can't just use an existing model from the app (e.g., mock.Mock(spec=SomeModel) is not an option).
I am trying to unit test ResourceForm, but I can't figure out the right way to mock a Django Model, which is required since ResourceForm inherits from ModelForm. This is one of several efforts I have tried (not using mock in this case, but it serves to illustrate what is being attempted):
class ResourceFormTestCase(TestCase):
class SampleModel(Model):
sample_field = CharField()
class SampleResourceForm(ResourceForm):
class Meta():
model = SampleModel
fields = ['sample_field']
def test_unsupported_field_raise_validation_error(self):
print('Test validation error is raised when unsupported field is provided')
form_data = {'sample_field': 'FooBar', 'unsupported_field': 'Baz'}
form = self.SampleResourceForm(data=form_data)
But that raises:
RuntimeError: Model class customer.tests.tests_lib_restless_ext.SampleModel doesn't declare an explicit app_label and isn't in an application in INSTALLED_APPS.
I'm open to suggestions if I'm way off-base in how I'm trying to test this.
The simplest thing that might work is to use the user model that comes with Django.
If that's not acceptable, I have successfully patched enough of the Django plumbing to make it shut up and run unit tests without a database connection. Look in the django_mock_queries project to see if any of that is helpful.

What is the modern way to filter QuerySet inside custom QuerySet class in Django?

I have a custom QuerySet class which is later used to create an objects manager in actual model somewhat like this:
class FooQuerySet(models.QuerySet):
# some stuff in here ...
class Foo(models.Model):
objects = FooQuerySet.as_manager()
Now, I'd like to apply some filter like filter(active=True) to all QuerySet results from FooQuerySet without creating a Manager class, however it's not described in Django docs and I've not been able to find what I need in resources I've found on Internet.

How to add a Model Manager to a Django-Model without deleting the objects that already exist?

the question is pretty explanatory I believe. I want to add 2 new managers to a django model. However, if I add these two managers, the objects I currently have in my database are deleted. Is there any way to get around this? Or do I need to simply remake all the objects again?
Don't worry about it. When you add a new manager, it's a class that extends from models.Manager so, this new class already has all the manager's default methods.
Remember you can create a custom manager by doing:
class MyManager(models.Manager):
...
And the add to your models class
class MyModel(models.Model):
...
objects = MyManager()
As you can see, MyManager class extends from models.Manager. You can see docs here

Override .objects in Model Django outside models.py

I want to override Classes in models.py to get call stacks at runtime.
I know that we can do following in Django in order to override manager and hence customize QuerySet API -
So, in models.py
class A(models.Model):
objects = SomeClass()
and in SomeClass
class B(Manager):
def get_query_set():
# override the way you want
But, in order to make things simpler I am thinking to use decorator to override the same - So,
in models.py
# decoratorForOverriding
class A(models.Model):
pass
in decorator.py
def decoratorForOverriding(cls):
cls.objects = SomeClass()
Error I get is
AttributeError: 'NoneType' object has no attribute '_meta'
Any idea what is going on?
Should I make the class A as an abstarct class? That did not do the trick either.
You mentioned that you were aware of how to override the Manager of a Model. I am having trouble imagining what the benefit of overriding the manager in a decorator vs as a property. as bruno desthuilliers mentions in his comment, Django does a bunch of stuff with the objects property at instantiation, the decorator probably will not work.
Do this the way Django core intended: https://docs.djangoproject.com/en/1.8/topics/db/managers/
doing otherwise is going to add technical debt for no real benefit (at least as far as I can tell). If there is something that cannot be achieved through a Manager lets tackle that problem.

Django signals vs. overriding save method

I'm having trouble wrapping my head around this. Right now I have some models that looks kind of like this:
def Review(models.Model)
...fields...
overall_score = models.FloatField(blank=True)
def Score(models.Model)
review = models.ForeignKey(Review)
question = models.TextField()
grade = models.IntegerField()
A Review is has several "scores", the overall_score is the average of the scores. When a review or a score is saved, I need to recalculate the overall_score average. Right now I'm using a overridden save method. Would there be any benefits to using Django's signal dispatcher?
Save/delete signals are generally favourable in situations where you need to make changes which aren't completely specific to the model in question, or could be applied to models which have something in common, or could be configured for use across models.
One common task in overridden save methods is automated generation of slugs from some text field in a model. That's an example of something which, if you needed to implement it for a number of models, would benefit from using a pre_save signal, where the signal handler could take the name of the slug field and the name of the field to generate the slug from. Once you have something like that in place, any enhanced functionality you put in place will also apply to all models - e.g. looking up the slug you're about to add for the type of model in question, to ensure uniqueness.
Reusable applications often benefit from the use of signals - if the functionality they provide can be applied to any model, they generally (unless it's unavoidable) won't want users to have to directly modify their models in order to benefit from it.
With django-mptt, for example, I used the pre_save signal to manage a set of fields which describe a tree structure for the model which is about to be created or updated and the pre_delete signal to remove tree structure details for the object being deleted and its entire sub-tree of objects before it and they are deleted. Due to the use of signals, users don't have to add or modify save or delete methods on their models to have this management done for them, they just have to let django-mptt know which models they want it to manage.
You asked:
Would there be any benefits to using Django's signal dispatcher?
I found this in the django docs:
Overridden model methods are not called on bulk operations
Note that the delete() method for an object is not necessarily called
when deleting objects in bulk using a QuerySet or as a result of a
cascading delete. To ensure customized delete logic gets executed, you
can use pre_delete and/or post_delete signals.
Unfortunately, there isn’t a workaround when creating or updating
objects in bulk, since none of save(), pre_save, and post_save are
called.
From: Overriding predefined model methods
Small addition from Django docs about bulk delete (.delete() method on QuerySet objects):
Keep in mind that this will, whenever possible, be executed purely in
SQL, and so the delete() methods of individual object instances will
not necessarily be called during the process. If you’ve provided a
custom delete() method on a model class and want to ensure that it is
called, you will need to “manually” delete instances of that model
(e.g., by iterating over a QuerySet and calling delete() on each
object individually) rather than using the bulk delete() method of a
QuerySet.
https://docs.djangoproject.com/en/1.11/topics/db/queries/#deleting-objects
And bulk update (.update() method on QuerySet objects):
Finally, realize that update() does an update at the SQL level and,
thus, does not call any save() methods on your models, nor does it
emit the pre_save or post_save signals (which are a consequence of
calling Model.save()). If you want to update a bunch of records for a
model that has a custom save() method, loop over them and call save()
https://docs.djangoproject.com/en/2.1/ref/models/querysets/#update
If you'll use signals you'd be able to update Review score each time related score model gets saved. But if don't need such functionality i don't see any reason to put this into signal, that's pretty model-related stuff.
It is a kind sort of denormalisation. Look at this pretty solution. In-place composition field definition.

Categories

Resources