I have a table to store view events, that is, if a user views an entity, a record will be stored into that table. This table is represented by a model that has a generic relation, that is, it can be related to any other model.
I have defined a mixin ViewTracked that should be extended by any model that can be tracked (i.e. class SomeModel(ViewTracked)).
I want to have a custom method for queryset of objects manager called custom_method for example. I know that I can define a custom Manager and override the objects manager with it easily, but the problem is that the tracked model can already have a custom manager that has his own custom queryset, so I can't simply override it and lose the custom queryset that it has.
Unfortunately, I couldn't find a proper way of doing this, so I tried to add a metaclass to override the manager's get_queryset and add my custom method to it, but for some reason, when I call SomeModel.objects it always returns None.
Here's what I tried:
# Meta class
class ViewTrackedMeta(ModelBase):
def __new__(mcs, class_name, base_classes, attributes_dict):
# let ModelBase do its magic
new_class = super().__new__(mcs, class_name, base_classes, attributes_dict)
if hasattr(new_class, 'objects'):
objects_manager = new_class.objects
if isinstance(objects_manager, Manager):
queryset = objects_manager.get_queryset()
def custom_method(queryset):
return queryset.filter(...)
def get_extended_queryset(manager):
queryset.custom_method = types.MethodType(custom_method, queryset)
objects_manager.get_queryset = types.MethodType(get_extended_queryset, objects_manager)
return new_class
# Mixin
class ViewTracked(Model, metaclass=ViewTrackedMeta):
class Meta:
abstract = True
...
# Models
class SomeModel(ViewTracked):
objects = CustomManager()
class SomeOtherModel(ViewTracked):
... # default django objects manager
class SomeOtherModel(ViewTracked):
objects = OtherCustomManager()
Is there any other way I can achieve what I want? Why SomeModel.objects is always returning None?
Other than instaniating your manager classes, you should be using from_queryset. Here are the docs.
class CustomQuerySet(models.QuerySet):
def manager_and_queryset_method(self):
return
class MyModel(models.Model):
objects = models.Manager.from_queryset(CustomQuerySet)()
Now you can do:
MyModel.objects.manager_and_queryset_method()
as well as
MyModel.objects.filter(something="else").manager_and_queryset_method()
Related
I've got a model with a field tool_class, whose verbose name is class and differs from name:
class Tool(models.Model):
tool_class = jsonfield.JSONField(verbose_name="class")
The Serializer and ViewSet are just stock HyperlinkedModelSerializer and ModelViewSet.
So, when I POST or PUT data to the server with a key class, it is recognized fine:
'{"class": "..."}
but in the response data it is called tool_class again:
{"tool_class": "..."}
How to make it be called class always?
I can't use the name "class" for the field name, as it is a reserved word in python, but in API it absolutely must be called "class", because the API conforms to a certain open standard, which specifies this word.
Obviously, I cannot say:
class = CharField(source="tool_class")
in my ToolSerializer, because it's a SyntaxError: invalid syntax.
SIMPLE SOLUTION:
Guys in another thread suggested a great solution. You can use vars() syntax to circumvent this problem. For instance, I use the following code:
class Tool(Document):
vars()['class'] = mongoengine.fields.DictField(required=True)
Serializer creates respective field automatically. Ain't we sneaky?
I tried to find a way to have a field called class on the serializer, using some tricks with setattr, but it was getting very intrusive and hacky. The field_name is collected from the field at the time of binding the field to the serializer, and there is no easy place to override the behaviour of the bind.
In the end I decided it would be better and simpler just to let DRF do its thing, and add a post-processing step on the serializer:
class ToolSerializer(ModelSerializer):
class Meta:
model = Tool
def to_representation(self, instance):
data = super(ToolSerializer, self).to_representation(instance)
data['class'] = data.pop('tool_class')
return data
Note that the data structure returned by to_representation is an OrderedDict, and this disturbs the ordering slightly - the renamed key in this mapping will be removed from wherever it was at and pushed to the back.
That is unlikely to be an issue for most use-cases, so you shouldn't bother to address it if not necessary. If you do need to preserve ordering, rebuild a new OrderedDict using a comprehension:
data = OrderedDict(
('class' if k == 'tool_class' else k, v) for (k, v) in data.items()
)
You can do this by overriding the metaclass for Serializers. Here is an example of a serializers.py file.
The main magic is this section of the metaclass
# Remap fields (to use class instead of class_)
fields_ = []
for name, field in fields:
if name.endswith('_'):
name = name.rstrip('_')
fields_.append((name, field))
This takes any field you define in the serializer that ends in an underscore (ie. field_) and removes the underscore from the name when it binds the Fields and sets the _declared_fields attribute on the serializer.
from collections import OrderedDict
from rest_framework import serializers
from rest_framework.fields import Field
from snippets.models import Snippet, LANGUAGE_CHOICES, STYLE_CHOICES
class MyMeta(serializers.SerializerMetaclass):
#classmethod
def _get_declared_fields(cls, bases, attrs):
fields = [(field_name, attrs.pop(field_name))
for field_name, obj in list(attrs.items())
if isinstance(obj, Field)]
fields.sort(key=lambda x: x[1]._creation_counter)
# If this class is subclassing another Serializer, add that Serializer's
# fields. Note that we loop over the bases in *reverse*. This is necessary
# in order to maintain the correct order of fields.
for base in reversed(bases):
if hasattr(base, '_declared_fields'):
fields = list(base._declared_fields.items()) + fields
# Remap fields (to use class instead of class_)
fields_ = []
for name, field in fields:
if name.endswith('_'):
name = name.rstrip('_')
fields_.append((name, field))
return OrderedDict(fields_)
class ToolSerializer(serializers.Serializer):
__metaclass__ = MyMeta
...
class_ = serializers.JSONField(source='tool_class', label='class')
def create(self, validated_data):
"""
Create and return a new `Snippet` instance, given the validated data.
"""
return Snippet.objects.create(**validated_data)
def update(self, instance, validated_data):
"""
Update and return an existing `Snippet` instance, given the validated data.
"""
...
instance.class_ = validated_data.get('class', instance.class_)
instance.save()
return instance
I am generating a Django model based on an abstract model class AbstractAttr and a normal model (let's say Foo).
I want my foo/models.py to look like this:
from bar.models import Attrs
# ...
class Foo(models.Model):
....
attrs = Attrs()
In the Attrs class which mimics a field I have a contribute_to_class that generates the required model using type(). The generated model c is called FooAttr.
Everything works. If I migrate, I see FooAttr appear in the proper table.
EXCEPT FOR ONE THING.
I want to be able to from foo.models import FooAttr. Somehow my generated FooAttr class is not bound to the models.py file in which it is generated.
If I change my models.py to this:
class Foo(models.Model):
# ...
FooAttr = generate_foo_attr_class(...)
it works, but this is not what I want (for example, this forces the dev to guess the generate class name).
Is what I want possible, define the class somewhat like in the first example AND bind it to the specific models.py module?
The project (pre-Alpha) is here (in develop branch):
https://github.com/zostera/django-mav
Some relevant code:
def create_model_attribute_class(model_class, class_name=None, related_name=None, meta=None):
"""
Generate a value class (derived from AbstractModelAttribute) for a given model class
:param model_class: The model to create a AbstractModelAttribute class for
:param class_name: The name of the AbstractModelAttribute class to generate
:param related_name: The related name
:return: A model derives from AbstractModelAttribute with an object pointing to model_class
"""
if model_class._meta.abstract:
# This can't be done, because `object = ForeignKey(model_class)` would fail.
raise TypeError("Can't create attrs for abstract class {0}".format(model_class.__name__))
# Define inner Meta class
if not meta:
meta = {}
meta['app_label'] = model_class._meta.app_label
meta['db_tablespace'] = model_class._meta.db_tablespace
meta['managed'] = model_class._meta.managed
meta['unique_together'] = list(meta.get('unique_together', [])) + [('attribute', 'object')]
meta.setdefault('db_table', '{0}_attr'.format(model_class._meta.db_table))
# The name of the class to generate
if class_name is None:
value_class_name = '{name}Attr'.format(name=model_class.__name__)
else:
value_class_name = class_name
# The related name to set
if related_name is None:
model_class_related_name = 'attrs'
else:
model_class_related_name = related_name
# Make a type for our class
value_class = type(
str(value_class_name),
(AbstractModelAttribute,),
dict(
# Set to same module as model_class
__module__=model_class.__module__,
# Add a foreign key to model_class
object=models.ForeignKey(
model_class,
related_name=model_class_related_name
),
# Add Meta class
Meta=type(
str('Meta'),
(object,),
meta
),
))
return value_class
class Attrs(object):
def contribute_to_class(self, cls, name):
# Called from django.db.models.base.ModelBase.__new__
mav_class = create_model_attribute_class(model_class=cls, related_name=name)
cls.ModelAttributeClass = mav_class
I see you create the model from within models.py, so I think you should be able to add it to the module's globals. How about this:
new_class = create_model_attribute_class(**kwargs)
globals()[new_class.__name__] = new_class
del new_class # no need to keep original around
Thanks all for thinking about this. I have updated the source code of the project at GitHub and added more tests. See https://github.com/zostera/django-mav
Since the actual generation of the models is done outside of foo/models.py (it takes place in mav/models.py, it seems Pythonically impossible to link the model to foo/models.py. Also, after rethinking this, it seems to automagically for Python (explicit is better, no magic).
So my new strategy is to use simple functions, a decorator to make it easy to add mav, and link the generated models to mac/attrs.py, so I can universally from mav.attrs import FooAttr. I also link the generated class to the Foo model as Foo._mav_class.
(In this comment, Foo is of course used as an example model that we want to add model-attribute-value to).
I have a model, Parent which contains a Django ContentType GenericForeignKey relationship to various child models (ChildA, ChildB) in Parent.child
I'm trying to get ListCreateAPIView and other listing views working with this setup. Originally I handled the serialization of the child instance using a SerializerMethodField which looked something like:
class ParentSerializer(serializers.ModelSerializer):
child = serializers.SerializerMethodField('get_child')
(other fields)
def get_child(self, obj):
if obj.content_type == "child_a":
return ChildASerializer(obj.child).data
...
Now I want to take advantage of Django Rest Framework to its full (including deserialization/creation/validation) so I want to avoid my current approach and increase DRYness by doing:
class ParentSerializer(serializers.ModelSerializer):
(code for serializing parent fields except 'child' attribute)
class ChildASerializer(serializers.ModelSerializer):
(code for ChildA fields)
class ParentTypeASerializer(ParentSerializer):
child = ChildASerializer()
If i'm reading the docs right, this means POST/PUT will go through the serializer's process without me having to override the post methods in views and other uglyness. This is important as ChildA,ChildB,ChildC come from plugins and the core Parent/ParentSerializer should be as unaware of them as possible.
My thinking was to override get_serializer() in the view, but when listing many objects, I don't see how I can provide ParentTypeASerializer, ParentTypeBSerializer etc in the view.
def get_serializer(self, instance=None, data=None, files=None, many=False, partial=True):
serializer_class = None
if instance and instance.content_type == "child_a":
serializer_class = ParentTypeASerializer
if instance and instance.content_type == "child_b":
serializer_class = ParentTypeBSerializer
...
# What about many=True ?!
return serializer_class(instance,data=data,files=files,many-many,partial=partial,context=context)
Another idea I had was to write a PolymorphicField class extending WritableField that does the decision. Unsure if this is the simplest approach:
class ParentSerializer(serializers.ModelSerializer):
child = PolymorphicChildSerializerProxy() # Passes through/wraps the right serializer
question: is there any dynamic/runtime/per-object way to provide the right serializer for an Generic/polymorphic nested object in such a way either at the view level or the parent serializer? Ideally something like the second example and an override in the view that works for List/Create/Destroy generics or like the first example except I return a serializer class rather than serialized data?
I have a class with a custom pk field, and a classmethod to generate this special pk:
class Card(models.Model):
custom_pk = models.BigIntegerField(primary_key=True)
other_attr = ...
#classmethod
def gen_id(cls):
# ...
return the_id
Now, I suppose I can create object (in a view for example) doing this:
Card.objects.create(custom_pk=Card.gen_id(), other_attr="foo")
But I'd like to have the same result using the classic way to do it:
Card.objects.create(other_attr="foo")
Thanks!
You can use pre_save signal to supply your primary key is missing. This signal handler will be called before each call to Card.save() method, therefore we need to make sure that we won't override custom_pk if already set:
#receiver(pre_save, sender=Card)
def add_auto_pk(sender, instance, **kwargs):
if not instance.custom_pk:
instance.custom_pk = Card.get_id()
See Django signal documentation for more details.
Is it possible to use something similar to the inline relational items from the Django admin to represent embedded models in a ListField?
For Example, I've got the following models:
class CartEntry(model.Model):
product_name=model.CharField( max_length=20 )
quantity = model.IntegerField()
class Cart(model.Model):
line_items = ListField(EmbeddedModelField('CartEntry'))
I've tried using the standard inlining, but I know it's not right:
class CartEntryInline( admin.StackedInline ):
model=CartEntry
class CartAdmin(admin.ModelAdmin)
inlines=[CartEntryInline]
But obviously that doesn't work, since there's no foreign key relation. Is there any way to do this in django-nonrel?
This is not so easy to do out of the box. You will need to manage ListField and EmbeddedModelField type fields in Django's admin module and do some hacking to get it done. You'll have to implement two parts:
Use EmbeddedModelField in Django's admin
You need to define a class that handles EmbeddedModelField objects to make it work with Django's admin. Here is a link where you can find great sample codes. Below are just code blocks for demonstration:
Add this class into your models.py file and use EmbedOverrideField instead of EmbeddedModelField in Cart model:
class EmbedOverrideField(EmbeddedModelField):
def formfield(self, **kwargs):
return models.Field.formfield(self, ObjectListField, **kwargs)
Implement a class in forms.py that has two methods:
class ObjectListField(forms.CharField):
def prepare_value(self, value):
pass # you should actually implement this method
def to_python(self, value):
pass # Implement this method as well
Use ListFields in Django's admin
You also need to define a class that handles ListField objects to make it work with Django's admin. Here is a link where you can find great sample codes. Below are just code blocks for demonstration:
Add this class into your models.py file and ItemsField instead of ListField in Cart model:
class ItemsField(ListField):
def formfield(self, **kwargs):
return models.Field.formfield(self, StringListField, **kwargs)
Implement a class in forms.py that has two methods:
class StringListField(forms.CharField):
def prepare_value(self, value):
pass # you should actually implement this method
def to_python(self, value):
pass # Implement this method as well