I need to define a class variable, named "class". I want to do this directly in class namespace, not in a class method. Obviously, I cannot directly say:
class MyClass(object):
a = 1
b = 2
class = 3
So, I want to do something like:
class MyClass(object):
a = 1
b = 2
self.__dict__["class"] = 3
Where "self" should be replaced with a reference to the class. So, how do I refer to a class from class namespace?
NOTE: This question might seem contrived, but it stems from a practical goal.
In fact, MyClass is a Django REST Framework serializer and I need a "class" field to be defined on it, because this REST endpoint has to follow a certain protocol.
There's a metaclass defined for Serializers, which calls __new__() upon class creation and that __new__() aggregates all the fields, defined on class and populates a registry of fields with them. So, I have to define my variable class before the class is created. Also see: Django REST Framework: how to make verbose name of field differ from its field_name?
You could do:
class MyClass(object):
a = 1
b = 2
vars()['class'] = 3
But since class is a reserved keyword, then you have to access the variable using getattr and setattr, so that class remains a string.
>>> m = MyClass()
>>> getattr(m, 'class')
3
You can create your class from type and add the attribute class to the class dictionary:
>>> MyClass = type('MyClass', (), {'class': 3, 'a':1, 'b':2})
>>> getattr(MyClass, 'class')
3
You can't directly access the name class with a dot reference, you'll need to use getattr:
>>> MyClass.class
File "<stdin>", line 1
MyClass.class
^
SyntaxError: invalid syntax
FWIW, you can define the class methods like you would do conventionally and then bind them to the class later on.
Caveat: While this works, I wouldn't use this hack myself as the keyword class is too much of a keyword to tamper with.
You don't need to name the attribute class, which can lead to all kinds of problems. You can name the attribute class_, but still have it pull from a source attribute named class and render out to JSON as class.
You can do this by overriding the metaclass for Serializers. Here is an example of a serializers.py file (the models and classes are largely pulled straight from the tutorial).
The main magic is this section of the metaclass
# Remap fields (to use class instead of class_)
fields_ = []
for name, field in fields:
if name.endswith('_'):
name = name.rstrip('_')
fields_.append((name, field))
This takes any field you define in the serializer that ends in an underscore (ie. field_) and removes the underscore from the name when it binds the Fields and sets the _declared_fields attribute on the serializer.
from collections import OrderedDict
from rest_framework import serializers
from rest_framework.fields import Field
from snippets.models import Snippet, LANGUAGE_CHOICES, STYLE_CHOICES
class MyMeta(serializers.SerializerMetaclass):
#classmethod
def _get_declared_fields(cls, bases, attrs):
fields = [(field_name, attrs.pop(field_name))
for field_name, obj in list(attrs.items())
if isinstance(obj, Field)]
fields.sort(key=lambda x: x[1]._creation_counter)
# If this class is subclassing another Serializer, add that Serializer's
# fields. Note that we loop over the bases in *reverse*. This is necessary
# in order to maintain the correct order of fields.
for base in reversed(bases):
if hasattr(base, '_declared_fields'):
fields = list(base._declared_fields.items()) + fields
# Remap fields (to use class instead of class_)
fields_ = []
for name, field in fields:
if name.endswith('_'):
name = name.rstrip('_')
fields_.append((name, field))
return OrderedDict(fields_)
class SnippetSerializer(serializers.Serializer):
__metaclass__ = MyMeta
pk = serializers.IntegerField(read_only=True)
title = serializers.CharField(required=False, allow_blank=True, max_length=100)
class_ = serializers.CharField(source='klass', label='class', default='blah')
def create(self, validated_data):
"""
Create and return a new `Snippet` instance, given the validated data.
"""
return Snippet.objects.create(**validated_data)
def update(self, instance, validated_data):
"""
Update and return an existing `Snippet` instance, given the validated data.
"""
instance.title = validated_data.get('title', instance.title)
instance.class_ = validated_data.get('class', instance.class_)
instance.save()
return instance
Here is the models.py file for reference (django doesn't allow field names to end in an underscore)
from django.db import models
class Snippet(models.Model):
title = models.CharField(max_length=100, blank=True, default='')
klass = models.CharField(max_length=100, default='yo')
This is how it looks from the django shell
$ python manage.py shell
>>> from snippets.models import Snippet
>>> from snippets.serializers import SnippetSerializer
>>> from rest_framework.renderers import JSONRenderer
>>> from rest_framework.parsers import JSONParser
>>> snippet = Snippet(title='test')
>>> snippet.save()
>>> serializer = SnippetSerializer(snippet)
>>> serializer.data
{'title': u'test', 'pk': 6, 'class': u'yo'}
You cannot it while creating class - technically that object does not exist yet.
You could consider:
class MyClass(object):
a = 1
b = 2
# class is already created
MyClass.__dict__["class"] = 3
But MyClass.__dict__ is not a dict, but a dictproxy, and 'dictproxy' object does not support item assignment, so there would be TypeError raised.
Use '''setattr''' to set a class attribute immediately after you finish the class definition. Outside the definition, of course. Pass '''MyClass''' for parameter, and it will create an attribute of your class.
Dict of members should not be used, especially for modifying an object. In fact, it is rarely needed. Most of things (though not all) people usually intend it to do are better done by '''setattr''' and '''getattr'''.
Finally, as one of those who answered noticed, you do not really need a field named '''class''', but that's another story, different from your original question.
Related
I'm a little new to tinkering with class inheritance in python, particularly when it comes down to using class attributes. In this case I am using a class attribute to change an argument in pydantic's Field() function. This wouldn't be too hard to do if my class contained it's own constructor, however, my class User1 is inheriting this from pydantic's BaseModel.
The idea is that I would like to be able to change the class attribute prior to creating the instance.
Please see some example code below:
from pydantic import Basemodel, Field
class User1(BaseModel):
_set_ge = None # create class attribute
item: float = Field(..., ge=_set_ge)
# avoid overriding BaseModel's __init__
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
User1._set_ge = 0 # setting the class attribute to a new value
instance = User1(item=-1)
print(instance) # item=-1.0
When creating the instance using instance = User1(item=-1) I would expect a validation error to be thrown, but it instead passes validation and simply returns the item value.
If I had my own constructor there would be little issue in changing the _set_ge, but as User1 inheriting this constructor from BaseModel, things are a little more complicated.
The eventual aim is to add this class to a fastapi endpoint as follows:
from fastapi import Fastapi
from schemas import User1
class NewUser1(User1):
pass
NewUser1._set_ge = 0
#app.post("/")
def endpoint(request: NewUser1):
return User1.item
To reduce code duplication, I aimed to use this method to easily change Field() arguments. If there is a better way, I'd be glad to consider that too.
This question is quite closely related to this unanswered one.
In the end, the #validator proposal by #hernán-alarcón is probably the best way to do this. For example:
from pydantic import Basemodel, Field, NumberNotGeError
from typing import ClassVar
class User(BaseModel):
_set_ge = ClassVar[float] # added the ClassVar typing to make clearer, but the underscore should be sufficient
item: float = Field(...)
#validator('item')
def limits(cls, v):
limit_number = cls._set_ge
if v >= limit_number:
return v
else:
raise NumberNotGeError(limit_value=limit_number)
class User1(User)
_set_ge = 0 # setting the class attribute to a new value
instance = User1(item=-1) # raises the error
I have a table to store view events, that is, if a user views an entity, a record will be stored into that table. This table is represented by a model that has a generic relation, that is, it can be related to any other model.
I have defined a mixin ViewTracked that should be extended by any model that can be tracked (i.e. class SomeModel(ViewTracked)).
I want to have a custom method for queryset of objects manager called custom_method for example. I know that I can define a custom Manager and override the objects manager with it easily, but the problem is that the tracked model can already have a custom manager that has his own custom queryset, so I can't simply override it and lose the custom queryset that it has.
Unfortunately, I couldn't find a proper way of doing this, so I tried to add a metaclass to override the manager's get_queryset and add my custom method to it, but for some reason, when I call SomeModel.objects it always returns None.
Here's what I tried:
# Meta class
class ViewTrackedMeta(ModelBase):
def __new__(mcs, class_name, base_classes, attributes_dict):
# let ModelBase do its magic
new_class = super().__new__(mcs, class_name, base_classes, attributes_dict)
if hasattr(new_class, 'objects'):
objects_manager = new_class.objects
if isinstance(objects_manager, Manager):
queryset = objects_manager.get_queryset()
def custom_method(queryset):
return queryset.filter(...)
def get_extended_queryset(manager):
queryset.custom_method = types.MethodType(custom_method, queryset)
objects_manager.get_queryset = types.MethodType(get_extended_queryset, objects_manager)
return new_class
# Mixin
class ViewTracked(Model, metaclass=ViewTrackedMeta):
class Meta:
abstract = True
...
# Models
class SomeModel(ViewTracked):
objects = CustomManager()
class SomeOtherModel(ViewTracked):
... # default django objects manager
class SomeOtherModel(ViewTracked):
objects = OtherCustomManager()
Is there any other way I can achieve what I want? Why SomeModel.objects is always returning None?
Other than instaniating your manager classes, you should be using from_queryset. Here are the docs.
class CustomQuerySet(models.QuerySet):
def manager_and_queryset_method(self):
return
class MyModel(models.Model):
objects = models.Manager.from_queryset(CustomQuerySet)()
Now you can do:
MyModel.objects.manager_and_queryset_method()
as well as
MyModel.objects.filter(something="else").manager_and_queryset_method()
I've got a model with a field tool_class, whose verbose name is class and differs from name:
class Tool(models.Model):
tool_class = jsonfield.JSONField(verbose_name="class")
The Serializer and ViewSet are just stock HyperlinkedModelSerializer and ModelViewSet.
So, when I POST or PUT data to the server with a key class, it is recognized fine:
'{"class": "..."}
but in the response data it is called tool_class again:
{"tool_class": "..."}
How to make it be called class always?
I can't use the name "class" for the field name, as it is a reserved word in python, but in API it absolutely must be called "class", because the API conforms to a certain open standard, which specifies this word.
Obviously, I cannot say:
class = CharField(source="tool_class")
in my ToolSerializer, because it's a SyntaxError: invalid syntax.
SIMPLE SOLUTION:
Guys in another thread suggested a great solution. You can use vars() syntax to circumvent this problem. For instance, I use the following code:
class Tool(Document):
vars()['class'] = mongoengine.fields.DictField(required=True)
Serializer creates respective field automatically. Ain't we sneaky?
I tried to find a way to have a field called class on the serializer, using some tricks with setattr, but it was getting very intrusive and hacky. The field_name is collected from the field at the time of binding the field to the serializer, and there is no easy place to override the behaviour of the bind.
In the end I decided it would be better and simpler just to let DRF do its thing, and add a post-processing step on the serializer:
class ToolSerializer(ModelSerializer):
class Meta:
model = Tool
def to_representation(self, instance):
data = super(ToolSerializer, self).to_representation(instance)
data['class'] = data.pop('tool_class')
return data
Note that the data structure returned by to_representation is an OrderedDict, and this disturbs the ordering slightly - the renamed key in this mapping will be removed from wherever it was at and pushed to the back.
That is unlikely to be an issue for most use-cases, so you shouldn't bother to address it if not necessary. If you do need to preserve ordering, rebuild a new OrderedDict using a comprehension:
data = OrderedDict(
('class' if k == 'tool_class' else k, v) for (k, v) in data.items()
)
You can do this by overriding the metaclass for Serializers. Here is an example of a serializers.py file.
The main magic is this section of the metaclass
# Remap fields (to use class instead of class_)
fields_ = []
for name, field in fields:
if name.endswith('_'):
name = name.rstrip('_')
fields_.append((name, field))
This takes any field you define in the serializer that ends in an underscore (ie. field_) and removes the underscore from the name when it binds the Fields and sets the _declared_fields attribute on the serializer.
from collections import OrderedDict
from rest_framework import serializers
from rest_framework.fields import Field
from snippets.models import Snippet, LANGUAGE_CHOICES, STYLE_CHOICES
class MyMeta(serializers.SerializerMetaclass):
#classmethod
def _get_declared_fields(cls, bases, attrs):
fields = [(field_name, attrs.pop(field_name))
for field_name, obj in list(attrs.items())
if isinstance(obj, Field)]
fields.sort(key=lambda x: x[1]._creation_counter)
# If this class is subclassing another Serializer, add that Serializer's
# fields. Note that we loop over the bases in *reverse*. This is necessary
# in order to maintain the correct order of fields.
for base in reversed(bases):
if hasattr(base, '_declared_fields'):
fields = list(base._declared_fields.items()) + fields
# Remap fields (to use class instead of class_)
fields_ = []
for name, field in fields:
if name.endswith('_'):
name = name.rstrip('_')
fields_.append((name, field))
return OrderedDict(fields_)
class ToolSerializer(serializers.Serializer):
__metaclass__ = MyMeta
...
class_ = serializers.JSONField(source='tool_class', label='class')
def create(self, validated_data):
"""
Create and return a new `Snippet` instance, given the validated data.
"""
return Snippet.objects.create(**validated_data)
def update(self, instance, validated_data):
"""
Update and return an existing `Snippet` instance, given the validated data.
"""
...
instance.class_ = validated_data.get('class', instance.class_)
instance.save()
return instance
I am generating a Django model based on an abstract model class AbstractAttr and a normal model (let's say Foo).
I want my foo/models.py to look like this:
from bar.models import Attrs
# ...
class Foo(models.Model):
....
attrs = Attrs()
In the Attrs class which mimics a field I have a contribute_to_class that generates the required model using type(). The generated model c is called FooAttr.
Everything works. If I migrate, I see FooAttr appear in the proper table.
EXCEPT FOR ONE THING.
I want to be able to from foo.models import FooAttr. Somehow my generated FooAttr class is not bound to the models.py file in which it is generated.
If I change my models.py to this:
class Foo(models.Model):
# ...
FooAttr = generate_foo_attr_class(...)
it works, but this is not what I want (for example, this forces the dev to guess the generate class name).
Is what I want possible, define the class somewhat like in the first example AND bind it to the specific models.py module?
The project (pre-Alpha) is here (in develop branch):
https://github.com/zostera/django-mav
Some relevant code:
def create_model_attribute_class(model_class, class_name=None, related_name=None, meta=None):
"""
Generate a value class (derived from AbstractModelAttribute) for a given model class
:param model_class: The model to create a AbstractModelAttribute class for
:param class_name: The name of the AbstractModelAttribute class to generate
:param related_name: The related name
:return: A model derives from AbstractModelAttribute with an object pointing to model_class
"""
if model_class._meta.abstract:
# This can't be done, because `object = ForeignKey(model_class)` would fail.
raise TypeError("Can't create attrs for abstract class {0}".format(model_class.__name__))
# Define inner Meta class
if not meta:
meta = {}
meta['app_label'] = model_class._meta.app_label
meta['db_tablespace'] = model_class._meta.db_tablespace
meta['managed'] = model_class._meta.managed
meta['unique_together'] = list(meta.get('unique_together', [])) + [('attribute', 'object')]
meta.setdefault('db_table', '{0}_attr'.format(model_class._meta.db_table))
# The name of the class to generate
if class_name is None:
value_class_name = '{name}Attr'.format(name=model_class.__name__)
else:
value_class_name = class_name
# The related name to set
if related_name is None:
model_class_related_name = 'attrs'
else:
model_class_related_name = related_name
# Make a type for our class
value_class = type(
str(value_class_name),
(AbstractModelAttribute,),
dict(
# Set to same module as model_class
__module__=model_class.__module__,
# Add a foreign key to model_class
object=models.ForeignKey(
model_class,
related_name=model_class_related_name
),
# Add Meta class
Meta=type(
str('Meta'),
(object,),
meta
),
))
return value_class
class Attrs(object):
def contribute_to_class(self, cls, name):
# Called from django.db.models.base.ModelBase.__new__
mav_class = create_model_attribute_class(model_class=cls, related_name=name)
cls.ModelAttributeClass = mav_class
I see you create the model from within models.py, so I think you should be able to add it to the module's globals. How about this:
new_class = create_model_attribute_class(**kwargs)
globals()[new_class.__name__] = new_class
del new_class # no need to keep original around
Thanks all for thinking about this. I have updated the source code of the project at GitHub and added more tests. See https://github.com/zostera/django-mav
Since the actual generation of the models is done outside of foo/models.py (it takes place in mav/models.py, it seems Pythonically impossible to link the model to foo/models.py. Also, after rethinking this, it seems to automagically for Python (explicit is better, no magic).
So my new strategy is to use simple functions, a decorator to make it easy to add mav, and link the generated models to mac/attrs.py, so I can universally from mav.attrs import FooAttr. I also link the generated class to the Foo model as Foo._mav_class.
(In this comment, Foo is of course used as an example model that we want to add model-attribute-value to).
I'm making Django like ORM for my study project and because we are not allowed to use existing ORMs (If you want to use one you have to code it yourself) and just for educating myself, i thought that the same kind of ORM like in Django would be nice.
In the ORM I wan't to make model definitions in same style that they are implemented in Django. ie.
class Person(models.Model):
first_name = models.CharField(max_length=30)
last_name = models.CharField(max_length=30)
Django uses metaclasses and in my project I'm using too, but I have problem with the fact that metaclasses construct classes not instances and so all attributes are class attributes and shared between all instances.
This is generic example what I tried but because of what I earlier said, it won't work:
def getmethod(attrname):
def _getmethod(self):
return getattr(self, "__"+attrname).get()
return _getmethod
def setmethod(attrname):
def _setmethod(self, value):
return getattr(self, "__"+attrname).set(value)
return _setmethod
class Metaclass(type):
def __new__(cls, name, based, attrs):
ndict = {}
for attr in attrs:
if isinstance(attrs[attr], Field):
ndict['__'+attr] = attrs[attr]
ndict[attr] = property(getmethod(attr), setmethod(attr))
return super(Metaclass, cls).__new__(cls, name, based, ndict)
class Field:
def __init__(self):
self.value = 0;
def set(self, value):
self.value = value
def get(self):
return self.value
class Mainclass:
__metaclass__ = Metaclass
class Childclass(Mainclass):
attr1 = Field()
attr2 = Field()
a = Childclass()
print "a, should be 0:", a.attr1
a.attr1 = "test"
print "a, should be test:", a.attr1
b = Childclass()
print "b, should be 0:", b.attr1
I tried to lookup from Djangos source but it is too complicated for me to understand and the "magic" seems to be hidden somewhere.
Question is simple, how Django does this in very simplificated example?
The answer is quite simple really, once you check the right code. The metaclass used by Django adds all fields to <model>._meta.fields (well, kinda), but the field attribute is removed from the actual class. The only exception to this is a RelatedField subclass, in which case an object descriptor is added (similar to a property - in fact, a propery is an object descriptor, just with a native implementation).
Then, in the __init__ method of the model, the code iterates over all fields, and either sets the provided value in *args or **kwargs, or sets a default value on the instance.
In your example, this means that the class Person will never have attributes named first_name and last_name, but both fields are stored in Person._meta.fields. However, an instance of Person will always have attributes named first_name and last_name, even if they are not provided as arguments.