property() setter issues in metaclass - python

I'm trying to use property() in my metaclass to give a way to access/set an internal attribute. I'm using property() as apposed to #property as it's inside a metaclass and I need to apply the property to a class passed in the __new__ method.
This is a basic example of my code with relevant parts:
def __new__(mcls, name, bases, dct, **kwargs):
def getabstract(obj):
return getattr(obj, '_abstract', False)
def setabstract(obj, value):
if str(value) in ('True', 'False'):
return setattr(obj, '_abstract', value)
else: print(f'invalid abstract assignment {value}')
return None
cls = super().__new__(mcls, name, bases, dct)
for name, value in cls.__dict__.items():
if callable(value):
# only applies to callable (method)
value._abstract = getattr(value, '_abstract', False)
# add an _abstract attribute to the method or return its
# current value if it already has one
for base in bases:
base._abstract = getattr(base, '_abstract', False)
# give each base class an _abstract attribute if not already given
cls.abstract = property(getabstract(cls),
setabstract(cls, getattr(cls, '_abstract', False)))
# make an abstract property for class
for name, value in cls.__dict__.items():
if callable(value):
# make an abstract property for functions
value.abstract = property(getabstract(value),
setabstract(value, getattr(value, '_abstract', False)))
When I run this, no errors occur, but when accessing a new class made by this metaclass e.g. Foo.abstract it returns:
<property object at 0x.....>
Also, the setabstract() function used as the setter only sets the attribute if it's either True or False, but when I do something like Foo.abstract = 'foo', it still sets the value to 'foo'
Is there something I'm doing wrong here or something I've missed?

A property is not an instance attribute, it is a descriptor attribute of the class which is bound to the object which accessed it. Here is a simplified example:
class A:
#property
def x(self):
return True
print(A.x) # <property object at 0x0000021AE2944548>
print(A().x) # True
When getting A.x, you obtain the unbound property. When getting A().x, the property is bound to an instance and the A.x.__get__ returns a value.
Back to your code
What this means is that a property must be a class attribute, not an instance attribute. Or, in your case, the property must be a metaclass attribute.
class Meta(type):
#property
def abstract(cls):
return getattr(cls, '_abstract', False)
#abstract.setter
def abstract(cls, value):
if value in (True, False):
setattr(cls, '_abstract', value)
else:
raise TypeError(f'invalid abstract assignment {value}')
class Foo(metaclass=Meta):
pass
print(Foo.abstract) # False
Foo.abstract = 'Not at bool' # TypeError: invalid abstract assignment Not at bool
Although, this only partially solves your issue as you want every method of you classes to have an abstract property. To do so, you will need their class to have an that property.
Before we go any deeper, let me recall you one key concept in Python: we're all consenting adults here. Maybe you should just trust your users not to set abstract to anything else than a bool and let it be a non-property attribute. In particular, any user can change the _abstract attribute itself anyway.
If that does not suit you and you want abstract to be a property, then one way to do that is to define a wrapper class for methods that hold that property.
class Abstract:
#property
def abstract(cls):
return getattr(cls, '_abstract', False)
#abstract.setter
def abstract(cls, value):
if value in (True, False):
setattr(cls, '_abstract', value)
else:
raise TypeError(f'invalid abstract assignment {value}')
class AbstractCallable(Abstract):
def __init__(self, f):
self.callable = f
def __call__(self, *args, **kwargs):
return self.callable(*args, **kwargs)
class Meta(type, Abstract):
def __new__(mcls, name, bases, dct, **kwargs):
for name, value in dct.items():
if callable(value):
dct[name] = AbstractCallable(value)
return super().__new__(mcls, name, bases, dct)
class Foo(metaclass=Meta):
def bar(self):
pass
print(Foo.abstract) # False
print(Foo.bar.abstract) # False
Foo.bar.abstract = 'baz' # TypeError: invalid abstract assignment baz

Related

How to create a metaclass that can give a class an array of instances and provide a "voodoo" instance that acts on all class instances?

I'm wondering how to create a metaclass in Python that can create other classes that:
Store their instances in an array automatically
Have a special instance, NonMetaClass.all, whose properties:
When set, set all the class's instances with the same key to the same value (e.g., Foo.all.num = 3 makes all instances of Foo have a num of 3)
When accessed (get), returns an array of all of the class's instances's key values (e.g., Foo.all.num returns [5, 3, 2])
Cannot be deleted.
When called (if the attribute is a function), call that method on all the instances of a class.
In Python terms, I would like to turn a class that is like this:
class Foo(object):
BAR = 23
def __init__(self):
self.a = 5
def pointless():
print 'pointless.'
def change_a(self):
self.a = 52
Into this:
class Foo(object):
BAR = 23
instances = []
all = # Some black magic to create the special "all" instance
def __init__(self):
self.a = 5
Foo.instances.append(self)
def pointless(self):
print 'pointless.'
def change_a(self):
self.a = 52
And be able to use it like this:
>>> Foo()
>>> Foo.instances[0]
<__main__.Foo instance at 0x102ff5758>
>>> Foo()
>>> len(Foo.instances)
2
>>> Foo.all.a = 78
78
>>> Foo.all.a
[78, 78]
>>> Foo.all.change_a()
>>> Foo.all.a
[52, 52]
>>>
The only thing a metaclass is needed for there is actually quite easy:
exactly creating the intances and all attributes.
All it have to do is to insert those into the namespace. Ah, it will also have to wrap the class __new__ method to insert new instances into the instances list.
The part that is the behavior wanted from all is interesting, and that can be implemented using the descriptor protocol, and attribute access control, so we have to craft a couple special classes, that will return the appropriate objects when requested after the ".".
"All" is the class that will be instantiated as "all" - it just needs a __get__ method to return another special object, from the AllAttr class, already bound to the parent class.
"AllAttr" is a special object that on any attribute access, perform your requirements on the members of the owner class "instance" attribute.
And "CallAllList" is a special list subclass that is callable, and calls all its members in turn. It is used by AllAttr if the required attribute from the owner class is callable itself.
class CallAllList(list):
def __call__(self, *args, **kwargs):
return [instance(*args, **kwargs) for instance in self]
class AllAttr(object):
def __init__(self, owner):
self._owner = owner
def __getattr__(self, attr):
method = getattr(self._owner, attr, None)
cls = CallAllList if callable(method) else list
return cls(getattr(obj, attr) for obj in self._owner.instances)
def __setattr__(self, attr, value):
if attr == "_owner":
return super(AllAttr, self).__setattr__(attr, value)
for obj in self._owner.instances:
setattr(obj, attr, value)
class All(object):
def __get__(self, instance, owner):
return AllAttr(owner)
def __repr__(self):
return "Representation of all instances of '{}'".format(self.__class__.__name__)
class MetaAll(type):
def __new__(metacls, name, bases, namespace):
namespace["all"] = All()
namespace["instances"] = []
cls = super(MetaAll, metacls).__new__(metacls, name, bases, namespace)
original_new = getattr(cls, "__new__")
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
cls.__new__ = __new__
return cls
class Foo(metaclass=MetaAll):
pass
The code above is written so that it is Python 3 and Python 2 compatible, since you appear to still be using Python2 given your "print" example.
The only thing that cannot be written compatible with both forms is the metaclass using declaration itself - just declare a __metaclass__ = MetaAll inside the body of your Foo class if you are using Python 2. But you should not really be using Python2, just change to Python 3 as soon as you can.
update
It happens that Python 2 has the "unbound method" figure, and the special casing of __new__ does not work like in Python 3: you can't just attribute a function named __new__ to the class. In order to get the correct __new__ method from the superclasses, the easiest way is to create a disposable class, so that it can be searched linearly. Otherwise, one would have to reimplement the MRO algorithm to get the proper __new__ method.
So, for Python 2, the metaclass should be this:
class MetaAll(type):
def __new__(metacls, name, bases, namespace):
namespace["all"] = All()
namespace["instances"] = []
if "__new__" in namespace:
original_new = namespace["__new__"]
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
else:
# We create a disposable class just to get the '__mro__'
stub_cls = super(MetaAll, metacls).__new__(metacls, name, bases, {})
for parent in stub_cls.__mro__[1:]:
if "__new__" in parent.__dict__:
original_new = parent.__dict__["__new__"]
break
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
namespace["__new__"] = __new__
final_cls = super(MetaAll, metacls).__new__(metacls, name, bases, namespace)
return final_cls
class Foo(object):
__metaclass__ = MetaAll
(now, again, this thing is ancient. Just settle for Python 3.6)
Ok, I figured out how to do this for Python 2.7 on my own. This is what I believe to be the best solution though it may not be the only one. It allows you to set, get, and function call on attributes of Class.all. I've named the metaclass InstanceUnifier, but please comment if you think there's a better (shorter, more descriptive) name you can think of.
class InstanceUnifier(type):
'''
What we want: A metaclass that can give a class an array of instances and provide a static Class.all object, that, when a method is called on it, calls the same method on every instance of the class.
'''
def __new__(cls, name, base_classes, dct):
dct['all'] = None
dct['instances'] = []
return type.__new__(cls, name, base_classes, dct)
def __init__(cls, name, base_classes, dct):
class Accessor(object):
def __getattribute__(self, name):
array = [getattr(inst, name) for inst in cls.instances]
if all([callable(item) for item in array]):
def proxy_func(*args, **kwargs):
for i in range(len(cls.instances)):
this = cls.instances[i]
func = array[i]
func(*args, **kwargs)
return proxy_func
elif all([not callable(item) for item in array]):
return array
else:
raise RuntimeError('Some objects in class instance array for key "'+name+'" are callable, some are not.')
def __setattr__(self, name, value):
[setattr(inst, name, value) for inst in cls.instances]
def __delattr__(self, name):
[delattr(inst, name) for inst in cls.instances]
cls.all = Accessor()
return type.__init__(cls, name, base_classes, dct)
def __call__(cls, *args, **kwargs):
inst = type.__call__(cls, *args, **kwargs)
cls.instances.append(inst)
return inst

Using #classmethod with #property [duplicate]

This question already has answers here:
Using property() on classmethods
(19 answers)
Closed 3 years ago.
In python I can add a method to a class with the #classmethod decorator. Is there a similar decorator to add a property to a class? I can better show what I'm talking about.
class Example(object):
the_I = 10
def __init__( self ):
self.an_i = 20
#property
def i( self ):
return self.an_i
def inc_i( self ):
self.an_i += 1
# is this even possible?
#classproperty
def I( cls ):
return cls.the_I
#classmethod
def inc_I( cls ):
cls.the_I += 1
e = Example()
assert e.i == 20
e.inc_i()
assert e.i == 21
assert Example.I == 10
Example.inc_I()
assert Example.I == 11
Is the syntax I've used above possible or would it require something more?
The reason I want class properties is so I can lazy load class attributes, which seems reasonable enough.
Here's how I would do this:
class ClassPropertyDescriptor(object):
def __init__(self, fget, fset=None):
self.fget = fget
self.fset = fset
def __get__(self, obj, klass=None):
if klass is None:
klass = type(obj)
return self.fget.__get__(obj, klass)()
def __set__(self, obj, value):
if not self.fset:
raise AttributeError("can't set attribute")
type_ = type(obj)
return self.fset.__get__(obj, type_)(value)
def setter(self, func):
if not isinstance(func, (classmethod, staticmethod)):
func = classmethod(func)
self.fset = func
return self
def classproperty(func):
if not isinstance(func, (classmethod, staticmethod)):
func = classmethod(func)
return ClassPropertyDescriptor(func)
class Bar(object):
_bar = 1
#classproperty
def bar(cls):
return cls._bar
#bar.setter
def bar(cls, value):
cls._bar = value
# test instance instantiation
foo = Bar()
assert foo.bar == 1
baz = Bar()
assert baz.bar == 1
# test static variable
baz.bar = 5
assert foo.bar == 5
# test setting variable on the class
Bar.bar = 50
assert baz.bar == 50
assert foo.bar == 50
The setter didn't work at the time we call Bar.bar, because we are calling
TypeOfBar.bar.__set__, which is not Bar.bar.__set__.
Adding a metaclass definition solves this:
class ClassPropertyMetaClass(type):
def __setattr__(self, key, value):
if key in self.__dict__:
obj = self.__dict__.get(key)
if obj and type(obj) is ClassPropertyDescriptor:
return obj.__set__(self, value)
return super(ClassPropertyMetaClass, self).__setattr__(key, value)
# and update class define:
# class Bar(object):
# __metaclass__ = ClassPropertyMetaClass
# _bar = 1
# and update ClassPropertyDescriptor.__set__
# def __set__(self, obj, value):
# if not self.fset:
# raise AttributeError("can't set attribute")
# if inspect.isclass(obj):
# type_ = obj
# obj = None
# else:
# type_ = type(obj)
# return self.fset.__get__(obj, type_)(value)
Now all will be fine.
If you define classproperty as follows, then your example works exactly as you requested.
class classproperty(object):
def __init__(self, f):
self.f = f
def __get__(self, obj, owner):
return self.f(owner)
The caveat is that you can't use this for writable properties. While e.I = 20 will raise an AttributeError, Example.I = 20 will overwrite the property object itself.
[answer written based on python 3.4; the metaclass syntax differs in 2 but I think the technique will still work]
You can do this with a metaclass...mostly. Dappawit's almost works, but I think it has a flaw:
class MetaFoo(type):
#property
def thingy(cls):
return cls._thingy
class Foo(object, metaclass=MetaFoo):
_thingy = 23
This gets you a classproperty on Foo, but there's a problem...
print("Foo.thingy is {}".format(Foo.thingy))
# Foo.thingy is 23
# Yay, the classmethod-property is working as intended!
foo = Foo()
if hasattr(foo, "thingy"):
print("Foo().thingy is {}".format(foo.thingy))
else:
print("Foo instance has no attribute 'thingy'")
# Foo instance has no attribute 'thingy'
# Wha....?
What the hell is going on here? Why can't I reach the class property from an instance?
I was beating my head on this for quite a while before finding what I believe is the answer. Python #properties are a subset of descriptors, and, from the descriptor documentation (emphasis mine):
The default behavior for attribute access is to get, set, or delete the
attribute from an object’s dictionary. For instance, a.x has a lookup chain
starting with a.__dict__['x'], then type(a).__dict__['x'], and continuing
through the base classes of type(a) excluding metaclasses.
So the method resolution order doesn't include our class properties (or anything else defined in the metaclass). It is possible to make a subclass of the built-in property decorator that behaves differently, but (citation needed) I've gotten the impression googling that the developers had a good reason (which I do not understand) for doing it that way.
That doesn't mean we're out of luck; we can access the properties on the class itself just fine...and we can get the class from type(self) within the instance, which we can use to make #property dispatchers:
class Foo(object, metaclass=MetaFoo):
_thingy = 23
#property
def thingy(self):
return type(self).thingy
Now Foo().thingy works as intended for both the class and the instances! It will also continue to do the right thing if a derived class replaces its underlying _thingy (which is the use case that got me on this hunt originally).
This isn't 100% satisfying to me -- having to do setup in both the metaclass and object class feels like it violates the DRY principle. But the latter is just a one-line dispatcher; I'm mostly okay with it existing, and you could probably compact it down to a lambda or something if you really wanted.
If you use Django, it has a built in #classproperty decorator.
from django.utils.decorators import classproperty
For Django 4, use:
from django.utils.functional import classproperty
I think you may be able to do this with the metaclass. Since the metaclass can be like a class for the class (if that makes sense). I know you can assign a __call__() method to the metaclass to override calling the class, MyClass(). I wonder if using the property decorator on the metaclass operates similarly.
Wow, it works:
class MetaClass(type):
def getfoo(self):
return self._foo
foo = property(getfoo)
#property
def bar(self):
return self._bar
class MyClass(object):
__metaclass__ = MetaClass
_foo = 'abc'
_bar = 'def'
print MyClass.foo
print MyClass.bar
Note: This is in Python 2.7. Python 3+ uses a different technique to declare a metaclass. Use: class MyClass(metaclass=MetaClass):, remove __metaclass__, and the rest is the same.
As far as I can tell, there is no way to write a setter for a class property without creating a new metaclass.
I have found that the following method works. Define a metaclass with all of the class properties and setters you want. IE, I wanted a class with a title property with a setter. Here's what I wrote:
class TitleMeta(type):
#property
def title(self):
return getattr(self, '_title', 'Default Title')
#title.setter
def title(self, title):
self._title = title
# Do whatever else you want when the title is set...
Now make the actual class you want as normal, except have it use the metaclass you created above.
# Python 2 style:
class ClassWithTitle(object):
__metaclass__ = TitleMeta
# The rest of your class definition...
# Python 3 style:
class ClassWithTitle(object, metaclass = TitleMeta):
# Your class definition...
It's a bit weird to define this metaclass as we did above if we'll only ever use it on the single class. In that case, if you're using the Python 2 style, you can actually define the metaclass inside the class body. That way it's not defined in the module scope.
def _create_type(meta, name, attrs):
type_name = f'{name}Type'
type_attrs = {}
for k, v in attrs.items():
if type(v) is _ClassPropertyDescriptor:
type_attrs[k] = v
return type(type_name, (meta,), type_attrs)
class ClassPropertyType(type):
def __new__(meta, name, bases, attrs):
Type = _create_type(meta, name, attrs)
cls = super().__new__(meta, name, bases, attrs)
cls.__class__ = Type
return cls
class _ClassPropertyDescriptor(object):
def __init__(self, fget, fset=None):
self.fget = fget
self.fset = fset
def __get__(self, obj, owner):
if self in obj.__dict__.values():
return self.fget(obj)
return self.fget(owner)
def __set__(self, obj, value):
if not self.fset:
raise AttributeError("can't set attribute")
return self.fset(obj, value)
def setter(self, func):
self.fset = func
return self
def classproperty(func):
return _ClassPropertyDescriptor(func)
class Bar(metaclass=ClassPropertyType):
__bar = 1
#classproperty
def bar(cls):
return cls.__bar
#bar.setter
def bar(cls, value):
cls.__bar = value
bar = Bar()
assert Bar.bar==1
Bar.bar=2
assert bar.bar==2
nbar = Bar()
assert nbar.bar==2
I happened to come up with a solution very similar to #Andrew, only DRY
class MetaFoo(type):
def __new__(mc1, name, bases, nmspc):
nmspc.update({'thingy': MetaFoo.thingy})
return super(MetaFoo, mc1).__new__(mc1, name, bases, nmspc)
#property
def thingy(cls):
if not inspect.isclass(cls):
cls = type(cls)
return cls._thingy
#thingy.setter
def thingy(cls, value):
if not inspect.isclass(cls):
cls = type(cls)
cls._thingy = value
class Foo(metaclass=MetaFoo):
_thingy = 23
class Bar(Foo)
_thingy = 12
This has the best of all answers:
The "metaproperty" is added to the class, so that it will still be a property of the instance
Don't need to redefine thingy in any of the classes
The property works as a "class property" in for both instance and class
You have the flexibility to customize how _thingy is inherited
In my case, I actually customized _thingy to be different for every child, without defining it in each class (and without a default value) by:
def __new__(mc1, name, bases, nmspc):
nmspc.update({'thingy': MetaFoo.services, '_thingy': None})
return super(MetaFoo, mc1).__new__(mc1, name, bases, nmspc)
If you only need lazy loading, then you could just have a class initialisation method.
EXAMPLE_SET = False
class Example(object):
#classmethod
def initclass(cls):
global EXAMPLE_SET
if EXAMPLE_SET: return
cls.the_I = 'ok'
EXAMPLE_SET = True
def __init__( self ):
Example.initclass()
self.an_i = 20
try:
print Example.the_I
except AttributeError:
print 'ok class not "loaded"'
foo = Example()
print foo.the_I
print Example.the_I
But the metaclass approach seems cleaner, and with more predictable behavior.
Perhaps what you're looking for is the Singleton design pattern. There's a nice SO QA about implementing shared state in Python.

python nested attributes encapsulation

I have some question about encapsulation nested attributes in python. Let's assume few classes:
Here we have a main class (DataWrapper) that includes two more classes: InnerWrapper1 and InnerWrapper2. Both inner wrappers includes two attributes.
class DataWrapper(object):
#property
def inner_wrapper1(self):
return self.__inner_wrapper1
#inner_wrapper1.setter
def inner_wrapper1(self, value):
self.__inner_wrapper1 = value
#property
def inner_wrapper2(self):
return self.__inner_wrapper2
#inner_wrapper2.setter
def inner_wrapper2(self, value):
self.__inner_wrapper2 = value
class InnerWrapper1(object):
#property
def property1(self):
return self.__property1
#property1.setter
def property1(self, value):
self.__property1 = value
#property
def property2(self):
return self.__property2
#property2.setter
def property2(self, value):
self.__property2 = value
class InnerWrapper2(object):
#property
def property3(self):
return self.__property3
#property3.setter
def property3(self, value):
self.__property3 = value
#property
def property4(self):
return self.__property4
#property4.setter
def property4(self, value):
self.__property4 = value
Is it possible to override somehow getattr and setattr methods to make possible below encapsulation? What I want to achieve is to have an access to those nested attributes from the top class- DataWrapper.
data_wrapper = DataWrapper()
data_wrapper.property1 = "abc"
...
var = data_wrapper.property2
...
The first thing that came to my mind was to execute hasattr in getattr, but that gave a maximum recursion depth...
Here's a complete code:
class DataWrapper(object):
def __init__(self):
self.inner_wrapper1 = InnerWrapper1()
self.inner_wrapper2 = InnerWrapper2()
#property
def inner_wrapper1(self):
return self.__inner_wrapper1
#inner_wrapper1.setter
def inner_wrapper1(self, value):
self.__inner_wrapper1 = value
#property
def inner_wrapper2(self):
return self.__inner_wrapper2
#inner_wrapper2.setter
def inner_wrapper2(self, value):
self.__inner_wrapper2 = value
def __setattr__(self, attribute, value):
#if attribute in {'innerwrapper1', 'innerwrapper2'}:
if attribute in ['inner_wrapper1', 'inner_wrapper2']:
return super(DataWrapper, self).__setattr__(attribute, value)
if hasattr(self.inner_wrapper1, attribute):
return setattr(self.inner_wrapper1, attribute, value)
elif hasattr(self.inner_wrapper2, attribute):
return setattr(self.inner_wrapper2, attribute, value)
def __getattr__(self, attribute):
try:
return getattr(self.inner_wrapper1, attribute)
except AttributeError: pass
try:
return getattr(self.inner_wrapper2, attribute)
except AttributeError: pass
class InnerWrapper1(object):
#property
def property1(self):
return self.__property1
#property1.setter
def property1(self, value):
self.__property1 = value
#property
def property2(self):
return self.__property2
#property2.setter
def property2(self, value):
self.__property2 = value
class InnerWrapper2(object):
#property
def property3(self):
return self.__property3
#property3.setter
def property3(self, value):
self.__property3 = value
#property
def property4(self):
return self.__property4
#property4.setter
def property4(self, value):
self.__property4 = value
def main():
data_wrapper = DataWrapper()
data_wrapper.property1 = "abc"
if __name__ == "__main__":
main()
You get an infinite recursion error because you forgot to take into account setting the inner_wrapper1 and inner_wrapper2 attributes in your __init__ method.
When you do this:
self.inner_wrapper1 = InnerWrapper()
Python will also use your __setattr__ method. This then tries to use self.inner_wrapper1 which doesn't yet exist so __getattr__ is called, which tries to use self.inner_wrapper1 which doesn't yet exist, and you enter into an infinite recursion loop.
In __setattr__ delegate attribute setting to the superclass:
def __setattr__(self, attribute, value):
if attribute in {'innerwrapper1', 'innerwrapper2'}:
return super(DataWrapper, self).__setattr__(attribute, value)
if hasattr(self.inner_wrapper1, attribute):
return setattr(self.inner_wrapper1, attribute, value)
elif hasattr(self.inner_wrapper2, attribute):
return setattr(self.inner_wrapper2, attribute, value)
If you used a single leading underscore for 'private' attributes (so _innerwrapper1 and _innerwrapper2) you could just test for that:
def __setattr__(self, attribute, value):
if attribute[0] == '_': # private attribute
return super(DataWrapper, self).__setattr__(attribute, value)
so you don't have to hardcode a whole set of names.
Since your updated full script uses __inner_wrapper1 and __inner_wrapper2 as the actual attribute names, and you are using properties, you'll have to adjust your __setattr__ test to look for those names. Because you are using double-underscore names you need to adjust for the name mangling of such attributes:
def __setattr__(self, attribute, value):
if attribute in {
'inner_wrapper1', 'inner_wrapper2',
'_DataWrapper__inner_wrapper1', '_DataWrapper__inner_wrapper2'}:
return super(DataWrapper, self).__setattr__(attribute, value)
Unless you are going to subclass DataWrapper and must protect your attributes from accidental overriding, I'd avoid using double-underscored names altogether, however. In Pythonic code, you don't worry about other code accessing attributes, there is no concept of truly private attributes.
Using properties is also overkill here; properties don't buy you encapsulation, in Python you'd only use those to simplify the API (replacing a method call with attribute access).
Note that the hasattr() tests for the InnerWrapper* property* attributes will fail because you don't have default values:
>>> inner = InnerWrapper1()
>>> hasattr(inner, 'property1')
False
hasattr() doesn't test for properties, it simply tries to access an attribute and if any exception is raised it returns False:
>>> inner = InnerWrapper1()
>>> hasattr(inner, 'property1')
False
>>> inner.property1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 43, in property1
AttributeError: 'InnerWrapper1' object has no attribute '_InnerWrapper1__property1'
>>> inner.property1 = 'foo'
>>> inner.property1
'foo'
>>> hasattr(inner, 'property1')
True
By removing all the #property objects you can simplify this greatly:
class DataWrapper(object):
def __init__(self):
self._inner_wrapper1 = InnerWrapper1()
self._inner_wrapper2 = InnerWrapper2()
def __setattr__(self, attribute, value):
if attribute[0] == '_':
return super(DataWrapper, self).__setattr__(attribute, value)
if hasattr(self._inner_wrapper1, attribute):
return setattr(self._inner_wrapper1, attribute, value)
elif hasattr(self._inner_wrapper2, attribute):
return setattr(self._inner_wrapper2, attribute, value)
def __getattr__(self, attribute):
try:
return getattr(self._inner_wrapper1, attribute)
except AttributeError: pass
return getattr(self._inner_wrapper2, attribute)
class InnerWrapper1(object):
property1 = None
property2 = None
class InnerWrapper2(object):
property3 = None
property4 = None

Using both __setattr__ and descriptors for a python class

I'm writing a python class that uses __setattr__ and __getattr__ to provide custom attribute access.
However, some attributes can't be handled in a generic way, so I was hoping to use descriptors for those.
A problem arises in that for a descriptor, the descriptor's __get__ will be invoked in favour of the instances __getattr__, but when assigning to an attribute, __setattr__ will be invoked in favour of the descriptors __set__.
An example:
class MyDesc(object):
def __init__(self):
self.val = None
def __get__(self, instance, owner):
print "MyDesc.__get__"
return self.val
def __set__(self, instance, value):
print "MyDesc.__set__"
self.val = value
class MyObj(object):
foo = MyDesc()
def __init__(self, bar):
object.__setattr__(self, 'names', dict(
bar=bar,
))
object.__setattr__(self, 'new_names', dict())
def __setattr__(self, name, value):
print "MyObj.__setattr__ for %s" % name
self.new_names[name] = value
def __getattr__(self, name):
print "MyObj.__getattr__ for %s" % name
if name in self.new_names:
return self.new_names[name]
if name in self.names:
return self.names[name]
raise AttributeError(name)
if __name__ == "__main__":
o = MyObj('bar-init')
o.bar = 'baz'
print o.bar
o.foo = 'quux'
print o.foo
prints:
MyObj.__setattr__ for bar
MyObj.__getattr__ for bar
baz
MyObj.__setattr__ for foo
MyDesc.__get__
None
The descriptor's __set__ is never called.
Since the __setattr__ definition isn't just overriding behaviour for a limited set of names, there's no clear place that it can defer to object.__setattr__
Is there a recommended way to have assigning to attributes use the descriptor, if available, and __setattr__ otherwise?
I think I'd approach this by having a mechanism to automatically mark which are the
descriptors in each class, and wrap the __setattr__ in a way that it'd call
object's normal behavior for those names.
This can be easily achieved with a metaclass (and a decorator for __setattr__
def setattr_deco(setattr_func):
def setattr_wrapper(self, attr, value):
if attr in self._descriptors:
return object.__setattr__(self, attr, value)
return setattr_func(self, attr, value)
return setattr_wrapper
class MiscSetattr(type):
def __new__(metacls, name, bases, dct):
descriptors = set()
for key, obj in dct.items():
if key == "__setattr__":
dct[key] = setattr_deco(obj)
elif hasattr(obj, "__get__"):
descriptors.add(key)
dct["_descriptors"] = descriptors
return type.__new__(metacls, name, bases, dct)
# and use MiscSetattr as metaclass for your classes
One of possible ways:
def __setattr__(self, name, value):
print "MyObj.__setattr__ for %s" % name
for cls in self.__class__.__mro__ + (self, ):
if name in cls.__dict__:
return object.__setattr__(self, name, value)
print 'New name', name, value
self.new_names[name] = value
It checks if name already defined in class, base classes or instance and then it calls object.__setattr__ which will execute descriptor __set__.
Another way:
def __setattr__(self, name, value):
print "MyObj.__setattr__ for %s" % name
try:
object.__getattribute__(self, name)
except AttributeError:
print 'New name', name, value
self.new_names[name] = value
else:
object.__setattr__(self, name, value)
But it will call descriptor's __get__.
P.S.
I'm not sure about need to check all __mro__ members since MyObj will contain inherited class members in __dict__.
Maybe for cls in (self.__class__, self):... will be enough.

Can I get a reference to the 'owner' class during the __init__ method of a descriptor?

Is it possible to access the 'owner' class inside a descriptor during the __init__ function of that descriptor, without passing it in manually as in this example?
class FooDescriptor(object):
def __init__(self, owner):
#do things to owner here
setattr(owner, 'bar_attribute', 'bar_value')
class BarClass(object):
foo_attribute = FooDescriptor(owner=BarClass)
One way to do something like that is with a metaclass. Just make sure it's really what you want, and don't just copy blindly if you don't understand how it works.
class Descriptor(object):
pass
class Meta(type):
def __new__(cls, name, bases, attrs):
obj = type.__new__(cls, name, bases, attrs)
# obj is now a type instance
# this loop looks for Descriptor subclasses
# and instantiates them, passing the type as the first argument
for name, attr in attrs.iteritems():
if isinstance(attr, type) and issubclass(attr, Descriptor):
setattr(obj, name, attr(obj))
return obj
class FooDescriptor(Descriptor):
def __init__(self, owner):
owner.foo = 42
class BarClass(object):
__metaclass__ = Meta
foo_attribute = FooDescriptor # will be instantiated by the metaclass
print BarClass.foo
If you need to pass additional arguments, you could use e.g. a tuple of (class, args) in the place of the class, or make FooDescriptor a decorator that would return a class that takes only one argument in the ctor.
Since Python 3.6, you can use the __set_name__ special method:
class FooDescriptor(object):
def __set_name__(self, owner, name):
owner.foo = 42
class BarClass(object):
foo_attribute = FooDescriptor()
# foo_attribute.__set_name__(BarClass, "foo_attribute") called after class definition
__set_name__ is automatically called on all descriptors in a class immediately after the class is created.
See PEP 487 for more details.

Categories

Resources