python - perfectly mimic inheritance with composition - python

I am attempting to wrap a class from a third-party package in such a way that my new class looks exactly like a subclass of the third-party class. The third-party class does not support inheritance, and it has nontrivial features, such as functions that have a __getitem__ method. I can wrap almost every attribute and method using a solution based on Wrapping a class whose methods return instances of that class and How can I intercept calls to python's "magic" methods in new style classes?. However, I still need to override the __init__ method of the third-party class. How can I do that? Note: I am using new-style classes.
Code so far:
import copy
class WrapperMetaclass(type):
"""
Works with the `Wrapper` class to create proxies for the wrapped object's magic methods.
"""
def __init__(cls, name, bases, dct):
def make_proxy(name):
def proxy(self, *args):
return getattr(self._obj, name)
return proxy
type.__init__(cls, name, bases, dct)
if cls.__wraps__:
ignore = set("__%s__" % n for n in cls.__ignore__.split())
for name in dir(cls.__wraps__):
if name.startswith("__"):
if name not in ignore and name not in dct:
setattr(cls, name, property(make_proxy(name)))
class Wrapper(object):
"""
Used to provide a (nearly) seamless inheritance-like interface for classes that do not support direct inheritance.
"""
__metaclass__ = WrapperMetaclass
__wraps__ = None
# note that the __init__ method will be ignored by WrapperMetaclass
__ignore__ = "class mro new init setattr getattr getattribute dict"
def __init__(self, obj):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
elif isinstance(obj, self.__wraps__):
self._obj = obj
else:
raise ValueError("wrapped object must be of %s" % self.__wraps__)
def __getattr__(self, name):
if name is '_obj':
zot = 1
orig_attr = self._obj.__getattribute__(name)
if callable(orig_attr) and not hasattr(orig_attr, '__getitem__'):
def hooked(*args, **kwargs):
result = orig_attr(*args, **kwargs)
if result is self._obj:
return self
elif isinstance(result, self.__wraps__):
return self.__class__(result)
else:
return result
return hooked
else:
return orig_attr
def __setattr__(self, attr, val):
object.__setattr__(self, attr, val)
if getattr(self._obj, attr, self._obj) is not self._obj: # update _obj's member if it exists
setattr(self._obj, attr, getattr(self, attr))
class ClassToWrap(object):
def __init__(self, data):
self.data = data
def theirfun(self):
new_obj = copy.deepcopy(self)
new_obj.data += 1
return new_obj
def __str__(self):
return str(self.data)
class Wrapped(Wrapper):
__wraps__ = ClassToWrap
def myfun(self):
new_obj = copy.deepcopy(self)
new_obj.data += 1
return new_obj
# can't instantiate Wrapped directly! This is the problem!
obj = ClassToWrap(0)
wr0 = Wrapped(obj)
print wr0
>> 0
print wr0.theirfun()
>> 1
This works, but for truly seamless inheritance-like behavior, I need to instantiate Wrapped directly, e.g.
wr0 = Wrapped(0)
which currently throws
ValueError: wrapped object must be of <class '__main__.ClassToWrap'>
I attempted to override by defining a new proxy for __init__ in WrapperMetaclass, but rapidly ran into infinite recursions.
My codebase is complex with users at different skill levels, so I can't afford to use monkey-patching or solutions that modify the definition of the example classes ClassToWrap or Wrapped. I am really hoping for an extension to the code above that overrides Wrapped.__init__.
Please note that this question is not simply a duplicate of e.g. Can I exactly mimic inheritance behavior with delegation by composition in Python?. That post does not have any answer that is nearly as detailed as what I'm already providing here.

It sounds like you just want Wrapper.__init__ method to work differently that it currently does. Rather than taking an already existing instance of the __wraps__ class, it should take the arguments that the other class expects in its constructor and built the instance for you. Try something like this:
def __init__(self, *args, **kwargs):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
else:
self._obj = self.__wraps__(*args, **kwargs)
If you want Wrapper to remain the same for some reason, you could put the logic in a new Wrapped.__init__ method instead:
def __init__(self, data): # I'm explicitly naming the argument here, but you could use *args
super(self, Wrapped).__init__(self.__wraps__(data)) # and **kwargs to make it extensible

Related

How to create a metaclass that can give a class an array of instances and provide a "voodoo" instance that acts on all class instances?

I'm wondering how to create a metaclass in Python that can create other classes that:
Store their instances in an array automatically
Have a special instance, NonMetaClass.all, whose properties:
When set, set all the class's instances with the same key to the same value (e.g., Foo.all.num = 3 makes all instances of Foo have a num of 3)
When accessed (get), returns an array of all of the class's instances's key values (e.g., Foo.all.num returns [5, 3, 2])
Cannot be deleted.
When called (if the attribute is a function), call that method on all the instances of a class.
In Python terms, I would like to turn a class that is like this:
class Foo(object):
BAR = 23
def __init__(self):
self.a = 5
def pointless():
print 'pointless.'
def change_a(self):
self.a = 52
Into this:
class Foo(object):
BAR = 23
instances = []
all = # Some black magic to create the special "all" instance
def __init__(self):
self.a = 5
Foo.instances.append(self)
def pointless(self):
print 'pointless.'
def change_a(self):
self.a = 52
And be able to use it like this:
>>> Foo()
>>> Foo.instances[0]
<__main__.Foo instance at 0x102ff5758>
>>> Foo()
>>> len(Foo.instances)
2
>>> Foo.all.a = 78
78
>>> Foo.all.a
[78, 78]
>>> Foo.all.change_a()
>>> Foo.all.a
[52, 52]
>>>
The only thing a metaclass is needed for there is actually quite easy:
exactly creating the intances and all attributes.
All it have to do is to insert those into the namespace. Ah, it will also have to wrap the class __new__ method to insert new instances into the instances list.
The part that is the behavior wanted from all is interesting, and that can be implemented using the descriptor protocol, and attribute access control, so we have to craft a couple special classes, that will return the appropriate objects when requested after the ".".
"All" is the class that will be instantiated as "all" - it just needs a __get__ method to return another special object, from the AllAttr class, already bound to the parent class.
"AllAttr" is a special object that on any attribute access, perform your requirements on the members of the owner class "instance" attribute.
And "CallAllList" is a special list subclass that is callable, and calls all its members in turn. It is used by AllAttr if the required attribute from the owner class is callable itself.
class CallAllList(list):
def __call__(self, *args, **kwargs):
return [instance(*args, **kwargs) for instance in self]
class AllAttr(object):
def __init__(self, owner):
self._owner = owner
def __getattr__(self, attr):
method = getattr(self._owner, attr, None)
cls = CallAllList if callable(method) else list
return cls(getattr(obj, attr) for obj in self._owner.instances)
def __setattr__(self, attr, value):
if attr == "_owner":
return super(AllAttr, self).__setattr__(attr, value)
for obj in self._owner.instances:
setattr(obj, attr, value)
class All(object):
def __get__(self, instance, owner):
return AllAttr(owner)
def __repr__(self):
return "Representation of all instances of '{}'".format(self.__class__.__name__)
class MetaAll(type):
def __new__(metacls, name, bases, namespace):
namespace["all"] = All()
namespace["instances"] = []
cls = super(MetaAll, metacls).__new__(metacls, name, bases, namespace)
original_new = getattr(cls, "__new__")
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
cls.__new__ = __new__
return cls
class Foo(metaclass=MetaAll):
pass
The code above is written so that it is Python 3 and Python 2 compatible, since you appear to still be using Python2 given your "print" example.
The only thing that cannot be written compatible with both forms is the metaclass using declaration itself - just declare a __metaclass__ = MetaAll inside the body of your Foo class if you are using Python 2. But you should not really be using Python2, just change to Python 3 as soon as you can.
update
It happens that Python 2 has the "unbound method" figure, and the special casing of __new__ does not work like in Python 3: you can't just attribute a function named __new__ to the class. In order to get the correct __new__ method from the superclasses, the easiest way is to create a disposable class, so that it can be searched linearly. Otherwise, one would have to reimplement the MRO algorithm to get the proper __new__ method.
So, for Python 2, the metaclass should be this:
class MetaAll(type):
def __new__(metacls, name, bases, namespace):
namespace["all"] = All()
namespace["instances"] = []
if "__new__" in namespace:
original_new = namespace["__new__"]
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
else:
# We create a disposable class just to get the '__mro__'
stub_cls = super(MetaAll, metacls).__new__(metacls, name, bases, {})
for parent in stub_cls.__mro__[1:]:
if "__new__" in parent.__dict__:
original_new = parent.__dict__["__new__"]
break
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
namespace["__new__"] = __new__
final_cls = super(MetaAll, metacls).__new__(metacls, name, bases, namespace)
return final_cls
class Foo(object):
__metaclass__ = MetaAll
(now, again, this thing is ancient. Just settle for Python 3.6)
Ok, I figured out how to do this for Python 2.7 on my own. This is what I believe to be the best solution though it may not be the only one. It allows you to set, get, and function call on attributes of Class.all. I've named the metaclass InstanceUnifier, but please comment if you think there's a better (shorter, more descriptive) name you can think of.
class InstanceUnifier(type):
'''
What we want: A metaclass that can give a class an array of instances and provide a static Class.all object, that, when a method is called on it, calls the same method on every instance of the class.
'''
def __new__(cls, name, base_classes, dct):
dct['all'] = None
dct['instances'] = []
return type.__new__(cls, name, base_classes, dct)
def __init__(cls, name, base_classes, dct):
class Accessor(object):
def __getattribute__(self, name):
array = [getattr(inst, name) for inst in cls.instances]
if all([callable(item) for item in array]):
def proxy_func(*args, **kwargs):
for i in range(len(cls.instances)):
this = cls.instances[i]
func = array[i]
func(*args, **kwargs)
return proxy_func
elif all([not callable(item) for item in array]):
return array
else:
raise RuntimeError('Some objects in class instance array for key "'+name+'" are callable, some are not.')
def __setattr__(self, name, value):
[setattr(inst, name, value) for inst in cls.instances]
def __delattr__(self, name):
[delattr(inst, name) for inst in cls.instances]
cls.all = Accessor()
return type.__init__(cls, name, base_classes, dct)
def __call__(cls, *args, **kwargs):
inst = type.__call__(cls, *args, **kwargs)
cls.instances.append(inst)
return inst

How to make a "switch" between functions, depending on attribute access type (using class or instance)?

Disclaimer:
This article is more a recipe than a question, but I found the subject quite interesting, with almost no references in the Web.
If there is any better place on StackOverflow to publish this kind of articles, please let me know.
Subject:
How can I force Python to invoke different function depending on the type of attribute access (using class or instance) - e.g. force Python to invoke different method for MyClass.my_method() and MyClass().my_method()?
Usecase:
Let's say, we have custom Enum implementation (based on Python36 Enum, but with some customization). As a user of this Enum, we want to create a CustomEnum, inherit not just from Enum, but also from str: class MyEnum(str, Enum).We also want to add encoding and decoding feature. Our idea is to use MyEnum.encode to encode any object, that includes our enum members, but leave the original str.encode in power for instances of our enum class.
In short: MyEnum.encode invoke our custom encoding function, and have perfectly sens, from this point of view. MyEnum() is a string, so MyEnum().encode should invoke encode function inherited from str class.
Solution:
Write a descriptor, which will work as a switch.
Full answer in my first post.
Solution:
As far as I know, descriptors are the only objects, that can distinguish, if they are invoke for class or instance, because of the __get__ function signature: __get__(self, instance, instance_type). This property allows us to build a switch on top of it.
class boundmethod(object):
def __init__(self, cls_method=None, instance_method=None, doc=None):
self._cls_method = cls_method
self._instance_method = instance_method
if cls_method:
self._method_name = cls_method.__name__
elif instance_method:
self._method_name = instance_method.__name__
if doc is None and cls_method is not None:
doc = cls_method.__doc__
self.__doc__ = doc
self._method = None
self._object = None
def _find_method(self, instance, instance_type, method_name):
for base in instance_type.mro()[1:]:
method = getattr(base, method_name, None)
if _is_descriptor(method):
method = method.__get__(instance, base)
if method and method is not self:
try:
return method.__func__
except AttributeError:
return method
def __get__(self, instance, instance_type):
if instance is None:
self._method = self._cls_method or self._find_method(instance, instance_type, self._method_name)
self._object = instance_type
else:
self._method = self._instance_method or self._find_method(instance, instance_type, self._method_name)
self._object = instance
return self
#staticmethod
def cls_method(obj=None):
def constructor(cls_method):
if obj is None:
return boundmethod(cls_method, None, cls_method.__doc__)
else:
return type(obj)(cls_method, obj._instance_method, obj.__doc__)
if isinstance(obj, FunctionType):
return boundmethod(obj, None, obj.__doc__)
else:
return constructor
#staticmethod
def instance_method(obj=None):
def constructor(instance_method):
if obj is None:
return boundmethod(None, instance_method, instance_method.__doc__)
else:
return type(obj)(obj._cls_method, instance_method, obj.__doc__)
if isinstance(obj, FunctionType):
return boundmethod(None, obj, obj.__doc__)
else:
return constructor
def __call__(self, *args, **kwargs):
if self._method:
try:
return self._method(self._object, *args, **kwargs)
except TypeError:
return self._method(*args, **kwargs)
return None
Example:
>>> class Walkmen(object):
... #boundmethod.cls_method
... def start(self):
... return 'Walkmen start class bound method'
... #boundmethod.instance_method(start)
... def start(self):
... return 'Walkmen start instance bound method'
>>> print Walkmen.start()
Walkmen start class bound method
>>> print Walkmen().start()
Walkmen start instance bound method
I hope it will help some o you guys.
Best.
I actually just asked this question (Python descriptors and inheritance I hadn't seen this question). My solution uses descriptors and a metaclass for inheritance.
from my answer:
class dynamicmethod:
'''
Descriptor to allow dynamic dispatch on calls to class.Method vs obj.Method
fragile when used with inheritence, to inherit and then overwrite or extend
a dynamicmethod class must have dynamicmethod_meta as its metaclass
'''
def __init__(self, f=None, m=None):
self.f = f
self.m = m
def __get__(self, obj, objtype=None):
if obj is not None and self.f is not None:
return types.MethodType(self.f, obj)
elif objtype is not None and self.m is not None:
return types.MethodType(self.m, objtype)
else:
raise AttributeError('No associated method')
def method(self, f):
return type(self)(f, self.m)
def classmethod(self, m):
return type(self)(self.f, m)
def make_dynamicmethod_meta(meta):
class _dynamicmethod_meta(meta):
def __prepare__(name, bases, **kwargs):
d = meta.__prepare__(name, bases, **kwargs)
for base in bases:
for k,v in base.__dict__.items():
if isinstance(v, dynamicmethod):
if k in d:
raise ValueError('Multiple base classes define the same dynamicmethod')
d[k] = v
return d
return _dynamicmethod_meta
dynamicmethod_meta=make_dynamicmethod_meta(type)
class A(metaclass=dynamicmethod_meta):
#dynamicmethod
def a(self):
print('Called from obj {} defined in A'.format(self))
#a.classmethod
def a(cls)
print('Called from class {} defined in A'.format(cls))
class B(A):
#a.method
def a(self):
print('Called from obj {} defined in B'.format(self))
A.a()
A().a()
B.a()
B().a()
results in:
Called from class <class 'A'> defined in A
Called from obj <A object at ...> defined in A
Called from class <class 'B'> defined in A
Called from obj <B object at ...> defined in B

Writing a setting method for

In a class I am writing, one of the member properties is a list:
#property
def address_list(self):
return self._address_list
#address_list.setter
def address_list(self, addr_list):
if type(addr_list) is not list:
return
self._address_list = addr_list
I want to be able to write a property so if someone wanted to append something onto the list, it would call something like another setter function, but for adding onto the list:
Object.address_list.append(value)
would call something like
#property.appender # I made this line up
def address_list.append(self, value):
if value >= 0 and value <= 127:
self._address_list.append(value)
so I could safely append values to my private list. Is something like this possible without having to create a new type of list object?
EDIT: Address list returns a standard python list
you would need to create a new AddressList class to handle this. something like
class Wrapper(object):
"""Wrapper class that provides proxy access to an instance of some
internal instance."""
__wraps__ = None
__ignore__ = "class mro new init setattr getattr getattribute"
def __init__(self, obj):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
elif isinstance(obj, self.__wraps__):
self._obj = obj
else:
raise ValueError("wrapped object must be of %s" % self.__wraps__)
# provide proxy access to regular attributes of wrapped object
def __getattr__(self, name):
return getattr(self._obj, name)
# create proxies for wrapped object's double-underscore attributes
class __metaclass__(type):
def __init__(cls, name, bases, dct):
def make_proxy(name):
def proxy(self, *args):
return getattr(self._obj, name)
return proxy
type.__init__(cls, name, bases, dct)
if cls.__wraps__:
ignore = set("__%s__" % n for n in cls.__ignore__.split())
for name in dir(cls.__wraps__):
if name.startswith("__"):
if name not in ignore and name not in dct:
setattr(cls, name, property(make_proxy(name)))
class AddressList(Wrapper):
__wraps__=list
def append(self,x):
if 0 <= x <= 127:
self._obj.append(x)
else:
raise ValueError("Cannot append %r"%x)
class MyContainer:
def __init__(self):
self.address_list = AddressList([])
x = MyContainer()
x.address_list.append(1)
x.address_list.append(7)
x.address_list.append(-1)
print x.address_list
*note that this answer borrows heavily from https://stackoverflow.com/a/9059858/541038 *

Writing a function to define class properties

I was recently writing a definition for a pretty basic data class in Python and I came up with the following:
class A:
def __init__(self, **kwargs):
self.__a1 = kwargs.get('some_value', -1)
#property
def a1(self):
return self.__a1
#a1.setter
def a1(self, new_a1):
self.__a1 = new_a1
And it goes on. In this case, the value -1 could be replaced with a variety of "null" values: -1, "", [], etc., and some_value comes from an Enum I defined earlier.
Because the class definition contains several of these property definitions, and they're all very "same-y", I'd like to write a function to do this for me. I'm pretty sure it's possible in Python but I've never tried it so I was hoping for some pointers.
Assuming you want to simplify the repetitive property definitions, you can use a generic descriptor to simplify this significantly:
class ProtectedAttribute(object):
"""Basic descriptor functionality for a protected attribute.
Args:
name (str): The name of the attribute to back the descriptor
(usually the name the descriptor is assigned to with a single
additional leading underscore).
"""
def __init__(self, name, **kwargs):
self.name = name
def __get__(self, obj, typ):
return getattr(obj, self.name)
def __set__(self, obj, value):
setattr(obj, self.name, value)
def __delete__(self, obj):
delattr(obj, self.name)
Now you can just do:
class A(object):
a1 = ProtectedAttribute('__a1')
def __init__(self, **kwargs):
self.a1 = kwargs.get("some_value", -1)
Note also the use of dict.get to simplify __init__.

python: defining registry in base class

I'm implementing enumeration using a base class that defines a variety of methods. The actual enumerations are subclasses of that, with no additional methods or attributes. (Each subclass is populated with its own values using the constructor defined in the base class).
I use a registry (a class attribute that stores all the instances of that class). Ideally, I'd like to avoid defining it in each subclass. Unfortunately, if I define it in the base class, all the subclasses will end up sharing the same registry.
What's a good approach here?
Below is the implementation in case it helps (it's based on #jchl comment in python enumeration class for ORM purposes).
class IterRegistry(type):
def __iter__(cls):
return iter(cls._registry.values())
class EnumType(metaclass = IterRegistry):
_registry = {}
_frozen = False
def __init__(self, token):
if hasattr(self, 'token'):
return
self.token = token
self.id = len(type(self)._registry)
type(self)._registry[token] = self
def __new__(cls, token):
if token in cls._registry:
return cls._registry[token]
else:
if cls._frozen:
raise TypeError('No more instances allowed')
else:
return object.__new__(cls)
#classmethod
def freeze(cls):
cls._frozen = True
def __repr__(self):
return self.token
#classmethod
def instance(cls, token):
return cls._registry[token]
class Enum1(EnumType): pass
Enum1('a')
Enum1('b')
for i in Enum1:
print(i)
# not going to work properly because _registry is shared
class Enum2(EnumType): pass
As you already have a metaclass you might as well use it to put a add a separate _registry attribute to each subclass automatically.
class IterRegistry(type):
def __new__(cls, name, bases, attr):
attr['_registry'] = {} # now every class has it's own _registry
return type.__new__(cls, name, bases, attr)
Marty Alchin has a very nice pattern for this: see his blog entry.
What if you share the same registry, but with sub-registries per class, i.e.
if cls.__name__ not in self._registry:
self._registry[cls.__name__] = {}
self._registry[cls.__name__][token] = cls
You actually don't even need cls.__name__, you should be able to use cls itself as key.

Categories

Resources