Wrapper class for method chaining in python & handling dunder methods - python

I'm hoping to write a wrapper class that would make method calls of a given object return self instead of None to allow for method chaining.
TLDR: I have figured out the above goal for regular methods but can't figure out how to handle dunder methods (e.g., __add__, etc) since __getattr__ doesn't get called with dunder methods.
Given that __getattr__ doesn't handle parameters of undefined methods, I've so far made a class called Chain that can allow for method-chaining for regular methods:
class Chain:
def __init__(self, obj):
self.obj = obj
def __getattr__(self, attr):
if hasattr(self.obj, attr):
obj_attr = getattr(self.obj, attr)
if callable(obj_attr):
return lambda *args, **kwargs: \
self.newMethod(obj_attr, *args, **kwargs)
else:
return obj_attr
else:
raise AttributeError(
f"Chain: Object of type {type(self.obj)} " + \
f"has no attribute {attr}"
)
def newMethod(self, method, *args, **kwargs):
output = method(*args, **kwargs)
if output is None:
return self
else:
return output
A trivial example of how this could work:
class A:
def __init__(self):
self.num = 0
def incrementNum(self, num):
self.num += num
# original code
obj = A()
obj.incrementNum(1)
obj.incrementNum(2)
obj.incrementNum(3)
print("without Chain:", obj.num)
# with wrapper
new_obj = Chain(A())
new_obj.incrementNum(1) \
.incrementNum(2) \
.incrementNum(3)
print("with Chain:", new_obj.num)
Obviously, for user-defined classes, the Chain class is superfluous since you can just change the class definition to always return self where applicable. I'm making Chain for use with tkinter classes like ttk.Menu etc since I don't want to mess with the tkinter source code.
Note that in the Chain class, it's not just the object that's returned, but instead it's the instance of Chain that wraps said object, so it can be method-chained more than once.
However, say that I'm hoping to Chain-wrap a class that implements __add__ or some other class-defined operators. This is where I'm having issues, since I cannot intercept the output of __add__ using __getattr__. Yes, I could of course implement __add__ in Chain, but if I wanted to be really thorough, I'd also have to implement that for every other operator that could show up for classes. I thought that maybe __getattribute__ would be the way to do it, but it doesn't catch dunder methods either. Any ideas how I can work around this in this case?
Yes, I know I'm doing a lot of acrobatics just to implement method-chaining. Might not be worth it, but it would be cool to augment a given class so that it behaves exactly the same as before except it returns itself with its methods. However, I suppose it's also possible __getattr__ doesn't catch other dunder methods for a reason I don't know about.
Side comment: I should probably also implement __setattr__ and __delattr__ in Chain so that its instances can be treated just like the objects they're wrapping. I've left that out here for brevity.

Related

Is there any design pattern for implementing decoration of methods of child classes of an abstract class?

The case is such that I have an abstract class and a few child classes implementing it.
class Parent(metaclass=ABCMeta):
#abstract_method
def first_method(self, *args, **kwargs):
raise NotImplementedError()
#abstract_method
def second_method(self, *args, **kwargs):
raise NotImplementedError()
class Child(Parent):
def first_method(self, *args, **kwargs):
print('First method of the child class called!')
def second_method(self, *args, **kwargs):
print('Second method of the child class called!')
My goal is to make some kind of decorator, which will be used on methods of any child of the Parent class. I need this because every method make some kind of preparation before actually doing something, and this preparation is absolutely the same in all methods of all childs of the Parent class. Like:
class Child(Parent):
def first_method(self, *args, **kwargs):
print('Preparation!')
print('First method of the child class called!')
def second_method(self, *args, **kwargs):
print('Preparation!')
print('Second method of the child class called!')
The first thing came to my mind is to use Parent class method implementation: just remove "raise NotImplementedError()" and put some functionality, and then in child classes I would call, for example, super().first_method(self, *args, **kwargs) in the beginning of each method. It is good, but I also would want to return some data from the Parent method, and it would look weird when parent method and child method return something different in declaration. Not to mention that I would probably want to do some post-processing work after the method, so then I would need 2 different functions: for the beginning and after the performing the script.
The next thing I came up with is making MetaClass.
Just implement all the decoration of methods in the new MetaClass during creating a class, and pass the newly generated data which is used in child methods to them in kwargs.
This is the closest solution to my goal, but it feels wrong anyway. Because it is not explicit that some kwargs will be passed to child methods, and if you are new to this code, then you need to do some researches to understand how it works. I feel like I overengineering or so.
So the question: is there any pattern or something along these lines to implement this functionality?
Probably you can advise something better for my case?
Thank you a lot in advance!
So, existing patterns apart: I won't know if this has an specific name, what you need, that would be a "pattern" is the use of "slots": that is - you document special named methods that will be called as part of the execution of another method. This other method then performs its setup code, checks if the slotted method (usually identifiable by name) exists, call them, with a plain simple method call, which will run the most specialized version of it, even if the special method that calls the slots is in the base class, and you are on a big class-inheritance hierarchy.
One plain example of this pattern is the way Python instantiates objects: what one actually invokes calling the class with the same syntax that is used for function calls (MyClass()) is that class's class (its metaclass) __call__ method. (Usally type.__call__). In Python's code for type.__call__ the class' __new__ method is called, then the class' __init__ method is called and finally the value returned by the first call, to __new__ is returned. A custom metaclass can modify __call__ to run whatever code it wants before, between, or after these two calls.
So, if this was not Python, all you'd need is to spec down this, and document that these methods should not be called directly, but rather through an "entry point" method - which could simply feature an "ep_" prefix. These would have to be fixed and hardcoded on a baseclass, and you'd need one for each of the methods you want to prefix/postfix code to.
class Base(ABC):
def ep_first_method(self, *args, **kw);
# prefix code...
ret_val = self.first_method(*args, **kw)
# postfix code...
return ret_val
#abstractmethod
def first_method(self):
pass
class Child(Base):
def first_method(self, ...):
...
This being Python, it is easier to add some more magic to avoid code repetition and keep things concise.
One possible thing is to have a special class that, when detecting a method in a child class that should be called as a slot of a wrapper method, like above, to automatically rename that method: this way the entry point methods can feature the same name as the child methods - and better yet, a simple decorator can mark the methods that are meant to be "entrypoints", and inheritance would even work for them.
Basically, when building a new class we check all methods: if any of them has a correspondent part in the calling hierarchy which is marked as an entrypoint, the renaming takes place.
It is more practical if any entrypoint method will take as second parameter (the first being self), a reference for the slotted method to be called.
After some fiddling: the good news is that a custommetaclass is not needed - the __init_subclass__ special method in a baseclass is enough to enable the decorator.
The bad news: due to re-entry iterations in the entry-point triggered by potential calls to "super()" on the final methods, a somewhat intricate heuristic to call the original method in the intermediate classes is needed. I also took care to put some multi-threading protections - although this is not 100% bullet-proof.
import sys
import threading
from functools import wraps
def entrypoint(func):
name = func.__name__
slotted_name = f"_slotted_{name}"
recursion_control = threading.local()
recursion_control.depth = 0
lock = threading.Lock()
#wraps(func)
def wrapper(self, *args, **kw):
slotted_method = getattr(self, slotted_name, None)
if slotted_method is None:
# this check in place of abstractmethod errors. It is only raised when the method is called, though
raise TypeError("Child class {type(self).__name__} did not implement mandatory method {func.__name__}")
# recursion control logic: also handle when the slotted method calls "super",
# not just straightforward recursion
with lock:
recursion_control.depth += 1
if recursion_control.depth == 1:
normal_course = True
else:
normal_course = False
try:
if normal_course:
# runs through entrypoint
result = func(self, slotted_method, *args, **kw)
else:
# we are within a "super()" call - the only way to get the renamed method
# in the correct subclass is to recreate the callee's super, by fetching its
# implicit "__class__" variable.
try:
callee_super = super(sys._getframe(1).f_locals["__class__"], self)
except KeyError:
# callee did not make a "super" call, rather it likely is a recursive function "for real"
callee_super = type(self)
slotted_method = getattr(callee_super, slotted_name)
result = slotted_method(*args, **kw)
finally:
recursion_control.depth -= 1
return result
wrapper.__entrypoint__ = True
return wrapper
class SlottedBase:
def __init_subclass__(cls, *args, **kw):
super().__init_subclass__(*args, **kw)
for name, child_method in tuple(cls.__dict__.items()):
#breakpoint()
if not callable(child_method) or getattr(child_method, "__entrypoint__", None):
continue
for ancestor_cls in cls.__mro__[1:]:
parent_method = getattr(ancestor_cls, name, None)
if parent_method is None:
break
if not getattr(parent_method, "__entrypoint__", False):
continue
# if the code reaches here, this is a method that
# at some point up has been marked as having an entrypoint method: we rename it.
delattr (cls, name)
setattr(cls, f"_slotted_{name}", child_method)
break
# the chaeegs above are inplace, no need to return anything
class Parent(SlottedBase):
#entrypoint
def meth1(self, slotted, a, b):
print(f"at meth 1 entry, with {a=} and {b=}")
result = slotted(a, b)
print("exiting meth1\n")
return result
class Child(Parent):
def meth1(self, a, b):
print(f"at meth 1 on Child, with {a=} and {b=}")
class GrandChild(Child):
def meth1(self, a, b):
print(f"at meth 1 on grandchild, with {a=} and {b=}")
super().meth1(a,b)
class GrandGrandChild(GrandChild):
def meth1(self, a, b):
print(f"at meth 1 on grandgrandchild, with {a=} and {b=}")
super().meth1(a,b)
c = Child()
c.meth1(2, 3)
d = GrandChild()
d.meth1(2, 3)
e = GrandGrandChild()
e.meth1(2, 3)

Parametric classes in python

I want to define a pair of classes that are almost identical, except that the class methods are decorated in two different ways. Currently, I just have a factory function that takes the decorator as an argument, constructs the class using that decorator, and returns the class. Greatly simplified, something like this works:
# Defined in mymodule.py
def class_factory(decorator):
class C:
#decorator
def fancy_func(self, x):
# some fanciness
return x
return C
C1 = class_factory(decorator1)
C2 = class_factory(decorator2)
And I can use these as usual:
import mymodule
c1 = mymodule.C1()
c2 = mymodule.C2()
I'm not entirely comfortable with this, for a number of reasons. First, a purely aesthetic reason: the types of both objects display as mymodule.class_factory.<locals>.C. They're not actually identical, but they look like it, and it causes problems with the documentation. Second, my class is pretty complicated. I'd actually like to use inheritance and mixins and so on, but in any case, those other classes also need access to the decorators. So currently, I make several factories, and call the parent class factories inside the child class factory, and the child inherits from the parents created in this way. But this means I can't really use the resulting parents as classes outside the factory.
So my questions are
Is there a better design pattern for this sort of thing? It would be really convenient if there were some way to use inheritance, where the decorators are actually methods in a class, and I inherit in two different ways.
Is there anything wrong with changing the <locals> part of the class name by just altering C.__qualname__ before returning?
To be a bit more specific: I want one version of the class to work extremely quickly with numpy arrays, and I want another version of the class to work with arbitrary python objects — especially sympy expressions. So for the first, I decorate with #numba.guvectorize (and relatives). This means I actually need to pass numba some signatures, so I can't just rely on numba falling back to object mode for the second case. But for simplicity, I think we can ignore the issue of signatures here. For the second case, I basically make a no-op decorator that ignores signatures and does nothing to the function.
Here's an approach using __init_subclass__. I use keyword arguments here, but you could easily change it so the decorators are defined as methods on C1 and C2 and are applied in __init_subclass__.
def passthru(f):
return f
class BaseC:
def __init_subclass__(cls, /, decorator=passthru, **kwargs):
super().__init_subclass__(**kwargs)
# if you also have class attributes or methods you don't want to decorate,
# you might need to maintain an explicit list of decoratable methods
for attr in dir(cls):
if not attr.startswith('__'):
setattr(cls, attr, decorator(getattr(cls, attr)))
def fancy_func(self, x):
# some fanciness
return x
def two(f):
return lambda self, x: "surprise"
class C1(BaseC):
pass
class C2(BaseC, decorator=two):
pass
print(C1().fancy_func(42))
print(C2().fancy_func(42))
# further subclassing
class C3(C2):
pass
print(C3().fancy_func(42))
I took #Jasmijn's suggestion of using __init_subclass__. But since I really need multiple decorators (jit, guvectorize, and sometimes neither even when using numba with other methods), I tweaked it a little. Rather than jitting every public method, I use decorators to flag methods with attributes explaining how to compile them.
I decorate the individual methods much like I would have originally, indicating whether to jit or whatnot. But these decorators don't actually do any compilation; they just add hidden attributes to the functions indicating whether and how to apply the actual decorators. Then, when a subclass is created, __init_subclass__ loops through, looking for these attributes on all the subclass's methods, and applying any requested compilation.
I turn this into a pretty general class, named Jitter below. Any class that wants the option of jitting in multiple ways can just inherit from this class and decorate methods with Jitter.jit or Jitter.guvectorize. By default, nothing much happens to those functions, so the first child class of Jitter can be used with sympy, for example. But I can also inherit from such a class while adding the relevant keyword(s) to the class definition, enabling jitting in the subclass. Here's the Jitter class:
class Jitter:
def jit(f):
f._jit = True
return f
def guvectorize(*args, **kwargs):
def wrapper(f):
f._guvectorize = (args, kwargs)
return f
return wrapper
def __init_subclass__(cls, /, jit=None, guvectorize=None, **kwargs):
super().__init_subclass__(**kwargs)
for attr_name in dir(cls):
attr = getattr(cls, attr_name)
if jit is not None and hasattr(attr, '_jit'):
setattr(cls, attr_name, jit(attr))
elif guvectorize is not None and hasattr(attr, '_guvectorize'):
args, kwargs = getattr(attr, '_guvectorize')
setattr(cls, attr_name, guvectorize(*args, **kwargs)(attr))
Now, I can inherit from this class very conveniently:
import numba as nb
class Adder(Jitter):
#Jitter.jit
def add(x, y):
return x + y
class NumbaAdder(Adder, jit=nb.njit):
pass
Here, Adder.add is a regular python function that just happens to have a _jit attribute, but NumbaAdder.add is a numba jit function. For more realistic code, I would use the same Jitter class and the same NumbaAdder class, but would put all the complexity into the Adder class.
Note that we could decorate with Adder.jit, but this would be precisely the same as decorating with Jitter.jit, because Adder.jit doesn't get changed (if at all) until after the decorators in the class definition have already been applied, so we still need to loop through and apply the jit functions with __init_subclass__.

dynamic inheritance in Python through a decorator

I found this post where a function is used to inherit from a class:
def get_my_code(base):
class MyCode(base):
def initialize(self):
...
return MyCode
my_code = get_my_code(ParentA)
I would like to do something similar, but with a decorator, something like:
#decorator(base)
class MyClass(base):
...
Is this possible?
UPDATE
Say you have a class Analysis that is used throughout your code. Then you realize that you want to use a wrapper class Transient that is just a time loop on top of the analysis class. If in the code I replace the analysis class, but Transient(Analysis) everything breaks because an analysis class is expected, and thus all its attributes. The problem is that I can't just get to define class Transient(Analysis) in this way because there are plenty of analysis classes. I thought the best way to do this would be to have some sort of dynamic inheritance. Right now I use aggregation to redirect the functionality to the analysis class inside transient.
A class decorator actually gets the class already built - and instantiated (as a class object). It can perform changes on it's dict, and even wrap its methods with other decorators.
However, it means the class already has its bases set - and these can't be ordinarily changed. That implies you have to, in some ay rebuild the class inside the decorator code.
However, if the class'methods make use of parameterless super or __class__ cell variable, those are already set in the member functions (that in Python 3 are the same as unbound methods) you can't just create a new class and set those methods as members on the new one.
So, there might be a way, but it will be non-trivial. And as I pointed out in the comment above, I d like to understand what you'd like to be able to achieve with this, since one could just put the base class on the class declaration itself, instead of using it on the decorator configuration.
I've crafted a function that, as described above, creates a new class, "clonning" the original and can re-build all methods that use __class__ or super: it returns the new class which is functionally identical to the orignal one, but with the bases exchanged. If used in a decorator as requested (decorator code included), it will simply change the class bases. It can't handle decorated methods (other than classmethod and staticmethod), and don't take care of naming details - such as qualnames or repr for the methods.
from types import FunctionType
def change_bases(cls, bases, metaclass=type):
class Changeling(*bases, metaclass=metaclass):
def breeder(self):
__class__ #noQA
cell = Changeling.breeder.__closure__
del Changeling.breeder
Changeling.__name__ = cls.__name__
for attr_name, attr_value in cls.__dict__.items():
if isinstance(attr_value, (FunctionType, classmethod, staticmethod)):
if isinstance(attr_value, staticmethod):
func = getattr(cls, attr_name)
elif isinstance(attr_value, classmethod):
func = attr_value.__func__
else:
func = attr_value
# TODO: check if func is wrapped in decorators and recreate inner function.
# Although reaplying arbitrary decorators is not actually possible -
# it is possible to have a "prepare_for_changeling" innermost decorator
# which could be made to point to the new function.
if func.__closure__ and func.__closure__[0].cell_contents is cls:
franken_func = FunctionType(
func.__code__,
func.__globals__,
func.__name__,
func.__defaults__,
cell
)
if isinstance(attr_value, staticmethod):
func = staticmethod(franken_func)
elif isinstance(attr_value, classmethod):
func = classmethod(franken_func)
else:
func = franken_func
setattr(Changeling, attr_name, func)
continue
setattr(Changeling, attr_name, attr_value)
return Changeling
def decorator(bases):
if not isinstance(base, tuple):
bases = (bases,)
def stage2(cls):
return change_bases(cls, bases)
return stage2

Is it possible to inverse inheritance using a Python meta class?

Out of curiosity, I'm interested whether it's possible to write a meta class that causes methods of parent classes to have preference over methods of sub classes. I'd like to play around with it for a while. It would not be possible to override methods anymore. The base class would have to call the sub method explicitly, for example using a reference to a base instance.
class Base(metaclass=InvertedInheritance):
def apply(self, param):
print('Validate parameter')
result = self.subclass.apply(param)
print('Validate result')
return result
class Child(Base):
def apply(self, param):
print('Compute')
result = 42 * param
return result
child = Child()
child.apply(2)
With the output:
Validate parameter
Compute
Validate result
If you only care about making lookups on instances go in reverse order (not classes), you don't even need a metaclass. You can just override __getattribute__:
class ReverseLookup:
def __getattribute__(self, attr):
if attr.startswith('__'):
return super().__getattribute__(attr)
cls = self.__class__
if attr in self.__dict__:
return self.__dict__[attr]
# Using [-3::-1] skips topmost two base classes, which will be ReverseLookup and object
for base in cls.__mro__[-3::-1]:
if attr in base.__dict__:
value = base.__dict__[attr]
# handle descriptors
if hasattr(value, '__get__'):
return value.__get__(self, cls)
else:
return value
raise AttributeError("Attribute {} not found".format(attr))
class Base(ReverseLookup):
def apply(self, param):
print('Validate parameter')
result = self.__class__.apply(self, param)
print('Validate result')
return result
class Child(Base):
def apply(self, param):
print('Compute')
result = 42 * param
return result
>>> Child().apply(2)
Validate parameter
Compute
Validate result
84
This mechanism is relatively simple because lookups on the class aren't in reverse:
>>> Child.apply
<function Child.apply at 0x0000000002E06048>
This makes it easy to get a "normal" lookup just by doing it on a class instead of an instance. However, it could result in confusion in other cases, like if a base class method tries to access a different method on the subclass, but that method actually doesn't exist on that subclass; in this case lookup will proceed in the normal direction and possibly find the method on a higher class. In other words, when doing this you have be sure that you don't look any methods up on a class unless you're sure they're defined on that specific class.
There may well be other corner cases where this approach doesn't work. In particular you can see that I jury-rigged descriptor handling; I wouldn't be surprised if it does something weird for descriptors with a __set__, or for more complicated descriptors that make more intense use of the class/object parameters passed to __get__. Also, this implementation falls back on the default behavior for any attributes beginning with two underscores; changing this would require careful thought about how it's going to work with magic methods like __init__ and __repr__.

'super' object not calling __getattr__

I have one object wrapped inside another.
The "Wrapper" accesses the attributes from the "Wrapped" object by overriding __getattr__.
This works well until I need to override an atribute on a sub class, and then access the attribute from the base class using super().
I can still access the attribute directly from __getattr__ but why does super() not work?
class Wrapped(object):
def __init__(self, value):
self.value = value
def hello_world(self):
print 'hello world', self.value
class Wrapper(object):
def __init__(self, obj):
self.wrapped_obj = obj
def __getattr__(self, name):
if name in self.__dict__:
return getattr(self, name)
else:
return getattr(self.wrapped_obj, name)
class Subclass(Wrapper):
def __init__(self, obj):
super(Subclass, self).__init__(obj)
def hello_world(self):
# this works
func = super(Subclass, self).__getattr__('hello_world')()
# this doesn't
super(Subclass, self).hello_world()
a = Wrapped(2)
b = Subclass(a)
b.hello_world()
According to this, super does not allow implicit calls of "hook" functions such as __getattr__. I'm not sure why it is implemented this way (there's probably a good reason and things are already confusing enough since the super object has custom __getattribute__ and __get__ methods as it is), but it seems like it's just the way things are.
Edit: This post appears to clear things up a little. It looks like the problem is the extra layer of indirection caused by __getattribute__ is ignored when calling functions implicitly. Doing foo.x is equivalent to
foo.__getattr__(x)
(Assuming no __getattribute__ method is defined and x is not in foo.__dict__)
However, it is NOT equivalent to
foo.__getattribute__('__getattr__')(x)
Since super returns a proxy object, it has an extra layer of indirection which causes things to fail.
P.S. The self.__dict__ check in your __getattr__ function is completely unnecessary. __getattr__ is only called if the attribute doesn't already exist in your dict. (Use __getattribute__ if you want it to always be called, but then you have to be very careful, because even something simple like if name in self.__dict__ will cause infinite recursion.

Categories

Resources