Getting rid of explicit super - python

I'd like to implement something like this
def after(f):
def inner(*args, **kwargs):
super().f(*args, **kwargs)
f(*args, **kwargs)
return inner
class A:
def f(self):
print ('hello')
class B(A):
#after
def f(self):
print ('world')
b = B()
b.f()
that is I would like to get rid of explicit super in some of my classes and replace it with #before / #after decorator (maybe with parameters).
That is, in this example, I would like hello world to be printed.
the idea is to increase the readability of the code, as in some classes I often use multiple inheritance, so I often override methods and often have to use super().
I think I could use inspect to determine the class instance that calls the decorator (although not sure about performance if I have many class instances).
is there a way to do this without sacrificing performance?

You can make your decorator work, you just need it to make it a descriptor class, rather than a function. You need to implement the __set_name__ method to get a reference to the class you've been added to. With the class reference, you can make a two-argument super call:
import functools
class after:
def __init__(self, method):
self.method = method
def __set_name__(self, owner, name):
self.owner = owner
self.name = name # using self.method.__name__ might be better?
def __get__(self, instance, owner):
if instance is None:
return self
return functools.partial(self, instance)
def __call__(self, instance, *args, **kwargs):
assert(self.owner is not None and self.name is not None)
getattr(super(self.owner, instance), self.name)(*args, **kwargs)
return self.method(instance, *args, **kwargs)
You could do a before too, which would be nearly the same, just with the last two lines in the reverse order (and some fiddling to handle the return value).
I'd note that this decorator is quite a bit less generally useful than calling super the normal way since you can't usefully interact with the value returned by the overridden method, or change the arguments being passed in to it. There's no before or after decorated method that can replicate these classes:
class Foo:
def foo(self, x, y):
return x + y
class Bar(Foo):
def foo(self, x, y, z):
return super().foo(x//2, y+1) * z

Related

Correct way of returning new class object (which could also be extended)

I am trying to find a good way for returning a (new) class object in class method that can be extended as well.
I have a class (classA) which has among other methods, a method that returns a new classA object after some processing
class classA:
def __init__(): ...
def methodX(self, **kwargs):
process data
return classA(new params)
Now, I am extending this class to another classB. I need methodX to do the same, but return classB this time, instead of classA
class classB(classA):
def __init__(self, params):
super().__init__(params)
self.newParams = XYZ
def methodX(self, **kwargs):
???
This may be something trivial but I simply cannot figure it out. In the end I dont want to rewrite the methodX each time the class gets extended.
Thank you for your time.
Use the __class__ attribute like this:
class A:
def __init__(self, **kwargs):
self.kwargs = kwargs
def methodX(self, **kwargs):
#do stuff with kwargs
return self.__class__(**kwargs)
def __repr__(self):
return f'{self.__class__}({self.kwargs})'
class B(A):
pass
a = A(foo='bar')
ax = a.methodX(gee='whiz')
b = B(yee='haw')
bx = b.methodX(cool='beans')
print(a)
print(ax)
print(b)
print(bx)
class classA:
def __init__(self, x):
self.x = x
def createNew(self, y):
t = type(self)
return t(y)
class classB(classA):
def __init__(self, params):
super().__init__(params)
a = classA(1)
newA = a.createNew(2)
b = classB(1)
newB = b.createNew(2)
print(type(newB))
# <class '__main__.classB'>
I want to propose what I think is the cleanest approach, albeit similar to existing answers. The problem feels like a good fit for a class method:
class A:
#classmethod
def method_x(cls, **kwargs):
return cls(<init params>)
Using the #classmethod decorator ensures that the first input (traditionally named cls) will refer to the Class to which the method belongs, rather than the instance.
(usually we call the first method input self and this refers to the instance to which the method belongs)
Because cls refers to A, rather than an instance of A, we can call cls() as we would call A().
However, in a class that inherits from A, cls will instead refer to the child class, as required:
class A:
def __init__(self, x):
self.x = x
#classmethod
def make_new(cls, **kwargs):
y = kwargs["y"]
return cls(y) # returns A(y) here
class B(A):
def __init__(self, x):
super().__init__(x)
self.z = 3 * x
inst = B(1).make_new(y=7)
print(inst.x, inst.z)
And now you can expect that print statement to produce 7 21.
That inst.z exists should confirm for you that the make_new call (which was only defined on A and inherited unaltered by B) has indeed made an instance of B.
However, there's something I must point out. Inheriting the unaltered make_new method only works because the __init__ method on B has the same call signature as the method on A. If this weren't the case then the call to cls might have had to be altered.
This can be circumvented by allowing **kwargs on the __init__ method and passing generic **kwargs into cls() in the parent class:
class A:
def __init__(self, **kwargs):
self.x = kwargs["x"]
#classmethod
def make_new(cls, **kwargs):
return cls(**kwargs)
class B(A):
def __init__(self, x, w):
super().__init__(x=x)
self.w = w
inst = B(1,2).make_new(x="spam", w="spam")
print(inst.x, inst.w)
Here we were able to give B a different (more restrictive!) signature.
This illustrates a general principle, which is that parent classes will typically be more abstract/less specific than their children.
It follows that, if you want two classes that substantially share behaviour but which do quite specific different things, it will be better to create three classes: one rather abstract one that defines the behaviour-in-common, and two children that give you the specific behaviours you want.

Python inheritance, method overloading, and decorating

I have a class B that inherits from A :
class A():
def do_something(self, x):
"""Prints x."""
print(x)
class B(A):
def something_else(self, x):
print("This isn't the same.")
I'd like to achieve a few things :
I'd like for B.do_something to inherit the docstring from A.do_something. I think functools.wraps is the recommended solution : is that right ?
Let's say there are some methods of A that return an instance of A. If I call those methods from B, I'd like them to return an instance of B. So far, I'm overloading each function manually.
def method_of_A(self, *args, **kwargs):
return A(super(self.__class__, self).method_of_A(*args, **kwargs))
There's likely a better way - especially given that I have to do this for a large number of classes. Is there same way to check if a function is defined within B and, if not but available in A, have it decorated / wrapped to return an instance of B ? EDIT : I can't make changes to A's codebase.
Are there solutions that are Py2 and Py3 compatible ?
Thanks very much for any suggestions.
Yes, you can use functools.wraps to copy the function name and docstring. You can return an instance of the current class using self.__class__
class A(object):
def func(self):
return self.__class__()
class B(A):
#functools.wraps(A.func)
def func(self):
return super(B, self).func()
>>> b = B()
>>> obj = b.return_object()
>>> print type(obj)
"<class '__main__.B'>"
Is there same way to check if a function is defined within B and, if not but available in A, have it decorated / wrapped to return an instance of B?
You may be able to do this using metaclasses, assuming A isn't already using a custom metaclass that you're not able to inherit from (like if it is only defined in C and hasn't been exposed to python). The way you use metaclasses is slightly different in python 2 and 3.
class MetaB(type):
def __new__(cls, name, bases, attrs):
if A in bases:
for attr, value in A.__dict__.items():
if isinstance(value, types.FunctionType) and attr not in attrs:
new_func = MyMeta.make_wrapper_func(value)
attrs[attr] = new_func
return super(MetaB, cls).__new__(cls, name, bases, attrs)
#staticmethod
def make_wrapper_func(func):
#functools.wraps(func)
def _func(self, *args, **kwargs):
value = func(self, *args, **kwargs)
if isinstance(value, A):
value = self.__class__(value)
return value
return _func
class B(A):
__metaclass__ = MetaB
...
In python 3, metaclasses are used a little differently
class B(A, metaclass=MetaB):
...
This assumes you can create an object of the B() type just by passing an instance of A() to the constructor for it (ie. return self.__class__(value)). That was just a guess. I'd have to know a litte more about your object to know how to translate an A object to a B object, but the general method would be the same. This solution also only works on regular class methods. It's not going to work on some other stuff like classmethods and staticmethods or other types of descriptor objects. You certainly could make it work for all those, your metaclass would just need to be a little more complex.
Let's say there are some methods of A that return an instance of A. If I call those methods from B, I'd like them to return an instance of B. So far, I'm overloading each function manually.
Use a classmethod.
class A(object):
#classmethod
def f(cls):
return cls
when b (instance of B) will call f, it will return B.

Using decorators as class attributes instead of instance attributes

I have the following classes.
Validator is a decorator that receives a class which defines validation criteria for a decorated function. ValidateKeys is the validation criteria for this example. Node2D is a class using validation criteria.
class Validator(object):
def __init__(self, TheValidator, *args, **kwargs):
self.validator = TheValidator(*args,**kwargs)
def __call__(self,f):
def wrapped_f(instance, *args,**kwargs):
self.TheValidator(instance, *args, **kwargs)
return f(instance,*args,**kwargs)
return wrapped_f
class ValidateKeys(object):
def __init__(self,*keysIterable):
self.validkeys = keysIterable
def __call__(self, instance, **kwargs):
for a in kwargs:
if not a in self.validkeys:
raise Exception()
instance.__dict__.update(kwargs)
class Node2D(object):
#property
def coords(self):
return self.__dict__
#coords.setter
def coords(self,Coords):
self.set_coords(**Coords)
#Validator(ValidateKeys, 'x','y')
def set_coords(self,**Coords):
pass
From what I understand, as things are written here, every instance of Node2D will produce a duplicate Validator (as will any other class decorated with Validator) and ValidateKeys.
EDIT: THIS IS WRONG! See answer below.
Note that this is primarily a learning exercise for me and although I would be interested in hearing criticisms/suggestions for improving my over all approach, my primary goal is to learn more about how to use decorators effectively.
Also note that I normally would not use capitalization for a decorator class but am using it here since it makes it easier to read on SO.
My assumption was incorrect.
As things are written, only one instance of Validator and ValidateKeys is created per class. I did not realize that the line #Validator(ValidateKeys, 'x','y') only runs once (at the time of class definition) and not at instance creation.
I should have realized this, since decorator expressions appear at the same level of hierarchy as class attributes, e.g.:
class MyClass():
class_attribute = None #only one class_attribute is created
#decorator #only one decorator (i.e., decorated method) is created
def method():
pass
Test:
class Validator(object):
def __init__(self, TheValidator, *args, **kwargs):
print("New Validator Object")
self.TheValidator = TheValidator(*args,**kwargs)
def __call__(self,f):
def wrapped_f(instance, *args,**kwargs):
self.TheValidator(instance, *args, **kwargs)
return f(instance,*args,**kwargs)
return wrapped_f
class ValidateKeys(object):
def __init__(self,*keysIterable):
print("New ValidateKeys Object")
self.validkeys = keysIterable
def __call__(self, instance, **kwargs):
for a in kwargs:
if not a in self.validkeys:
raise Exception()
instance.__dict__.update(kwargs)
class Node2D(object):
#property
def coords(self):
return self.__dict__
#coords.setter
def coords(self,Coords):
self.set_coords(**Coords)
#Validator(ValidateKeys, 'x','y')
def set_coords(self,**Coords):
pass
n1 = Node2D()
n2 = Node2D()
n1.setcoords(x=1,y=2)
n1.coords
Output:
'New Validator Object' #<-- Seen only once when module is loaded (class defined)
'New ValidateKeys Object' #<-- Seen only once when module is loaded (class defined)
'{'x': 1, 'y': 2}'
I do not have the problem I thought I had. Thanks to all for the help.

Decorators on abstract methods

In python, is there a way to make a decorator on an abstract method carry through to the derived implementation(s)?
For example, in
import abc
class Foo(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
#some_decorator
def my_method(self, x):
pass
class SubFoo(Foo):
def my_method(self, x):
print x
SubFoo's my_method won't get decorated with some_decorator as far as I can tell. Is there some way I can make this happen without having to individually decorate each derived class of Foo?
I would code it as two different methods just like in standard method factory pattern description.
https://www.oodesign.com/factory-method-pattern.html
class Foo(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
#some_decorator
def my_method(self, x):
self.child_method()
class SubFoo(Foo):
def child_method(self, x):
print x
This is, of course, possible. There is very little that can't be done in Python haha! I'll leave whether it's a good idea up to you...
class MyClass:
def myfunc():
raise NotImplemented()
def __getattribute__(self, name):
if name == "myfunc":
func = getattr(type(self), "myfunc")
return mydecorator(func)
return object.__getattribute__(self, name)
(Not tested for syntax yet, but should give you the idea)
As far as I know, this is not possible and not a good strategy in Python. Here's more explanation.
According to the abc documentation:
When abstractmethod() is applied in combination with other method descriptors, it should be applied as the innermost decorator, as shown in the following usage examples: ...
In other words, we could write your class like this (Python 3 style):
from abc import ABCMeta, abstractmethod
class AbstractClass(metclass=ABCMeta):
#property
#abstactmethod
def info(self):
pass
But then what? If you derive from AbstractClass and try to override the info property without specifying the #property decorator, that would create a great deal of confusion. Remember that properties (and it's only an example) usually use the same name for their class method, for concision's sake:
class Concrete(AbstractMethod):
#property
def info(self):
return
#info.setter
def info(self, new_info):
new_info
In this context, if you didn't repeat the #property and #info.setter decorators, that would create confusion. In Python terms, that won't work either, properties being placed on the class itself, not on the instance. In other words, I guess it could be done, but in the end, it would create confusing code that's not nearly as easy to read as repeating a few decorator lines, in my opinion.
My solution would be extending the superclass' method without overriding it.
import abc
class Foo(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
#some_decorator
def my_method(self, x):
pass
class SubFoo(Foo):
def my_method(self, x):
super().my_method(x) #delegating the call to the superclass
print x
Jinksy's answer did not work for me, but with a small modification it did (I use different names but the idea should be clear):
def my_decorator(func):
def wrapped(self, x, y):
print('start')
result = func(self, x, y)
print('end')
return result
return wrapped
class A(ABC):
#abstractmethod
def f(self, x, y):
pass
#my_decorator
def f_decorated(self, x, y):
return self.f(x, y)
class B(A):
def f(self, x, y):
return x + y
B().f_decorated(1, 3)
[Out:]
start
end
4
Notice that the important difference between this and what Jinksy wrote is that the abstract method is f, and when calling B().f_decorated it is the inherited, non-abstract method that gets called.
As I understand it, f_decorated can be properly defined because the abstractmethod decorator is not interfering with the decorator my_decorator.

python singleton class decorator

I have come across this singleton implementation here: http://blog.amir.rachum.com/post/21850841339/implementing-the-singleton-pattern-in-python in the first reply.
def singleton(cls):
return cls()
#singleton
class Foo(object):
def bar(self):
pass
if __name__ == '__main__':
print id(Foo)
print id(Foo)
But I don't understand the inner workings, the decorator returns a class instance, but why the same instance every time ?
You can rewrite that code to
class Foo(object):
pass
Foo = singleton(Foo)
# which is
Foo = Foo()
So here the name of the class is replaced by an instantiation of it. A bit cheesy in my opinion, especially since you can still create new objects of the same class by using Foo.__class__ and you are messing with the naming schema.
The singleton does that by holding internal state. This state here would probably be an instance of the class. The decorator can be something arbitrary.
Have a look at this:
http://hairysun.com/downloads/DecoratorHandout.pdf
class Decorator(object):
# in __init__ set up state
def __call__(self, function):
#functools.wraps(function)
def wrapper(*args, **kw): # 1.
print "before func"
result = function(*args, **kw) # 2.
print "after func"
return result
return wrapper # 3.
>>> decorator2 = Decorator()
>>> #decorator2
... def nothing(): pass
The decorator is essentially a function that
Defines a function
That calls the function that you passed in
Returns the newly 'wrapped' function to be called later
The surrounding class (here: the decorator) could do something like this:
class Singleton(object):
def __init__(self):
self.instance = None
def __call__(self, function):
#functools.wraps(function)
def wrapper(*args, **kw):
if self.instance is None:
self.instance = function(*args, **kw)
return self.instance
return wrapper
I did not run the code, but I assume this is in general how it works. If there is no instance available create one. If one is available, don't create a new one - return the single old one instead. One might probably want to check for other properties of the callable before using this in production.

Categories

Resources