In python, is there a way to make a decorator on an abstract method carry through to the derived implementation(s)?
For example, in
import abc
class Foo(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
#some_decorator
def my_method(self, x):
pass
class SubFoo(Foo):
def my_method(self, x):
print x
SubFoo's my_method won't get decorated with some_decorator as far as I can tell. Is there some way I can make this happen without having to individually decorate each derived class of Foo?
I would code it as two different methods just like in standard method factory pattern description.
https://www.oodesign.com/factory-method-pattern.html
class Foo(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
#some_decorator
def my_method(self, x):
self.child_method()
class SubFoo(Foo):
def child_method(self, x):
print x
This is, of course, possible. There is very little that can't be done in Python haha! I'll leave whether it's a good idea up to you...
class MyClass:
def myfunc():
raise NotImplemented()
def __getattribute__(self, name):
if name == "myfunc":
func = getattr(type(self), "myfunc")
return mydecorator(func)
return object.__getattribute__(self, name)
(Not tested for syntax yet, but should give you the idea)
As far as I know, this is not possible and not a good strategy in Python. Here's more explanation.
According to the abc documentation:
When abstractmethod() is applied in combination with other method descriptors, it should be applied as the innermost decorator, as shown in the following usage examples: ...
In other words, we could write your class like this (Python 3 style):
from abc import ABCMeta, abstractmethod
class AbstractClass(metclass=ABCMeta):
#property
#abstactmethod
def info(self):
pass
But then what? If you derive from AbstractClass and try to override the info property without specifying the #property decorator, that would create a great deal of confusion. Remember that properties (and it's only an example) usually use the same name for their class method, for concision's sake:
class Concrete(AbstractMethod):
#property
def info(self):
return
#info.setter
def info(self, new_info):
new_info
In this context, if you didn't repeat the #property and #info.setter decorators, that would create confusion. In Python terms, that won't work either, properties being placed on the class itself, not on the instance. In other words, I guess it could be done, but in the end, it would create confusing code that's not nearly as easy to read as repeating a few decorator lines, in my opinion.
My solution would be extending the superclass' method without overriding it.
import abc
class Foo(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
#some_decorator
def my_method(self, x):
pass
class SubFoo(Foo):
def my_method(self, x):
super().my_method(x) #delegating the call to the superclass
print x
Jinksy's answer did not work for me, but with a small modification it did (I use different names but the idea should be clear):
def my_decorator(func):
def wrapped(self, x, y):
print('start')
result = func(self, x, y)
print('end')
return result
return wrapped
class A(ABC):
#abstractmethod
def f(self, x, y):
pass
#my_decorator
def f_decorated(self, x, y):
return self.f(x, y)
class B(A):
def f(self, x, y):
return x + y
B().f_decorated(1, 3)
[Out:]
start
end
4
Notice that the important difference between this and what Jinksy wrote is that the abstract method is f, and when calling B().f_decorated it is the inherited, non-abstract method that gets called.
As I understand it, f_decorated can be properly defined because the abstractmethod decorator is not interfering with the decorator my_decorator.
Related
I am trying to find a good way for returning a (new) class object in class method that can be extended as well.
I have a class (classA) which has among other methods, a method that returns a new classA object after some processing
class classA:
def __init__(): ...
def methodX(self, **kwargs):
process data
return classA(new params)
Now, I am extending this class to another classB. I need methodX to do the same, but return classB this time, instead of classA
class classB(classA):
def __init__(self, params):
super().__init__(params)
self.newParams = XYZ
def methodX(self, **kwargs):
???
This may be something trivial but I simply cannot figure it out. In the end I dont want to rewrite the methodX each time the class gets extended.
Thank you for your time.
Use the __class__ attribute like this:
class A:
def __init__(self, **kwargs):
self.kwargs = kwargs
def methodX(self, **kwargs):
#do stuff with kwargs
return self.__class__(**kwargs)
def __repr__(self):
return f'{self.__class__}({self.kwargs})'
class B(A):
pass
a = A(foo='bar')
ax = a.methodX(gee='whiz')
b = B(yee='haw')
bx = b.methodX(cool='beans')
print(a)
print(ax)
print(b)
print(bx)
class classA:
def __init__(self, x):
self.x = x
def createNew(self, y):
t = type(self)
return t(y)
class classB(classA):
def __init__(self, params):
super().__init__(params)
a = classA(1)
newA = a.createNew(2)
b = classB(1)
newB = b.createNew(2)
print(type(newB))
# <class '__main__.classB'>
I want to propose what I think is the cleanest approach, albeit similar to existing answers. The problem feels like a good fit for a class method:
class A:
#classmethod
def method_x(cls, **kwargs):
return cls(<init params>)
Using the #classmethod decorator ensures that the first input (traditionally named cls) will refer to the Class to which the method belongs, rather than the instance.
(usually we call the first method input self and this refers to the instance to which the method belongs)
Because cls refers to A, rather than an instance of A, we can call cls() as we would call A().
However, in a class that inherits from A, cls will instead refer to the child class, as required:
class A:
def __init__(self, x):
self.x = x
#classmethod
def make_new(cls, **kwargs):
y = kwargs["y"]
return cls(y) # returns A(y) here
class B(A):
def __init__(self, x):
super().__init__(x)
self.z = 3 * x
inst = B(1).make_new(y=7)
print(inst.x, inst.z)
And now you can expect that print statement to produce 7 21.
That inst.z exists should confirm for you that the make_new call (which was only defined on A and inherited unaltered by B) has indeed made an instance of B.
However, there's something I must point out. Inheriting the unaltered make_new method only works because the __init__ method on B has the same call signature as the method on A. If this weren't the case then the call to cls might have had to be altered.
This can be circumvented by allowing **kwargs on the __init__ method and passing generic **kwargs into cls() in the parent class:
class A:
def __init__(self, **kwargs):
self.x = kwargs["x"]
#classmethod
def make_new(cls, **kwargs):
return cls(**kwargs)
class B(A):
def __init__(self, x, w):
super().__init__(x=x)
self.w = w
inst = B(1,2).make_new(x="spam", w="spam")
print(inst.x, inst.w)
Here we were able to give B a different (more restrictive!) signature.
This illustrates a general principle, which is that parent classes will typically be more abstract/less specific than their children.
It follows that, if you want two classes that substantially share behaviour but which do quite specific different things, it will be better to create three classes: one rather abstract one that defines the behaviour-in-common, and two children that give you the specific behaviours you want.
An upstream interface was given to me with all of its functions defined as non-abstract when in reality they should be decorated with #abstractmethods. I want to receive an error when I did not implement one of its functions when it's called. To do this, I would create a wrapper class and manually go through each of its defined functions and do something like this:
from abc import ABC, abstractmethod
class Foo(object):
def foo(self):
print("Foo")
class AbstractFoo(Foo, ABC):
#abstractmethod
def foo(self):
return super().foo()
class ConcreteFoo(AbstractFoo):
def foo(self):
print("Concrete Foo")
super().foo()
f = ConcreteFoo()
f.foo()
Which outputs:
Concrete Foo
Foo
I would like some way of just doing this to all functions defined by Foo. Obviously, inherited magic functions like __str__ and __repr__ should be forwarded appropriately.
Does anyone know a nice, pythonic way of doing this?
def validate_base_class_implemntation(cls):
base_cls_funcs = []
for attr in cls.__bases__[0].__dict__:
if callable(getattr(cls, attr)):
base_cls_funcs.append(attr)
cls_funcs = []
for attr in cls.__dict__:
if callable(getattr(cls, attr)):
cls_funcs.append(attr)
missing_funcs = [x for x in base_cls_funcs if x not in cls_funcs]
if len(missing_funcs) > 0:
print("Not implemented functions are: {}".format(','.join(missing_funcs)))
raise Exception("Not implement function exception!")
return cls
class Foo(object):
def foo(self):
print("Foo")
def boo(self):
print("Wow")
#validate_base_class_implemntation
class ConcreteFoo(Foo):
def foo(self):
print("Concrete Foo")
super().foo()
f = ConcreteFoo()
f.foo()
Not sure in 100% if that what you meant.
this decorator checks that the class decorated implements all the base class function(in your case, they are not decorated with abstract). if there is a function that your decorated class does not implement, it raises exception.
You can modify the original class Foo and turn all its methods into abstract methods and then define a blank subclass of Foo with metaclass=ABCMeta in order to handle the checks:
from abc import ABCMeta, abstractmethod
from types import FunctionType
class AbstractFoo(Foo, metaclass=ABCMeta):
pass
names = set()
for k, v in vars(Foo).items():
if k.startswith('__') and k.endswith('__'):
continue
elif isinstance(v, FunctionType):
names.add(k)
v.__isabstractmethod__ = True
AbstractFoo.__abstractmethods__ = frozenset(names)
Side note: This approach relies on dunder attributes being used by abc and as such can break without deprecation.
I'd like to implement something like this
def after(f):
def inner(*args, **kwargs):
super().f(*args, **kwargs)
f(*args, **kwargs)
return inner
class A:
def f(self):
print ('hello')
class B(A):
#after
def f(self):
print ('world')
b = B()
b.f()
that is I would like to get rid of explicit super in some of my classes and replace it with #before / #after decorator (maybe with parameters).
That is, in this example, I would like hello world to be printed.
the idea is to increase the readability of the code, as in some classes I often use multiple inheritance, so I often override methods and often have to use super().
I think I could use inspect to determine the class instance that calls the decorator (although not sure about performance if I have many class instances).
is there a way to do this without sacrificing performance?
You can make your decorator work, you just need it to make it a descriptor class, rather than a function. You need to implement the __set_name__ method to get a reference to the class you've been added to. With the class reference, you can make a two-argument super call:
import functools
class after:
def __init__(self, method):
self.method = method
def __set_name__(self, owner, name):
self.owner = owner
self.name = name # using self.method.__name__ might be better?
def __get__(self, instance, owner):
if instance is None:
return self
return functools.partial(self, instance)
def __call__(self, instance, *args, **kwargs):
assert(self.owner is not None and self.name is not None)
getattr(super(self.owner, instance), self.name)(*args, **kwargs)
return self.method(instance, *args, **kwargs)
You could do a before too, which would be nearly the same, just with the last two lines in the reverse order (and some fiddling to handle the return value).
I'd note that this decorator is quite a bit less generally useful than calling super the normal way since you can't usefully interact with the value returned by the overridden method, or change the arguments being passed in to it. There's no before or after decorated method that can replicate these classes:
class Foo:
def foo(self, x, y):
return x + y
class Bar(Foo):
def foo(self, x, y, z):
return super().foo(x//2, y+1) * z
Let's say I have the following two classes
class A:
def own_method(self):
pass
def descendant_method(self):
pass
class B(A):
pass
and I want descendant_method to be callable from instances of B, but not of A, and own_method to be callable from everywhere.
I can think of several solutions, all unsatisfactory:
Check some field and manually raise NotImplementedError:
class A:
def __init__(self):
self.some_field = None
def own_method(self):
pass
def descendant_method(self):
if self.some_field is None:
raise NotImplementedError
class B(A):
def __init__(self):
super(B, self).__init__()
self.some_field = 'B'
pass
But this is modifying the method's runtime behaviour, which I don't want to do
Use a mixin:
class A:
def own_method(self):
pass
class AA:
def descendant_method(self):
pass
class B(AA, A):
pass
This is nice as long as descendant_method doesn't use much from A, or else we'll have to inherit AA(A) and this defies the whole point
make method private in A and redefine it in a metaclass:
class A:
def own_method(self):
pass
def __descendant_method(self):
pass
class AMeta(type):
def __new__(mcs, name, parents, dct):
par = parents[0]
desc_method_private_name = '_{}__descendant_method'.format(par.__name__)
if desc_method_private_name in par.__dict__:
dct['descendant_method'] = par.__dict__[desc_method_private_name]
return super(AMeta, mcs).__new__(mcs, name, parents, dct)
class B(A, metaclass=AMeta):
def __init__(self):
super(B, self).__init__()
This works, but obviously looks dirty, just like writing self.descendant_method = self._A__descendant_method in B itself.
What would be the right "pythonic" way of achieving this behaviour?
UPD: putting the method directly in B would work, of course, but I expect that A will have many descendants that will use this method and do not want to define it in every subclass.
What is so bad about making AA inherit from A? It's basically an abstract base class that adds additional functionality that isn't meant to be available in A. If you really don't want AA to ever be instantiated then the pythonic answer is not to worry about it, and just document that the user isn't meant to do that. Though if you're really insistent you can define __new__ to throw an error if the user tries to instantiate AA.
class A:
def f(self):
pass
class AA(A):
def g(self):
pass
def __new__(cls, *args, **kwargs):
if cls is AA:
raise TypeError("AA is not meant to be instansiated")
return super().__new__(cls)
class B(AA):
pass
Another alternative might be to make AA an Abstract Base Class. For this to work you will need to define at least one method as being abstract -- __init__ could do if there are no other methods you want to say are abstract.
from abc import ABCMeta, abstractmethod
class A:
def __init__(self, val):
self.val = val
def f(self):
pass
class AA(A, metaclass=ABCMeta):
#abstractmethod
def __init__(self, val):
super().__init__(val)
def g(self):
pass
class B(AA):
def __init__(self, val):
super().__init__(val)
Very finally, what's so bad about having the descendant method available on A, but just not using it. You are writing the code for A, so just don't use the method... You could even document the method that it isn't meant to be used directly by A, but is rather meant to be available to child classes. That way future developers will know your intentions.
As far as I can tell, this may be the most Pythonic way of accomplishing what you want:
class A:
def own_method(self):
pass
def descendant_method(self):
raise NotImplementedError
class B(A):
def descendant_method(self):
...
Another option could be the following:
class A:
def own_method(self):
pass
def _descendant_method(self):
pass
class B(A):
def descendant_method(self):
return self._descendant_method(self)
They're both Pythonic because it's explicit, readable, clear and concise.
It's explicit because it's not doing any unnecessary magic.
It's readable because
one can tell precisely what your doing, and what your intention was
at first glance.
It's clear because the leading single underscore is
a widely used convention in the Python community for private
(non-magic) methods—any developer that uses it should know to tread
with caution.
Choosing between one of these approaches will depend on how you intend on your use case. A more concrete example in your question would be helpful.
Try to check the class name using __class__.__name__ .
class A(object):
def descendant_method(self):
if self.__class__.__name__ == A.__name__:
raise NotImplementedError
print 'From descendant'
class B(A):
pass
b = B()
b.descendant_method()
a = A()
a.descendant_method()
I need to refactor existing code by collapsing a method that's copy-and-pasted between various classed that inherit from one another into a single method.
So I produced the following code:
class A(object):
def rec(self):
return 1
class B(A):
def rec(self):
return self.rec_gen(B)
def rec_gen(self, rec_class):
return super(rec_class, self).rec() + 1
class C(B):
def rec(self):
return self.rec_gen(C)
if __name__=='__main__':
b = B(); c = C()
print c.rec()
print b.rec()
And the output:
3
2
What still bothers me is that in the 'rec' method I need to tell 'rec_gen' the context of the class in which it's running. Is there a way for 'rec_gen' to figure it out by itself in runtime?
This capability has been added to Python 3 - see PEP 3135. In a nutshell:
class B(A):
def rec(self):
return super().rec() + 1
I think you've created the convoluted rec()/rec_gen() setup because you couldn't automatically find the class, but in case you want that anyway the following should work:
class A(object):
def rec(self):
return 1
class B(A):
def rec(self):
# __class__ is a cell that is only created if super() is in the method
super()
return self.rec_gen(__class__)
def rec_gen(self, rec_class):
return super(rec_class, self).rec() + 1
class C(B):
def rec(self):
# __class__ is a cell that is only created if super() is in the method
super()
return self.rec_gen(__class__)
The simplest solution in Python 2 is to use a private member to hold the super object:
class B(A):
def __init__(self):
self.__super = super(B)
def rec(self):
return self.__super.rec() + 1
But that still suffers from the need to specify the actual class in one place, and if you happen to have two identically-named classes in the class hierarchy (e.g. from different modules) this method will break.
There were a couple of us who made recipes for automatic resolution for Python 2 prior to the existence of PEP 3135 - my method is at self.super on ActiveState. Basically, it allows the following:
class B(A, autosuper):
def rec(self):
return self.super().rec() + 1
or in the case that you're calling a parent method with the same name (the most common case):
class B(A, autosuper):
def rec(self):
return self.super() + 1
Caveats to this method:
It's quite slow. I have a version sitting around somewhere that does bytecode manipulation to improve the speed a lot.
It's not consistent with PEP 3135 (although it was a proposal for the Python 3 super at one stage).
It's quite complex.
It's a mix-in base class.
I don't know if the above would enable you to meet your requirements. With a small change to the recipe though you could find out what class you're in and pass that to rec_gen() - basically extract the class-finding code out of _getSuper() into its own method.
An alternative solution for python 2.x would be to use a metaclass to automatically define the rec method in all your subclasses:
class RecGen(type):
def __new__(cls, name, bases, dct):
new_cls = super(RecGen, cls).__new__(cls, name, bases, dct)
if bases != (object,):
def rec(self):
return super(new_cls, self).rec() + 1
new_cls.rec = rec
return new_cls
class A(object):
__metaclass__ = RecGen
def rec(self):
return 1
class B(A):
pass
class C(B):
pass
Note that if you're just trying to get something like the number of parent classes, it would be easier to use self.__class__.__mro__ directly:
class A(object):
def rec(self):
return len(self.__class__.__mro__)-1
class B(A):
pass
class C(B):
pass
I'm not sure exactly what you're trying to achieve, but if it is just to have a method that returns a different constant value for each class then use class attributes to store the value. It isn't clear at all from your example that you need to go anywhere near super().
class A(object):
REC = 1
def rec(self):
return self.REC
class B(A):
REC = 2
class C(B):
REC = 3
if __name__=='__main__':
b = B(); c = C()
print c.rec()
print b.rec()