I have a class B that inherits from A :
class A():
def do_something(self, x):
"""Prints x."""
print(x)
class B(A):
def something_else(self, x):
print("This isn't the same.")
I'd like to achieve a few things :
I'd like for B.do_something to inherit the docstring from A.do_something. I think functools.wraps is the recommended solution : is that right ?
Let's say there are some methods of A that return an instance of A. If I call those methods from B, I'd like them to return an instance of B. So far, I'm overloading each function manually.
def method_of_A(self, *args, **kwargs):
return A(super(self.__class__, self).method_of_A(*args, **kwargs))
There's likely a better way - especially given that I have to do this for a large number of classes. Is there same way to check if a function is defined within B and, if not but available in A, have it decorated / wrapped to return an instance of B ? EDIT : I can't make changes to A's codebase.
Are there solutions that are Py2 and Py3 compatible ?
Thanks very much for any suggestions.
Yes, you can use functools.wraps to copy the function name and docstring. You can return an instance of the current class using self.__class__
class A(object):
def func(self):
return self.__class__()
class B(A):
#functools.wraps(A.func)
def func(self):
return super(B, self).func()
>>> b = B()
>>> obj = b.return_object()
>>> print type(obj)
"<class '__main__.B'>"
Is there same way to check if a function is defined within B and, if not but available in A, have it decorated / wrapped to return an instance of B?
You may be able to do this using metaclasses, assuming A isn't already using a custom metaclass that you're not able to inherit from (like if it is only defined in C and hasn't been exposed to python). The way you use metaclasses is slightly different in python 2 and 3.
class MetaB(type):
def __new__(cls, name, bases, attrs):
if A in bases:
for attr, value in A.__dict__.items():
if isinstance(value, types.FunctionType) and attr not in attrs:
new_func = MyMeta.make_wrapper_func(value)
attrs[attr] = new_func
return super(MetaB, cls).__new__(cls, name, bases, attrs)
#staticmethod
def make_wrapper_func(func):
#functools.wraps(func)
def _func(self, *args, **kwargs):
value = func(self, *args, **kwargs)
if isinstance(value, A):
value = self.__class__(value)
return value
return _func
class B(A):
__metaclass__ = MetaB
...
In python 3, metaclasses are used a little differently
class B(A, metaclass=MetaB):
...
This assumes you can create an object of the B() type just by passing an instance of A() to the constructor for it (ie. return self.__class__(value)). That was just a guess. I'd have to know a litte more about your object to know how to translate an A object to a B object, but the general method would be the same. This solution also only works on regular class methods. It's not going to work on some other stuff like classmethods and staticmethods or other types of descriptor objects. You certainly could make it work for all those, your metaclass would just need to be a little more complex.
Let's say there are some methods of A that return an instance of A. If I call those methods from B, I'd like them to return an instance of B. So far, I'm overloading each function manually.
Use a classmethod.
class A(object):
#classmethod
def f(cls):
return cls
when b (instance of B) will call f, it will return B.
Related
I'd like to implement something like this
def after(f):
def inner(*args, **kwargs):
super().f(*args, **kwargs)
f(*args, **kwargs)
return inner
class A:
def f(self):
print ('hello')
class B(A):
#after
def f(self):
print ('world')
b = B()
b.f()
that is I would like to get rid of explicit super in some of my classes and replace it with #before / #after decorator (maybe with parameters).
That is, in this example, I would like hello world to be printed.
the idea is to increase the readability of the code, as in some classes I often use multiple inheritance, so I often override methods and often have to use super().
I think I could use inspect to determine the class instance that calls the decorator (although not sure about performance if I have many class instances).
is there a way to do this without sacrificing performance?
You can make your decorator work, you just need it to make it a descriptor class, rather than a function. You need to implement the __set_name__ method to get a reference to the class you've been added to. With the class reference, you can make a two-argument super call:
import functools
class after:
def __init__(self, method):
self.method = method
def __set_name__(self, owner, name):
self.owner = owner
self.name = name # using self.method.__name__ might be better?
def __get__(self, instance, owner):
if instance is None:
return self
return functools.partial(self, instance)
def __call__(self, instance, *args, **kwargs):
assert(self.owner is not None and self.name is not None)
getattr(super(self.owner, instance), self.name)(*args, **kwargs)
return self.method(instance, *args, **kwargs)
You could do a before too, which would be nearly the same, just with the last two lines in the reverse order (and some fiddling to handle the return value).
I'd note that this decorator is quite a bit less generally useful than calling super the normal way since you can't usefully interact with the value returned by the overridden method, or change the arguments being passed in to it. There's no before or after decorated method that can replicate these classes:
class Foo:
def foo(self, x, y):
return x + y
class Bar(Foo):
def foo(self, x, y, z):
return super().foo(x//2, y+1) * z
I have the following class structure:
class Base:
def z(self):
raise NotImplementedError()
class A(Base):
def z(self):
self._x()
return self._z()
def _x(self):
# do stuff
def _a(self):
raise NotImplementedError()
class B(Base)
def z(self):
self._x()
return self._z()
def _x(self):
# do stuff
def _z(self):
raise NotImplementedError()
class C(A):
def _z(self):
print(5)
class D(B):
def _z(self):
print(5)
The implementation of C(A) and D(B) is exactly the same and does not really care which class it inherits from. The conceptual difference is only in A and B (and these need to be kept as separate classes). Instead of writing separate definitions for C and D, I want to be able to dynamically inherit from A or B based on an argument provided at time of creating an instance of C/D (eventually C and D must be the same name).
It seems that metaclasses might work, but I am not sure how to pass an __init__ argument to the metaclass __new__ (and whether this will actually work). I would really prefer a solution which resolves the problem inside the class.
Have you considered using composition instead of inheritance? It seems like it is much more suitable for this use case. See the bottom of the answer for details.
Anyway,
class C(A): ......... class C(B): ..... is not even valid, and will result with only class C(B) getting defined.
I'm not sure a metaclass will be able to help you here. I believe the best way would be to use type but I'd love to be corrected.
A solution using type (and probably misusing locals() but that's not the point here)
class A:
def __init__(self):
print('Inherited from A')
class B:
def __init__(self):
print('Inherited from B')
class_to_inherit = input() # 'A' or 'B"
C = type('C', (locals()[class_to_inherit],), {})
C()
'A' or 'B'
>> A
Inherited from A
'A' or 'B'
>> B
Inherited from B
Composition
Tracking back to the question in the beginning of my answer, you state yourself that the implementation of both "C(A)" and "C(B)" is identical and they don't actually care about A or B. It seems more correct to me to use composition. Then you can do something along the lines of:
class A: pass
class B: pass
class C:
def __init__(self, obj): # obj is either A or B instance, or A or B themselves
self.obj = obj # or self.obj = obj() if obj is A or B themselves
c = C(A()) # or c = C(A)
In case C should expose the same API as A or B, C can overwrite __getattr__:
class A:
def foo(self):
print('foo')
class C:
def __init__(self, obj):
self.obj = obj
def __getattr__(self, item):
return getattr(self.obj, item)
C(A()).foo()
# foo
Lets say I have this class:
class Test(object):
def __init__(self, a):
self.a = a
def test(self, b):
if isinstance(self, Test):
return self.a + b
else:
return self + b
This would ideally in my world do this:
>>> Test.test(1,2)
3
>>> Test(1).test(2)
3
Now this doesn't work because you get this error:
TypeError: unbound method test() must be called with Test instance as first argument (got int instance instead)
In python3 this works fine, and I have the sneaking suspicion this is possible with a decorator in python2 but my python foo isn't strong enough to get that to work.
Plot Twist: So what happens when I need something on self when it's not called statically.
If you want something that will actually receive self if called on an instance, but can also be called on the class, writing your own descriptor type may be advisable:
import types
class ClassOrInstanceMethod(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __get__(self, instance, owner):
if instance is None:
instance = owner
return self.wrapped.__get__(instance, owner)
class demo(object):
#ClassOrInstanceMethod
def foo(self):
# self will be the class if this is called on the class
print(self)
Demo.
For the original version of your question, you could just write it like any other static method, with #staticmethod. Calling a static method on an instance works the same as calling it on the class:
class Test(object):
#staticmethod
def test(a, b):
return a + b
Demo.
In Python 3, if any value that is not an instance of cls is returned, the __init__ method is never called. So I can, for example, do this:
class Foo:
#staticmethod
def bar(n):
return n * 5
def __new__(cls, n):
return Foo.bar(n)
print(Foo(3)) # => 15
I was under the impression that the order was __call__ (if it's an instance) -> __new__ -> __init__.
However, in Python 2, this seems to raise a TypeError: this constructor takes no arguments due to the lack of an __init__. I can fix that by inheriting from object. So, running this:
class Foo:
def __new__(cls, *args, **kwargs):
print("new called")
def __init__(self, *args, **kwargs):
print("init called")
Foo()
"""
Python2: "init called"
Python3: "new called"
"""
In Python 2, I even messed around with metaclasses.
Meta = type("Meta", (type,), dict(__call__=lambda self, x: x * 5))
class Foo(object):
__metaclass__ = Meta
print(Foo(4)) # => 20
But this does not work in Python3 because the init/new methods seem to be reversed.
Is there any Python2/3 compatible way of doing this?
Solution:
This is the way I did it. I don't like it, but it works:
class Foo(object):
#staticmethod
def __call__(i):
return i * 5
def __new__(cls, i):
return Foo.__call__(i)
Surely there is a more pythonic way of doing this.
In Python 2, you need to use new-style classes to make classes work properly. That means you need to define your class as class Foo(object). Then your first example will work in both Python 2 and Python 3.
I need to refactor existing code by collapsing a method that's copy-and-pasted between various classed that inherit from one another into a single method.
So I produced the following code:
class A(object):
def rec(self):
return 1
class B(A):
def rec(self):
return self.rec_gen(B)
def rec_gen(self, rec_class):
return super(rec_class, self).rec() + 1
class C(B):
def rec(self):
return self.rec_gen(C)
if __name__=='__main__':
b = B(); c = C()
print c.rec()
print b.rec()
And the output:
3
2
What still bothers me is that in the 'rec' method I need to tell 'rec_gen' the context of the class in which it's running. Is there a way for 'rec_gen' to figure it out by itself in runtime?
This capability has been added to Python 3 - see PEP 3135. In a nutshell:
class B(A):
def rec(self):
return super().rec() + 1
I think you've created the convoluted rec()/rec_gen() setup because you couldn't automatically find the class, but in case you want that anyway the following should work:
class A(object):
def rec(self):
return 1
class B(A):
def rec(self):
# __class__ is a cell that is only created if super() is in the method
super()
return self.rec_gen(__class__)
def rec_gen(self, rec_class):
return super(rec_class, self).rec() + 1
class C(B):
def rec(self):
# __class__ is a cell that is only created if super() is in the method
super()
return self.rec_gen(__class__)
The simplest solution in Python 2 is to use a private member to hold the super object:
class B(A):
def __init__(self):
self.__super = super(B)
def rec(self):
return self.__super.rec() + 1
But that still suffers from the need to specify the actual class in one place, and if you happen to have two identically-named classes in the class hierarchy (e.g. from different modules) this method will break.
There were a couple of us who made recipes for automatic resolution for Python 2 prior to the existence of PEP 3135 - my method is at self.super on ActiveState. Basically, it allows the following:
class B(A, autosuper):
def rec(self):
return self.super().rec() + 1
or in the case that you're calling a parent method with the same name (the most common case):
class B(A, autosuper):
def rec(self):
return self.super() + 1
Caveats to this method:
It's quite slow. I have a version sitting around somewhere that does bytecode manipulation to improve the speed a lot.
It's not consistent with PEP 3135 (although it was a proposal for the Python 3 super at one stage).
It's quite complex.
It's a mix-in base class.
I don't know if the above would enable you to meet your requirements. With a small change to the recipe though you could find out what class you're in and pass that to rec_gen() - basically extract the class-finding code out of _getSuper() into its own method.
An alternative solution for python 2.x would be to use a metaclass to automatically define the rec method in all your subclasses:
class RecGen(type):
def __new__(cls, name, bases, dct):
new_cls = super(RecGen, cls).__new__(cls, name, bases, dct)
if bases != (object,):
def rec(self):
return super(new_cls, self).rec() + 1
new_cls.rec = rec
return new_cls
class A(object):
__metaclass__ = RecGen
def rec(self):
return 1
class B(A):
pass
class C(B):
pass
Note that if you're just trying to get something like the number of parent classes, it would be easier to use self.__class__.__mro__ directly:
class A(object):
def rec(self):
return len(self.__class__.__mro__)-1
class B(A):
pass
class C(B):
pass
I'm not sure exactly what you're trying to achieve, but if it is just to have a method that returns a different constant value for each class then use class attributes to store the value. It isn't clear at all from your example that you need to go anywhere near super().
class A(object):
REC = 1
def rec(self):
return self.REC
class B(A):
REC = 2
class C(B):
REC = 3
if __name__=='__main__':
b = B(); c = C()
print c.rec()
print b.rec()