Inheritance in Python Such That All Base Functions Are Called - python

Basically, what I want is to do this:
class B:
def fn(self):
print 'B'
class A:
def fn(self):
print 'A'
#extendInherit
class C(A,B):
pass
c=C()
c.fn()
And have the output be
A
B
How would I implement the extendInherit decorator?

This is not a job for decorators. You want to completely change the normal behaviour of a class, so this is actually a job for a metaclass.
import types
class CallAll(type):
""" MetaClass that adds methods to call all superclass implementations """
def __new__(meta, clsname, bases, attrs):
## collect a list of functions defined on superclasses
funcs = {}
for base in bases:
for name, val in vars(base).iteritems():
if type(val) is types.FunctionType:
if name in funcs:
funcs[name].append( val )
else:
funcs[name] = [val]
## now we have all methods, so decorate each of them
for name in funcs:
def caller(self, *args,**kwargs):
""" calls all baseclass implementations """
for func in funcs[name]:
func(self, *args,**kwargs)
attrs[name] = caller
return type.__new__(meta, clsname, bases, attrs)
class B:
def fn(self):
print 'B'
class A:
def fn(self):
print 'A'
class C(A,B, object):
__metaclass__=CallAll
c=C()
c.fn()

A metaclass is a possible solution, but somewhat complex. super can do it very simply (with new style classes of course: there's no reason to use legacy classes in new code!):
class B(object):
def fn(self):
print 'B'
try: super(B, self).fn()
except AttributeError: pass
class A(object):
def fn(self):
print 'A'
try: super(A, self).fn()
except AttributeError: pass
class C(A, B): pass
c = C()
c.fn()
You need the try/except to support any order of single or multiple inheritance (since at some point there will be no further base along the method-resolution-order, MRO, defining a method named fn, you need to catch and ignore the resulting AttributeError). But as you see, differently from what you appear to think based on your comment to a different answer, you don't necessarily need to override fn in your leafmost class unless you need to do something specific to that class in such an override -- super works fine on purely inherited (not overridden) methods, too!

I personally wouldn't try doing this with a decorator since using new-style classes and super(), the following can be achieved:
>>> class A(object):
... def __init__(self):
... super(A, self).__init__()
... print "A"
...
>>> class B(object):
... def __init__(self):
... super(B, self).__init__()
... print "B"
...
>>> class C(A, B):
... def __init__(self):
... super(C, self).__init__()
...
>>> foo = C()
B
A
I'd imagine method invocations would work the same way.

Related

How to dynamically inherit at initialization time?

I have the following class structure:
class Base:
def z(self):
raise NotImplementedError()
class A(Base):
def z(self):
self._x()
return self._z()
def _x(self):
# do stuff
def _a(self):
raise NotImplementedError()
class B(Base)
def z(self):
self._x()
return self._z()
def _x(self):
# do stuff
def _z(self):
raise NotImplementedError()
class C(A):
def _z(self):
print(5)
class D(B):
def _z(self):
print(5)
The implementation of C(A) and D(B) is exactly the same and does not really care which class it inherits from. The conceptual difference is only in A and B (and these need to be kept as separate classes). Instead of writing separate definitions for C and D, I want to be able to dynamically inherit from A or B based on an argument provided at time of creating an instance of C/D (eventually C and D must be the same name).
It seems that metaclasses might work, but I am not sure how to pass an __init__ argument to the metaclass __new__ (and whether this will actually work). I would really prefer a solution which resolves the problem inside the class.
Have you considered using composition instead of inheritance? It seems like it is much more suitable for this use case. See the bottom of the answer for details.
Anyway,
class C(A): ......... class C(B): ..... is not even valid, and will result with only class C(B) getting defined.
I'm not sure a metaclass will be able to help you here. I believe the best way would be to use type but I'd love to be corrected.
A solution using type (and probably misusing locals() but that's not the point here)
class A:
def __init__(self):
print('Inherited from A')
class B:
def __init__(self):
print('Inherited from B')
class_to_inherit = input() # 'A' or 'B"
C = type('C', (locals()[class_to_inherit],), {})
C()
'A' or 'B'
>> A
Inherited from A
'A' or 'B'
>> B
Inherited from B
Composition
Tracking back to the question in the beginning of my answer, you state yourself that the implementation of both "C(A)" and "C(B)" is identical and they don't actually care about A or B. It seems more correct to me to use composition. Then you can do something along the lines of:
class A: pass
class B: pass
class C:
def __init__(self, obj): # obj is either A or B instance, or A or B themselves
self.obj = obj # or self.obj = obj() if obj is A or B themselves
c = C(A()) # or c = C(A)
In case C should expose the same API as A or B, C can overwrite __getattr__:
class A:
def foo(self):
print('foo')
class C:
def __init__(self, obj):
self.obj = obj
def __getattr__(self, item):
return getattr(self.obj, item)
C(A()).foo()
# foo

Method accessible only from class descendants in python

Let's say I have the following two classes
class A:
def own_method(self):
pass
def descendant_method(self):
pass
class B(A):
pass
and I want descendant_method to be callable from instances of B, but not of A, and own_method to be callable from everywhere.
I can think of several solutions, all unsatisfactory:
Check some field and manually raise NotImplementedError:
class A:
def __init__(self):
self.some_field = None
def own_method(self):
pass
def descendant_method(self):
if self.some_field is None:
raise NotImplementedError
class B(A):
def __init__(self):
super(B, self).__init__()
self.some_field = 'B'
pass
But this is modifying the method's runtime behaviour, which I don't want to do
Use a mixin:
class A:
def own_method(self):
pass
class AA:
def descendant_method(self):
pass
class B(AA, A):
pass
This is nice as long as descendant_method doesn't use much from A, or else we'll have to inherit AA(A) and this defies the whole point
make method private in A and redefine it in a metaclass:
class A:
def own_method(self):
pass
def __descendant_method(self):
pass
class AMeta(type):
def __new__(mcs, name, parents, dct):
par = parents[0]
desc_method_private_name = '_{}__descendant_method'.format(par.__name__)
if desc_method_private_name in par.__dict__:
dct['descendant_method'] = par.__dict__[desc_method_private_name]
return super(AMeta, mcs).__new__(mcs, name, parents, dct)
class B(A, metaclass=AMeta):
def __init__(self):
super(B, self).__init__()
This works, but obviously looks dirty, just like writing self.descendant_method = self._A__descendant_method in B itself.
What would be the right "pythonic" way of achieving this behaviour?
UPD: putting the method directly in B would work, of course, but I expect that A will have many descendants that will use this method and do not want to define it in every subclass.
What is so bad about making AA inherit from A? It's basically an abstract base class that adds additional functionality that isn't meant to be available in A. If you really don't want AA to ever be instantiated then the pythonic answer is not to worry about it, and just document that the user isn't meant to do that. Though if you're really insistent you can define __new__ to throw an error if the user tries to instantiate AA.
class A:
def f(self):
pass
class AA(A):
def g(self):
pass
def __new__(cls, *args, **kwargs):
if cls is AA:
raise TypeError("AA is not meant to be instansiated")
return super().__new__(cls)
class B(AA):
pass
Another alternative might be to make AA an Abstract Base Class. For this to work you will need to define at least one method as being abstract -- __init__ could do if there are no other methods you want to say are abstract.
from abc import ABCMeta, abstractmethod
class A:
def __init__(self, val):
self.val = val
def f(self):
pass
class AA(A, metaclass=ABCMeta):
#abstractmethod
def __init__(self, val):
super().__init__(val)
def g(self):
pass
class B(AA):
def __init__(self, val):
super().__init__(val)
Very finally, what's so bad about having the descendant method available on A, but just not using it. You are writing the code for A, so just don't use the method... You could even document the method that it isn't meant to be used directly by A, but is rather meant to be available to child classes. That way future developers will know your intentions.
As far as I can tell, this may be the most Pythonic way of accomplishing what you want:
class A:
def own_method(self):
pass
def descendant_method(self):
raise NotImplementedError
class B(A):
def descendant_method(self):
...
Another option could be the following:
class A:
def own_method(self):
pass
def _descendant_method(self):
pass
class B(A):
def descendant_method(self):
return self._descendant_method(self)
They're both Pythonic because it's explicit, readable, clear and concise.
It's explicit because it's not doing any unnecessary magic.
It's readable because
one can tell precisely what your doing, and what your intention was
at first glance.
It's clear because the leading single underscore is
a widely used convention in the Python community for private
(non-magic) methods—any developer that uses it should know to tread
with caution.
Choosing between one of these approaches will depend on how you intend on your use case. A more concrete example in your question would be helpful.
Try to check the class name using __class__.__name__ .
class A(object):
def descendant_method(self):
if self.__class__.__name__ == A.__name__:
raise NotImplementedError
print 'From descendant'
class B(A):
pass
b = B()
b.descendant_method()
a = A()
a.descendant_method()

Force implementation of a method in all inheriting classes

I have a situation in which I want to enforce each and every class inheriting from a certain (abstract) class to implement a method. This is something I would normally achieve using #abstractmethod. However, considering this situation of multiple inheritance:
from abc import ABCMeta, abstractmethod
class A(object):
__metaclass__ = ABCMeta
#abstractmethod
def very_specific_method(self):
pass
class B(A):
def very_specific_method(self):
print 'doing something in B'
class C(B):
pass
I want to enforce C to implement the method as well. I want each and every class that inherits A either directly or indirectly to be forced to implement the method. Is this possible?
Clarification: I want this to apply for a specific method, not to all abstract methods. abstract methods should continue to work the same, but perhaps a new decorator signaling a different kind of methods should be created.
Side note: I used abc in the question because this seems like the most related to the issue. I understand how abstract methods usually work and use them regularly. This is a different situation, and I don't mind if it's not done via abc.
A modified version of ABCMeta should do the trick.
Here instead of checking for methods with __isabstractmethod__ set to True only in base classes we can check for this is in class's MRO, and if it is found in any of the class in MRO and it is not present in current class then we can add this to the set abstracts.
from abc import ABCMeta, abstractmethod
from _weakrefset import WeakSet
class EditedABCMeta(ABCMeta):
def __new__(mcls, name, bases, namespace):
cls = type.__new__(mcls, name, bases, namespace)
# Compute set of abstract method names
abstracts = set(name
for name, value in namespace.items()
if getattr(value, "__isabstractmethod__", False))
for base in cls.__mro__:
for name, value in base.__dict__.items():
if getattr(value, "__isabstractmethod__", False) and name not in cls.__dict__:
abstracts.add(name)
cls.__abstractmethods__ = frozenset(abstracts)
# Set up inheritance registry
cls._abc_registry = WeakSet()
cls._abc_cache = WeakSet()
cls._abc_negative_cache = WeakSet()
cls._abc_negative_cache_version = ABCMeta._abc_invalidation_counter
return cls
class A(object):
__metaclass__ = EditedABCMeta
#abstractmethod
def veryspecificmethod(self):
pass
class B(A):
def veryspecificmethod(self):
print 'doing something in B'
#abstractmethod
def foo(self):
print 'foo from B'
class C(B):
def foo(self):
pass
class D(C, B):
pass
if __name__ == '__main__':
for cls in (C, D):
try:
cls().veryspecificmethod
except TypeError as e:
print e.message
print '-'*20
for cls in (C, D):
try:
cls().foo
except TypeError as e:
print e.message
Output:
Can't instantiate abstract class C with abstract methods veryspecificmethod
Can't instantiate abstract class D with abstract methods foo, veryspecificmethod
--------------------
Can't instantiate abstract class C with abstract methods veryspecificmethod
Can't instantiate abstract class D with abstract methods foo, veryspecificmethod
EDIT:
Adding a special decorator #enforcedmethod that can meet your requirements without affecting #abstractmethod:
from abc import ABCMeta, abstractmethod
def enforcedmethod(func):
func.__enforcedmethod__ = True
return func
class EditedABCMeta(ABCMeta):
def __call__(cls, *args, **kwargs):
enforcedmethods = set()
for base in cls.__mro__:
for name, value in base.__dict__.items():
if getattr(value, "__enforcedmethod__", False) and name not in cls.__dict__:
enforcedmethods.add(name)
if enforcedmethods:
raise TypeError("Can't instantiate abstract class {} "
"with enforced methods {}".format(
cls.__name__, ', '.join(enforcedmethods)))
else:
return super(EditedABCMeta, cls).__call__(*args, **kwargs)
class A(object):
__metaclass__ = EditedABCMeta
#enforcedmethod
def veryspecificmethod(self):
pass
#abstractmethod
def simplemethod(self):
pass
class B(A):
def veryspecificmethod(self):
print 'doing something in B'
def simplemethod(self):
pass
class C(B):
pass
class D(C):
def veryspecificmethod(self):
print 'doing something in D'
Output:
>>> D().veryspecificmethod()
doing something in D
>>> C().veryspecificmethod()
Traceback (most recent call last):
File "<pyshell#23>", line 1, in <module>
C().veryspecificmethod()
File "C:\Python27\so.py", line 19, in __call__
cls.__name__, ', '.join(enforcedmethods)))
TypeError: Can't instantiate abstract class C with enforced methods veryspecificmethod
I'm pretty sure that this isn't a great idea, but I think that you can do this. Checking out the ABCMeta implementation for inspiration:
from abc import ABCMeta
def always_override(func):
func._always_override = True
return func
class always_override_property(property):
_always_override = True
class CrazyABCMeta(ABCMeta):
def __new__(mcls, name, bases, namespace):
cls = super(ABCMeta, mcls).__new__(mcls, name, bases, namespace)
abstracts = set()
# first, get all abstracts from the base classes
for base in bases:
abstracts.update(getattr(base, "_all_always_override", set()))
all_abstracts = abstracts.copy()
# Now add abstracts from this class and remove abstracts that this class defines
for name, value in namespace.items():
always_override = getattr(value, '_always_override', False)
if always_override:
abstracts.add(name)
all_abstracts.add(name)
elif name in abstracts:
abstracts.remove(name)
cls._all_always_override = frozenset(all_abstracts)
cls._always_override = frozenset(abstracts)
return cls
def __call__(cls, *args, **kwargs):
if cls._always_override:
raise TypeError(
'The following methods/properties must '
'be overridden {}'.format(cls._all_always_override))
return super(CrazyABCMeta, cls).__call__(*args, **kwargs)
# # # # # # # # # # #
# TESTS!
# # # # # # # # # # #
class A(object):
__metaclass__ = CrazyABCMeta
#always_override
def foo(self):
pass
#always_override_property
def bar(self):
pass
class B(A):
def foo(self):
pass
bar = 1
class C(B):
pass
class D(C):
pass
class E(D):
def foo(self):
pass
#property
def bar(self):
return 6
for cls in (B, E):
cls()
print ("Pass {}".format(cls.__name__))
for cls in (C, D):
try:
print cls()
except TypeError:
print ("Pass {}".format(cls.__name__))

getattr and multiple classes

Hi everyone I have two classes A and B and I want to grab a method from B to use in A.
My code is along the lines of:
class A(object):
__init__(self, args):
'blah'
def func2(self, args):
#method = B.func1(args)
# method = getattr(B, 'func1')
class B(object):
__init__(self):
'do stuff'
def func1(self, args):
'Do stuff here'
return
Is there a way to get func1 into A without removing the self attribute from func1?
Neither of the two method calls are working for me and I keep getting a type error
TypeError: unbound method func1 must be called with B instance as
first argument (got NoneType instance instead)
edit: found my solution
I found the solution to my question. When I passed values from B to A I needed to pass my instance for B as well. So in my init for A
class A(object):
__init__( args, B_arg):
And in class B
class B(object):
def passattributes():
c = A( args, self )
I'm going to guess that since you're trying to do what you're trying to do, the appropriate thing to do in your case is to define B.func1 as a classmethod, because you don't expect it to require an instance of class B.
#classmethod
def func1(cls, args):
'blah
You could create an instance of B in the A class:
class A(object):
def __init__(self):
'blah'
self.bInst = B()
def func2(self, args):
method = self.bInst.func1('func1')
class B(object):
def __init__(self):
'do stuff'
def func1(self, args):
print 'Do stuff here'
return
aInst = A()
aInst.func2('some arg')
Result:
Do stuff here

Implicitly invoking parent class initializer

class A(object):
def __init__(self, a, b, c):
#super(A, self).__init__()
super(self.__class__, self).__init__()
class B(A):
def __init__(self, b, c):
print super(B, self)
print super(self.__class__, self)
#super(B, self).__init__(1, b, c)
super(self.__class__, self).__init__(1, b, c)
class C(B):
def __init__(self, c):
#super(C, self).__init__(2, c)
super(self.__class__, self).__init__(2, c)
C(3)
In the above code, the commented out __init__ calls appear to the be the commonly accepted "smart" way to do super class initialization. However in the event that the class hierarchy is likely to change, I have been using the uncommented form, until recently.
It appears that in the call to the super constructor for B in the above hierarchy, that B.__init__ is called again, self.__class__ is actually C, not B as I had always assumed.
Is there some way in Python-2.x that I can maintain proper MRO (with respect to initializing all parent classes in the correct order) when calling super constructors while not naming the current class (the B in in super(B, self).__init__(1, b, c))?
Short answer: no, there's no way to implicitly invoke the right __init__ with the right arguments of the right parent class in Python 2.x.
Incidentally, the code as shown here is incorrect: if you use super().__init__, then all classes in your hierarchy must have the same signature in their __init__ methods. Otherwise your code can stop working if you introduce a new subclass that uses multiple inheritance.
See http://fuhm.net/super-harmful/ for a longer description of the issue (with pictures).
Your code has nothing to do with method resolution order. Method resolution comes in the case of multiple inheritance which is not the case of your example. Your code is simply wrong because you assume that self.__class__ is actually the same class of the one where the method is defined and this is wrong:
>>> class A(object):
... def __init__(self):
... print self.__class__
...
>>>
>>> class B(A):
... def __init__(self):
... A.__init__(self)
...
>>> B()
<class '__main__.B'>
<__main__.B object at 0x1bcfed0>
>>> A()
<class '__main__.A'>
<__main__.A object at 0x1bcff90>
>>>
so when you should call:
super(B, self).__init__(1, b, c)
you are indeed calling:
# super(self.__class__, self).__init__(1, b, c)
super(C, self).__init__(1, b, c)
EDIT: trying to better answer the question.
class A(object):
def __init__(self, a):
for cls in self.__class__.mro():
if cls is not object:
cls._init(self, a)
def _init(self, a):
print 'A._init'
self.a = a
class B(A):
def _init(self, a):
print 'B._init'
class C(A):
def _init(self, a):
print 'C._init'
class D(B, C):
def _init(self, a):
print 'D._init'
d = D(3)
print d.a
prints:
D._init
B._init
C._init
A._init
3
(A modified version of template pattern).
Now parents' methods are really called implicitly, but i have to agree with python zen where explicit is better than implicit because the code is lesser readable and the gain is poor. But beware that all _init methods have the same parameters, you cannot completely forget about parents and I don't suggest to do so.
For single inheritance, a better approach is explicitly calling parent's method, without invoking super. Doing so you don't have to name the current class, but still you must care about who is the parent's class.
Good reads are: how-does-pythons-super-do-the-right-thing and the links suggested in that question and in particularity Python's Super is nifty, but you can't use it
If hierarchy is likely to change is symptoms of bad design and has consequences in all the parts who are using that code and should not be encouraged.
EDIT 2
Another example comes me in mind, but which uses metaclasses. Urwid library uses metaclass to store an attribute, __super, in class so that you need just to access to that attribute.
Ex:
>>> class MetaSuper(type):
... """adding .__super"""
... def __init__(cls, name, bases, d):
... super(MetaSuper, cls).__init__(name, bases, d)
... if hasattr(cls, "_%s__super" % name):
... raise AttributeError, "Class has same name as one of its super classes"
... setattr(cls, "_%s__super" % name, super(cls))
...
>>> class A:
... __metaclass__ = MetaSuper
... def __init__(self, a):
... self.a = a
... print 'A.__init__'
...
>>> class B(A):
... def __init__(self, a):
... print 'B.__init__'
... self.__super.__init__(a)
...
>>> b = B(42)
B.__init__
A.__init__
>>> b.a
42
>>>
Perhaps what you are looking for is metaclasses?
class metawrap(type):
def __new__(mcs,name, bases, dict):
dict['bases'] = bases
return type.__new__(mcs,name,bases,dict)
class A(object):
def __init__(self):
pass
def test(self):
print "I am class A"
class B(A):
__metaclass__ = metawrap
def __init__(self):
pass
def test(self):
par = super(self.bases[0],self)
par.__thisclass__.test(self)
foo = B()
foo.test()
Prints "I am class A"
What the metaclass does is overriding the initial creation of the B class (not the object) and makes sure that the builtin dictionary for each B object now contains a bases array where you can find all the baseclasses for B
To my knowledge, the following isn't commonly done. But it does seem to work.
Methods in a given class definition always mangle double-underscore attributes to include the name of the class they're defined in. So, if you stash a reference to the class in name-mangled form where the instances can see it, you can use that in the call to super.
An example stashing the references on the object itself, by implementing __new__ on the baseclass:
def mangle(cls, name):
if not name.startswith('__'):
raise ValueError('name must start with double underscore')
return '_%s%s' % (cls.__name__, name)
class ClassStasher(object):
def __new__(cls, *args, **kwargs):
obj = object.__new__(cls)
for c in cls.mro():
setattr(obj, mangle(c, '__class'), c)
return obj
class A(ClassStasher):
def __init__(self):
print 'init in A', self.__class
super(self.__class, self).__init__()
class B(A):
def __init__(self):
print 'init in B', self.__class
super(self.__class, self).__init__()
class C(A):
def __init__(self):
print 'init in C', self.__class
super(self.__class, self).__init__()
class D(B, C):
def __init__(self):
print 'init in D', self.__class
super(self.__class, self).__init__()
d = D()
print d
And, doing a similar thing, but using a meta-class and stashing the __class references on the class objects themselves:
class ClassStasherType(type):
def __init__(cls, name, bases, attributes):
setattr(cls, mangle(cls, '__class'), cls)
class ClassStasher(object):
__metaclass__ = ClassStasherType
class A_meta(ClassStasher):
def __init__(self):
print 'init in A_meta', self.__class
super(self.__class, self).__init__()
class B_meta(A_meta):
def __init__(self):
print 'init in B_meta', self.__class
super(self.__class, self).__init__()
class C_meta(A_meta):
def __init__(self):
print 'init in C_meta', self.__class
super(self.__class, self).__init__()
class D_meta(B_meta, C_meta):
def __init__(self):
print 'init in D_meta', self.__class
super(self.__class, self).__init__()
d = D_meta()
print d
Running this all together, as one source file:
% python /tmp/junk.py
init in D <class '__main__.D'>
init in B <class '__main__.B'>
init in C <class '__main__.C'>
init in A <class '__main__.A'>
<__main__.D object at 0x1004a4a50>
init in D_meta <class '__main__.D_meta'>
init in B_meta <class '__main__.B_meta'>
init in C_meta <class '__main__.C_meta'>
init in A_meta <class '__main__.A_meta'>
<__main__.D_meta object at 0x1004a4bd0>

Categories

Resources