class A(object):
def __init__(self, a, b, c):
#super(A, self).__init__()
super(self.__class__, self).__init__()
class B(A):
def __init__(self, b, c):
print super(B, self)
print super(self.__class__, self)
#super(B, self).__init__(1, b, c)
super(self.__class__, self).__init__(1, b, c)
class C(B):
def __init__(self, c):
#super(C, self).__init__(2, c)
super(self.__class__, self).__init__(2, c)
C(3)
In the above code, the commented out __init__ calls appear to the be the commonly accepted "smart" way to do super class initialization. However in the event that the class hierarchy is likely to change, I have been using the uncommented form, until recently.
It appears that in the call to the super constructor for B in the above hierarchy, that B.__init__ is called again, self.__class__ is actually C, not B as I had always assumed.
Is there some way in Python-2.x that I can maintain proper MRO (with respect to initializing all parent classes in the correct order) when calling super constructors while not naming the current class (the B in in super(B, self).__init__(1, b, c))?
Short answer: no, there's no way to implicitly invoke the right __init__ with the right arguments of the right parent class in Python 2.x.
Incidentally, the code as shown here is incorrect: if you use super().__init__, then all classes in your hierarchy must have the same signature in their __init__ methods. Otherwise your code can stop working if you introduce a new subclass that uses multiple inheritance.
See http://fuhm.net/super-harmful/ for a longer description of the issue (with pictures).
Your code has nothing to do with method resolution order. Method resolution comes in the case of multiple inheritance which is not the case of your example. Your code is simply wrong because you assume that self.__class__ is actually the same class of the one where the method is defined and this is wrong:
>>> class A(object):
... def __init__(self):
... print self.__class__
...
>>>
>>> class B(A):
... def __init__(self):
... A.__init__(self)
...
>>> B()
<class '__main__.B'>
<__main__.B object at 0x1bcfed0>
>>> A()
<class '__main__.A'>
<__main__.A object at 0x1bcff90>
>>>
so when you should call:
super(B, self).__init__(1, b, c)
you are indeed calling:
# super(self.__class__, self).__init__(1, b, c)
super(C, self).__init__(1, b, c)
EDIT: trying to better answer the question.
class A(object):
def __init__(self, a):
for cls in self.__class__.mro():
if cls is not object:
cls._init(self, a)
def _init(self, a):
print 'A._init'
self.a = a
class B(A):
def _init(self, a):
print 'B._init'
class C(A):
def _init(self, a):
print 'C._init'
class D(B, C):
def _init(self, a):
print 'D._init'
d = D(3)
print d.a
prints:
D._init
B._init
C._init
A._init
3
(A modified version of template pattern).
Now parents' methods are really called implicitly, but i have to agree with python zen where explicit is better than implicit because the code is lesser readable and the gain is poor. But beware that all _init methods have the same parameters, you cannot completely forget about parents and I don't suggest to do so.
For single inheritance, a better approach is explicitly calling parent's method, without invoking super. Doing so you don't have to name the current class, but still you must care about who is the parent's class.
Good reads are: how-does-pythons-super-do-the-right-thing and the links suggested in that question and in particularity Python's Super is nifty, but you can't use it
If hierarchy is likely to change is symptoms of bad design and has consequences in all the parts who are using that code and should not be encouraged.
EDIT 2
Another example comes me in mind, but which uses metaclasses. Urwid library uses metaclass to store an attribute, __super, in class so that you need just to access to that attribute.
Ex:
>>> class MetaSuper(type):
... """adding .__super"""
... def __init__(cls, name, bases, d):
... super(MetaSuper, cls).__init__(name, bases, d)
... if hasattr(cls, "_%s__super" % name):
... raise AttributeError, "Class has same name as one of its super classes"
... setattr(cls, "_%s__super" % name, super(cls))
...
>>> class A:
... __metaclass__ = MetaSuper
... def __init__(self, a):
... self.a = a
... print 'A.__init__'
...
>>> class B(A):
... def __init__(self, a):
... print 'B.__init__'
... self.__super.__init__(a)
...
>>> b = B(42)
B.__init__
A.__init__
>>> b.a
42
>>>
Perhaps what you are looking for is metaclasses?
class metawrap(type):
def __new__(mcs,name, bases, dict):
dict['bases'] = bases
return type.__new__(mcs,name,bases,dict)
class A(object):
def __init__(self):
pass
def test(self):
print "I am class A"
class B(A):
__metaclass__ = metawrap
def __init__(self):
pass
def test(self):
par = super(self.bases[0],self)
par.__thisclass__.test(self)
foo = B()
foo.test()
Prints "I am class A"
What the metaclass does is overriding the initial creation of the B class (not the object) and makes sure that the builtin dictionary for each B object now contains a bases array where you can find all the baseclasses for B
To my knowledge, the following isn't commonly done. But it does seem to work.
Methods in a given class definition always mangle double-underscore attributes to include the name of the class they're defined in. So, if you stash a reference to the class in name-mangled form where the instances can see it, you can use that in the call to super.
An example stashing the references on the object itself, by implementing __new__ on the baseclass:
def mangle(cls, name):
if not name.startswith('__'):
raise ValueError('name must start with double underscore')
return '_%s%s' % (cls.__name__, name)
class ClassStasher(object):
def __new__(cls, *args, **kwargs):
obj = object.__new__(cls)
for c in cls.mro():
setattr(obj, mangle(c, '__class'), c)
return obj
class A(ClassStasher):
def __init__(self):
print 'init in A', self.__class
super(self.__class, self).__init__()
class B(A):
def __init__(self):
print 'init in B', self.__class
super(self.__class, self).__init__()
class C(A):
def __init__(self):
print 'init in C', self.__class
super(self.__class, self).__init__()
class D(B, C):
def __init__(self):
print 'init in D', self.__class
super(self.__class, self).__init__()
d = D()
print d
And, doing a similar thing, but using a meta-class and stashing the __class references on the class objects themselves:
class ClassStasherType(type):
def __init__(cls, name, bases, attributes):
setattr(cls, mangle(cls, '__class'), cls)
class ClassStasher(object):
__metaclass__ = ClassStasherType
class A_meta(ClassStasher):
def __init__(self):
print 'init in A_meta', self.__class
super(self.__class, self).__init__()
class B_meta(A_meta):
def __init__(self):
print 'init in B_meta', self.__class
super(self.__class, self).__init__()
class C_meta(A_meta):
def __init__(self):
print 'init in C_meta', self.__class
super(self.__class, self).__init__()
class D_meta(B_meta, C_meta):
def __init__(self):
print 'init in D_meta', self.__class
super(self.__class, self).__init__()
d = D_meta()
print d
Running this all together, as one source file:
% python /tmp/junk.py
init in D <class '__main__.D'>
init in B <class '__main__.B'>
init in C <class '__main__.C'>
init in A <class '__main__.A'>
<__main__.D object at 0x1004a4a50>
init in D_meta <class '__main__.D_meta'>
init in B_meta <class '__main__.B_meta'>
init in C_meta <class '__main__.C_meta'>
init in A_meta <class '__main__.A_meta'>
<__main__.D_meta object at 0x1004a4bd0>
Related
I have class A which I want to inherit from, this class has a class method that can initialize a new instance from some data. I don't have access to the code for from_data and can't change the implementation of A.
I want to initialize new instances of class B using the same data I would pass to the A's from_data method. In the solution I came up with I create a new instance of A in __new__(...) and change the __class__ to B. __init__(...) can then further initialize the "new instance of B" as normal. It seems to work but I'm not sure this will have some sort of side effects.
So will this work reliably? Is there a proper way of achieving this?
class A:
def __init__(self, alpha, beta):
self.alpha = alpha
self.beta = beta
#classmethod
def from_data(cls, data):
obj = cls(*data)
return obj
class B(A):
def __new__(cls, data):
a = A.from_data(data)
a.__class__ = cls
return a
def __init__(self, data):
pass
b = B((5, 3))
print(b.alpha, b.beta)
print(type(b))
print(isinstance(b, B))
Output:
5 3
<class '__main__.B'>
True
It could be that your use-case is more abstract than I am understanding, but testing out in a REPL, it seems that calling the parent class A constructor via super()
class A:
# ...
class B(A):
def __init__(self, data):
super().__init__(*data)
b = B((5, 3))
print(b.alpha, b.beta)
print(type(b))
print(isinstance(b, B))
also results in
5 3
<class '__main__.B'>
True
Is there a reason you don't want to call super() to instantiate a new instance of your child class?
Edit:
So, in case you need to use the from_data constructor... you could do something like
#... class A
class B(A):
def __init__(self, data):
a_obj = A.from_data(data)
for attr in a_obj.__dict__:
setattr(self, attr, getattr(a_obj, attr))
That is really hacky though... and not guaranteed to work for all attrs of A class object, especially if the __dict__ function has been overloaded.
class A:
def __init__(self,name):
self.name=name
class B(A):
def __init__(self,name,add):
super().__init__(name)
self.add = add
class C(A):
def __init__(self,name,tel):
super().__init__(name)
self.tel = tel
class D(B,C):
def __init__(self,name,add,tel,company):
super().__init__(name,add)
super().__init__(name,tel)
self.company = company
d = D('Hank','ctm',55514,'google')
enter image description here
A resolution to this multiple inheritance is to cooperatively design the classes, see Raymond's article Python’s super() considered super!:
class A:
def __init__(self, name, **kwargs):
self.name = name
class B(A):
def __init__(self, add, **kwargs):
super().__init__(**kwargs)
self.add = add
class C(A):
def __init__(self, tel, **kwargs):
super().__init__(**kwargs)
self.tel = tel
class D(B, C):
def __init__(self, company, **kwargs):
super().__init__(**kwargs)
self.company = company
d = D(name='Hank', add='ctm', tel=55514, company='google')
As pointed out by others, this will follow the MRO, e.g. D -> B -> C - A.
The error is caused by the call to super().__init__(name) in the class B. Unlike single-inheritance languages such as Java, super() in Python doesn't necessarily give you the superclass; it gives you the next class in the method resolution order. The MRO for class D is this:
>>> D.__mro__
(<class '__main__.D'>,
<class '__main__.B'>,
<class '__main__.C'>,
<class '__main__.A'>,
<class 'object'>)
As you can see, the next class in the MRO after B is C, so the call to super().__init__(name) invokes C.__init__ with a single argument. However, the C.__init__ method expects two arguments, so calling it with just one argument results in the error you've seen.
I override a function, but would like to get hold of the parent's function from within the parent.
>>> class a:
... def __init__(self):
... print(self.f)
... def f(self):
... pass
...
>>> class b(a):
... def __init__(self):
... super(b, self).__init__()
... def f(self):
... pass
...
>>> b()
<bound method b.f of <__main__.b object at 0x000002E297A96160>>
I'd like the printout to say a.f.
You could use name mangling to make self.__f refer to A.__f from within A's class definition.
Name mangling is helpful for letting subclasses override methods without breaking intraclass method calls
class A:
def __init__(self):
self.__f()
def f(self):
print('A.f')
__f = f # Private copy of A's `f` method
class B(A):
def __init__(self):
super(B, self).__init__()
def f(self):
print('B.f')
b = B()
b.f()
prints
A.f
B.f
class a(object):
def __init__(self):
pass
def f(self):
print 'Parent Method...'
class b(a):
def __init__(self):
super(b, self).__init__()
a.f(self) #referance the parent class rather than the child class, because the child overrides the parent method.
self.f()
def f(self):
print "Childs Method..."
b()
Hi everyone I have two classes A and B and I want to grab a method from B to use in A.
My code is along the lines of:
class A(object):
__init__(self, args):
'blah'
def func2(self, args):
#method = B.func1(args)
# method = getattr(B, 'func1')
class B(object):
__init__(self):
'do stuff'
def func1(self, args):
'Do stuff here'
return
Is there a way to get func1 into A without removing the self attribute from func1?
Neither of the two method calls are working for me and I keep getting a type error
TypeError: unbound method func1 must be called with B instance as
first argument (got NoneType instance instead)
edit: found my solution
I found the solution to my question. When I passed values from B to A I needed to pass my instance for B as well. So in my init for A
class A(object):
__init__( args, B_arg):
And in class B
class B(object):
def passattributes():
c = A( args, self )
I'm going to guess that since you're trying to do what you're trying to do, the appropriate thing to do in your case is to define B.func1 as a classmethod, because you don't expect it to require an instance of class B.
#classmethod
def func1(cls, args):
'blah
You could create an instance of B in the A class:
class A(object):
def __init__(self):
'blah'
self.bInst = B()
def func2(self, args):
method = self.bInst.func1('func1')
class B(object):
def __init__(self):
'do stuff'
def func1(self, args):
print 'Do stuff here'
return
aInst = A()
aInst.func2('some arg')
Result:
Do stuff here
Basically, what I want is to do this:
class B:
def fn(self):
print 'B'
class A:
def fn(self):
print 'A'
#extendInherit
class C(A,B):
pass
c=C()
c.fn()
And have the output be
A
B
How would I implement the extendInherit decorator?
This is not a job for decorators. You want to completely change the normal behaviour of a class, so this is actually a job for a metaclass.
import types
class CallAll(type):
""" MetaClass that adds methods to call all superclass implementations """
def __new__(meta, clsname, bases, attrs):
## collect a list of functions defined on superclasses
funcs = {}
for base in bases:
for name, val in vars(base).iteritems():
if type(val) is types.FunctionType:
if name in funcs:
funcs[name].append( val )
else:
funcs[name] = [val]
## now we have all methods, so decorate each of them
for name in funcs:
def caller(self, *args,**kwargs):
""" calls all baseclass implementations """
for func in funcs[name]:
func(self, *args,**kwargs)
attrs[name] = caller
return type.__new__(meta, clsname, bases, attrs)
class B:
def fn(self):
print 'B'
class A:
def fn(self):
print 'A'
class C(A,B, object):
__metaclass__=CallAll
c=C()
c.fn()
A metaclass is a possible solution, but somewhat complex. super can do it very simply (with new style classes of course: there's no reason to use legacy classes in new code!):
class B(object):
def fn(self):
print 'B'
try: super(B, self).fn()
except AttributeError: pass
class A(object):
def fn(self):
print 'A'
try: super(A, self).fn()
except AttributeError: pass
class C(A, B): pass
c = C()
c.fn()
You need the try/except to support any order of single or multiple inheritance (since at some point there will be no further base along the method-resolution-order, MRO, defining a method named fn, you need to catch and ignore the resulting AttributeError). But as you see, differently from what you appear to think based on your comment to a different answer, you don't necessarily need to override fn in your leafmost class unless you need to do something specific to that class in such an override -- super works fine on purely inherited (not overridden) methods, too!
I personally wouldn't try doing this with a decorator since using new-style classes and super(), the following can be achieved:
>>> class A(object):
... def __init__(self):
... super(A, self).__init__()
... print "A"
...
>>> class B(object):
... def __init__(self):
... super(B, self).__init__()
... print "B"
...
>>> class C(A, B):
... def __init__(self):
... super(C, self).__init__()
...
>>> foo = C()
B
A
I'd imagine method invocations would work the same way.