Can I get common parameters from parent constructor? - python

I would like to know if I can use the parent constructor for passing parameters which will be needed for every subclass. So for instance:
Class A():
def __init__(a,b):
...do some stuff...
Class B(A):
def __init__(c,d):
...do some stuff needing a and b...
Class C(A):
def __init__(e,f,g):
...do some stuff needing a and b...
Basically there are some parameters each of my subclasses is going to want and some others which are specific. I don't want to have to add a,b to the definition of every subclass of A. Is there some way I can do this in python?
What I'd like to see is the ability to call:
b=B(a=1,b=2,c=3,d=4)
without having to include a and b in the subclass definition.
Many thanks!

# Python 3, but the idea is the same in 2
class A:
def __init__(self, a, b):
# ...
class B(A):
def __init__(self, c, d, *args, **kwargs):
super().__init__(*args, **kwargs)
class C(A):
def __init__(self, e, f, g, *args, **kwargs):
super().__init__(*args, **kwargs)

Related

Correct way of returning new class object (which could also be extended)

I am trying to find a good way for returning a (new) class object in class method that can be extended as well.
I have a class (classA) which has among other methods, a method that returns a new classA object after some processing
class classA:
def __init__(): ...
def methodX(self, **kwargs):
process data
return classA(new params)
Now, I am extending this class to another classB. I need methodX to do the same, but return classB this time, instead of classA
class classB(classA):
def __init__(self, params):
super().__init__(params)
self.newParams = XYZ
def methodX(self, **kwargs):
???
This may be something trivial but I simply cannot figure it out. In the end I dont want to rewrite the methodX each time the class gets extended.
Thank you for your time.
Use the __class__ attribute like this:
class A:
def __init__(self, **kwargs):
self.kwargs = kwargs
def methodX(self, **kwargs):
#do stuff with kwargs
return self.__class__(**kwargs)
def __repr__(self):
return f'{self.__class__}({self.kwargs})'
class B(A):
pass
a = A(foo='bar')
ax = a.methodX(gee='whiz')
b = B(yee='haw')
bx = b.methodX(cool='beans')
print(a)
print(ax)
print(b)
print(bx)
class classA:
def __init__(self, x):
self.x = x
def createNew(self, y):
t = type(self)
return t(y)
class classB(classA):
def __init__(self, params):
super().__init__(params)
a = classA(1)
newA = a.createNew(2)
b = classB(1)
newB = b.createNew(2)
print(type(newB))
# <class '__main__.classB'>
I want to propose what I think is the cleanest approach, albeit similar to existing answers. The problem feels like a good fit for a class method:
class A:
#classmethod
def method_x(cls, **kwargs):
return cls(<init params>)
Using the #classmethod decorator ensures that the first input (traditionally named cls) will refer to the Class to which the method belongs, rather than the instance.
(usually we call the first method input self and this refers to the instance to which the method belongs)
Because cls refers to A, rather than an instance of A, we can call cls() as we would call A().
However, in a class that inherits from A, cls will instead refer to the child class, as required:
class A:
def __init__(self, x):
self.x = x
#classmethod
def make_new(cls, **kwargs):
y = kwargs["y"]
return cls(y) # returns A(y) here
class B(A):
def __init__(self, x):
super().__init__(x)
self.z = 3 * x
inst = B(1).make_new(y=7)
print(inst.x, inst.z)
And now you can expect that print statement to produce 7 21.
That inst.z exists should confirm for you that the make_new call (which was only defined on A and inherited unaltered by B) has indeed made an instance of B.
However, there's something I must point out. Inheriting the unaltered make_new method only works because the __init__ method on B has the same call signature as the method on A. If this weren't the case then the call to cls might have had to be altered.
This can be circumvented by allowing **kwargs on the __init__ method and passing generic **kwargs into cls() in the parent class:
class A:
def __init__(self, **kwargs):
self.x = kwargs["x"]
#classmethod
def make_new(cls, **kwargs):
return cls(**kwargs)
class B(A):
def __init__(self, x, w):
super().__init__(x=x)
self.w = w
inst = B(1,2).make_new(x="spam", w="spam")
print(inst.x, inst.w)
Here we were able to give B a different (more restrictive!) signature.
This illustrates a general principle, which is that parent classes will typically be more abstract/less specific than their children.
It follows that, if you want two classes that substantially share behaviour but which do quite specific different things, it will be better to create three classes: one rather abstract one that defines the behaviour-in-common, and two children that give you the specific behaviours you want.

When calling super().__init__ how can I tell if the super init is object.__init__ in python3?

Given an arbitrary class inheritance
how do I find out if super().__init__ == object.__init__?
Description + Example
I have this code that I'm not allowed to touch which defines classes A, B, C, CombinedCba, CombinedAc and each class has this weird __init__ constraint which validates instance properties.
When init calls are made in all base classes, we get the error:
TypeError: object.__init__() takes exactly one argument (the instance to initialize)
So to prevent that error, we should stop calling super init when it is object init.
I am able to edit the super_init function. How do I detect when super init is init? If I know that I can not make the next super init call and eliminate the error.
# code that I can edit
def super_init(self, super_instance, *args, **kwargs):
# checking super_instance.__init__ == object.__init__ OR super_instance.__init__ is object.__init__ doesn't work
# pseudo-code
if super_instance.__init__ is not object.__init__:
super_instance.__init__(*args, **kwargs)
# auto generated code is from here on down
class A:
def __init__(self, *args, **kwargs):
self.a = kwargs['a']
assert self.a == 'a'
super_init(self, super(), *args, **kwargs)
class B:
def __init__(self, *args, **kwargs):
self.b = kwargs['b']
self.some_num = kwargs['some_num']
assert self.some_num <= 30
super_init(self, super(), *args, **kwargs)
class C:
def __init__(self, *args, **kwargs):
self.some_num = kwargs['some_num']
assert self.some_num >= 10
super_init(self, super(), *args, **kwargs)
class CombinedCba(C, B, A):
pass
combo_cba = CombinedCba(a='a', b='b', some_num=25)
class CombinedAc(A, C):
pass
combo_ac = CombinedAc(a='a', some_num=15)
The only way that I was able to get this so work was to build a new temporary class
containing the remaining super classes, then checking the __init__ method of that class.
def super_init(self, super_instance, *args, **kwargs):
classes_in_order = self.__class__.__mro__
for i, cls in enumerate(classes_in_order):
if cls == super_instance.__thisclass__:
remainder_cls = type('_unused', classes_in_order[i+1:], {})
super_init_is_object_init = remainder_cls.__init__ == object.__init__
if not super_init_is_object_init:
super_instance.__init__(*args, **kwargs)
These attempts didn't work:
checking super().__init__ == object.__init__ did not work
checking super().__init__ is object.__init__ did not work
checking super(super().__thisclass__, self) vs object.__init__ did not work
introspecting the function signatures with inspect.signature did not work
First, define A, B, and C to use super correctly:
class A:
def __init__(self, a, **kwargs):
super().__init__(**kwargs)
assert a == 'a'
self.a = a
class B:
def __init__(self, b, some_num, *args, **kwargs):
super().__init__(**kwargs)
self.b = b
self.some_num = some_num
assert self.some_num <= 30
class C:
def __init__(self, some_num, **kwargs):
super().__init__(**kwargs)
self.some_num = some_num
assert self.some_num >= 10
In particular, note that both B and C claim "ownership" of some_num, without worrying yet that another class might make use of it.
Next, define a mix-in class that does nothing but ensure that some_num is used to set the some_num attribute.
class SomeNumAdaptor:
def __init__(self, some_num, **kwargs):
self.some_num = some_num
super().__init__(**kwargs)
Third, define wrappers for B and C that get the value of some_num from self in order to add it back as a keyword argument (which SomeNumAdaptor stripped):
class CWrapper(C):
def __init__(self, **kwargs):
super().__init__(some_num=self.some_num, **kwargs)
class BWrapper(B):
def __init__(self, **kwargs):
super().__init__(some_num=self.some_num, **kwargs)
This means that both B and C will "reset" the value of self.num.
(A wrapper isn't necessary if you can also modify B and C to make some_num optional and check for the existence of self.some_num.)
Finally, define your combination classes in terms of SomeNumAdaptor and the wrapper classes. You must inherit from SomeNumAdaptor first, to ensure that BWrapper and CWrapper find some_num as an attribute, regardless of their relative ordering.
class CombinedAbc(SomeNumAdaptor, A, BWrapper, CWrapper):
pass
class CombinedCba(SomeNumAdaptor, CWrapper, BWrapper, A):
pass
combo_cba = CombinedCba(a='a', b='b', some_num=25)
combo_abc = CombinedAbc(a='a', b='b', some_num=15)
The above all assume that neither B nor C store a modified value of its some_num argument to the attribute. If it does, you'll need more complicated wrappers to handle it, likely doing more than simply passing the received value to __init__. Note that this could indicate a more fundamental issue with inheriting from both B and C at the same time.

Python inheritance structure and arguments

I am trying to design a class structure that allows the user to define their own class that overloads predefined methods in other classes. In this case the user would create the C class to overload the "function" method in D. The user created C class has common logic for other user created classes A and B so they inherit from C to overload "function" but also inherit from D to use D's other methods. The issue I am having is how to pass "value" from A and B to D and ignore passing it to C. What I currently have written will produce an error as C does not have "value" as an argument.
I know that I can add "value" (or *args) to C's init method and the super call but I don't want to have to know what inputs other classes need in order to add new classes to A and B. Also, if I swap the order of C and D I won't get an error but then I don't use C's overloaded "function". Is there an obvious way around this?
class D(SomethingElse):
def __init__(self, value, **kwargs):
super(D, self).__init__(**kwargs)
self.value = value
def function(self):
return self.value
def other_method(self):
pass
class C(object):
def __init__(self):
super(C, self).__init__()
def function(self):
return self.value*2
class B(C, D):
def __init__(self, value, **kwargs):
super(B, self).__init__(value, **kwargs)
class A(C, D):
def __init__(self, value, **kwargs):
super(A, self).__init__(value, **kwargs)
a = A(3)
print(a.function())
>>> 6
Essentially, there are two things you need to do to make your __init__ methods play nice with multiple inheritance in Python:
Always take a **kwargs parameter, and always call super().__init__(**kwargs), even if you think you are the base class. Just because your superclass is object doesn't mean you are last (before object) in the method resolution order.
Don't pass your parent class's __init__ arguments explicitly; only pass them via **kwargs. Your parent class isn't necessarily the next one after you in the method resolution order, so positional arguments might be passed to the wrong other __init__ method.
This is called "co-operative subclassing". Let's try with your example code:
class D:
def __init__(self, value, **kwargs):
self.value = value
super().__init__(**kwargs)
def function(self):
return self.value
class C:
# add **kwargs parameter
def __init__(self, **kwargs):
# pass kwargs to super().__init__
super().__init__(**kwargs)
def function(self):
return self.value * 2
class B(C, D):
# don't take parent class's value arg explicitly
def __init__(self, **kwargs):
# pass value arg via kwargs
super().__init__(**kwargs)
class A(C, D):
# don't take parent class's value arg explicitly
def __init__(self, **kwargs):
# pass value arg via kwargs
super().__init__(**kwargs)
Demo:
>>> a = A(value=3)
>>> a.value
3
>>> a.function()
6
Note that value must be passed to the A constructor as a keyword argument, not as a positional argument. It's also recommended to set self.value = value before calling super().__init__.
I've also simplified class C(object): to class C:, and super(C, self) to just super() since these are equivalent in Python 3.
So I'm trying to understand the point of A AND B. I'm guessing that maybe you want to mix in the superclass behavior and sometimes have local behavior. So suppose A is just mixing together behaviors, and B has some local behavior and state.
If you don't need your own state, you probably don't need an __init__. So for A and C just omit __init__.
class SomethingElse(object):
def __init__(self, *args, **kwargs):
self.args = args
self.kwargs = kwargs
class D(SomethingElse):
def __init__(self, value, *args, **kwargs):
super(D, self).__init__(*args, **kwargs)
self.value = value
def function(self):
return self.value
def other_method(self):
return self.__dict__
class C(object):
#def __init__(self):
# super(C, self).__init__()
def function(self):
return self.value*2
class B(C, D):
def __init__(self, value, bstate, *args, **kwargs):
super(B, self).__init__(value, *args, **kwargs)
self.bstate = bstate
def __repr__(self):
return (self.__class__.__name__ + ' ' +
self.bstate + ' ' + str(self.other_method()))
class A(C, D):
pass
a = A(3)
b = B(21, 'extra')
a.function()
6
b.function()
42
repr(a)
'<xx.A object at 0x107cf5e10>'
repr(b)
"B extra {'args': (), 'bstate': 'extra', 'value': 21, 'kwargs': {}}"
I've kept python2 syntax assuming you might still be using it, but as another answer points out, python3 simplifies super() syntax, and you really should be using python3 now.
If you swap C and D you are changing the python method resolution order, and that will indeed change the method to which a call to A.function resolves.

Maintaining readability when using super() for direct multiple inheritance

For the case of the most basic multiple inheritance:
class A:
def __init__(self, a):
self.a = a
class B:
def __init__(self, b):
self.b = b
class C(A, B):
def __init__(self, a, b):
A.__init__(self, a)
B.__init__(self, b)
I do not see why super() should be used. I suppose you could implement it with kwargs, but that is surely less readable than the above method. I am yet to find any answers on stack overflow which are in favour of this method, yet surely for this case it is the most satisfactory?
There are a lot of questions marked as duplicate on this topic, but no satisfactory answers for this exact case. This question addresses multiple inheritance and the use of super() for a diamond inheritance. In this case there is no diamond inheritance and neither parent class have any knowledge of each other, so they shouldn't need to call super() like this suggests.
This answer deals with the use of super in this scenario but without passing arguments to __init__ like is done here, and this answer deals with passing arguments but is again a diamond inheritance.
One correct way to use super here would be
class A:
def __init__(self, a, **kwargs):
super().__init__(**kwargs)
self.a = a
class B:
def __init__(self, b, **kwargs):
super().__init__(**kwargs)
self.b = b
class C1(A, B):
pass
class C2(A, B):
def __init__(self, a, b, **kwargs):
super().__init__(a=a, b=b, **kwargs)
c1 = C1(a="foo", b="bar")
c2 = C2(a="foo", b="bar")
The method resolution order for C is [C, A, B, object]. Each time super() is called, it returns a proxy for the next class in the MRO, based on where super() is called at the time.
You have two options when defining C, depending on whether you want to define C.__init__ with a signature that mentions the two arguments A and B required for initialization. With C1, C1.__init__ is not defined so A.__init__ will be called instead. With C2, you need to explicitly call the next __init__ method in the chain.
C, knowing that it is a subclass of A and B, has to at least provide the expected arguments for the known upstream __init__ methods.
A.__init__ will pass on everything except a to the next class's __init__ method.
B.__init__ will pass on everything it receives except b.
object.__init__ will finally be called, and assuming all previous classes correctly removed the keyword arguments they introduced, will receive no additional arguments.
Changing the order in which the various __init__s are called means changing the MRO, which means altering the order of the base classes. If you want more control than that, then cooperative multiple inheritance is not for you.
class A(object):
def __init__(self, *args, **kwargs):
super(A, self).__init__(*args, **kwargs)
self.a = kwargs['a']
class B(object):
def __init__(self, *args, **kwargs):
super(B, self).__init__()
self.b = kwargs['b']
class C(A, B):
def __init__(self, *args, **kwargs):
super(C, self).__init__(*args, **kwargs)
z = C(a=1,b=2)
z.b
2

Should I be using abstract methods in this Python scenario?

I'm not sure my approach is good design and I'm hoping I can get a tip. I'm thinking somewhere along the lines of an abstract method, but in this case I want the method to be optional. This is how I'm doing it now...
from pymel.core import *
class A(object):
def __init__(self, *args, **kwargs):
if callable(self.createDrivers):
self._drivers = self.createDrivers(*args, **kwargs)
select(self._drivers)
class B(A):
def createDrivers(self, *args, **kwargs):
c1 = circle(sweep=270)[0]
c2 = circle(sweep=180)[0]
return c1, c2
b = B()
In the above example, I'm just creating 2 circle arcs in PyMEL for Maya, but I fully intend on creating more subclasses that may or may not have a createDrivers method at all! So I want it to be optional and I'm wondering if my approach is—well, if my approach could be improved?
You still have a problem, when you will inherit your class B, and this will call A.__init__ and if you don't implement createDrivers in the subclass this line callable(self.createDrivers) will throw an error as that createDrivers doesn't exist (AttributeError) i think if i were you i will do it like so:
class A(object):
def __init__(self, *args, **kwargs):
try:
self._drivers = self.createDrivers(*args, **kwargs)
select(self._drivers)
except NotImplementedError:
pass
def createDrivers(self, *args, **kwargs):
raise NotImplementedError("This class wasn't implemented")
class B(A):
def createDrivers(self, *args, **kwargs):
c1 = circle(sweep=270)[0]
c2 = circle(sweep=180)[0]
return c1, c2
class C(A):
pass
Another way is to replace callable(self.createDrivers) by hasattr(self, 'createDrivers').
I would do this:
class A(object):
def __init__(self, *args, **kwargs):
self.createDrivers(*args, **kwargs)
def createDrivers(self, *args, **kwargs):
"Override"
pass
class B(A):
def createDrivers(self, *args, **kwargs):
self._drivers = blabla
If you want createDrivers to be optional but still always there, the best is not an abstract method, but do implement it in the base class as a noop.
class A(object):
def __init__(self, *args, **kwargs):
self._drivers = self.createDrivers(*args, **kwargs)
select(self._drivers)
def createDrivers(self, *args, **kwargs):
"""This should be overridden by subclasses if they need custom drivers"""
pass

Categories

Resources