PyLint W0231 super-init-not-called with __init__ of grandparent - python

I make a Child class that use methods from Parent class witch not present in GrandParent. But in the Child.__init__() class I don't want to get some effects from Parent.__init__() and instead I call call GrandParent.__init__()
class GrandParent():
...
class Parent(GrandParent):
def __init__(self):
GrandParent.__init__(self)
...
class Child(Parent):
def __init__(self):
GrandParent.__init__(self)
...
With this code i got next warnings from pylint
W0233: __init__ method from a non direct base class 'GrandParent' is called (non-parent-init-called)
W0231: __init__ method from base class 'Parent' is not called (super-init-not-called)
What is the correct way to fix this warnings ?
(or disable this warnings for Child is proper solution ?)
class Child(Parent):
def __init__(self): # pylint: disable=W0233
GrandParent.__init__() # pylint: disable=W0231
...

You can try multiple ways such as:
GrandParent.__init__(self)
super().super().__init__(self)
super(GrandParent, self).__init__()
but the error will still be there either as:
super-init-not-called
or bad-super-call (super(GrandParent, self)...)
or even as useless super call to the immediate parent.
I suppose it's meant as a warning for not initializing the immediate class in the class __mro__ by mistake if you work with inheritance and you can to turn off it with a pylint disable:
# pylint: disable=super-init-not-called
but you are messing with the __mro__ chain by such calls.
What you can do is allow cascading through the Child.__mro__ items upward i.e. Child -> Parent -> GrandParent (-> object) while skipping by class reference in an __init__:
class Parent(GrandParent):
def __init__(self):
super().__init__()
if self.__class__ == Child:
return
which does not trigger the PyLint warnings and allows the correct behavior explicitly and without skipping any required items in __mro__.
IMO it's much cleaner preserving the __mro__ which seems what PyLint's intent is as well by super-init-not-called warning. On the other hand, it's kind of tricky and hides the code in a completely different class and that will make the debugging in the future annoying if some properties from Parent will be missing and no comment/disable warning are present in Child class.
If you want to pull the methods out of the GrandParent class only, simply go with the "mixins" classes which is what Django REST Framework is using and don't write any __init__ for those (or if, then don't forget super().__init__() to continue in __mro__ chain!).
class GrandParent:
def __init__(self):
print("> GrandParent __init__()")
class Mixin:
def mymethod(self):
print("Hey")
class Parent(Mixin, GrandParent):
def __init__(self):
print("> Parent __init__()")
super().__init__()
class Child(Mixin, GrandParent):
def __init__(self):
print("> Child __init__()")
super().__init__()
Child().mymethod()
# > Child __init__()
# > GrandParent __init__()
# Hey
Parent().mymethod()
# > Parent __init__()
# > GrandParent __init__()
# Hey
which will basically be the same situation you have with Child -> Parent -> GrandParent but Mixin itself doesn't have (any useful) __init__ which will cause your code only from GrandParent to execute due to Mixin.__init__() being empty hence doing visually the same thing as with if self.__class__ == Child: return but without any hacks + with increased readability.

Related

how to define a "middleman" class's init to successfully passthrough args from the grandchild to the parent in python 3

I have a semi-abstract Parent and Middle classes and some Grandchild fully implemented ones that inherit from each other, but while the Parent and Grandchild need some __init__ args, the middle one is just for shared implemented code:
class Parent:
def __init__(self, some_arg):
...
class Middle(Parent, ABC):
def do_something_any_middle_can_do():
...
class Grandchild(Middle):
def __init__(self, some_arg):
super().__init__(some_arg)
...
As you can see, the super().__init__(some_arg) in the Grandchild would call the default __init__ in the Middle, and not send some_arg to the Parent.
So far, I have thought to use **kwargs, but that requires the authors of any new Grandchild to explicitly name the args in super().__init__(some_arg=some_arg) and if they don't unexpected things might happen without a good error message:
class Parent:
def __init__(self, some_arg):
...
class Middle(Parent, ABC):
def __init__(self, **kwargs):
super().__init__(kwargs)
...
def do_something_any_middle_can_do():
...
class Grandchild(Middle):
def __init__(self, some_arg):
super().__init__(some_arg=some_arg)
...
As you can see, the super().__init__(some_arg) in the Grandchild would call the default __init__ in the Middle, and not send some_arg to the Parent.
That's not correct. There is no "default" __init__ method in Middle.
super().__init__ refers to the __init__ attribute of the next class in self's method resolution order that has a defined __init__ method. Since Middle.__init__ is not defined, that means Parent.__init__ is called immediately.
The advice given in Python's super considered super! is to use keyword arguments to avoid conflict between which classes define which positional parameters. (It's the responsibility of the class designer to know which keyword arguments each ancestor expects, and to resolve any conflicts between ancestors that use the same name for different purposes.)
class Parent:
def __init__(self, *, some_arg, **kwargs):
super().__init__(**kwargs)
...
class Middle(Parent, ABC):
def do_something_any_middle_can_do(self):
...
class Grandchild(Middle):
def __init__(self, **kwargs):
super().__init__(**kwargs)
...
g = Grandchild(some_arg=3)
Grandchild.__init__ doesn't need to "advertise" some_arg; it accepts it as an arbitrary keyword argument and passes it up the chain (via super().__init__). Eventually, some class (Parent, in this case) has a defined parameter to accept it; otherwise, the argument will be passed to object.__init__ where an exception will be raised.

Python Multiple Inheritance child(parent1, parent2) access parent2 __init__

let's say I have this class
class child(parent1, parent2):
pass
Would it be possible to access parent2.__init__, if parent1 also has a defined __init__ ?.
Here is my complete code.
class parent1:
def __init__(self):
self.color="Blue"
class parent2:
def __init__(self):
self.figure="Triangle"
class child(parent1,parent2):
pass
juan=child()
try:
print(juan.color)
except Exception as e:
print(e)
try:
print(juan.figure)
except Exception as e:
print(e)
print(juan.__dict__)
I've tried with
class child(parent1,parent2):
def __init__(self):
super(parent2).__init__()
but maybe I am missing something?
Thanks.
Regards.
parent1 and parent2, if expected to be used in a cooperative multiple inheritance setting, should call super.
class parent1:
def __init__(self):
super().__init__()
self.color = "Blue"
class parent2:
def __init__(self):
super().__init__()
self.figure = "Triangle"
When you define child, its method resolution order determines which __init__ gets called first, as well as determining which class super() refers to when each time it gets called. In your actual example,
class child(parent1,parent2):
pass
parent1.__init__ is called first (since child does not override __init__), and its use of super() refers to parent2. If you had instead defined
class child2(parent2, parent1):
pass
then parent2.__init__ would be called first, and its use of super() would refer to parent1.
super() is used not to ensure that object.__init__ (which doesn't do anything) is called, but rather object.__init__ exists so that it can be called once a chain of super() calls reaches the end. (object.__init__ itself does not use super, as it is guaranteed to be the last class in the method resolution order of any other class.)

Assert child attribute is property from parent class

I would like for child process to have certain properties, but since you can't force child methods to inherit the property decorator, I want to at least assert that these attributes of the child are properties of the class.
Something like this:
class Parent:
def __init__(self):
assert isinstance(self.foo, property)
def foo(self):
raise NotImplementedError
class Child(Parent):
def __init__(self):
super().__init__()
#property
def foo(self):
return 'bar'
But of course by the time Parent.__init__() is run, self.foo has become 'bar' and there is an AssertionError. Is there a way to accomplish what I'm going for using meta classes? If so, can it be that the Parent class is the one that inherits the metaclass, not the Child?
I found a solution. Instead of testing whether self.foo is a property, I tested whether it was not a bound method:
from inspect import ismethod
class Parent:
def __init__(self):
assert not ismethod(self.foo, property)
def foo(self):
raise NotImplementedError
This will work for most cases, but will fail if the property being returned by Child.foo is itself a bound method. Still open to more complete answers.

AttributeError not generated in case of passing method reference

I have one simple doubt with respect to python 2.7:
I have created an abstract base class and a child class:
from abc import ABCMeta, abstractmethod
class Base:
"""
Abstract base class for all entities.
"""
__metaclass__ = ABCMeta
def __init__(self, name):
self.name = name
def send_data(self):
self.send_data()
class Child (Base):
def __init__(self, name):
super(Child, self).__init__(name=name)
When the object for the child class is created and the send_method is called I get the following error which is the expected behavior:
sample = Child('test')
sample.send_data()
# …
RuntimeError: maximum recursion depth exceeded
But when the send_method reference is passed in the base class and call is made to send_method by creating the child class object I think the expected behavior is to receive AttributeError but I am surprised to see no error is generated. Please explain.
from abc import ABCMeta, abstractmethod
class Base:
"""
Abstract base class for all entities.
"""
__metaclass__ = ABCMeta
def __init__(self, name, parent):
self.name = name
self.parent = parent
def send_data(self):
self.send_data
sample = Child('test')
sample.send_data()
# No error
In your first example you simply created a recursive function:
def send_data(self):
self.send_data()
This calls itself, without end, and that's why you end up with a recursion depth exception.
Your second example doesn't actually call the method:
def send_data(self):
self.send_data
The only difference here is that you forgot to use ().
None of this has anything to do with abstract base classes or inheritance. You didn't mark the send_data function as abstract, and even if you did, all that using abstractmethod does is make it impossible to create an instance of a class without a concrete implementation to replace it.
You won't get an AttributeError just because you defined a method on an ABCMeta class. And note that methods are just attributes on a class; they don't live a separate namespace. self.send_data references the bound method, not some other attribute that is separate. Referencing the method without calling it does nothing otherwise.

How can I add to the initial definition of a python class inheriting from another class?

I'm trying to define self.data inside a class inheriting from a class
class Object():
def __init__(self):
self.data="1234"
class New_Object(Object):
# Code changing self.data here
But I ran into an issue.
class Object():
def __init__(self):
self.data="1234"
So I have the beginning class here, which is imported from elsewhere, and let's say that the class is a universal one so I can't modify the original at all.
In the original, the instance is referred to as "self" inside the class, and it is defined as self inside the definition __init__.
class New_Object(Object):
# Code changing self.data here
So if I wanted to inherit from the class Object, but define self.data inside New_Object, I thought I would have to define __init__ in New_Object, but this overrides the __init__ from New_Object
Is there any way I could do this without copypasting the __init__ from Object?
You use super to call the original implementation.
class New_Object(Object):
def __init__(self):
super(NewObject, self).__init__()
self.info = 'whatever'
That's what super is for:
class NewObject(Object):
def __init__(self):
super(NewObject, self).__init__()
# self.data exists now, and you can modify it if necessary
You can use super().__init__() to call Object.__init__() from New_Object.__init__().
What you would do:
class Object:
def __init__(self):
print("Object init")
self.data = "1234"
class New_Object(Object):
def __init__(self):
print("calling super")
super().__init__()
print("data is now", self.data)
self.data = self.data.split("3")
o = New_Object()
# calling super
# Object init
# data is now 1234
Note that you do not have to give any arguments to super(), as long as you are using Python 3.
The answer is that you call the superclass's __init__ explicitly during the subclass's __init__. This can be done either of two ways:
Object.__init__(self) # requires you to name the superclass explicitly
or
super(NewObject, self).__init__() # requires you to name the subclass explicitly
The latter also requires you to ensure that you're using "new-style" classes: in Python 3 that's always the case, but in Python 2 you must be sure to inherit from the builtin object class. In Python 3 it can actually be expressed even more simply:
super().__init__()
Personally, in most of my code the "disadvantage" of having to name the superclass explicitly is no disadvantage at all, and Object.__init__() lends transparency since it makes it absolutely clear what is being called. This is because most of my code is single-inheritance only. The super route comes into its own when you have multiple inheritance. See What does 'super' do in Python?
Python 2 example:
class Object(object):
def __init__(self):
self.data = "1234"
class NewObject:
def __init__(self):
# subclass-specific stuff
super(NewObject, self).__init__()
# more subclass-specific stuff

Categories

Resources