Python constructors in ABC inheritance chain for interfaces - python

I've attempted to create a Python interface class hierachy that looks something like:
class Axis(object, metaclass=ABCMeta):
def __init__(self):
# Do stuff...
class LinearAxis(Axis, metaclass=ABCMeta):
#abstractmethod
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
Axis.__init__(self)
class RotationalAxis(Axis, metaclass=ABCMeta):
#abstractmethod
def move_rotate(self, move_degree):
pass
def __init__(self):
# Do stuff...
Axis.__init__(self)
class XAxis(LinearAxis, metaclass=ABCMeta):
def __init__(self):
# Do stuff...
LinearAxis.__init__(self)
So basically an interface sort of like that with a bunch more functions everywhere and stuff in the constructors etc...
Then I go to derive off my interface:
class AnAxis(Axis):
def __init__(self):
# Do stuff...
Axis.__init__(self)
class AnLinearAxis(AnAxis, LinearAxis):
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
AnAxis.__init__(self)
LinearAxis.__init__(self)
class AnRotationalAxis(AnAxis, RotationalAxis):
def move_rotate(self, move_degree):
pass
def __init__(self):
# Do stuff...
AnAxis.__init__(self)
RotationalAxis.__init__(self)
class AnXAxis(AnLinearAxis, XAxis):
def __init__(self):
# Do stuff...
AnLinearAxis.__init__(self)
XAxis.__init__(self)
I'm trying to work out how to call the constructors properly. The way I have it, I'm pretty sure I call the interface constructors many times... So it's wrong... Is there a preferred way to do it? (Perhaps I don't call constructors in the interface classes, or I only call the interface constructor at the end up my implementation class.)
Also, I've never coded in this style and am open to better ways to code this.

You're probably looking for the super() function.
Calling super().something() calls the method something() of the parent class. It makes sure (using __mro__) to call the parent classes' method only once.
i.e. your code will look like this:
class AnLinearAxis(AnAxis, LinearAxis):
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
super().__init__()
Keep in mind you do not need to pass self or the metaclass. The metaclass passes by the inheritance. Also, you do not need to call super more than once. Super will call all of the parent classes' methods automatically.
Regarding the interface, it looks good but there's no need to pass metaclass=ABCMeta if the class you're inheriting from already has it. The metaclass is passed on by inheritance.

Related

Multilevel abstraction with interface and inheritance in Python

I'm not exactly sure how to phrase this question, hence the strange title. I also have not been able to find any information on this after searching, so hopefully this isn't a duplicate and I'm just searching for the wrong words. Anyhow, here is the situation, I have an abstract base class with some methods in it, which is inherited by a class. I don't want to set one of the methods in this base class, as this class is meant to be inherited by other classes to provide the common functionality they all share. Something like:
class A(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun1(self):
pass
#abc.abstractmethod
def fun2(self):
pass
class B(A):
def fun1(self):
#do work here
#abc.abstractmethod
def fun2(self): # Intent to have the final classes define this
pass
class C(B):
def fun2(self):
# do work here
class D(B):
def fun2(self):
# do work here
I would like to keep the function as an ABC.meta to force implementation on the final children, but because there can be multiple types of class B in this case all inheriting from the interface, I want to keep the initial virtulization of the method at this root class, but have a way for class B to enforce that it's sub-classes must implement this. The code works just find if I don't add the abstract method to class B, but that is awkward since subclassess must implement the method and shouldn't have to look all the way up to the interface to figure out everything they need to implement. As written, it will error out because class B cannot declare the method as an abc.abstract. If I don't declare it as an abstract there is no way to enforce the child class has to implement the method.
I hope my convoluted way of writing this makes sense to someone out there...
Thanks!
You probably should not redefine fun2 as an abstract method in the concrete class B. You are creating a set of rules for your interface, but immediately violating them when you do that.
Instead, either define a mix-in class or an additional ABC that C and D can inherit.
class A(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun1(self):
pass
class A2(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun2(self):
pass
class B(A):
def fun1(self):
print('hello')
class B2(A2):
def fun2(self):
print('world')
class C(B, B2):
pass
class D(B, B2):
pass

Calling base class method after child class __init__ from base class __init__?

This is a feature I miss in several languages and wonder if anyone has any idea how it can be done in Python.
The idea is that I have a base class:
class Base(object):
def __init__(self):
self.my_data = 0
def my_rebind_function(self):
pass
and a derived class:
class Child(Base):
def __init__(self):
super().__init__(self)
# Do some stuff here
self.my_rebind_function() # <==== This is the line I want to get rid of
def my_rebind_function(self):
# Do stuff with self.my_data
As can be seen above, I have a rebound function which I want called after the Child.__init__ has done its job. And I want this done for all inherited classes, so it would be great if it was performed by the base class, so I do not have to retype that line in every child class.
It would be nice if the language had a function like __finally__, operating similar to how it operates with exceptions. That is, it should run after all __init__-functions (of all derived classes) have been run, that would be great. So the call order would be something like:
Base1.__init__()
...
BaseN.__init__()
LeafChild.__init__()
LeafChild.__finally__()
BaseN.__finally__()
...
Base1.__finally__()
And then object construction is finished. This is also kind of similar to unit testing with setup, run and teardown functions.
You can do this with a metaclass like that:
class Meta(type):
def __call__(cls, *args, **kwargs):
print("start Meta.__call__")
instance = super().__call__(*args, **kwargs)
instance.my_rebind_function()
print("end Meta.__call__\n")
return instance
class Base(metaclass=Meta):
def __init__(self):
print("Base.__init__()")
self.my_data = 0
def my_rebind_function(self):
pass
class Child(Base):
def __init__(self):
super().__init__()
print("Child.__init__()")
def my_rebind_function(self):
print("Child.my_rebind_function")
# Do stuff with self.my_data
self.my_data = 999
if __name__ == '__main__':
c = Child()
print(c.my_data)
By overwriting Metaclass.__call__ you can hook after all __init__ ( and __new__) methods of the class-tree have run an before the instance is returned. This is the place to call your rebind function. To understand the call order i added some print statements. The output will look like this:
start Meta.__call__
Base.__init__()
Child.__init__()
Child.my_rebind_function
end Meta.__call__
999
If you want to read on and get deeper into details I can recommend following great article: https://blog.ionelmc.ro/2015/02/09/understanding-python-metaclasses/
I may still not fully understand, but this seems to do what I (think) you want:
class Base(object):
def __init__(self):
print("Base.__init__() called")
self.my_data = 0
self.other_stuff()
self.my_rebind_function()
def other_stuff(self):
""" empty """
def my_rebind_function(self):
""" empty """
class Child(Base):
def __init__(self):
super(Child, self).__init__()
def other_stuff(self):
print("In Child.other_stuff() doing other stuff I want done in Child class")
def my_rebind_function(self):
print("In Child.my_rebind_function() doing stuff with self.my_data")
child = Child()
Output:
Base.__init__() called
In Child.other_stuff() doing other stuff I want done in Child class
In Child.my_rebind_function() doing stuff with self.my_data
If you want a "rebind" function to be invoked after each instance of a type which inherits from Base is instantiated, then I would say this "rebind" function can live outside the Base class(or any class inheriting from it).
You can have a factory function that gives you the object you need when you invoke it(for example give_me_a_processed_child_object()). This factory function basically instantiates an object and does something to it before it returns it to you.
Putting logic in __init__ is not a good idea because it obscures logic and intention. When you write kid = Child(), you don't expect many things to happen in the background, especially things that act on the instance of Child that you just created. What you expect is a fresh instance of Child.
A factory function, however, transparently does something to an object and returns it to you. This way you know you're getting an already processed instance.
Finally, you wanted to avoid adding "rebind" methods to your Child classes which you now you can since all that logic can be placed in your factory function.

Why does #abstractmethod need to be used in a class whose metaclass is derived from ABCMeta?

PEP 3119 states that:
The #abstractmethod decorator should only be used inside a class body, and only for classes whose metaclass is (derived from) ABCMeta. Dynamically adding abstract methods to a class, or attempting to modify the abstraction status of a method or class once it is created, are not supported.
I cannot find, however, an explanation of why that is. Specifically, I do not notice a difference in behavior when using only #abstractmethod in a class that does not explicitly inherit from ABCMeta. In the following simple example, if I understand correctly, the proper way of doing things would be:
import six
from abc import ABCMeta
from abc import abstractmethod
class Base(six.with_metaclass(ABCMeta)):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
However, if I let the Base class inherit simply from object, and only use the decorator when needed, I notice no change in behavior.
from abc import abstractmethod
class Base(object):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
I have found this to be the case even on more complex architectures, so I wonder: when does the latter method fail?
You don't see any difference because your first subclass does implement the do_something abstractmethod.
Comment out the definition of do_something in the subclasses in both versions and you'll find out that in the first case you get a TypeError when trying to instanciate the subclass - you'd also get one trying to instanciate the first version Base class itself FWIW. With the second version, you can instanciate both classes (which shouldn't be possible since they are abstract) and call the abstract do_something method - which kind of defeats one of main points of ABCs.
You'll also miss quite a few other interesting features of ABCs FWIW...

How to make sure parent method is always called before child overrides it in a decorator?

So I have a parent class:
class Parent(Object):
def function(self):
do_something()
And many child classes:
class Child1(Parent):
def function(self):
do_something_else_1()
class Child2(Parent):
def function(self):
do_something_else_2()
...
I would like to ensure that the parent function() is always called before the children's function(), so that every call to function() also calls do_something() no matter the class. Now, I know I can do something like:
class Child1(Parent):
def function(self):
super(Child1, self).function()
do_something_else_1()
class Child2(Parent):
def function(self):
super(Child2, self).function()
do_something_else_2()
...
but I would rather not do that for every child class, because these child classes are being generated on the fly, and because these child classes themselves are being extended further. Instead, I would like to do something that looks like
class Child1(Parent):
#call_parent
def function(self):
do_something_else_1()
class Child2(Parent):
#call_parent
def function(self):
do_something_else_2()
...
And write a decorator to accomplish the same task.
I have two questions:
Is this even a good idea? Am I using decorators and function overriding in their intended way?
How would I go about writing this decorator?
Is this even a good idea? Am I using decorators and function overriding in their intended way?
This question is complex to answer without knowing the details about your system.
Just from the abstract example it looks OK, but replacing the explicit and clear super() call with something like #call_parent is not a good idea.
Everyone knows or can easily find out what super() does and decorator will only cause the confusion.
How would I go about writing this decorator?
Don't write the decorator, instead you can use the template method:
class Parent(Object):
def function(self):
do_something()
do_something_in_child()
def do_something_in_child():
pass
Now in child classes you only override the do_something_in_child, the function stays only in the Parent, so you are sure your do_something() is always called.
class Child1(Parent):
def do_something_in_child(self):
do_something_else_1():
class Child2(Parent):
def do_something_in_child(self):
do_something_else_2():
class Child3(Parent):
# no override here, function() will do the same what it does in Parent
pass
Im not well versed on Python but you could so something like:
# Function in childredn. Overrides parent one.
def function(self):
# child code
super().function() #however it is used
# more child code
If that is not plausible, take a look on template method design pattern.
# Function in parent. Do not override this one
def function(self):
# your parent code
function_do_something()
# more code if you need it
# function in parent. Children overryde this one
def function_do_something():
...
And, you can allways let function_do_something() be void, in order to only execute your father constructor.

Robot Framework keywords and Inheritance

I have a library of keywords. I have a few classes and subclasses, but I'm having an issue with inheritance and keywords being double-defined. For example:
MyLib.py
class Class1:
def __init__(self):
pass
def do_something_generic(self):
#do stuff that is generic to all subclasses
pass
class Subclass1(Class1):
def __init__(self):
pass
def do_something_specific_to_subclass1(self):
#something specific
pass
class Subclass2(Class1):
def __init__(self):
pass
def do_something_specific_to_subclass2(self):
#something specific
pass
The specific keywords work fine, but when I try to call Do Something Generic I get Multiple keywords with name 'Do Something Generic' found. I can fully qualify the library name with MyLib.Class1.Do Something Generic, but is there any way to define Do Something Generic to always refer to the superclass, since the method is only defined there and is simply inherited by the subclasses?
From Robot Framework User Guide
When the static library API is used, Robot Framework uses reflection to find out what public methods the library class or module implements.
It will exclude all methods starting with an underscore,
and with Java libraries also methods that are implemented only in java.lang.Object are ignored.
All the methods that are not ignored are considered keywords.
Have you considered adding helper base class with a _do_something_generic function? You can exclude it from __all__ list. Then use the inheritance to expose keywords from the base class in Class1.
MyLibrary.py:
__all__ = ['Class1', 'Subclass1', 'Subclass2']
class BaseClass:
def _do_something_generic(self):
pass
class Class1(BaseClass):
def __init__(self):
pass
def do_something_generic(self):
return self._do_something_generic()
class Subclass1(BaseClass):
def __init__(self):
pass
def do_something_specific_to_subclass1(self):
a = self._do_something_generic()
return (a, 3)
class Subclass2(BaseClass):
def __init__(self):
pass
def do_something_specific_to_subclass2(self):
#something specific
pass
I think the best solution is to simply move do_something_generic to a separate class, so that your base class only has helper functions and no public keywords:
class Class1:
def __init__(self):
pass
class Subclass0(Class1):
def do_something_generic(self):
#do stuff that is generic to all subclasses
pass
While it might be possible to solve this with something exotic like using __slots__ or __getattr__, or modifying self.__dict__, it's probably not worth the trouble.

Categories

Resources