How to pass arguments to overridden functions with different interface? - python

The situation is: An abstract base class defines a set of common actions, while leaving a bunch of specialized actions to subclasses.
from abc import ABC,abstractmethod
class Common(ABC):
def __init__(self,ord_obj):
self.ord_obj = ord_obj
self.abs_obj = self.generate_abs_obj(?)
#abstractmethod
def generate_abs_obj(self):
# Will be implemented in subclasses
pass
def common_operation(self):
return self.ord_obj + self.abs_obj
class specialize_1(Common):
def __init__(self):
super().__init__()
def generate_abs_obj(self,param_1):
# do something
return 25
class specialize_2(Common):
def __init__(self):
super().__init__()
def generate_abs_obj(self,param_1,param_2):
# do something
return param_1+param_2
As shown above, abs_obj is an object generated by an abstract method and initialized in base class __init__. It will be used in the base class by an ordinary method. However, the generate_abs_obj in subclasses have different interfaces. How do I call super().__init__() with different parameters?

The overall approach that you are using seems to be entirely not-pythonic. I highly recommend watching Raymond Hettinger explain Super and Pythonic inheritance in this talk

Related

Why does #abstractmethod need to be used in a class whose metaclass is derived from ABCMeta?

PEP 3119 states that:
The #abstractmethod decorator should only be used inside a class body, and only for classes whose metaclass is (derived from) ABCMeta. Dynamically adding abstract methods to a class, or attempting to modify the abstraction status of a method or class once it is created, are not supported.
I cannot find, however, an explanation of why that is. Specifically, I do not notice a difference in behavior when using only #abstractmethod in a class that does not explicitly inherit from ABCMeta. In the following simple example, if I understand correctly, the proper way of doing things would be:
import six
from abc import ABCMeta
from abc import abstractmethod
class Base(six.with_metaclass(ABCMeta)):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
However, if I let the Base class inherit simply from object, and only use the decorator when needed, I notice no change in behavior.
from abc import abstractmethod
class Base(object):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
I have found this to be the case even on more complex architectures, so I wonder: when does the latter method fail?
You don't see any difference because your first subclass does implement the do_something abstractmethod.
Comment out the definition of do_something in the subclasses in both versions and you'll find out that in the first case you get a TypeError when trying to instanciate the subclass - you'd also get one trying to instanciate the first version Base class itself FWIW. With the second version, you can instanciate both classes (which shouldn't be possible since they are abstract) and call the abstract do_something method - which kind of defeats one of main points of ABCs.
You'll also miss quite a few other interesting features of ABCs FWIW...

Python constructors in ABC inheritance chain for interfaces

I've attempted to create a Python interface class hierachy that looks something like:
class Axis(object, metaclass=ABCMeta):
def __init__(self):
# Do stuff...
class LinearAxis(Axis, metaclass=ABCMeta):
#abstractmethod
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
Axis.__init__(self)
class RotationalAxis(Axis, metaclass=ABCMeta):
#abstractmethod
def move_rotate(self, move_degree):
pass
def __init__(self):
# Do stuff...
Axis.__init__(self)
class XAxis(LinearAxis, metaclass=ABCMeta):
def __init__(self):
# Do stuff...
LinearAxis.__init__(self)
So basically an interface sort of like that with a bunch more functions everywhere and stuff in the constructors etc...
Then I go to derive off my interface:
class AnAxis(Axis):
def __init__(self):
# Do stuff...
Axis.__init__(self)
class AnLinearAxis(AnAxis, LinearAxis):
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
AnAxis.__init__(self)
LinearAxis.__init__(self)
class AnRotationalAxis(AnAxis, RotationalAxis):
def move_rotate(self, move_degree):
pass
def __init__(self):
# Do stuff...
AnAxis.__init__(self)
RotationalAxis.__init__(self)
class AnXAxis(AnLinearAxis, XAxis):
def __init__(self):
# Do stuff...
AnLinearAxis.__init__(self)
XAxis.__init__(self)
I'm trying to work out how to call the constructors properly. The way I have it, I'm pretty sure I call the interface constructors many times... So it's wrong... Is there a preferred way to do it? (Perhaps I don't call constructors in the interface classes, or I only call the interface constructor at the end up my implementation class.)
Also, I've never coded in this style and am open to better ways to code this.
You're probably looking for the super() function.
Calling super().something() calls the method something() of the parent class. It makes sure (using __mro__) to call the parent classes' method only once.
i.e. your code will look like this:
class AnLinearAxis(AnAxis, LinearAxis):
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
super().__init__()
Keep in mind you do not need to pass self or the metaclass. The metaclass passes by the inheritance. Also, you do not need to call super more than once. Super will call all of the parent classes' methods automatically.
Regarding the interface, it looks good but there's no need to pass metaclass=ABCMeta if the class you're inheriting from already has it. The metaclass is passed on by inheritance.

Class Inheritence Naming in Python

There is a BaseClient
class BaseClient(object):
that later get inherits in a lot of classes
class Account(BaseClient):
def create(self, **params):
pass
and few others.
class MainClass(Account, User):
pass
There a few functions that use the same create function
def create(self, **params):
pass
How to add a unique class label like
MainClass.Account.create()
Now it is working as
MainClass.create()
Update:
There a lot duplicate functions like create() that going to override the ones that are inherting from. I would like to call the class like Account, so when I call
MainClass.Account.create()
MainClass.User.create()
they act so two different functions.
In other words, you have multiple inheritance, with:
class Base1(object):
def create(self): ...
class Base2(object):
def create(self): ...
class C(Base1, Base2):
def create(self): ...
In class C, you can choose whether to call the implementation from the parent classes or not.
Option 1: do not implement create in class C
If you don't implement method create in C, then Base1.create is going to be used.
Note that this situation where C inherits from Base1 and Base2 is treated as if C inherites from Base1 and Base1 inherits from Base2.
You can see that if you print C.__mro__
See also this thread about MRO: Method Resolution Order (MRO) in new style Python classes
Option 2: do not call the base implemntation
class C(Base1, Base2):
def create(self):
pass
Now Base1.create is no longer going to be called.
Option 3: call only one of the bases
class C(Base1, Base2):
def create(self):
Base2.create(self)
Now Base1.create is not going to be called, but Base2.create is.
Option 4: call each of the base implementations
class C(Base1, Base2):
def create(self):
Base1.create(self)
Base2.create(self)
Both Base1.create and Base2.create will be called.
Option 5: user super to call all base implementations
Although option 4 may seem like a very nice solution here, in some configurations, like diamond inheritance it could cause a method to be called multiple times. So, an alternative approach is to user super, which uses the MRO (see Option 1) to determine which base implementation to use. By using MRO, it avoids diamond inheritance problems. However, it has to be used systematically on all classes and even then it has its caveats.
class CommonBase(object):
def create(self):
pass
class Base1(CommonBase):
def create(self):
super(Base1, self).create()
class Base2(CommonBase):
def create(self):
super(Base2, self).create()
class C(Base1, Base2):
def create(self):
super(C, self).create()
Here, C().create() will call all four create methods, each once.
You can't control it as a client of the class from the outside of the class, it can only be controlled inside a class, in your case inside MainClass by calling super to call a method from one or another base class: Account or User.
class MainClass(Account, User):
# your own convention that by default it calls Account.create
def create(self, **params):
super(Account, self).create(**params)
def create2(self, **params):
super(User, self).create(**params)

Robot Framework keywords and Inheritance

I have a library of keywords. I have a few classes and subclasses, but I'm having an issue with inheritance and keywords being double-defined. For example:
MyLib.py
class Class1:
def __init__(self):
pass
def do_something_generic(self):
#do stuff that is generic to all subclasses
pass
class Subclass1(Class1):
def __init__(self):
pass
def do_something_specific_to_subclass1(self):
#something specific
pass
class Subclass2(Class1):
def __init__(self):
pass
def do_something_specific_to_subclass2(self):
#something specific
pass
The specific keywords work fine, but when I try to call Do Something Generic I get Multiple keywords with name 'Do Something Generic' found. I can fully qualify the library name with MyLib.Class1.Do Something Generic, but is there any way to define Do Something Generic to always refer to the superclass, since the method is only defined there and is simply inherited by the subclasses?
From Robot Framework User Guide
When the static library API is used, Robot Framework uses reflection to find out what public methods the library class or module implements.
It will exclude all methods starting with an underscore,
and with Java libraries also methods that are implemented only in java.lang.Object are ignored.
All the methods that are not ignored are considered keywords.
Have you considered adding helper base class with a _do_something_generic function? You can exclude it from __all__ list. Then use the inheritance to expose keywords from the base class in Class1.
MyLibrary.py:
__all__ = ['Class1', 'Subclass1', 'Subclass2']
class BaseClass:
def _do_something_generic(self):
pass
class Class1(BaseClass):
def __init__(self):
pass
def do_something_generic(self):
return self._do_something_generic()
class Subclass1(BaseClass):
def __init__(self):
pass
def do_something_specific_to_subclass1(self):
a = self._do_something_generic()
return (a, 3)
class Subclass2(BaseClass):
def __init__(self):
pass
def do_something_specific_to_subclass2(self):
#something specific
pass
I think the best solution is to simply move do_something_generic to a separate class, so that your base class only has helper functions and no public keywords:
class Class1:
def __init__(self):
pass
class Subclass0(Class1):
def do_something_generic(self):
#do stuff that is generic to all subclasses
pass
While it might be possible to solve this with something exotic like using __slots__ or __getattr__, or modifying self.__dict__, it's probably not worth the trouble.

python: calling super().__init__ too early in the __init__ method?

I have a class hierarchy where __init__ in class Base performs some pre-initialization and then calls method calculate. The calculate method is defined in class Base, but it's expected to be redefined in derived classes. The redefined calculate will use some of the attributes that are only available in class Derived:
class Base:
def __init__(self, args):
# perform some pre-initialization
...
# now call method "calculate"
self.calculate()
class Derived(Base):
def __init__(self, args, additional_attr):
super().__init__(args)
# do some work and create new instance attributes
...
self.additional_attr = additional_attr
This is not going to work because calculate method in class Derived will be invoked before self.additional_attr is assigned.
I can't move super().__init__(args) call to the end of the __init__ method because some of the work it does has to happen before processing additional_attr.
What to do?
Perhaps you shouldn't have the calculate() call in your constructor then. If you can't construct a derived object by allowing the base constructor to complete first, then you must be doing something wrong IMHO. A sensible approach would be to move that call out of the constructor and perhaps create a factory method to make that call automatically. Then use that method if you need precalculated instances.
class Base(object):
def __init__(self, args):
# perform some initialization
pass
def calculate(self):
# do stuff
pass
#classmethod
def precalculated(cls, args):
# construct first
newBase = cls(args)
# now call method "calculate"
newBase.calculate()
return newBase
class Derived(Base):
def __init__(self, args, additional_attr):
super(Derived, self).__init__(args)
# do some work and create new instance attributes
self.additional_attr = additional_attr
#classmethod
def precalculated(cls, args, additional_attr): # also if you want
newDerived = cls(args, additional_attr)
newDerived.calculate()
return newDerived
newBase = Base('foo')
precalculatedBase = Base.precalculated('foo')
newDerived = Derived('foo', 'bar')
precalculatedDerived = Derived.precalculated('foo', 'bar')
This is bad design, IMHO, and you're obusing the object system of Python. Consider that in other OO languages like C++, you don't even have control over the creation of base classes. The derived class's constructor calls the base constructor before your code runs. Such behavior is almost always expected of well-behaved class hierarchies, and changing it can only lead to problems.
Sure, you can do some patching (such as assigning self.additional_attr before the call to super's constructor, or other tricks), but the better way would be to change your design so that it won't require such hacks. Since you've presented an abstract example here, it's hard to give more comprehensive design advice.
In order for something like this to work, you need to design a protocol that allows the base and derived class(es) to cooperate with each other to accomplish the object initialization task:
class Base:
def __init__(self, args, *additional_args):
# perform some pre-initialization
# ...
# perform any futher initialization needed by derived classes
self.subclass_setup(*additional_args)
# now call method "calculate"
self.calculate()
def subclass_setup(self, *args):
pass
class Derived(Base):
def __init__(self, args, additional_attr):
super().__init__(args, additional_attr)
def subclass_setup(self, additional_attr):
# do some work and create new instance attributes
# ...
self.additional_attr = additional_attr
Can you pass the additional_attr as a parameter to __init__ method of base class and propogate it from there to calculate method?
Say something like:
class Base(object):
def __init__(self, args,additional_attr):
print 'Args for base class:%s' %(args)
self.calculate(additional_attr)
class Derived(Base):
def __init__(self, args, additional_attr):
super(Derived,self).__init__(args,additional_attr)
def calculate(self,val):
print 'Arg for calculate:%s' %(val)
self.additional_attr = val
>>> d = Derived(['test','name'],100)
Args for base class:['test', 'name']
Arg for calculate:100
This is roundabout way, but with no information about what the pre-initialisation steps are, it is hard to say whether the above approach would help you.

Categories

Resources