can I call base class method in my setup function - python

I am writing python script for the first time
Here is a basic question
class TestLolSupv(TestSetTxFreqPL4App,TestCaseAppForceReset):
def setUp(self):
super().setUp()
super().test_TX_SET_FREQ_PL4P2_A001()
Can I call base class method directly in this way in my setUp function?
Also if I am inheriting more than one base class , and super.setup() executes which both ? if so which one first ?

super.setup() will search for setup method in all parent classes, starting from left to right, till it finds it and executes the first found method.
for example
class A(object):
def setup(self):
print("A")
class B(object):
def setup(self):
print("B")
class C(A, B):
def setup(self):
super().setup()
c = C()
c.setup()
will print the answer as "A".
if the parent classes have inherited from other classes, then they will be searched in that order.
example
class A(object):
def setup(self):
pass
class B(object):
def setup(self):
pass
class C(A):
pass
class D(B):
pass
class E(C, D):
def setup(self):
super().setup()
class F(D, C):
def setup(self):
super().setup()
now for E the setup in A will be executed, and for F setup in B will be executed.

Yes, doing super().setUp() is the right way of calling the parent method class.
Now, regarding the multiple inheritance and the method calling. Python will actually call only the method of one class. When you do multiple inheritance, Python will create a 'Multiple-Resolution Order', that is the order in which python check the parent class for the method. Once it found the method in a parent it will stop looking for the method (this normally happens, because the inheritance hierarchy may be big).
In your example, you resolution order would be:
[<class '__main__.TestLolSupv'>, <class '__main__.TestSetTxFreqPL4App'>, <class '__main__.TestCaseAppForceReset'>, <class 'object'>]
You can check it by running TestLolSupv.mro().
If you want to have control of the order the objects are being called, I would suggest using delegation instead. And also, if you are using inheritance to have access to parent class method, delegation would be better. Normally, inheritance is worth when the parent class calls a child class method, because this means you are building an abstraction.
Here is some info about multiple inheritance in python.

Related

Multilevel abstraction with interface and inheritance in Python

I'm not exactly sure how to phrase this question, hence the strange title. I also have not been able to find any information on this after searching, so hopefully this isn't a duplicate and I'm just searching for the wrong words. Anyhow, here is the situation, I have an abstract base class with some methods in it, which is inherited by a class. I don't want to set one of the methods in this base class, as this class is meant to be inherited by other classes to provide the common functionality they all share. Something like:
class A(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun1(self):
pass
#abc.abstractmethod
def fun2(self):
pass
class B(A):
def fun1(self):
#do work here
#abc.abstractmethod
def fun2(self): # Intent to have the final classes define this
pass
class C(B):
def fun2(self):
# do work here
class D(B):
def fun2(self):
# do work here
I would like to keep the function as an ABC.meta to force implementation on the final children, but because there can be multiple types of class B in this case all inheriting from the interface, I want to keep the initial virtulization of the method at this root class, but have a way for class B to enforce that it's sub-classes must implement this. The code works just find if I don't add the abstract method to class B, but that is awkward since subclassess must implement the method and shouldn't have to look all the way up to the interface to figure out everything they need to implement. As written, it will error out because class B cannot declare the method as an abc.abstract. If I don't declare it as an abstract there is no way to enforce the child class has to implement the method.
I hope my convoluted way of writing this makes sense to someone out there...
Thanks!
You probably should not redefine fun2 as an abstract method in the concrete class B. You are creating a set of rules for your interface, but immediately violating them when you do that.
Instead, either define a mix-in class or an additional ABC that C and D can inherit.
class A(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun1(self):
pass
class A2(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun2(self):
pass
class B(A):
def fun1(self):
print('hello')
class B2(A2):
def fun2(self):
print('world')
class C(B, B2):
pass
class D(B, B2):
pass

Why does #abstractmethod need to be used in a class whose metaclass is derived from ABCMeta?

PEP 3119 states that:
The #abstractmethod decorator should only be used inside a class body, and only for classes whose metaclass is (derived from) ABCMeta. Dynamically adding abstract methods to a class, or attempting to modify the abstraction status of a method or class once it is created, are not supported.
I cannot find, however, an explanation of why that is. Specifically, I do not notice a difference in behavior when using only #abstractmethod in a class that does not explicitly inherit from ABCMeta. In the following simple example, if I understand correctly, the proper way of doing things would be:
import six
from abc import ABCMeta
from abc import abstractmethod
class Base(six.with_metaclass(ABCMeta)):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
However, if I let the Base class inherit simply from object, and only use the decorator when needed, I notice no change in behavior.
from abc import abstractmethod
class Base(object):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
I have found this to be the case even on more complex architectures, so I wonder: when does the latter method fail?
You don't see any difference because your first subclass does implement the do_something abstractmethod.
Comment out the definition of do_something in the subclasses in both versions and you'll find out that in the first case you get a TypeError when trying to instanciate the subclass - you'd also get one trying to instanciate the first version Base class itself FWIW. With the second version, you can instanciate both classes (which shouldn't be possible since they are abstract) and call the abstract do_something method - which kind of defeats one of main points of ABCs.
You'll also miss quite a few other interesting features of ABCs FWIW...

Python Parent/Child class method call

Python 2.7.6 on Linux.
I'm using a test class that inherits from a parent. The parent class holds a number of fields that are common to many child classes, and I need to call the parent setUp method to initialize the fields. Is calling ParentClass.setUp(self) the correct way to do this? Here's a simple example:
class RESTTest(unittest.TestCase):
def setUp(self):
self.host = host
self.port = port
self.protocol = protocol
self.context = context
class HistoryTest(RESTTest):
def setUp(self):
RESTTest.setUp(self)
self.endpoint = history_endpoint
self.url = "%s://%s:%s/%s/%s" %(self.protocol, self.host, self.port, self.context, self.endpoint)
def testMe(self):
self.assertTrue(True)
if __name__ == '__main__':
unittest.main()
Is this correct? It seems to work.
You would use super for that.
super(ChildClass, self).method(args)
class HistoryTest(RESTTest):
def setUp(self):
super(HistoryTest, self).method(args)
...
In Python 3 you may write:
class HistoryTest(RESTTest):
def setUp(self):
super().method(args)
...
which is simpler.
See this answer:
super() lets you avoid referring to the base class explicitly, which can be nice. But the main advantage comes with multiple inheritance, where all sorts of fun stuff can happen. See the standard docs on super if you haven't already.
Multiple inheritance
To (try to) answer the question in your comment:
How do you specify which super method you want to call?
From what I understand of the philosophy of multiple inheritance (in Python), you don't. I mean, super, along with the Method Resolution Order (MRO) should do things right and select the appropriate methods. (Yes methods is a plural, see below.)
There are a lot of blog posts / SO answers about this you can find with keywords "multiple inheritance", "diamond", "MRO", "super", etc. This article provides a Python 3 example I found surprising and didn't find in other sources:
class A:
def m(self):
print("m of A called")
class B(A):
def m(self):
print("m of B called")
super().m()
class C(A):
def m(self):
print("m of C called")
super().m()
class D(B,C):
def m(self):
print("m of D called")
super().m()
D().m()
m of D called
m of B called
m of C called
m of A called
See? Both B.m() and C.m() are called thanks to super, which seems like the right thing to do considering D inherits from both B and C.
I suggest you play with this example like I just did. Adding a few prints, you'll see that, when calling D().m(), the super().m() statement in class B itself calls C.m(). Whereas, of course, if you call B().m() (B instance, not D instance), only A.m() is called. In other words, super().m() in B is aware of the class of the instance it is dealing with and behaves accordingly.
Using super everywhere sounds like the silver bullet, but you need to make sure all classes in the inheritance schema are cooperative (another keyword to dig for) and don't break the chain, for instance when expecting additional parameters in child classes.

Python constructors in ABC inheritance chain for interfaces

I've attempted to create a Python interface class hierachy that looks something like:
class Axis(object, metaclass=ABCMeta):
def __init__(self):
# Do stuff...
class LinearAxis(Axis, metaclass=ABCMeta):
#abstractmethod
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
Axis.__init__(self)
class RotationalAxis(Axis, metaclass=ABCMeta):
#abstractmethod
def move_rotate(self, move_degree):
pass
def __init__(self):
# Do stuff...
Axis.__init__(self)
class XAxis(LinearAxis, metaclass=ABCMeta):
def __init__(self):
# Do stuff...
LinearAxis.__init__(self)
So basically an interface sort of like that with a bunch more functions everywhere and stuff in the constructors etc...
Then I go to derive off my interface:
class AnAxis(Axis):
def __init__(self):
# Do stuff...
Axis.__init__(self)
class AnLinearAxis(AnAxis, LinearAxis):
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
AnAxis.__init__(self)
LinearAxis.__init__(self)
class AnRotationalAxis(AnAxis, RotationalAxis):
def move_rotate(self, move_degree):
pass
def __init__(self):
# Do stuff...
AnAxis.__init__(self)
RotationalAxis.__init__(self)
class AnXAxis(AnLinearAxis, XAxis):
def __init__(self):
# Do stuff...
AnLinearAxis.__init__(self)
XAxis.__init__(self)
I'm trying to work out how to call the constructors properly. The way I have it, I'm pretty sure I call the interface constructors many times... So it's wrong... Is there a preferred way to do it? (Perhaps I don't call constructors in the interface classes, or I only call the interface constructor at the end up my implementation class.)
Also, I've never coded in this style and am open to better ways to code this.
You're probably looking for the super() function.
Calling super().something() calls the method something() of the parent class. It makes sure (using __mro__) to call the parent classes' method only once.
i.e. your code will look like this:
class AnLinearAxis(AnAxis, LinearAxis):
def move_linear(self, move_um):
pass
def __init__(self):
# Do stuff...
super().__init__()
Keep in mind you do not need to pass self or the metaclass. The metaclass passes by the inheritance. Also, you do not need to call super more than once. Super will call all of the parent classes' methods automatically.
Regarding the interface, it looks good but there's no need to pass metaclass=ABCMeta if the class you're inheriting from already has it. The metaclass is passed on by inheritance.

Class Inheritence Naming in Python

There is a BaseClient
class BaseClient(object):
that later get inherits in a lot of classes
class Account(BaseClient):
def create(self, **params):
pass
and few others.
class MainClass(Account, User):
pass
There a few functions that use the same create function
def create(self, **params):
pass
How to add a unique class label like
MainClass.Account.create()
Now it is working as
MainClass.create()
Update:
There a lot duplicate functions like create() that going to override the ones that are inherting from. I would like to call the class like Account, so when I call
MainClass.Account.create()
MainClass.User.create()
they act so two different functions.
In other words, you have multiple inheritance, with:
class Base1(object):
def create(self): ...
class Base2(object):
def create(self): ...
class C(Base1, Base2):
def create(self): ...
In class C, you can choose whether to call the implementation from the parent classes or not.
Option 1: do not implement create in class C
If you don't implement method create in C, then Base1.create is going to be used.
Note that this situation where C inherits from Base1 and Base2 is treated as if C inherites from Base1 and Base1 inherits from Base2.
You can see that if you print C.__mro__
See also this thread about MRO: Method Resolution Order (MRO) in new style Python classes
Option 2: do not call the base implemntation
class C(Base1, Base2):
def create(self):
pass
Now Base1.create is no longer going to be called.
Option 3: call only one of the bases
class C(Base1, Base2):
def create(self):
Base2.create(self)
Now Base1.create is not going to be called, but Base2.create is.
Option 4: call each of the base implementations
class C(Base1, Base2):
def create(self):
Base1.create(self)
Base2.create(self)
Both Base1.create and Base2.create will be called.
Option 5: user super to call all base implementations
Although option 4 may seem like a very nice solution here, in some configurations, like diamond inheritance it could cause a method to be called multiple times. So, an alternative approach is to user super, which uses the MRO (see Option 1) to determine which base implementation to use. By using MRO, it avoids diamond inheritance problems. However, it has to be used systematically on all classes and even then it has its caveats.
class CommonBase(object):
def create(self):
pass
class Base1(CommonBase):
def create(self):
super(Base1, self).create()
class Base2(CommonBase):
def create(self):
super(Base2, self).create()
class C(Base1, Base2):
def create(self):
super(C, self).create()
Here, C().create() will call all four create methods, each once.
You can't control it as a client of the class from the outside of the class, it can only be controlled inside a class, in your case inside MainClass by calling super to call a method from one or another base class: Account or User.
class MainClass(Account, User):
# your own convention that by default it calls Account.create
def create(self, **params):
super(Account, self).create(**params)
def create2(self, **params):
super(User, self).create(**params)

Categories

Resources