Let's assume that we have a Python class that makes use of the abc module to define an abstract attribute:
import abc
class A(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def test_attribute(self):
raise NotImplementedError
Let's now consider to define B that subclasses from A by adding a new method (test_method()), and C that subclasses from B implementing the abstract method originally declared in A:
class B(A):
def test_method(self):
pass
class C(B):
def test_attribute(self):
# Implement abstract attribute
pass
Assuming that I would like to keep B abstract (non-instantiable), shall I redefine the abstract property (test_attribute) and the metaclass assignment also in B? Or is it enough to inherit them from A (as in the above code)?
I know that Python allows me to not redefine the abstract methods and thus inherit them from the parent class. Is this correct from a theoretical software engineering perspective?
I'm asking so because if I'm not wrong other languages (such as Java) do not allow inheritance of abstract methods without reimplementing them as abstract...
You've pretty much got all the code there, you can always test it and see if it works ... but as a spoiler, Your design is fine so long as C.test_attribute gets decorated with property.
If you try to make an instance of B, then you'll have problems since the whole abstract interface hasn't been created, but it is fine to create it as a base class for C (and presumably other classes later...)
e.g.:
import abc
class A(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def foo(self):
pass
class B(A):
def bar(self):
return "bar"
class C(B):
#property
def foo(self):
return "foo"
print C().foo # foo
print C().bar() # bar
print B().foo # TypeError
Related
There is class A (in another package) that defines an abstract method my_abstract_method() using the old way of defining an abstract class by setting __metaclass__ = ABCMeta.
My class B is a subclass of A, and I would like B to inherit A's my_abstact_method(), and require all subclasses of B to implement my_abstract_method()
How would I go about doing this?
Example:
from abc import abstractmethod, ABC, ABCMeta
class A(object):
__metaclass__ = ABCMeta
#abstractmethod
def my_abstract_method(self):
return
class B(A, metaclass=ABCMeta):
pass
B()
When executed, B() is created successfully.
How can I get B() to fail because the abstract method of A is not defined?
EDIT:
It's hard to grasp what the issue here is, apart from the obvious - the below solution might or might be relevant for the actual implementation.
The below solution introduces another layer of abstraction, which breaks the principle "the solution to inheritance is not more inheritance", but this might be a special case where it might be useful for your scenario.
from abc import abstractmethod, ABC, ABCMeta
class A(object):
__metaclass__ = ABCMeta
#abstractmethod
def my_abstract_method(self):
return
class C(A, ABC):
#abstractmethod
def new_abstract_method(self):
pass
class B(C):
pass
B()
I'm not exactly sure how to phrase this question, hence the strange title. I also have not been able to find any information on this after searching, so hopefully this isn't a duplicate and I'm just searching for the wrong words. Anyhow, here is the situation, I have an abstract base class with some methods in it, which is inherited by a class. I don't want to set one of the methods in this base class, as this class is meant to be inherited by other classes to provide the common functionality they all share. Something like:
class A(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun1(self):
pass
#abc.abstractmethod
def fun2(self):
pass
class B(A):
def fun1(self):
#do work here
#abc.abstractmethod
def fun2(self): # Intent to have the final classes define this
pass
class C(B):
def fun2(self):
# do work here
class D(B):
def fun2(self):
# do work here
I would like to keep the function as an ABC.meta to force implementation on the final children, but because there can be multiple types of class B in this case all inheriting from the interface, I want to keep the initial virtulization of the method at this root class, but have a way for class B to enforce that it's sub-classes must implement this. The code works just find if I don't add the abstract method to class B, but that is awkward since subclassess must implement the method and shouldn't have to look all the way up to the interface to figure out everything they need to implement. As written, it will error out because class B cannot declare the method as an abc.abstract. If I don't declare it as an abstract there is no way to enforce the child class has to implement the method.
I hope my convoluted way of writing this makes sense to someone out there...
Thanks!
You probably should not redefine fun2 as an abstract method in the concrete class B. You are creating a set of rules for your interface, but immediately violating them when you do that.
Instead, either define a mix-in class or an additional ABC that C and D can inherit.
class A(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun1(self):
pass
class A2(metaclass=abc.ABCMeta):
#abc.abstractmethod
def fun2(self):
pass
class B(A):
def fun1(self):
print('hello')
class B2(A2):
def fun2(self):
print('world')
class C(B, B2):
pass
class D(B, B2):
pass
Using abc, I can create abstract classes using the following:
from abc import ABC, abstractmethod
class A(ABC):
#abstractmethod
def foo(self):
print('foo')
class B(A):
pass
obj = B()
This will fail because B has not defined the method foo.
This mimics the abstract method functionality in Java.
I wanted to know if the abstract class functionality is also present in Python, where instantiation of a class is prevented without having any abstract methods.
The conventional way to create an abstract class in Python is to raise the built-in exception NotImplementedError.
class A(object):
def __init__(self):
raise NotImplementedError('abstract base class')
class B(A):
def __init__(self):
# don't call A.__init__ here.
pass
b = B()
# a = A() # This will fail.
Yes. You can.
If you want to not enforce method implementation:
Simply inherit from ABC but don't delcare a method abstract, so it needn't be implemented in its subclasses.
If you want the abstract class to enforce the implementation of all methods:
Decorate all methods.
If you want to enforce the implementation of a method that does not belong to an ABC:
Raise NotImplementedErrorin the method. This won't prevent instantiation, but usage. However, if you want to prevent it in the instantiation, you should rather use ABC's.
You can also delcare __init__ an abstractmethod, but generally this does not look very useful to me.
PEP 3119 states that:
The #abstractmethod decorator should only be used inside a class body, and only for classes whose metaclass is (derived from) ABCMeta. Dynamically adding abstract methods to a class, or attempting to modify the abstraction status of a method or class once it is created, are not supported.
I cannot find, however, an explanation of why that is. Specifically, I do not notice a difference in behavior when using only #abstractmethod in a class that does not explicitly inherit from ABCMeta. In the following simple example, if I understand correctly, the proper way of doing things would be:
import six
from abc import ABCMeta
from abc import abstractmethod
class Base(six.with_metaclass(ABCMeta)):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
However, if I let the Base class inherit simply from object, and only use the decorator when needed, I notice no change in behavior.
from abc import abstractmethod
class Base(object):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
I have found this to be the case even on more complex architectures, so I wonder: when does the latter method fail?
You don't see any difference because your first subclass does implement the do_something abstractmethod.
Comment out the definition of do_something in the subclasses in both versions and you'll find out that in the first case you get a TypeError when trying to instanciate the subclass - you'd also get one trying to instanciate the first version Base class itself FWIW. With the second version, you can instanciate both classes (which shouldn't be possible since they are abstract) and call the abstract do_something method - which kind of defeats one of main points of ABCs.
You'll also miss quite a few other interesting features of ABCs FWIW...
So, I think the code probably explains what I'm trying to do better than I can in words, so here goes:
import abc
class foo(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def bar(self):
pass
class bar_for_foo_mixin(object):
def bar(self):
print "This should satisfy the abstract method requirement"
class myfoo(foo, bar_for_foo_mixin):
def __init__(self):
print "myfoo __init__ called"
self.bar()
obj = myfoo()
The result:
TypeError: Can't instantiate abstract class myfoo with abstract methods bar
I'm trying to get the mixin class to satisfy the requirements of the abstract/interface class. What am I missing?
Shouldn't the inheritance be the other way round? In the MRO foo currently comes before bar_for_foo_mixin, and then rightfully complains. With class myfoo(bar_for_foo_mixin, foo) it should work.
And I am not sure if your class design is the right way to do it. Since you use a mixin for implementing bar it might be better not to derive from foo and just register it with the 'foo' class (i.e. foo.register(myfoo)). But this is just my gut feeling.
For completeness, here is the documentation for ABCs.
i think (tested in similar case) that reversing the baseclasses works:
class myfoo(bar_for_foo_mixin, foo):
def __init__(self):
print "myfoo __init__ called"
self.bar()
so in the mro() it would find a concrete version of bar() before it finds the abstract one. No idea if this is actually what happens in the background though.
Cheers, Lars
PS: the code that worked in python 2.7 (python 3 has a different way to set metaclasses) was:
class A(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def do(self):
pass
class B(object):
def do(self):
print "do"
class C(B, A):
pass
c = C()