Using abc, I can create abstract classes using the following:
from abc import ABC, abstractmethod
class A(ABC):
#abstractmethod
def foo(self):
print('foo')
class B(A):
pass
obj = B()
This will fail because B has not defined the method foo.
This mimics the abstract method functionality in Java.
I wanted to know if the abstract class functionality is also present in Python, where instantiation of a class is prevented without having any abstract methods.
The conventional way to create an abstract class in Python is to raise the built-in exception NotImplementedError.
class A(object):
def __init__(self):
raise NotImplementedError('abstract base class')
class B(A):
def __init__(self):
# don't call A.__init__ here.
pass
b = B()
# a = A() # This will fail.
Yes. You can.
If you want to not enforce method implementation:
Simply inherit from ABC but don't delcare a method abstract, so it needn't be implemented in its subclasses.
If you want the abstract class to enforce the implementation of all methods:
Decorate all methods.
If you want to enforce the implementation of a method that does not belong to an ABC:
Raise NotImplementedErrorin the method. This won't prevent instantiation, but usage. However, if you want to prevent it in the instantiation, you should rather use ABC's.
You can also delcare __init__ an abstractmethod, but generally this does not look very useful to me.
Related
In Python, how can I differentiate between a concrete subclass and a subclass which is still abstract (i.e. not all abstract methods have been implemented)?
Consider the following:
import abc
class A(abc.ABC):
#abc.abstractmethod
def do_something(self):
pass
class B(A):
def do_something(self):
print('I am doing')
class C(A):
pass
for subclass in A.__subclasses__():
if is_concrete_class(subclass):
subclass().do_something()
else:
print('subclass is still abstract')
What is the implementation of is_concrete_class?
I could attempt to instantiate each subclass given by __subclasses__() and catch TypeError: Can't instantiate abstract class <class_name> with abstract methods <...>, but TypeError seems too broad of an exception to catch.
I didn't find anything useful in Python's ABC Library.
There's a function for that in the inspect module: inspect.isabstract(some_object) will tell you whether an object is an abstract class.
It returns True for an abstract class, and False for anything else, including abstract methods, classes that inherit from abc.ABC but are still concrete because don't have abstract methods, and objects that have nothing to do with abstract classes, like 3.
There is class A (in another package) that defines an abstract method my_abstract_method() using the old way of defining an abstract class by setting __metaclass__ = ABCMeta.
My class B is a subclass of A, and I would like B to inherit A's my_abstact_method(), and require all subclasses of B to implement my_abstract_method()
How would I go about doing this?
Example:
from abc import abstractmethod, ABC, ABCMeta
class A(object):
__metaclass__ = ABCMeta
#abstractmethod
def my_abstract_method(self):
return
class B(A, metaclass=ABCMeta):
pass
B()
When executed, B() is created successfully.
How can I get B() to fail because the abstract method of A is not defined?
EDIT:
It's hard to grasp what the issue here is, apart from the obvious - the below solution might or might be relevant for the actual implementation.
The below solution introduces another layer of abstraction, which breaks the principle "the solution to inheritance is not more inheritance", but this might be a special case where it might be useful for your scenario.
from abc import abstractmethod, ABC, ABCMeta
class A(object):
__metaclass__ = ABCMeta
#abstractmethod
def my_abstract_method(self):
return
class C(A, ABC):
#abstractmethod
def new_abstract_method(self):
pass
class B(C):
pass
B()
If I have a parent class with two methods:
class Parent():
#abstractmethod
#staticmethod
def functionA():
pass
def functionB():
return __class__.functionA() + 1
And I implement a child class:
class Child(Parent):
def functionA(): # this function is different for each kind of child
return 3
In the end, the purpose of the child classes would be to call functionB() only.
Does it work? Of course, I could place functionB() into the child class and make it work, but because functionB() is the same for every kind of child class, I wouldn't want to write repeated code for each class?
Also, is my use of __class__ appropriate here?
First, functionB itself should be a class method.
#classmethod
def functionB(cls):
return cls.functionA() + 1
Second, you still have to decorate functionA as a static method in each child class; otherwise, you are replacing the inherited static method with an instance method.
class Child(Parent):
#staticmethod
def functionA():
return 3
Here is a working version:
from abc import ABC, abstractmethod
class Parent(ABC):
#staticmethod # not needed, but is documentation
#abstractmethod
def a(): pass
#classmethod
def b(cls): return cls.a() + 1
class Child(Parent):
#staticmethod
def a(): return 3
We can test it:
>>> c = Child()
>>> c.b()
4
Noteworthy things:
We need to use the ABC base class (or the ABCMeta metaclass) in order to have access to the abstractmethod decorator, as explained in the abc module documentation.
__class__ is not a keyword; it is an attribute of objects. We cannot just do things like __class__.a() because there is nothing to get the __class__ from. We could address this using the self parameter of an ordinary method, but what we are really trying to do here is to get behaviour that doesn't require an instance, yet depends on which derived class we are using. And that is what classmethod is for, and why there are separate classmethod and staticmethod decorators.
When you use classmethod, the class will be passed as a parameter, like how the instance is passed when you use a normal method. By convention, we name it cls. For more information about classmethod, please see this excellent talk by Raymond Hettinger (on the Python dev team).
Our implementations of the abstract method must also be decorated in the child classes, because they are still not ordinary methods - so Python needs to know not to pass an instance.
The staticmethod decoration on the abstract method needs to be listed first, before the abstractmethod decoration, for technical reasons. It effectively does nothing here; the abstractmethod would already behave like a staticmethod when we call e.g. Parent.a() (no instance; we can't create one anyway, since it's an abstract class).
We could also use classmethod instead of staticmethod for the a methods. This would allow children of Child to inherit the Child behaviour without explicitly writing their own a implementations. In this case, we would want an explicit classmethod decoration on the base abstract method, rather than staticmethod; and of course we would need to add cls parameters to each a implementation.
PEP 3119 states that:
The #abstractmethod decorator should only be used inside a class body, and only for classes whose metaclass is (derived from) ABCMeta. Dynamically adding abstract methods to a class, or attempting to modify the abstraction status of a method or class once it is created, are not supported.
I cannot find, however, an explanation of why that is. Specifically, I do not notice a difference in behavior when using only #abstractmethod in a class that does not explicitly inherit from ABCMeta. In the following simple example, if I understand correctly, the proper way of doing things would be:
import six
from abc import ABCMeta
from abc import abstractmethod
class Base(six.with_metaclass(ABCMeta)):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
However, if I let the Base class inherit simply from object, and only use the decorator when needed, I notice no change in behavior.
from abc import abstractmethod
class Base(object):
def __init__(self):
print('Init abstract base')
#abstractmethod
def do_something(self):
pass
class Subclass(Base):
def __init__(self):
super(Subclass, self).__init__()
def do_something(self):
print('Done.')
sub = Subclass()
sub.do_something()
I have found this to be the case even on more complex architectures, so I wonder: when does the latter method fail?
You don't see any difference because your first subclass does implement the do_something abstractmethod.
Comment out the definition of do_something in the subclasses in both versions and you'll find out that in the first case you get a TypeError when trying to instanciate the subclass - you'd also get one trying to instanciate the first version Base class itself FWIW. With the second version, you can instanciate both classes (which shouldn't be possible since they are abstract) and call the abstract do_something method - which kind of defeats one of main points of ABCs.
You'll also miss quite a few other interesting features of ABCs FWIW...
Let's assume that we have a Python class that makes use of the abc module to define an abstract attribute:
import abc
class A(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def test_attribute(self):
raise NotImplementedError
Let's now consider to define B that subclasses from A by adding a new method (test_method()), and C that subclasses from B implementing the abstract method originally declared in A:
class B(A):
def test_method(self):
pass
class C(B):
def test_attribute(self):
# Implement abstract attribute
pass
Assuming that I would like to keep B abstract (non-instantiable), shall I redefine the abstract property (test_attribute) and the metaclass assignment also in B? Or is it enough to inherit them from A (as in the above code)?
I know that Python allows me to not redefine the abstract methods and thus inherit them from the parent class. Is this correct from a theoretical software engineering perspective?
I'm asking so because if I'm not wrong other languages (such as Java) do not allow inheritance of abstract methods without reimplementing them as abstract...
You've pretty much got all the code there, you can always test it and see if it works ... but as a spoiler, Your design is fine so long as C.test_attribute gets decorated with property.
If you try to make an instance of B, then you'll have problems since the whole abstract interface hasn't been created, but it is fine to create it as a base class for C (and presumably other classes later...)
e.g.:
import abc
class A(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def foo(self):
pass
class B(A):
def bar(self):
return "bar"
class C(B):
#property
def foo(self):
return "foo"
print C().foo # foo
print C().bar() # bar
print B().foo # TypeError