In python, can I define an interface (abstract class) by inheritance from another abstract class?
If I try:
import abc
ABC = abc.ABCMeta('ABC', (object,), {})
class interface(ABC):
#abc.abstractmethod
def method(self, message):
return
class InterfaceExtended(ABC, interface):
#abc.abstractmethod
def NewMethod(self, message):
return
I get an error on the "InterfaceExtended" class :
TypeError: Error when calling the metaclass bases
Cannot create a consistent method resolution
order (MRO) for bases ABC, Interface
Don't inherit from ABC in your second class. The interface it derives from already inherits ABC
Related
I'm creating a module with a function that works as a factory of the implemented classes. It works registering the classes with the help of a metaclass (a pattern I have just copied from here).
_registry = {}
def register_class(cls):
_registry[cls.__name__] = cls
class Meta(type):
def __new__(meta, name, bases, class_dict):
cls = type.__new__(meta, name, bases, class_dict)
register_class(cls)
return cls
def factory(name):
return _registry[name]()
This works so far.
Now a particularity I have is that I'm implementing classes that share a lot of functionality, therefore I define an abstract base class that implements most of the shared logic, and a large number of derived classes that refine certain particularities. The problem is that this leads to a metaclass conflict, as the metaclass of the the derived classes is both ABCmeta and Meta:
from abc import ABC, abstractmethod
_registry = {}
def register_class(cls):
_registry[cls.__name__] = cls
class Meta(type):
def __new__(meta, name, bases, class_dict):
cls = type.__new__(meta, name, bases, class_dict)
register_class(cls)
return cls
def factory(name):
return _registry[name]()
class Base(ABC):
pass
class Derived1(Base, metaclass=Meta):
pass
TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
How can I work around this conflict?
Simply combine the metaclasses you need into a suitable, derived, metaclass, and use that one as your metaclass. In this case, jsut derive your metaclass from "ABCMeta" instead of type:
from abc import ABCMeta
class Meta(ABCMeta):
def __new__(meta, name, bases, class_dict):
cls = super().__new__(meta, name, bases, class_dict)
register_class(cls)
return cls
Note the importance of using super().__new__ instead of type.__new__ - that is the requisite for our metaclass to be able to be combined with other metaclasses (as far as none of them interfere directly in the same attributes/logic your own metaclass is working on).
So, if you'd need some of your classes to use ABCMeta, and some that would use your metaclass alon, you could, just by replacing the call to type.__new__ with super().__new__ use your metaclass as a mixin, to combine the ABCMeta as needed:
from abc import ABCMeta
class Meta(type):
...
class MetaWithAbc(Meta, ABCMeta):
pass
class Base(metaclass=MetaWithAbc):
pass
...
Also, since Python 3.6, the need for metaclasses has reduced considerably with the introduction of the special __init_subclass__ method.
To simply add a class to registry, as in this case, there is no need for a custom metaclass, if you have a common baseclass: __init_subclass__ is called once per subclass, as it is created:
from abc import ABC, abstractmethod
_registry = {}
def register_class(cls):
_registry[cls.__name__] = cls
def factory(name):
return _registry[name]()
class Base(ABC):
def __init_subclass__(cls, **kwargs):
# always make it colaborative:
super().__init_subclass__(**kwargs)
register_class(cls)
class Derived1(Base):
pass
I want to create a class that has some nested class that defines some contract in Python. A tenable example is a typed config object. My attempt at this is below:
from typing import Mapping
from abc import ABCMeta, abstractmethod
class BaseClass(metaclass=ABCMeta):
# If you want to implement BaseClass, you must also implement BaseConfig
class BaseConfig(metaclass=ABCMeta):
#abstractmethod
def to_dict(self) -> Mapping:
"""Converts the config to a dictionary"""
But unfortunately I can instantiate a subclass of BaseClass without implementing BaseConfig:
class Foo(BaseClass):
pass
if __name__ == "__main__":
foo = Foo()
Is there some way to enforce that a subclass must implement an inner class, too?
It doesn't seem like this is currently possible. The closest thing is to create two abstract classes (corresponding to outer and inner classes) and to force the implementer to provide the cls constructor for the concrete inner class; e.g.:
from abc import ABCMeta, abstractmethod
class Inner(metaclass=ABCMeta):
#abstractmethod
def __str__(self):
pass
class Outer(metaclass=ABCMeta):
inner_cls = Inner
def shout(self, *args, **kwargs):
inner = self.inner_cls(*args, **kwargs)
print(f"My inner is {inner}!!!")
class FooInner(Inner):
def __str__(self):
return "FooInner"
class FooOuter(Outer):
inner_cls = FooInner
This requires Inner to have at least one abstractmethod otherwise it can be instantiated as a default inner_cls implementation.
I have one base class with abstractmethod and subclass which implements this method. How to determine in runtime if in object (without checking type of object or calling method) that method is abstract or not?
class Base:
#abc.abstractmethod
def someAbstractMethod(self):
raise NotImplementedError("Not implemented yet.")
class Subclass(Base):
def someAbstractMethod(self):
some_operations
objects = [Base(),Subclass(),Base(),Subclass(),Subclass(),...]
for object in objects:
#here I want check if method is still abstract or not
Python prevents the creation of instances for classes with abstract methods. So just the fact that you have an instance means you have no abstract methods.
You do, however, have to use the ABCMeta metaclass to properly trigger this behaviour:
class Base(metaclass=abc.ABCMeta):
#abc.abstractmethod
def someAbstractMethod(self):
raise NotImplementedError("Not implemented yet.")
You can also inherit from abc.ABC to get the same metaclass via a base class.
If you wanted to see what abstract methods a class might have listed, use the __abstractmethods__ attribute; it is a set of all names that are still abstract:
>>> import abc
>>> class Base(metaclass=abc.ABCMeta):
... #abc.abstractmethod
... def someAbstractMethod(self):
... raise NotImplementedError("Not implemented yet.")
...
>>> class Subclass(Base):
... def someAbstractMethod(self):
... some_operations
...
>>> Base.__abstractmethods__
frozenset({'someAbstractMethod'})
>>> Subclass.__abstractmethods__
frozenset()
All that the #abc.abstractmethod decorator does is set a __isabstractmethod__ attribute on the function object; it is the metaclass that then uses those attributes.
So if you are dig in too deep or are using a third-party library that has forgotten to use the ABCMeta metaclass, you can test for those attributes:
>>> class Base: # note, no metaclass!
... #abc.abstractmethod
... def someAbstractMethod(self):
... raise NotImplementedError("Not implemented yet.")
...
>>> getattr(Base().someAbstractMethod, '__isabstractmethod__', False)
True
If you need to 'repair' such broken abstract base classes, you'd need subclass and mix in ABCMeta, and add a __abstractmethods__ frozenset to the class you are inheriting from:
BaseClass.__abstractmethods__ = frozenset(
name for name, attr in vars(BaseClass).items()
if getattr(attr, '__isabstractmethod__', False))
class DerivedClass(BaseClass, metaclass=abc.ABCMeta):
# ...
Now DerivedClass is a proper ABC-derived class where any abstract methods are properly tracked and acted on.
You're not using abc correctly here (well not as it's intended to be used at least). Your Base class should use abc.ABCMeta as metaclass, which would prevent instanciation of child classes not implementing abstractmethods. Else just using the abc.abstractmethod decorator is mostly useless.
Python 2.7:
class Base(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def someAbstractMethod(self):
# no need to raise an exception here
pass
Python 3.x
class Base(metaclass=abc.ABCMeta):
#abc.abstractmethod
def someAbstractMethod(self):
# no need to raise an exception here
pass
from abc import abstractmethod, ABCMeta
class AbstractBase(object):
__metaclass__ = ABCMeta
#abstractmethod
def must_implement_this_method(self):
raise NotImplementedError()
class ConcreteClass(AbstractBase):
def extra_function(self):
print('hello')
# def must_implement_this_method(self):
# print("Concrete implementation")
d = ConcreteClass() # no error
d.extra_function()
I'm on Python 3.4. I want to define an abstract base class that defines somes functions that need to be implemented by it's subclassses. But Python doesn't raise a NotImplementedError when the subclass does not implement the function...
The syntax for the declaration of metaclasses has changed in Python 3. Instead of the __metaclass__ field, Python 3 uses a keyword argument in the base-class list:
import abc
class AbstractBase(metaclass=abc.ABCMeta):
#abc.abstractmethod
def must_implement_this_method(self):
raise NotImplementedError()
Calling d = ConcreteClass() will raise an exception now, because a metaclass derived from ABCMeta can not be instantiated unless all of its abstract methods and properties are overridden (For more information see #abc.abstractmethod):
TypeError: Can't instantiate abstract class ConcreteClass with abstract methods
must_implement_this_method
Hope this helps :)
I have a Python system consisting of around 9-10 classes all implementing most of a fat duck-typed interface and used interchangeably by a large collection of modules. I'm trying to refactor the classes into a core, explicit (i.e. ABC) interface and peripheral functionality, following separation of responsibility, but in order to do that I need to be able to tell when the consumer modules are calling methods outside the core interface.
Suppose I have an ABC with abstract methods:
from abc import ABCMeta, abstractmethod
class MyABC:
__metaclass__ = ABCMeta
#abstractmethod
def foo(self):
pass
I also have a class implementing those abstract methods as well as other methods:
class MyClass(MyABC):
def foo(self):
pass
def bar(self):
pass
instance = MyClass()
>>> isinstance(instance, MyABC)
True
How can I ensure that when I pass instance to a method do_something it only uses the methods that are part of MyABC (in this case foo) and not any other methods (bar)? In a static-typed language (e.g. C++) I could pass do_something a pointer of the ABC type; is there some sort of wrapper available in Python that will restrict method access similarly?
Short and simple answer: no
There is no concept of private methods/variables in Python that would be enforced, as described here in detail.
In Python this is handeld by convention.
And if you really want to go into the deep internals checkout this thread.
This is what I came up with:
class ABCGuard(object):
def __init__(self, obj, *abcs):
if any(not isinstance(obj, abc) for abc in abcs):
raise ValueError("{0} must implement {1}"
.format(obj.__class__.__name__,
', '.join(abc.__name__ for abc in abcs
if not isinstance(obj, abc))))
self.__obj = obj
self.__abcs = abcs
classname = '{0}{{{1}}}'.format(obj.__class__.__name__,
', '.join(abc.__name__ for abc in abcs))
self.__class__ = type(classname, (ABCGuard, ) + abcs, {})
def __getattribute__(self, name):
if name.startswith('_ABCGuard__') or (name.startswith('__') and
name.endswith('__')):
return super(ABCGuard, self).__getattribute__(name)
elif any(name in abc.__abstractmethods__ for abc in self.__abcs):
return getattr(self.__obj, name)
else:
raise AttributeError("%r object has no attribute %r" %
(self.__class__.__name__, name))
def __dir__(self):
return [x for abc in self.__abcs for x in abc.__abstractmethods__]