Is it a good practice to define properties in an interface like this?
class MyInterface(object):
def required_method(self):
raise NotImplementedError
#property
def required_property(self):
raise NotImplementedError
I'd use a ABC class for that, but yes; you can even use a #abstractproperty for that very use-case.
from abc import ABCMeta, abstractproperty, abstractmethod
class MyInterface(object):
__metaclass__ = ABCMeta
#abstractmethod
def required_method(self):
pass
#abstractproperty
def required_property(self):
pass
Subclasses of the ABC are still free to implement required_property as an attribute instead; the ABC will only verify the existence of required_property, not what type it is.
Related
I would like to create an abstract method in parent class which would be overridden in subclasses. This method would print all methods in the given subclass which start with 'on_'.
from abc import ABC, abstractmethod
class abstract_class(ABC):
#abstractmethod
def get_all_on_methods(self):
pass
class sub(abstract_class):
an_object = sub()
def get_all_on_methods(self):
for attribute in dir(self):
if attribute.startswith("on_"):
print(attribute)
def nothin(self):
print("nothin")
def on_fallback(self):
raise NotImplementedError()
def on_no(self):
raise NotImplementedError()
sub.get_all_on_methods()
I have two problems. First, I have:
Unresolved reference 'sub'
Second, I don't know whether my approach as actually all that good.
I have a class with the following implementation:
class Device(AbstractDevice):
#property
def value(self):
return 1
it uses this ABC:
from abc import ABC
from abc import abstractmethod
class AbstractDevice(ABC):
#property
#abstractmethod
def value(self):
pass
#value.setter
#abstractmethod
def value(self):
pass
The problem is, python doesn't seem to enforce the setter part. Am I missing some decorator or is that by design?
What I would like to happen is if a user implements a class using my AbstractDevice the way I've done here, they'll get an error saying they need to implement the value.setter. Is that at all possible?
Edited
I have 2 classes inheriting from ABC, and a third class inheriting from both, each in a different file. Tried to provide the metaclass of ABCMeta to the last class, to resolve the conflict of metaclasses, but it fails with the same
"TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases"
Why does python ignore the metaclass directive in this case, and how to resolve it?
file A:
from abc import ABC, abstractmethod
class A(ABC):
#abstractmethod
def method1(self):
pass
file B:
from abc import ABC, abstractmethod
class B(ABC):
#abstractmethod
def method2(self):
pass
file C:
import A
import B
class C(A,B,metaclass=ABCMeta):
def method1(self):
pass
def method2(self):
pass
The problem stems from wrong import.
file C should be:
from A import A
from B import B
class C(A,B):
def method1(self):
pass
def method2(self):
pass
Credit should go to #Matthias & #Giacomo Alzetta, who pointed out that the MCVE works for them.
I have an Interface class which defines the requirements to an active "in-use" class:
class Portfolio(ABC):
#abstractmethod
def update_portfolio(self):
raise NotImplementedError
#abstractmethod
def update_from_fill(self):
raise NotImplementedError
#abstractmethod
def check_signal(self, signal_event):
raise NotImplementedError
The methods update_portfolio and update_from_fill are both methods which will be the same in 99% of the required cases. Only the check_signal method will vary. Therefore, to avoid having to write the same code again and again, I have defined a base class with default methods for update_portfolio and update_from_fill:
class BaseBacktestPortfolio(Portfolio):
def __init__(self, ...):
...
def update_portfolio(self, ...):
...
def update_from_fill(self, ...):
...
Then, finally, I have a class inheriting from the BacktestPortfolio class which specifies the correct implementation of the check_signal method:
class USBacktestPortfolio(BaseBacktestPortfolio):
def check_signal(self, ...):
...
Now, the problem is that my editor complains about the BacktestPortfolio classing not having all the required abstract methods. I could ignore this, of course, but the perfect scenario would be if I could make sure that it is not possible to instantiate an object form the BacktestPortfolio class.
Is this possible? And/or is there a more correct way to implement a structure like this?
I could ignore this, of course, but the perfect scenario would be if I could make sure that it is not possible to instantiate an object from the BacktestPortfolio class.
That is the case in your example already:
>>> BaseBacktestPortfolio.mro()
[__main__.BaseBacktestPortfolio, __main__.Portfolio, abc.ABC, object]
>>> BaseBacktestPortfolio()
TypeError: Can't instantiate abstract class BaseBacktestPortfolio with abstract methods check_signal
Since ABC and ABCMeta are just regular types, their features are inherited. This includes their guards against instantiating incomplete classes. Your BaseBacktestPortfolio already is an abstract class.
The warning from your IDE/linter/... exists specifically to warn you that instantiating BaseBacktestPortfolio is not possible.
You can make the BaseBacktestPortfolio also as Abstract class.
from abc import ABC, abstractmethod
class Portfolio(ABC):
#abstractmethod
def update_portfolio(self):
pass
#abstractmethod
def update_from_fill(self):
pass
#abstractmethod
def check_signal(self, signal_event):
pass
class BaseBacktestPortfolio(Portfolio, ABC):
def update_portfolio(self):
print("updated portfolio")
def update_from_fill(self):
print("update from fill")
#abstractmethod
def check_signal(self):
pass
class USBacktestPortfolio(BaseBacktestPortfolio):
def check_signal(self):
print("checked signal")
Also notice that you don't need raise NotImplementedError inside abstract method. You can just pass. Its more Pythonic :)
from abc import abstractmethod, ABCMeta
class AbstractBase(object):
__metaclass__ = ABCMeta
#abstractmethod
def must_implement_this_method(self):
raise NotImplementedError()
class ConcreteClass(AbstractBase):
def extra_function(self):
print('hello')
# def must_implement_this_method(self):
# print("Concrete implementation")
d = ConcreteClass() # no error
d.extra_function()
I'm on Python 3.4. I want to define an abstract base class that defines somes functions that need to be implemented by it's subclassses. But Python doesn't raise a NotImplementedError when the subclass does not implement the function...
The syntax for the declaration of metaclasses has changed in Python 3. Instead of the __metaclass__ field, Python 3 uses a keyword argument in the base-class list:
import abc
class AbstractBase(metaclass=abc.ABCMeta):
#abc.abstractmethod
def must_implement_this_method(self):
raise NotImplementedError()
Calling d = ConcreteClass() will raise an exception now, because a metaclass derived from ABCMeta can not be instantiated unless all of its abstract methods and properties are overridden (For more information see #abc.abstractmethod):
TypeError: Can't instantiate abstract class ConcreteClass with abstract methods
must_implement_this_method
Hope this helps :)