Python 3 ignores metaclass directive? - python

Edited
I have 2 classes inheriting from ABC, and a third class inheriting from both, each in a different file. Tried to provide the metaclass of ABCMeta to the last class, to resolve the conflict of metaclasses, but it fails with the same
"TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases"
Why does python ignore the metaclass directive in this case, and how to resolve it?
file A:
from abc import ABC, abstractmethod
class A(ABC):
#abstractmethod
def method1(self):
pass
file B:
from abc import ABC, abstractmethod
class B(ABC):
#abstractmethod
def method2(self):
pass
file C:
import A
import B
class C(A,B,metaclass=ABCMeta):
def method1(self):
pass
def method2(self):
pass

The problem stems from wrong import.
file C should be:
from A import A
from B import B
class C(A,B):
def method1(self):
pass
def method2(self):
pass
Credit should go to #Matthias & #Giacomo Alzetta, who pointed out that the MCVE works for them.

Related

OOP in Python beginner issues, finding all methods of the same starting name

I would like to create an abstract method in parent class which would be overridden in subclasses. This method would print all methods in the given subclass which start with 'on_'.
from abc import ABC, abstractmethod
class abstract_class(ABC):
#abstractmethod
def get_all_on_methods(self):
pass
class sub(abstract_class):
an_object = sub()
def get_all_on_methods(self):
for attribute in dir(self):
if attribute.startswith("on_"):
print(attribute)
def nothin(self):
print("nothin")
def on_fallback(self):
raise NotImplementedError()
def on_no(self):
raise NotImplementedError()
sub.get_all_on_methods()
I have two problems. First, I have:
Unresolved reference 'sub'
Second, I don't know whether my approach as actually all that good.

Proper way to implement ABC SubClass

I have an Interface class which defines the requirements to an active "in-use" class:
class Portfolio(ABC):
#abstractmethod
def update_portfolio(self):
raise NotImplementedError
#abstractmethod
def update_from_fill(self):
raise NotImplementedError
#abstractmethod
def check_signal(self, signal_event):
raise NotImplementedError
The methods update_portfolio and update_from_fill are both methods which will be the same in 99% of the required cases. Only the check_signal method will vary. Therefore, to avoid having to write the same code again and again, I have defined a base class with default methods for update_portfolio and update_from_fill:
class BaseBacktestPortfolio(Portfolio):
def __init__(self, ...):
...
def update_portfolio(self, ...):
...
def update_from_fill(self, ...):
...
Then, finally, I have a class inheriting from the BacktestPortfolio class which specifies the correct implementation of the check_signal method:
class USBacktestPortfolio(BaseBacktestPortfolio):
def check_signal(self, ...):
...
Now, the problem is that my editor complains about the BacktestPortfolio classing not having all the required abstract methods. I could ignore this, of course, but the perfect scenario would be if I could make sure that it is not possible to instantiate an object form the BacktestPortfolio class.
Is this possible? And/or is there a more correct way to implement a structure like this?
I could ignore this, of course, but the perfect scenario would be if I could make sure that it is not possible to instantiate an object from the BacktestPortfolio class.
That is the case in your example already:
>>> BaseBacktestPortfolio.mro()
[__main__.BaseBacktestPortfolio, __main__.Portfolio, abc.ABC, object]
>>> BaseBacktestPortfolio()
TypeError: Can't instantiate abstract class BaseBacktestPortfolio with abstract methods check_signal
Since ABC and ABCMeta are just regular types, their features are inherited. This includes their guards against instantiating incomplete classes. Your BaseBacktestPortfolio already is an abstract class.
The warning from your IDE/linter/... exists specifically to warn you that instantiating BaseBacktestPortfolio is not possible.
You can make the BaseBacktestPortfolio also as Abstract class.
from abc import ABC, abstractmethod
class Portfolio(ABC):
#abstractmethod
def update_portfolio(self):
pass
#abstractmethod
def update_from_fill(self):
pass
#abstractmethod
def check_signal(self, signal_event):
pass
class BaseBacktestPortfolio(Portfolio, ABC):
def update_portfolio(self):
print("updated portfolio")
def update_from_fill(self):
print("update from fill")
#abstractmethod
def check_signal(self):
pass
class USBacktestPortfolio(BaseBacktestPortfolio):
def check_signal(self):
print("checked signal")
Also notice that you don't need raise NotImplementedError inside abstract method. You can just pass. Its more Pythonic :)

Abstract base class is not enforcing function implementation

from abc import abstractmethod, ABCMeta
class AbstractBase(object):
__metaclass__ = ABCMeta
#abstractmethod
def must_implement_this_method(self):
raise NotImplementedError()
class ConcreteClass(AbstractBase):
def extra_function(self):
print('hello')
# def must_implement_this_method(self):
# print("Concrete implementation")
d = ConcreteClass() # no error
d.extra_function()
I'm on Python 3.4. I want to define an abstract base class that defines somes functions that need to be implemented by it's subclassses. But Python doesn't raise a NotImplementedError when the subclass does not implement the function...
The syntax for the declaration of metaclasses has changed in Python 3. Instead of the __metaclass__ field, Python 3 uses a keyword argument in the base-class list:
import abc
class AbstractBase(metaclass=abc.ABCMeta):
#abc.abstractmethod
def must_implement_this_method(self):
raise NotImplementedError()
Calling d = ConcreteClass() will raise an exception now, because a metaclass derived from ABCMeta can not be instantiated unless all of its abstract methods and properties are overridden (For more information see #abc.abstractmethod):
TypeError: Can't instantiate abstract class ConcreteClass with abstract methods
must_implement_this_method
Hope this helps :)

Error "__init__ method from base class is not called" for an abstract class

I have
class A(object):
def __init__ (self): raise NotImplementedError("A")
class B(A):
def __init__ (self):
....
and pylint says
__init__ method from base class 'A' is not called
Obviously, I do not want to do
super(B, self).__init__()
so what do I do?
(I tried abc and got
Undefined variable 'abstractmethod'
from pylint, so that is not an option either).
Ignore pylint. It's just a program that doesn't take abstract classes into account. Be confident you are smarter than it is. Pylint is a knee brace, not a crutch.
Using abc works for me:
import abc
class A(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def __init__(self):
pass
class B(A):
def __init__(self):
super(B, self).__init__()
I get warnings, but nothing related to abc or the parent's __init__ not being called:
C: 1, 0: Missing module docstring (missing-docstring)
C: 3, 0: Invalid class name "A" (invalid-name)
C: 3, 0: Missing class docstring (missing-docstring)
R: 3, 0: Too few public methods (0/2) (too-few-public-methods)
C: 9, 0: Invalid class name "B" (invalid-name)
C: 9, 0: Missing class docstring (missing-docstring)
R: 9, 0: Too few public methods (0/2) (too-few-public-methods)
R: 3, 0: Abstract class is only referenced 1 times (abstract-class-little-used)
For what its worth, I'm with #holdenweb on this one. Sometimes you know better than pylint.
Defining an empty __init__ as an abstract method is not very useful.
Instead, make A inherit from ABC (ABstract Class), and use #abstractmethod decorator on methods that really need to be implemented by the user.
from abc import ABC, abstractmethod
class A(ABC):
#abstractmethod
def cool_method(self):
raise NotImplemented
In that way, you would effectively not be able to instantiate A, and at the same time, you avoid the warning. Also consider that the default __init__ method for A when not implemented, would be something along the lines of:
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
So additionally, it has the advantage that it respects the mro, if you later want to use A in a multiple inheritance setup, you will not run into problems.

Is it a good idea to define properties in Python interfaces?

Is it a good practice to define properties in an interface like this?
class MyInterface(object):
def required_method(self):
raise NotImplementedError
#property
def required_property(self):
raise NotImplementedError
I'd use a ABC class for that, but yes; you can even use a #abstractproperty for that very use-case.
from abc import ABCMeta, abstractproperty, abstractmethod
class MyInterface(object):
__metaclass__ = ABCMeta
#abstractmethod
def required_method(self):
pass
#abstractproperty
def required_property(self):
pass
Subclasses of the ABC are still free to implement required_property as an attribute instead; the ABC will only verify the existence of required_property, not what type it is.

Categories

Resources