Abstract method allowing for additional arguments - python

Is it possible to have an abstract class require some specific arguments for a method of a Python class, while still leaving the possibility for concrete classes to add more arguments?
from abc import ABC, abstractmethod
class FooBase(ABC):
#abstractmethod
def __init__(self, required: str, also_required: int):
pass
And then the concrete class would go, conceptually:
class Foo(FooBase):
def __init__(self, required: str, also_required: int, something_else: float):
do_stuff()
Context
This is in the context of a package/library intended to be imported by client. I'd like to provide the FooBase abstract class, that has a particular contract regarding other parts of the library. Clients would be free to implement their concrete Foo, but in this use case, the method arguments of __init__ are to be considered as "mandatory minimum", not as being exhaustive.
Solution: using **kwargs ?
One "solution" could be to use **kwargs in both abstract and concrete methods to accept any other keyword argument, but the issue is precisely that: the class now accepts any other keyword argument, instead of enforcing specific ones, which brings at least two issues:
argument names validation can now happen only at runtime (e.g. no mypy)
can't know what argument names to use based on method signature alone
class UglyBase:
#abstractmethod
def __init__(self, required: str, **kwargs):
pass
class Ugly(UglyBase):
def __init__(self, required: str, **kwargs):
# yurk
self.something_else = kwargs.get("something_else", "default")
# small typo in the last argument's name goes undetected
# neither IDE nor MyPy can detect it statically
ugly_object = Ugly(required="hello", someting_else="blabla")
# mistyped argument was silently ignored
ugly_object.something_else
>>> "default"

I think this will give you a lead on how to (possibly) solve your issue. I don't know what to think about inspect, but it exists. So why not use it?
from abc import ABC, abstractmethod
import inspect
class FooBase(ABC):
def __new__(cls, *args, **kwargs):
# args = ('test', 1, 2.0)
# cls == Foo
# inspect.signature(cls.__init__) = <Signature (self, required: str, also_required: int, something_else: float)>
# You could compare the signature of cls.__init__ with the signature of FooBase.__init__
# for example
# set(inspect.signature(FooBase.__init__).parameters.keys()).issubset(set(inspect.signature(cls.__init__).parameters.keys()))
# this is a very long oneliner so you might want to use intermediate variables
return super(FooBase, cls).__new__(cls)
#abstractmethod
def __init__(self, required: str, also_required: int):
print("init")
pass
class Foo(FooBase):
def __init__(self, required: str, also_required: int, something_else: float):
pass
f = Foo("test", 1, 2.0)

Related

Mypy error on __init_subclass__ example from python documentation

From the official Python 3 documentation for __init__subclass__:
class Philosopher:
def __init_subclass__(cls, /, default_name: str, **kwargs):
super().__init_subclass__(**kwargs)
cls.default_name = default_name
class AustralianPhilosopher(Philosopher, default_name="Bruce"):
pass
The problem is, mypy raises "Type[Philosopher]" has no attribute "default_name". What is the solution for this? How can I make mypy take these values?
Just like the style of other statically-typed languages, you simply declare the variable as an attribute in the class body:
from typing import ClassVar
class Philosopher:
default_name: ClassVar[str]
def __init_subclass__(cls, /, default_name: str, **kwargs: object) -> None:
super().__init_subclass__(**kwargs)
cls.default_name = default_name

Type hinting a class decorator that returns a subclass

I have a set of unrelated classes (some imported) which all have a common attribute (or property) a of type dict[str, Any].
Within a there should be another dict under the key "b", which I would like to expose on any of these classes as an attribute b to simplify inst.a.get("b", {})[some_key] to inst.b[some_key].
I have made the following subclass factory to work as a class decorator for local classes and a function for imported classes.
But so far I'm failing to type hint its cls argument and return value correctly.
from functools import wraps
def access_b(cls):
#wraps(cls, updated=())
class Wrapper(cls):
#property
def b(self) -> dict[str, bool]:
return self.a.get("b", {})
return Wrapper
MRE of my latest typing attemp (with mypy 0.971 errors):
from functools import wraps
from typing import Any, Protocol, TypeVar
class AProtocol(Protocol):
a: dict[str, Any]
class BProtocol(AProtocol, Protocol):
b: dict[str, bool]
T_a = TypeVar("T_a", bound=AProtocol)
T_b = TypeVar("T_b", bound=BProtocol)
def access_b(cls: type[T_a]) -> type[T_b]:
#wraps(cls, updated=())
class Wrapper(cls): # Variable "cls" is not valid as a type & Invalid base class "cls"
#property
def b(self) -> dict[str, bool]:
return self.a.get("b", {})
return Wrapper
#access_b
class Demo1:
"""Local class."""
def __init__(self, a: dict[str, Any]):
self.a = a.copy()
demo1 = Demo1({"b": {"allow_X": True}})
demo1.b["allow_X"] # "Demo1" has no attribute "b"
class Demo2:
"""Consider me an imported class."""
def __init__(self, a: dict[str, Any]):
self.a = a.copy()
demo2 = access_b(Demo2)({"b": {"allow_X": True}}) # Cannot instantiate type "Type[<nothing>]"
demo2.b["allow_X"]
I do not understand why cls is not valid as a type, even after reading https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases.
I understand I should probably not return a Protocol (I suspect that is the source of Type[<nothing>]), but I don't see how I could specify "returns the original type with an extension".
PS1. I have also tried with a decorator which adds b dynamically, still failed to type it...
PS2. ...and with a decorator which uses a mixin as per #DaniilFajnberg's answer, still failing.
References:
functools.wraps(cls, update=()) from https://stackoverflow.com/a/65470430/17676984
(Type) Variables as base classes?
This is actually a really interesting question and I am curious about what solutions other people come up with.
I read up a little on these two errors:
Variable "cls" is not valid as a type / Invalid base class "cls"
There seems to be an issue here with mypy that has been open for a long time now. There doesn't seem to be a workaround yet.
The problem, as I understand it, is that no matter how you annotate it, the function argument cls will always be a type variable and that is considered invalid as a base class. The reasoning is apparently that there is no way to make sure that the value of that variable isn't overwritten somewhere.
I honestly don't understand the intricacies well enough, but it is really strange to me that mypy seems to treat a class A defined via class A: ... different than a variable of Type[A] since the former should essentially just be syntactic sugar for this:
A = type('A', (object,), {})
There was also a related discussion in the mypy issue tracker. Again, hoping someone can shine some light onto this.
Adding a convenience property
In any case, from your example I gather that you are not dealing with foreign classes, but that you define them yourself. If that is the case, a Mix-in would be the simplest solution:
from typing import Any, Protocol
class AProtocol(Protocol):
a: dict[str, Any]
class MixinAccessB:
#property
def b(self: AProtocol) -> dict[str, bool]:
return self.a.get("b", {})
class SomeBase:
...
class OwnClass(MixinAccessB, SomeBase):
def __init__(self, a: dict[str, Any]):
self.a = a.copy()
demo1 = OwnClass({"b": {"allow_X": True}})
print(demo1.b["allow_X"])
Output: True
No mypy issues in --strict mode.
Mixin with a foreign class
If you are dealing with foreign classes, you could still use the Mix-in and then use functools.update_wrapper like this:
from functools import update_wrapper
from typing import Any, Protocol
class AProtocol(Protocol):
a: dict[str, Any]
class MixinAccessB:
"""My mixin"""
#property
def b(self: AProtocol) -> dict[str, bool]:
return self.a.get("b", {})
class Foreign:
"""Foreign class documentation"""
def __init__(self, a: dict[str, Any]):
self.a = a.copy()
class MixedForeign(MixinAccessB, Foreign):
"""foo"""
pass
update_wrapper(MixedForeign, Foreign, updated=())
demo2 = MixedForeign({"b": {"allow_X": True}})
print(demo2.b["allow_X"])
print(f'{MixedForeign.__name__=} {MixedForeign.__doc__=}')
Output:
True
MixedForeign.__name__='Foreign' MixedForeign.__doc__='Foreign class documentation'
Also no mypy issues in --strict mode.
Note that you still need the AProtocol to make it clear that whatever self will be in that property follows that protocol, i.e. has an attribute a with the type dict[str, Any].
I hope I understood your requirements correctly and this at least provides a solution for your particular situation, even though I could not enlighten you on the type variable issue.

How to typehint that an object of a class is also adhering to a Protocol in Python?

I have a set of classes, Lets call them Foo and Bar, where both inherit from a base class Father that is defined outside of the current scope (not by me). I have definied a protocol class DummyProtocol that has a function do_something.
class DummyProtocol(Protocol):
def do_something(self):
...
class Foo(Father):
def do_something(self):
pass
class Bar(Father):
def do_something(self):
pass
I have a function create_instance.
def create_dummy_and_father_instance(cls, *args, **kwargs):
return cls(*args, **kwargs)
I want to typehint it in a way, that cls is typehinted to accept a class that is of type Father that also implements the DummyProtocol.
So I changed the function to this to indicate that cls is a type that inherit from both Father and DummyProtocol
def create_dummy_and_father_instance(
cls: Type[tuple[Father, DummyProtocol]], *args, **kwargs
):
return cls(*args, **kwargs)
But I get this error in mypy:
Cannot instantiate type "Type[Tuple[Father, DummyProtocol]]"
I came across the same issue and found this discussion on proposed Intersection types which seem to be exactly what is needed (e.g. see this comment).
Unfortunately this feature is not yet supported by the Python typing system, but there's a PEP in the making.
You can define a second Father class which inherits from Father and Protocol (see also mypy: how to verify a type has multiple super classes):
class DummyProtocol(Protocol):
def do_something(self):
...
class Father:
pass
class Father2(Father, DummyProtocol):
pass
class Foo(Father2):
def do_something(self):
pass
class Bar(Father2):
def do_something(self):
pass
class FooNot(Father):
pass
def create_dummy_and_father_instance(
cls: Type[Father2]
):
return cls()
create_dummy_and_father_instance(Foo)
create_dummy_and_father_instance(Bar)
create_dummy_and_father_instance(FooNot) # mypy error ok

Pylance: "property" is incompatible with "int"

from typing_extensions import Protocol
class IFoo(Protocol):
value: int
class Foo(IFoo):
#property
def value(self) -> int:
return 2
_value: int
#value.setter
def value(self, value: int):
self._value = value
Pylance in strict mode(basic mode doesn't) is giving an error at the getter and the setter saying that:
"value" overrides symbol of the same name in class "IFoo"
"property" is incompatible with "int".
I could make this work by changing the Protocol to:
class IFoo(Protocol):
#property
def value(self) -> int:
raise NotImplemented
But this now makes this invalid:
class Foo(IFoo):
value: int
This doesn't makes sense, the Foo would still have the property value that is an int, why being a getter should makes it different (in typescript this doesn't make a difference)?
How can I fix this?
Reading the Defining a protocol section of the relevant pep (PEP 544), the example implementation (in their case, class Resource) does not directly inherit from the protocol - their class SupportsClose functions as a reference type for type hinting validators.
Your example is also reminiscent of the long established zope.interface package, which this PEP also referenced. Note that the example usage the PEP have cited the following example (irrelevant lines trimmed):
from zope.interface import Interface, implementer
class IEmployee(Interface):
...
#implementer(IEmployee)
class Employee:
...
The Employee class does not directly subclass from IEmployee (a common mistake for newbie Zope/Plone developers back in the days), it's simply decorated with the zope.interface.implementer(IEmployee) class decorator to denote that the class Employee implements from the interface IEmployee.
Likewise, reading further down under the section Protocol members, we have an example of the template and concrete classes (again, example trimmed):
from typing import Protocol
class Template(Protocol):
name: str # This is a protocol member
value: int = 0 # This one too (with default)
class Concrete:
def __init__(self, name: str, value: int) -> None:
self.name = name
self.value = value
var: Template = Concrete('value', 42) # OK
Again, note that the Concrete implementation does not inherit from the Template, yet, the variable var is denoted to have an expected type of Template. Note that the Concrete instance can be assigned to it as it matches the expected protocol as defined for Template.
So all that being said, given your example, you may wish to define class Foo: as is rather than having it inherit from IFoo as you had originally, and fill in the type information such that things that expect IFoo be type hinted as appropriately in the relevant context (e.g. some_foo: IFoo = Foo(...) or def some_func(foo: IFoo):).
As an addendum, you may wish to define Foo as such:
class Foo:
_value: int
#property
def value(self) -> int:
return 2
#value.setter
def value(self, value: int):
self._value = value
Having the _value definition in between the property and its setter seems to confuse mypy due to this issue.

How can I allow one or more arguments for initializing a class?

If I want to create a class where each instance has to have at least one arguments but could have more. If it has none it should raise an exception. How can i achieve this?
You could write your class like this:
class MyClass(object):
def __init__(self, first, *rest):
# do something with the args
This accepts the first argument as first and any additional arguments as a tuple, rest.
The solution proposed by kindall is the most pythonic way to do it:
class MyClass(object):
def __init__(self, first, *rest):
# do something with the args
It will raise a TypeError whenever you forget about the first argument. Here is how you would go about it if you wished to raise a custom exception:
class MyClass(object):
def __init__(self, *args):
if not args:
raise MyCustomException()
first_arg = args[0]
...

Categories

Resources