I have a series of classes that looks like this
from abc import ABC, abstractmethod
from typing import TypeVar
T = TypeVar("T", bound="A")
U = TypeVar("U", bound="ThirdPartyClass")
class ThirdPartyClass:
"""
This is from a third-party library and I don't control the implementation.
"""
#classmethod
def create(cls: type[U]) -> U:
return cls()
class A(ABC):
#classmethod
#abstractmethod
def f(cls: type[T]) -> T:
pass
class B(ThirdPartyClass, A):
#classmethod
def f(cls) -> T:
return cls.create()
When I run mypy on this module, I get two errors for the last two lines
error: A function returning TypeVar should receive at least one argument containing the same Typevar
error: Incompatible return value type (got "B", expected "T")
In my mind, neither of these are valid.
For the first one, B.f does receive an argument containing the Typevar—it receives a type[B] and since B inherits from A, and T is bound by A, type[B] is valid here.
Similarly for the second one, the return type of B should be fine because B inherits from A, and A is the bound for T.
What types should I be using here to prevent mypy from failing?
I have a decorator meant to wrap a classmethod like this:
class Class(object):
#register_classmethod
#classmethod
def my_class_method(cls):
...
My decorator gets a classmethod object. When I attempt to call it, it throws class method is not callable.
Here is a sample, with an overly-simplified decorator implementation:
from typing import Callable
all_methods: list[Callable[[type], None]] = []
def register_classmethod(classmeth: Callable[[type], None]) -> Callable[[type], None]:
all_methods.append(classmeth)
return classmeth
class Class(object):
#register_classmethod
#classmethod
def my_class_method(cls) -> None:
print(f"Hello from {cls}.my_class_method")
#classmethod
def run_registered_classmethods(cls) -> None:
for classmeth in all_methods:
classmeth(cls)
Class.run_registered_classmethods()
While mypy --strict is perfectly happy with the typing, at execution I get:
$ python3 testscripts/test-classmethod-call.py
Traceback (most recent call last):
File ".../test-classmethod-call.py", line 20, in <module>
Class.run_registered_classmethods()
File ".../test-classmethod-call.py", line 18, in run_registered_classmethods
classmeth(cls)
TypeError: 'classmethod' object is not callable
Now, I am indeed refactoring code that did not have that explicit #classmethod on my_class_method, and that code did run fine:
$ python3 testscripts/test-classmethod-call.py
Hello from <class '__main__.Class'>.my_class_method
However, with the above type annotations, mypy dutifully points out that we're trying to register an instance method here:
testscripts/test-classmethod-call.py:10: error: Argument 1 to "register_classmethod" has incompatible type "Callable[[Class], None]"; expected "Callable[[type], None]" [arg-type]
Note: I think this is also the problem faced in python how to invoke classmethod if I have only it's object, but its initial formulation was likely not enough on-the-point.
interpretation and start of a solution
It looks like what we get in this context is the descriptor object underlying the class method. I'd think that we would need to bind it in our wrapper descriptor, eg. using MethodType as shown here as of 3.11:
class ClassMethod:
"Emulate PyClassMethod_Type() in Objects/funcobject.c"
def __init__(self, f):
self.f = f
def __get__(self, obj, cls=None):
if cls is None:
cls = type(obj)
if hasattr(type(self.f), '__get__'):
# This code path was added in Python 3.9
# and was deprecated in Python 3.11.
return self.f.__get__(cls, cls)
return MethodType(self.f, cls)
But we cannot pass the classmethod object to MethodType, and have to dig it up in its (undocumented AFAICT) __func__ member.
Now this does the job:
#classmethod
def run_registered_classmethods(cls) -> None:
for classmeth in all_methods:
bound_method = types.MethodType(classmeth.__func__, cls)
bound_method()
This however brings us back to a new typing problem: the classmethod-decorated method has type classmethod, but is annotated as Callable for user programs to make sense of it, which causes mypy to complain:
testscripts/test-classmethod-call.py:19: error: "Callable[[type], None]" has no attribute "__func__" [attr-defined]
We can teach him about the real type by way of assert(isinstance(...)), and finally have working and well-typed code with:
#classmethod
def run_registered_classmethods(cls) -> None:
for classmeth in all_methods:
assert isinstance(classmeth, classmethod)
bound_method = types.MethodType(classmeth.__func__, cls)
bound_method()
This works but assert does have a runtime cost. So we will want to give a hint in a better way, e.g. using typing.cast():
#classmethod
def run_registered_classmethods(cls) -> None:
for classmeth in all_methods:
bound_method = types.MethodType(cast(classmethod, classmeth).__func__, cls)
bound_method()
But if mypy is happy with this on the surface, using its --strict option shows our typing is not as precise as it could be:
testscripts/test-classmethod-call.py:20: error: Missing type parameters for generic type "classmethod" [type-arg]
So classmethod is a generic type ? Pretty sure I did not find any hint of this in the doc. Luckily, reveal_type() and a bit of intuition seems to hint that the generic type parameter is the return type of the class method:
testscripts/test-classmethod-call.py:21: note: Revealed type is "def [_R_co] (def (*Any, **Any) -> _R_co`1) -> builtins.classmethod[_R_co`1]"
(yes, ouch!)
But if cast(classmethod[None], classmeth) reads OK to mypy, python itself is less than happy: TypeError: 'type' object is not subscriptable.
So we also have to have the interpreter and the type-checker to look at different code, using typing.TYPE_CHECKING, which brings us to the following:
import types
from typing import Callable, cast, TYPE_CHECKING
all_methods: list[Callable[[type], None]] = []
def register_classmethod(classmeth: Callable[[type], None]) -> Callable[[type], None]:
all_methods.append(classmeth)
return classmeth
class Class(object):
#register_classmethod
#classmethod
def my_class_method(cls) -> None:
pass
#classmethod
def run_registered_classmethods(cls) -> None:
for classmeth in all_methods:
if TYPE_CHECKING:
realclassmethod = cast(classmethod[None], classmeth)
else:
realclassmethod = classmeth
bound_method = types.MethodType(realclassmethod.__func__, cls)
bound_method()
Class.run_registered_classmethods()
... which passes as:
$ mypy --strict testscripts/test-classmethod-call.py
Success: no issues found in 1 source file
$ python3 testscripts/test-classmethod-call.py
Hello from <class '__main__.Class'>.my_class_method
This seems overly complicated for something we'd like to be simple and readable, and possibly a generic helper like the following could be provided to make all of this more usable - I'm not really happy with it, even though it passes all the above tests:
_ClassType = TypeVar("_ClassType")
_RType = TypeVar("_RType")
def bound_class_method(classmeth: Callable[[type[_ClassType]], _RType],
cls: type[_ClassType]) -> Callable[[], _RType]:
if TYPE_CHECKING:
realclassmethod = cast(classmethod[None], classmeth)
else:
realclassmethod = classmeth
return types.MethodType(realclassmethod.__func__, cls)
It does not handle arbitrary arguments to the class method, which we can likely get around to using ParamSpec. But this still makes use of several implementation details of classmethod (__func__ and the generic type parameter): the doc says nothing about them.
Shouldn't there be a simple way to do that ?
Is there any better way ?
Edit: curated summary of answers so far
There are tons of info in those answers, thanks :)
What I find most useful in those:
we cannot today write an annotation that would cause a method decorated with another decorator than #classmethod to get its first parameter cls to be of type type[Class] instead of Class (#droooze)
as a consequence we have several families of options, none of which are perfect:
live with the fact that the method to be decorated does not have a class method signature; let the decorator register method before wrapping it with classmethod to avoid dealing with the latter's internals, and then return the wrapped version (#chepner).
Note that when we need to use cls as a type in our classmethod-that-is-not-one-from-the-inside-for-the-typechecker, we can do something like:
#register_classmethod
def my_class_method(cls) -> None:
klass = cast(type[Class], cls)
print(f"Hello from {klass}.my_class_method")
It is sad the class name has to be hardcoded, and the type annotation for the argument to register_classmethod has to be further massaged if we want to do better than Callable[[Any], None]
live with the explicit addition of #classmethod and with us making use of its internals, which can also be annoying as forcing ue of that additional decorator causes an API change (#droooze)
tell the type-checker that our decorator is special like classmethod, and that its first parameter is a type[Class] where it would have been otherwise annotated as Class. Drawback is (aside from the cost of writing and maintaining a plugin), this requires a separate plugin for each static checker.
The built-in decorators #property, #classmethod, and #staticmethod are likely to be special-cased by each of the 4 major type-checker implementations, which means that interactions with other decorators may not make any sense, even if you theoretically have the type annotations correct.
For mypy, #classmethod is special-cased such that
it is transformed into a collections.abc.Callable even though classmethod doesn't even have a __call__ method;
the callable it decorates has its first parameter transformed into type[<owning class>]. In fact, the only way you can even get a type[<owning class>] object for the first parameter is if you decorate it with builtins.classmethod; no other custom implementation of any typing construct will work, not even a direct subclass of classmethod with no implementation in the body.
As you've found, this is the reason for your runtime error.
If you are specifically using mypy, In the example you gave, I would tweak it like this:
from __future__ import annotations
import collections.abc as cx
import typing as t
clsT = t.TypeVar("clsT", bound=type)
P = t.ParamSpec("P")
R_co = t.TypeVar("R_co", covariant=True)
all_methods: list[classmethod[t.Any]] = []
def register_classmethod(classmeth: cx.Callable[[clsT], R_co]) -> classmethod[R_co]:
# The assertion performs type-narrowing; see
# https://mypy.readthedocs.io/en/stable/type_narrowing.html
assert isinstance(classmeth, classmethod)
all_methods.append(classmeth) # type: ignore[unreachable]
return classmeth
class Class(object):
#register_classmethod
#classmethod
def my_class_method(cls) -> None:
print(f"Hello from {cls}.my_class_method")
# Not a callable!
my_class_method() # mypy: "classmethod[None]" not callable [operator]
#classmethod
def run_registered_classmethods(cls) -> None:
for classmeth in all_methods:
classmeth.__func__(cls)
# Too many arguments
#register_classmethod # mypy: Argument 1 to "register_classmethod" has incompatible type "Callable[[Type[Class], int], None]"; expected "Callable[[type], None]" [arg-type]
#classmethod
def bad_too_many_args(cls, a: int) -> None:
return
Class.run_registered_classmethods()
If you're doing anything more with classmethods and require proper type-checking in all scopes, I would re-implement the typing for classmethod, as follows:
from __future__ import annotations
import collections.abc as cx
import typing as t
# This is strictly unnecessary, but demonstrates a more accurately implemented
# `classmethod`. Accessing this from inside the class body, from an instance, or from
# a class works as expected.
# Unfortunately, you cannot use `ClassMethod` as a decorator and expect
# the first parameter to be typed correctly (see explanation 2.)
if t.TYPE_CHECKING:
import sys
clsT = t.TypeVar("clsT", bound=type)
P = t.ParamSpec("P")
R_co = t.TypeVar("R_co", covariant=True)
class ClassMethod(t.Generic[clsT, P, R_co]):
# Largely re-implemented from typeshed stubs; see
# https://github.com/python/typeshed/blob/d2d706f9d8b1a568ff9ba1acf81ef8f6a6b99b12/stdlib/builtins.pyi#L128-L139
#property
def __func__(self) -> cx.Callable[t.Concatenate[clsT, P], R_co]: ...
#property
def __isabstractmethod__(self) -> bool: ...
def __new__(cls, __f: cx.Callable[t.Concatenate[clsT, P], R_co]) -> ClassMethod[clsT, P, R_co]:...
def __get__(self, __obj: t.Any, __type: type) -> cx.Callable[P, R_co]: ...
if sys.version_info >= (3, 10):
__name__: str
__qualname__: str
#property
def __wrapped__(self) -> cx.Callable[t.Concatenate[clsT, P], R_co]: ... # Same as `__func__`
else:
ClassMethod = classmethod
all_methods: list[ClassMethod[type, [], t.Any]] = []
def register_classmethod(
classmeth: cx.Callable[[clsT], R_co]
) -> ClassMethod[clsT, [], R_co]:
# The assertion performs type-narrowing; see
# https://mypy.readthedocs.io/en/stable/type_narrowing.html
assert isinstance(classmeth, ClassMethod)
all_methods.append(classmeth) # type: ignore[unreachable]
return classmeth
class Class(object):
#register_classmethod
#classmethod
def my_class_method(cls) -> None:
print(f"Hello from {cls}.my_class_method")
# Not a callable! Fixes problem given in explanation 1.
my_class_method() # mypy: "ClassMethod[Type[Class], [], None]" not callable [operator]
#classmethod
def run_registered_classmethods(cls) -> None:
for classmeth in all_methods:
classmeth.__func__(cls)
# Not enough arguments
classmeth.__func__() # mypy: Too few arguments [call-arg]
# Too many arguments - `typing.ParamSpec` is working correctly
#register_classmethod # mypy: Argument 1 to "register_classmethod" has incompatible type "Callable[[Type[Class], int], None]"; expected "Callable[[type], None]" [arg-type]
#classmethod
def bad_too_many_args(cls, a: int) -> None:
return
Class.run_registered_classmethods()
# `__get__` working correctly - on the descriptor protocol.
# These two error out both for static type checking and runtime.
Class.my_class_method(type) # mypy: Too few arguments [call-arg]
Class.my_class_method.__func__ # mypy: "Callable[[], None]" has no attribute "__func__" [attr-defined]
Class methods aren't callable; they define a __get__ method that returns a callable method instance that will pass the class to the underlying function as the first argument.
I might let register_classmethod both store the function and return a class method:
all_methods: list[classmethod] = []
def register_classmethod(classmeth: classmethod) -> classmethod:
all_methods.append(classmeth)
return classmethod(classmeth)
class Class(object):
#register_classmethod
def my_class_method(cls) -> None:
print(f"Hello from {cls}.my_class_method")
#classmethod
def run_registered_classmethods(cls) -> None:
for classmeth in all_methods:
classmeth(cls)
This way, run_registered_classmethods doesn't need to worry about the descriptor protocol: it's just running the underlying function directly.
Problem
Suppose I want to implement a class decorator that adds some attributes and functions to an existing class.
In particular, let's say I have a protocol called HasNumber, and I need a decorator can_add that adds the missing methods to convert HasNumber class to CanAdd.
class HasNumber(Protocol):
num: int
class CanAdd(HasNumber):
def add(self, num: int) -> int: ...
Implementation
I implement the decorator as follows:
_HasNumberT = TypeVar("_HasNumberT", bound=HasNumber)
def can_add(cls: Type[_HasNumberT]) -> Type[CanAdd]:
def add(self: _HasNumberT, num: int) -> int:
return self.num + num
setattr(cls, "add", add)
return cast(Type[CanAdd], cls)
#can_add
class Foo:
num: int = 12
Error
The code works just fine when I run it, but mypy is unhappy about it for some reason.
It gives the error "Foo" has no attribute "add" [attr-defined], as if it doesn't take the return value (annotated as Type[CanAdd]) of the can_add decorator into account.
foo = Foo()
print(foo.add(4)) # "Foo" has no attribute "add" [attr-defined]
reveal_type(foo) # note: Revealed type is "test.Foo"
Question
In this issue, someone demonstrated a way of annotating this with Intersection. However, is there a way to achieve it without Intersection? (Supposing that I don't care about other attributes in Foo except the ones defined in the protocols)
Or, is it a limitation of mypy itself?
Related posts that don't solve my problem:
Mypy annotation on a class decorator
Class Decorator Compatible for Mypy
cast tells mypy that cls (with or without an add attribute) is safe to use as the return value for can_add. It does not guarantee that the protocol holds.
As a result, mypy cannot tell whether Foo has been given an add attribute, only that it's OK to use the can_add decorator. The fact that can_add has a side effect of defining the add attribute isn't visible to mypy.
You can, however, replace the decorator with direct inheritance, something like
class HasNumber(Protocol):
num: int
_HasNumberT = TypeVar("_HasNumberT", bound=HasNumber)
class Adder(HasNumber):
def add(self, num: int) -> int:
return self.num + num
class Foo(Adder):
num: int = 12
foo = Foo()
print(foo.add(4))
Consider the case when I have different classes implementing the same method while returning different types.
class A:
def method(self) -> float:
return 3.14
class B:
def method(self) -> str:
return 'a string'
def do_method(x):
return x.method()
r = do_method(A())
reveal_type(r) # Revealed type is 'Any'
Mypy is not being able to infer the exact return type of function do_method() which depends on its argument x. How can I help Mypy achieve this?
Note: Please also consider that number of such classes that I want to use with the function do_method() is too many, so one doesn't want to change them all.
You could use generic protocol to do what you need. But it should be minded that mypy requires the covariance of the return type of the function of protocol when it is a TypeVar, so we must explicitly state this by covariant=True, otherwise the variable is considered as an invariant by default.
A covariant return type of a method is one that can be replaced by a "narrower" type when the method is overridden in a subclass.
from typing import TypeVar, Protocol
T = TypeVar('T', covariant=True)
class Proto(Protocol[T]):
def method(self) -> T: ...
class A:
def method(self) -> float:
return 3.14
class B:
def method(self) -> str:
return 'a string'
def do_method(x: Proto[T]) -> T:
return x.method()
r1 = do_method(A())
reveal_type(r1) # Revealed type is 'builtins.float*'
r2 = do_method(B())
reveal_type(r2) # Revealed type is 'builtins.str*'
Let's say i have to following classes.
class A:
#staticmethod
def foo():
pass
class B(A):
pass
And I have some kind of function that constructs an object based on it's type as well as calls a function.
def create(cls: Type[A]) -> A:
cls.foo()
return cls()
Now I can make the following calls to that function. And because B inherits from A it's all good.
instance_a: A = create(A)
instance_b: B = create(B)
Except the with the latter, type-checking will start complaining because create according to the annotations returns an instance of A.
This could be solved with TypeVar as follows.
from typing import Type, TypeVar
T = TypeVar('T')
def create(cls: Type[T]) -> T:
cls.foo()
return cls()
Except now the typing checking doesn't do it's original job of guarantying that cls has a method called foo. Is there a way to specify a generic to be of a certain type?
You can supply a bound:
T = TypeVar('T', bound=A)