Given the following example
from typing import Any, Iterable, Callable, TypeVar, Tuple
from itertools import islice
T = TypeVar('T')
def take1(n:int, iterable:Iterable[T]) -> Tuple[T,...]:
return tuple(islice(iterable, n))
def take2(n:int, iterable:Iterable[Any], container:Callable[[Iterable[Any]],T]=tuple) -> T:
return container(islice(iterable, n))
as is I get
test.py:12: error: Incompatible default for argument "container" (default has type "Type[Tuple[Any, ...]]", argument has type "Callable[[Iterable[Any]], T]")
Found 1 error in 1 file (checked 1 source file)
As you can see take2 is a more generalize version of take1, that by default is take1 but if the user want to put the result into something else there isn't some throw away tuple in the middle by just providing the desired container directly. So for instance "".join(take1(3,"abcdefg")) is equivalent to take2(3,"abcdefg","".join) just that there isn't that throw away tuple that take1 would make, which can be relevant in for example sum(take1(10**10,itertools.count()) vs take2(10**10,itertools.count(),sum) (take1 will fail with a memory error here while take2 will succeed eventually)...
For that I think the hint I put there is perfectly adequate, but mypy doesn't like it.
So, how can I type hint take2 so it pass the mypy test? (beside #type: ignore I suppose) and still get an useful info when calling help on it
after some old trial and error I arrived a solution to make mypy happy
change the signature to
def take2(n:int, iterable:Iterable[T], container:Callable[[Iterable[T]],Any]=tuple) -> Any:
return container(islice(iterable,n))
but that was actually the last solution I found the first was to make a stub file and make that into its own module if it wasn't already
from typing import TypeVar, overload, Iterable
T = TypeVar("T")
#overload
def take2(n:int, iterable:Iterable[T]) -> Tuple[T,...]:...
#overload
def take2(n:int, iterable:Iterable[Any], container:Callable[[Iterable[Any]],T]) -> T:...
but now with
#overload
def take2(n:int, iterable:Iterable[T]) -> Tuple[T,...]:...
#overload
def take2(n:int, iterable:Iterable[T], container:Callable[[Iterable[T]],Any]) -> Any:...
(it can also be in the main file)
mypy sure is hard to please...
Related
I'm trying to understand typing.overload and have applied it in a simple case where I want a function that takes input x: Literal["foo", "bar"] and returns the list [x].
I'd like mypy to type the resulting list as list[Literal["foo"]] or list[Literal["bar"]] depending on the value of x.
I know I could achieve this with a TypeVar, but I'd still like to understand why the code below fails with the following error:
test.py:14: error: Overloaded function implementation cannot produce return type of signature 1
test.py:14: error: Overloaded function implementation cannot produce return type of signature 2
from typing import Literal, overload
#overload
def f(x: Literal["foo"]) -> list[Literal["foo"]]:
...
#overload
def f(x: Literal["bar"]) -> list[Literal["bar"]]:
...
def f(x: Literal["foo", "bar"]) -> list[Literal["foo", "bar"]]:
return [x]
Lists in Python are invariant. That means that, even if B is a subtype of A, there is no relation between the types list[A] and list[B].
If list[B] were allowed to be a subtype of list[A], then someone could come along and do this.
my_b_list: list[B] = []
my_a_list: list[A] = my_b_list
my_a_list.append(A())
print(my_b_list) # Oh no, a list[B] contains an A value!
If you plan to modify the returned list, then what you're doing isn't safe. End of story. If you plan to treat the list as immutable, then consider what operations you actually need, and you may be able to find a covariant supertype of list in typing.
For example, Sequence is a popular choice. It supports iteration, random access, and length access, while explicitly not allowing mutation.
from typing import Literal, overload, Sequence
#overload
def f(x: Literal["foo"]) -> Sequence[Literal["foo"]]:
...
#overload
def f(x: Literal["bar"]) -> Sequence[Literal["bar"]]:
...
def f(x: Literal["foo", "bar"]) -> Sequence[Literal["foo", "bar"]]:
return [x]
(Note: typing.Sequence is deprecated in Python 3.9; if you only plan to support 3.9+, you might use collections.abc.Sequence instead)
AFAIU your question, your actual implementation needs to provide a single type (str), not multiple Literal.
The following works correctly according to pyright, and seems to provide the feature you are looking for (only allowing lists of either "foo" or "bar", rejecting everything else).
from typing import Literal, overload
#overload
def f(x: Literal["foo"]) -> list[Literal["foo"]]:
...
#overload
def f(x: Literal["bar"]) -> list[Literal["bar"]]:
...
def f(x: str) -> list[str]:
return [x]
f("foo") # valid
f("bar") # valid
f("baz") # error
which cause the following error:
a.py:20:3 - error: Argument of type "Literal['baz']" cannot be assigned to parameter "x" of type "Literal['bar']" in function "f"
"Literal['baz']" cannot be assigned to type "Literal['bar']"
The following code:
from typing import Union
def process(actions: Union[list[str], list[int]]) -> None:
for pos, action in enumerate(actions):
act(action)
def act(action: Union[str, int]) -> None:
print(action)
generates a mypy error: Argument 1 to "act" has incompatible type "object"; expected "Union[str, int]"
However when removing the enumerate function the typing is fine:
from typing import Union
def process(actions: Union[list[str], list[int]]) -> None:
for action in actions:
act(action)
def act(action: Union[str, int]) -> None:
print(action)
Does anyone know what the enumerate function is doing to effect the types?
This is python 3.9 and mypy 0.921
enumerate.__next__ needs more context than is available to have a return type more specific than Tuple[int, Any], so I believe mypy itself would need to be modified to make the inference that enumerate(actions) produces Tuple[int,Union[str,int]] values.
Until that happens, you can explicitly cast the value of action before passing it to act.
from typing import Union, cast
StrOrInt = Union[str, int]
def process(actions: Union[list[str], list[int]]) -> None:
for pos, action in enumerate(actions):
act(cast(StrOrInt, action))
def act(action: Union[str, int]) -> None:
print(action)
You can also make process generic (which now that I've thought of it, is probably a better idea, as it avoids the overhead of calling cast at runtime).
from typing import Union, cast, Iterable, TypeVar
T = TypeVar("T", str, int)
def process(actions: Iterable[T]) -> None:
for pos, action in enumerate(actions):
act(action)
def act(action: T) -> None:
print(action)
Here, T is not a union of types, but a single concrete type whose identity is fixed by the call to process. Iterable[T] is either Iterable[str] or Iterable[int], depending on which type you pass to process. That fixes T for the rest of the call to process, which every call to act must take the same type of argument.
An Iterable[str] or an Iterable[int] is a valid argument, binding T to int or str in the process. Now enumerate.__next__ apparently can have a specific return type Tuple[int, T].
I don't know how it's affecting the types. I do know that using len() can work the same way. It is slower but if it solves the problem it might be worth it. Sorry that it's not much help
Seems like mypy isn't able to infer the type and generalizes to object. Might be worth opening an issue at their side. As a workaround you could annotate 'action'. This would remove the error. Does it work if you import the (legacy) List from typing?
There is such a function for currying. The problem is that I don’t know how to make this function return a decorated function with the correct types. Help, I have not found a solution anywhere.
import functools
import typing as ty
from typing import TypeVar, Callable, Any, Optional
F = TypeVar("F", bound=Callable[..., Any])
def curry(func: F, max_argc: Optional[int] = None):
if max_argc is None:
max_argc = func.__code__.co_argcount
#functools.wraps(func)
def wrapped(*args):
argc = len(args)
if argc < max_argc:
return curry(functools.partial(func, *args), max_argc - argc)
else:
return func(*args)
return ty.cast(F, wrapped)
#curry
def foo(x: int, y: int) -> int:
return x + y
foo("df")(5) # mypy error: Too few arguments for "foo"
# mypy error: "int" not callable
# mypy error: Argument 1 to "foo" has incompatible type "str"; expected "int" # True
How to fix 1, 2 mypy errors?
Interestingly enough, i tried the exact same thing, writing a decorator, which would return a curried version of any function, including generic higher-order ones.
I tried to build a curry that would allow any partitioning of the input arguments.
Denial
However, AFAIK it is not possible due to some constraints of the python type system.
I'm struggling to find a generic typesafe approach right now. I mean Haskell lives it, C++ adopted it, Typescript manages to do it as well, so why should python not be able to?
I reached out to mypy alternatives such as pyright, which has AWESOME maintainers, BUT is stull bound by PEPs, which state that some things are just not possible.
Anger
When submitting an issue with pyright for the last missing piece in my curry-chain, i boiled the issue down to the following (as can be seen in this issue: https://github.com/microsoft/pyright/issues/1720)
from typing import TypeVar
from typing_extensions import Protocol
R = TypeVar("R", covariant=True)
S = TypeVar("S", contravariant=True)
class ArityOne(Protocol[S, R]):
def __call__(self, __s: S) -> R:
...
def id_f(x: ArityOne[S, R]) -> ArityOne[S, R]:
return x
X = TypeVar("X")
def identity(x: X) -> X:
return x
i: int = id_f(identity)(4) # Does not type check, expected type `X`, got type `Literal[4]`
Mind the gap, this is a minimal reproducible example of the missing link.
What i initially attempted to do was the following (skipping the actual curry implementation, which, in comparison, resembles a walk in the park):
Write a curry decorator (without types)
Define Unary, Binary and Ternary (etc.) Protocols, which is the more modern version of the function type Callable. Coincidentally, Protocols can specify type #overloads for their __call__ methods, which brings me to the next point.
Define CurriedBinary, CurriedTernary (etc.) using Protocols with type #overloaded __call__ methods.
Define type #overloads for the curry function, e.g. Binary -> CurriedBinary, Ternary -> CurriedTernary
With this, everything was in place, and it works awesome for fixed-type functions i.e. int -> int or str -> int -> bool.
I don't have my attempted implementation on hand right now tho'.
However, when currying functions such as map, or filter it fails to match the curried, generic version of the function with the actual types.
Bargaining
This happens due to how scoping of type variables works. For more details you can take a look at the GitHub issue. It is explained in greater detail there.
Essentially what happens is that the type variable of the generic function-to-be-curried cannot be influenced by the actual type of the data-to-be-passed partially, because there is a class Protocol in between, which defines its own scope of type variables.
Trying to wrap or reorganize stuff did not yield fruitful results.
Depression
I used Protocols there to be able to represent the type of a curried function,
which is not possible with e.g. Callable, and although pyright displays the type of an overloaded function as Overload[contract1, contract2, ...] there is no such symbol, only #overload.
So either way there's something that prevents you from expressing the type you want.
It is currently not possible to represent a fully generic typesafe curry function due to limitations of the python type system.
Acceptance
However, it is possible to compromise on certain features, like generics, or arbitrary partitioning of input arguments.
The following implementation of curry works in pyright 1.1.128.
from typing import TypeVar, Callable, List, Optional, Union
R = TypeVar("R", covariant=True)
S = TypeVar("S", contravariant=True)
T = TypeVar("T", contravariant=True)
def curry(f: Callable[[T, S], R]) -> Callable[[T], Callable[[S], R]]:
raise Exception()
X = TypeVar("X")
Y = TypeVar("Y")
def function(x: X, y: X) -> X:
raise Exception()
def to_optional(x: X) -> Optional[X]:
raise Exception()
def map(f: Callable[[X], Y], xs: List[X]) -> List[Y]:
raise Exception()
i: int = curry(function)(4)(5)
s: List[Optional[Union[str, int]]] = curry(map)(to_optional)(["dennis", 4])
First things first, I wouldn't make it a decorator, I'd wrap it as curry(foo). I find it confusing to look at an API where the decorated function signature is different to its initial definition.
On the subject of types, I would be very impressed if the general case is possible with Python type hints. I'm not sure how I'd even do it in Scala. You can do a limited number of cases, using overload for functions of two parameters as
T1 = TypeVar("T1")
T2 = TypeVar("T2")
U = TypeVar("U")
#overload
def curry(
func: Callable[[T1, T2], U],
max_argc: Optional[int]
) -> Callable[[T1], Callable[[T2], U]]:
...
adding versions for one, three, four parameters etc. Functions with lots of parameters are code smells anyway, with the exception of varargs, which I'm not sure if it even makes sense to curry.
For example, I have a piece of code like the following:
from typing import Type, TypeVar, cast
class SuperClass:
pass
T = TypeVar('T', bound=SuperClass)
def cast_to(obj: SuperClass, cast_to: Type[T] = SuperClass) -> T:
return cast(cast_to, obj)
And I saved it in type_check.py. If I run mypy on it, I got the following error messages:
type_check.py:10: error: Incompatible default for argument "cast_to" (default has type "Type[SuperClass]", argument has type "Type[T]")
type_check.py:11: error: Invalid type "cast_to"
From my understanding of bound in TypeVar, as long as a T is a subclass of SuperClass, it should be fine. But then why mypy is throwing out error here? Thanks!
There are two problems with your code: first the signature of your cast_to function should be:
def cast_to(obj: SuperClass, cast_to: Type[T] = Type[SuperClass]) -> T:
Then, in your cast statement, I'm not sure mypy will allow you to use cast_to as a first argument of cast. Instead you can try:
def cast_to(obj: SuperClass, cast_to: Type[T]) -> T:
return cast(T, obj)
Of course, with this definition you won't be able to call cast_to with only one argument.
I'm now going to ask: why do you feel you need to do this? are you sure your design is good? cast should be used in very particular cases; the documentation states:
Casts are used to silence spurious type checker warnings and give the type checker a little help when it can’t quite understand what is going on.
So you should seriously question your design here! give us a little more information about what you're trying to achieve. Maybe there are better and cleaner designs than what you're trying to do.
There's not a lot of detailed information online about making type annotations work with __round__. I have implemented this but I still get an error on line 16 (calling round without an ndigits argument) when I run mypy:
error: Incompatible types in assignment (expression has type "int", variable has type "MyClass")
The test passes, i.e. in both calls to round I get back an object of type MyClass. But the MyPy check fails only when I call round without an argument.
Version numbers: Python 3.6.5, mypy 0.641.
from typing import Any, SupportsRound, overload
class MyClass(SupportsRound['MyClass']):
def __round__(self: 'MyClass', ndigits: int = 0) -> 'MyClass':
return self
def test_tmp() -> None:
x = MyClass()
result: MyClass
result = round(x, 0)
assert type(result) == MyClass
result = round(x)
assert type(result) == MyClass
I believe the problem here has less to do with your use of SupportsRound and more to do with the definition of the round function. The round function is defined in typeshed, the repository of type hints for the standard library, to have the following signature:
#overload
def round(number: float) -> int: ...
#overload
def round(number: float, ndigits: None) -> int: ...
#overload
def round(number: float, ndigits: int) -> float: ...
#overload
def round(number: SupportsRound[_T]) -> int: ...
#overload
def round(number: SupportsRound[_T], ndigits: None) -> int: ... # type: ignore
#overload
def round(number: SupportsRound[_T], ndigits: int) -> _T: ...
Note that when only one argument is provided or ndigits is None, the output is always int. This is consistent with the documented behavior of the round function in the standard library: https://docs.python.org/3/library/functions.html#round
Unfortunately, I don't see a really clean way of working around this: I don't think the implementation of SupportsRound is really consistent with this behavior.
Specifically, SupportsRound probably ought to have been defined to be something like so:
#runtime
class SupportsRound(Protocol[_T_co]):
#abstractmethod
#overload
def __round__(self, ndigits: None = None) -> int: ...
#abstractmethod
#overload
def __round__(self, ndigits: int) -> _T_co: ...
Basically, force the user to handle these two cases.
Actually changing the definition would probably be complicated though: there isn't really a clean way of updating any older versions of Python that come bundled with older versions of the typing module.
I would recommend filing an issue about this on the typeshed issue tracker. I personally think you've discovered a genuine inconsistency/bug here, but there's possibly some nuance here that I'm missing, so I think it would be good to escalate this.