Assume a function that takes an object as parameter. There could be various ways to express the parameter object creation, some of which expressive, and likely easier to be used.
To give a simple example, we have a function which takes DateTime. We also want to accept string representations of DateTime, if possible (for example '20220606').
# version 1, strict. must send a DateTime
def UsefulFunc(startdate: DateTime) -> None:
pass
# version 2, allow more types, but loose on type hints
def UsefulFunc(startdate: (DateTime, str)) -> None:
# check if type is str, convert to DateTime if yes
pass
# version 3, multiple signatures to accept and call the base function
def UsefulFuncString(startdatestr: str) -> None:
# convert startdatestr to DateTime
UsefulFunc(startdate)
# … …
What approach is recommended in Python (I come from C# background)? If there's no clear indication/ or decision is based on situation, what are the considerations?
if you want type hint your function, you can use typing.Union
from datetime import datetime
from typing import Union
def UsefulFunc(startdate:Union[str, datetime]) -> None
...
or in py3.10+
def UsefulFunc(startdate:str|datetime) -> None
...
but type hint are just decoration in python, if you want to do something base on those types you need to check inside your function and work accordingly
def UsefulFunc(startdate:str|datetime) -> None
if isinstance(startdate,str):
...
elif isinstance(startdate,datetime):
...
else:
raise ValueError("Invalid type")
There is also the functools.singledispatch that help you do the above for you
from functools import singledispatch
#singledispatch
def UsefulFunc(startdate):
... #else case
#UsefulFunc.register
def _(startdate:datetime):
... #datetime case
#UsefulFunc.register
def _(startdate:str):
... #str case
After some research, and taking inspirations from the #Copperfield answer, I found an elegant solution to the problem.
Let's first rephrase the problem - we have a function that takes an object. We want to provide some overloads, which will do the validations/ conversions etc. and call the base function which accepts object. We also need to reject any call not following any function signature which are not implemented.
The library that I found very useful was multipledispatch. An easy example:
from multipledispatch import dispatch
#dispatch(int, int)
def add_nums(num1: int, num2: int) -> int:
return num1 + num2
#dispatch(str, str)
def add_nums(num1: str, num2: str) -> int:
# do some useful validations/ object transformations
# implement any intermediate logic before calling the base func
# this enables base function do it's intended feature rather than
# implementing overloads
return add_nums(int(num1), int(num2))
if we call add_nums(40, 15), we get 55 as the (int, int) version get called. add_nums('10', '15') get us 25 as expected as (str, str) version get called.
It becomes very interesting when we call add_nuns(10, 10.0) as this will fail saying NotImplementedError: Could not find signature for add_nums: <int, float>. Essentially any call not in (int, int) or (str, str) format, fail with NotImplementedError exception.
This is by far the closest behaviour of function overloading, when comparing with typed languages.
The only concern I have - this library was last updated on Aug 9, 2018.
Related
How can I specify the type hint of a variable as a function type? There is no typing.Function, and I could not find anything in the relevant PEP, PEP 483.
As #jonrsharpe noted in a comment, this can be done with typing.Callable:
from typing import Callable
def my_function(func: Callable):
Note: Callable on its own is equivalent to Callable[..., Any].
Such a Callable takes any number and type of arguments (...) and returns a value of any type (Any). If this is too unconstrained, one may also specify the types of the input argument list and return type.
For example, given:
def sum(a: int, b: int) -> int: return a+b
The corresponding annotation is:
Callable[[int, int], int]
That is, the parameters are sub-scripted in the outer subscription with the return type as the second element in the outer subscription. In general:
Callable[[ParamType1, ParamType2, .., ParamTypeN], ReturnType]
Another interesting point to note is that you can use the built in function type() to get the type of a built in function and use that.
So you could have
def f(my_function: type(abs)) -> int:
return my_function(100)
Or something of that form
In python3 it works without import typing:
def my_function(other_function: callable):
pass
My specific use case for wanting this functionality was to enable rich code completion in PyCharm. Using Callable didn't cause PyCharm to suggest that the object had a .__code__ attribute, which is what I wanted, in this case.
I stumbled across the types module and..
from types import FunctionType
allowed me to annotate an object with FunctionType and, voilà, PyCharm now suggests my object has a .__code__ attribute.
The OP wasn't clear on why this type hint was useful to them. Callable certainly works for anything that implements .__call__() but for further interface clarification, I submit the types module.
Bummer that Python needed two very similar modules.
An easiest and fancy solution is:
def f(my_function: type(lambda x: None)):
return my_function()
This can be proved in the following way:
def poww(num1, num2):
return num1**num2
print(type(lambda x: None) == type(poww))
and the output will be:
True
I'm trying to understand typing.overload and have applied it in a simple case where I want a function that takes input x: Literal["foo", "bar"] and returns the list [x].
I'd like mypy to type the resulting list as list[Literal["foo"]] or list[Literal["bar"]] depending on the value of x.
I know I could achieve this with a TypeVar, but I'd still like to understand why the code below fails with the following error:
test.py:14: error: Overloaded function implementation cannot produce return type of signature 1
test.py:14: error: Overloaded function implementation cannot produce return type of signature 2
from typing import Literal, overload
#overload
def f(x: Literal["foo"]) -> list[Literal["foo"]]:
...
#overload
def f(x: Literal["bar"]) -> list[Literal["bar"]]:
...
def f(x: Literal["foo", "bar"]) -> list[Literal["foo", "bar"]]:
return [x]
Lists in Python are invariant. That means that, even if B is a subtype of A, there is no relation between the types list[A] and list[B].
If list[B] were allowed to be a subtype of list[A], then someone could come along and do this.
my_b_list: list[B] = []
my_a_list: list[A] = my_b_list
my_a_list.append(A())
print(my_b_list) # Oh no, a list[B] contains an A value!
If you plan to modify the returned list, then what you're doing isn't safe. End of story. If you plan to treat the list as immutable, then consider what operations you actually need, and you may be able to find a covariant supertype of list in typing.
For example, Sequence is a popular choice. It supports iteration, random access, and length access, while explicitly not allowing mutation.
from typing import Literal, overload, Sequence
#overload
def f(x: Literal["foo"]) -> Sequence[Literal["foo"]]:
...
#overload
def f(x: Literal["bar"]) -> Sequence[Literal["bar"]]:
...
def f(x: Literal["foo", "bar"]) -> Sequence[Literal["foo", "bar"]]:
return [x]
(Note: typing.Sequence is deprecated in Python 3.9; if you only plan to support 3.9+, you might use collections.abc.Sequence instead)
AFAIU your question, your actual implementation needs to provide a single type (str), not multiple Literal.
The following works correctly according to pyright, and seems to provide the feature you are looking for (only allowing lists of either "foo" or "bar", rejecting everything else).
from typing import Literal, overload
#overload
def f(x: Literal["foo"]) -> list[Literal["foo"]]:
...
#overload
def f(x: Literal["bar"]) -> list[Literal["bar"]]:
...
def f(x: str) -> list[str]:
return [x]
f("foo") # valid
f("bar") # valid
f("baz") # error
which cause the following error:
a.py:20:3 - error: Argument of type "Literal['baz']" cannot be assigned to parameter "x" of type "Literal['bar']" in function "f"
"Literal['baz']" cannot be assigned to type "Literal['bar']"
How can I specify the type hint of a variable as a function type? There is no typing.Function, and I could not find anything in the relevant PEP, PEP 483.
As #jonrsharpe noted in a comment, this can be done with typing.Callable:
from typing import Callable
def my_function(func: Callable):
Note: Callable on its own is equivalent to Callable[..., Any].
Such a Callable takes any number and type of arguments (...) and returns a value of any type (Any). If this is too unconstrained, one may also specify the types of the input argument list and return type.
For example, given:
def sum(a: int, b: int) -> int: return a+b
The corresponding annotation is:
Callable[[int, int], int]
That is, the parameters are sub-scripted in the outer subscription with the return type as the second element in the outer subscription. In general:
Callable[[ParamType1, ParamType2, .., ParamTypeN], ReturnType]
Another interesting point to note is that you can use the built in function type() to get the type of a built in function and use that.
So you could have
def f(my_function: type(abs)) -> int:
return my_function(100)
Or something of that form
In python3 it works without import typing:
def my_function(other_function: callable):
pass
My specific use case for wanting this functionality was to enable rich code completion in PyCharm. Using Callable didn't cause PyCharm to suggest that the object had a .__code__ attribute, which is what I wanted, in this case.
I stumbled across the types module and..
from types import FunctionType
allowed me to annotate an object with FunctionType and, voilà, PyCharm now suggests my object has a .__code__ attribute.
The OP wasn't clear on why this type hint was useful to them. Callable certainly works for anything that implements .__call__() but for further interface clarification, I submit the types module.
Bummer that Python needed two very similar modules.
An easiest and fancy solution is:
def f(my_function: type(lambda x: None)):
return my_function()
This can be proved in the following way:
def poww(num1, num2):
return num1**num2
print(type(lambda x: None) == type(poww))
and the output will be:
True
There is such a function for currying. The problem is that I don’t know how to make this function return a decorated function with the correct types. Help, I have not found a solution anywhere.
import functools
import typing as ty
from typing import TypeVar, Callable, Any, Optional
F = TypeVar("F", bound=Callable[..., Any])
def curry(func: F, max_argc: Optional[int] = None):
if max_argc is None:
max_argc = func.__code__.co_argcount
#functools.wraps(func)
def wrapped(*args):
argc = len(args)
if argc < max_argc:
return curry(functools.partial(func, *args), max_argc - argc)
else:
return func(*args)
return ty.cast(F, wrapped)
#curry
def foo(x: int, y: int) -> int:
return x + y
foo("df")(5) # mypy error: Too few arguments for "foo"
# mypy error: "int" not callable
# mypy error: Argument 1 to "foo" has incompatible type "str"; expected "int" # True
How to fix 1, 2 mypy errors?
Interestingly enough, i tried the exact same thing, writing a decorator, which would return a curried version of any function, including generic higher-order ones.
I tried to build a curry that would allow any partitioning of the input arguments.
Denial
However, AFAIK it is not possible due to some constraints of the python type system.
I'm struggling to find a generic typesafe approach right now. I mean Haskell lives it, C++ adopted it, Typescript manages to do it as well, so why should python not be able to?
I reached out to mypy alternatives such as pyright, which has AWESOME maintainers, BUT is stull bound by PEPs, which state that some things are just not possible.
Anger
When submitting an issue with pyright for the last missing piece in my curry-chain, i boiled the issue down to the following (as can be seen in this issue: https://github.com/microsoft/pyright/issues/1720)
from typing import TypeVar
from typing_extensions import Protocol
R = TypeVar("R", covariant=True)
S = TypeVar("S", contravariant=True)
class ArityOne(Protocol[S, R]):
def __call__(self, __s: S) -> R:
...
def id_f(x: ArityOne[S, R]) -> ArityOne[S, R]:
return x
X = TypeVar("X")
def identity(x: X) -> X:
return x
i: int = id_f(identity)(4) # Does not type check, expected type `X`, got type `Literal[4]`
Mind the gap, this is a minimal reproducible example of the missing link.
What i initially attempted to do was the following (skipping the actual curry implementation, which, in comparison, resembles a walk in the park):
Write a curry decorator (without types)
Define Unary, Binary and Ternary (etc.) Protocols, which is the more modern version of the function type Callable. Coincidentally, Protocols can specify type #overloads for their __call__ methods, which brings me to the next point.
Define CurriedBinary, CurriedTernary (etc.) using Protocols with type #overloaded __call__ methods.
Define type #overloads for the curry function, e.g. Binary -> CurriedBinary, Ternary -> CurriedTernary
With this, everything was in place, and it works awesome for fixed-type functions i.e. int -> int or str -> int -> bool.
I don't have my attempted implementation on hand right now tho'.
However, when currying functions such as map, or filter it fails to match the curried, generic version of the function with the actual types.
Bargaining
This happens due to how scoping of type variables works. For more details you can take a look at the GitHub issue. It is explained in greater detail there.
Essentially what happens is that the type variable of the generic function-to-be-curried cannot be influenced by the actual type of the data-to-be-passed partially, because there is a class Protocol in between, which defines its own scope of type variables.
Trying to wrap or reorganize stuff did not yield fruitful results.
Depression
I used Protocols there to be able to represent the type of a curried function,
which is not possible with e.g. Callable, and although pyright displays the type of an overloaded function as Overload[contract1, contract2, ...] there is no such symbol, only #overload.
So either way there's something that prevents you from expressing the type you want.
It is currently not possible to represent a fully generic typesafe curry function due to limitations of the python type system.
Acceptance
However, it is possible to compromise on certain features, like generics, or arbitrary partitioning of input arguments.
The following implementation of curry works in pyright 1.1.128.
from typing import TypeVar, Callable, List, Optional, Union
R = TypeVar("R", covariant=True)
S = TypeVar("S", contravariant=True)
T = TypeVar("T", contravariant=True)
def curry(f: Callable[[T, S], R]) -> Callable[[T], Callable[[S], R]]:
raise Exception()
X = TypeVar("X")
Y = TypeVar("Y")
def function(x: X, y: X) -> X:
raise Exception()
def to_optional(x: X) -> Optional[X]:
raise Exception()
def map(f: Callable[[X], Y], xs: List[X]) -> List[Y]:
raise Exception()
i: int = curry(function)(4)(5)
s: List[Optional[Union[str, int]]] = curry(map)(to_optional)(["dennis", 4])
First things first, I wouldn't make it a decorator, I'd wrap it as curry(foo). I find it confusing to look at an API where the decorated function signature is different to its initial definition.
On the subject of types, I would be very impressed if the general case is possible with Python type hints. I'm not sure how I'd even do it in Scala. You can do a limited number of cases, using overload for functions of two parameters as
T1 = TypeVar("T1")
T2 = TypeVar("T2")
U = TypeVar("U")
#overload
def curry(
func: Callable[[T1, T2], U],
max_argc: Optional[int]
) -> Callable[[T1], Callable[[T2], U]]:
...
adding versions for one, three, four parameters etc. Functions with lots of parameters are code smells anyway, with the exception of varargs, which I'm not sure if it even makes sense to curry.
I want to statically enforce that a method of a class returns a value wrapped in some abstract type, that I know nothing about:
E.g. given the abstract class
F = ???
class ThingF(Generic[F]):
#abstractmethod
def action(self) -> F[Foo]:
...
I want to to be able to statically check that this is invalid:
class ThingI(ThingF[List]):
def action(self) -> Foo:
...
because action does not return List[Foo].
However the above declaration for ThingF does not even run, because Generic expects its arguments to be type variables and I cannot find a way to make F a type variable "with a hole".
Both
F = TypeVar('F')
and
T = TypeVar('T')
F = Generic[T]
do not work, because either TypeVar is not subscriptable or Generic[~T] cannot be used as a type variable.
Basically what I want is a "higher kinded type variable", an abstraction of a type constructor, if you will. I.e. something that says "F can be any type that takes another type to produce a concrete type".
Is there any way to express this with Python's type annotations and have it statically checked with mypy?
Unfortunately, the type system (as described in PEP 484) does not support higher-kinded types -- there's some relevant discussion here: https://github.com/python/typing/issues/548.
It's possible that mypy and other type checking tools will gain support for them at some point in the future, but I wouldn't hold my breath. It would require some pretty complicated implementation work to pull off.
You can use Higher Kinded Types with dry-python/returns.
We ship both primitives and a custom mypy plugin to make it work.
Here's an example with Mappable aka Functor:
from typing import Callable, TypeVar
from returns.interfaces.mappable import MappableN
from returns.primitives.hkt import Kinded, KindN, kinded
_FirstType = TypeVar('_FirstType')
_SecondType = TypeVar('_SecondType')
_ThirdType = TypeVar('_ThirdType')
_UpdatedType = TypeVar('_UpdatedType')
_MappableKind = TypeVar('_MappableKind', bound=MappableN)
#kinded
def map_(
container: KindN[_MappableKind, _FirstType, _SecondType, _ThirdType],
function: Callable[[_FirstType], _UpdatedType],
) -> KindN[_MappableKind, _UpdatedType, _SecondType, _ThirdType]:
return container.map(function)
It will work for any Mappable, examples:
from returns.maybe import Maybe
def test(arg: float) -> int:
...
reveal_type(map_(Maybe.from_value(1.5), test)) # N: Revealed type is 'returns.maybe.Maybe[builtins.int]'
And:
from returns.result import Result
def test(arg: float) -> int:
...
x: Result[float, str]
reveal_type(map_(x, test)) # N: Revealed type is 'returns.result.Result[builtins.int, builtins.str]'
It surely has some limitations, like: it only works with a direct Kind subtypes and we need a separate alias of Kind1, Kind2, Kind3, etc. Because at the time mypy does not support variadic generics.
Source: https://github.com/dry-python/returns/blob/master/returns/primitives/hkt.py
Plugin: https://github.com/dry-python/returns/blob/master/returns/contrib/mypy/_features/kind.py
Docs: https://returns.readthedocs.io/en/latest/pages/hkt.html
Announcement post: https://sobolevn.me/2020/10/higher-kinded-types-in-python