There is such a function for currying. The problem is that I don’t know how to make this function return a decorated function with the correct types. Help, I have not found a solution anywhere.
import functools
import typing as ty
from typing import TypeVar, Callable, Any, Optional
F = TypeVar("F", bound=Callable[..., Any])
def curry(func: F, max_argc: Optional[int] = None):
if max_argc is None:
max_argc = func.__code__.co_argcount
#functools.wraps(func)
def wrapped(*args):
argc = len(args)
if argc < max_argc:
return curry(functools.partial(func, *args), max_argc - argc)
else:
return func(*args)
return ty.cast(F, wrapped)
#curry
def foo(x: int, y: int) -> int:
return x + y
foo("df")(5) # mypy error: Too few arguments for "foo"
# mypy error: "int" not callable
# mypy error: Argument 1 to "foo" has incompatible type "str"; expected "int" # True
How to fix 1, 2 mypy errors?
Interestingly enough, i tried the exact same thing, writing a decorator, which would return a curried version of any function, including generic higher-order ones.
I tried to build a curry that would allow any partitioning of the input arguments.
Denial
However, AFAIK it is not possible due to some constraints of the python type system.
I'm struggling to find a generic typesafe approach right now. I mean Haskell lives it, C++ adopted it, Typescript manages to do it as well, so why should python not be able to?
I reached out to mypy alternatives such as pyright, which has AWESOME maintainers, BUT is stull bound by PEPs, which state that some things are just not possible.
Anger
When submitting an issue with pyright for the last missing piece in my curry-chain, i boiled the issue down to the following (as can be seen in this issue: https://github.com/microsoft/pyright/issues/1720)
from typing import TypeVar
from typing_extensions import Protocol
R = TypeVar("R", covariant=True)
S = TypeVar("S", contravariant=True)
class ArityOne(Protocol[S, R]):
def __call__(self, __s: S) -> R:
...
def id_f(x: ArityOne[S, R]) -> ArityOne[S, R]:
return x
X = TypeVar("X")
def identity(x: X) -> X:
return x
i: int = id_f(identity)(4) # Does not type check, expected type `X`, got type `Literal[4]`
Mind the gap, this is a minimal reproducible example of the missing link.
What i initially attempted to do was the following (skipping the actual curry implementation, which, in comparison, resembles a walk in the park):
Write a curry decorator (without types)
Define Unary, Binary and Ternary (etc.) Protocols, which is the more modern version of the function type Callable. Coincidentally, Protocols can specify type #overloads for their __call__ methods, which brings me to the next point.
Define CurriedBinary, CurriedTernary (etc.) using Protocols with type #overloaded __call__ methods.
Define type #overloads for the curry function, e.g. Binary -> CurriedBinary, Ternary -> CurriedTernary
With this, everything was in place, and it works awesome for fixed-type functions i.e. int -> int or str -> int -> bool.
I don't have my attempted implementation on hand right now tho'.
However, when currying functions such as map, or filter it fails to match the curried, generic version of the function with the actual types.
Bargaining
This happens due to how scoping of type variables works. For more details you can take a look at the GitHub issue. It is explained in greater detail there.
Essentially what happens is that the type variable of the generic function-to-be-curried cannot be influenced by the actual type of the data-to-be-passed partially, because there is a class Protocol in between, which defines its own scope of type variables.
Trying to wrap or reorganize stuff did not yield fruitful results.
Depression
I used Protocols there to be able to represent the type of a curried function,
which is not possible with e.g. Callable, and although pyright displays the type of an overloaded function as Overload[contract1, contract2, ...] there is no such symbol, only #overload.
So either way there's something that prevents you from expressing the type you want.
It is currently not possible to represent a fully generic typesafe curry function due to limitations of the python type system.
Acceptance
However, it is possible to compromise on certain features, like generics, or arbitrary partitioning of input arguments.
The following implementation of curry works in pyright 1.1.128.
from typing import TypeVar, Callable, List, Optional, Union
R = TypeVar("R", covariant=True)
S = TypeVar("S", contravariant=True)
T = TypeVar("T", contravariant=True)
def curry(f: Callable[[T, S], R]) -> Callable[[T], Callable[[S], R]]:
raise Exception()
X = TypeVar("X")
Y = TypeVar("Y")
def function(x: X, y: X) -> X:
raise Exception()
def to_optional(x: X) -> Optional[X]:
raise Exception()
def map(f: Callable[[X], Y], xs: List[X]) -> List[Y]:
raise Exception()
i: int = curry(function)(4)(5)
s: List[Optional[Union[str, int]]] = curry(map)(to_optional)(["dennis", 4])
First things first, I wouldn't make it a decorator, I'd wrap it as curry(foo). I find it confusing to look at an API where the decorated function signature is different to its initial definition.
On the subject of types, I would be very impressed if the general case is possible with Python type hints. I'm not sure how I'd even do it in Scala. You can do a limited number of cases, using overload for functions of two parameters as
T1 = TypeVar("T1")
T2 = TypeVar("T2")
U = TypeVar("U")
#overload
def curry(
func: Callable[[T1, T2], U],
max_argc: Optional[int]
) -> Callable[[T1], Callable[[T2], U]]:
...
adding versions for one, three, four parameters etc. Functions with lots of parameters are code smells anyway, with the exception of varargs, which I'm not sure if it even makes sense to curry.
Related
Assume a function that takes an object as parameter. There could be various ways to express the parameter object creation, some of which expressive, and likely easier to be used.
To give a simple example, we have a function which takes DateTime. We also want to accept string representations of DateTime, if possible (for example '20220606').
# version 1, strict. must send a DateTime
def UsefulFunc(startdate: DateTime) -> None:
pass
# version 2, allow more types, but loose on type hints
def UsefulFunc(startdate: (DateTime, str)) -> None:
# check if type is str, convert to DateTime if yes
pass
# version 3, multiple signatures to accept and call the base function
def UsefulFuncString(startdatestr: str) -> None:
# convert startdatestr to DateTime
UsefulFunc(startdate)
# … …
What approach is recommended in Python (I come from C# background)? If there's no clear indication/ or decision is based on situation, what are the considerations?
if you want type hint your function, you can use typing.Union
from datetime import datetime
from typing import Union
def UsefulFunc(startdate:Union[str, datetime]) -> None
...
or in py3.10+
def UsefulFunc(startdate:str|datetime) -> None
...
but type hint are just decoration in python, if you want to do something base on those types you need to check inside your function and work accordingly
def UsefulFunc(startdate:str|datetime) -> None
if isinstance(startdate,str):
...
elif isinstance(startdate,datetime):
...
else:
raise ValueError("Invalid type")
There is also the functools.singledispatch that help you do the above for you
from functools import singledispatch
#singledispatch
def UsefulFunc(startdate):
... #else case
#UsefulFunc.register
def _(startdate:datetime):
... #datetime case
#UsefulFunc.register
def _(startdate:str):
... #str case
After some research, and taking inspirations from the #Copperfield answer, I found an elegant solution to the problem.
Let's first rephrase the problem - we have a function that takes an object. We want to provide some overloads, which will do the validations/ conversions etc. and call the base function which accepts object. We also need to reject any call not following any function signature which are not implemented.
The library that I found very useful was multipledispatch. An easy example:
from multipledispatch import dispatch
#dispatch(int, int)
def add_nums(num1: int, num2: int) -> int:
return num1 + num2
#dispatch(str, str)
def add_nums(num1: str, num2: str) -> int:
# do some useful validations/ object transformations
# implement any intermediate logic before calling the base func
# this enables base function do it's intended feature rather than
# implementing overloads
return add_nums(int(num1), int(num2))
if we call add_nums(40, 15), we get 55 as the (int, int) version get called. add_nums('10', '15') get us 25 as expected as (str, str) version get called.
It becomes very interesting when we call add_nuns(10, 10.0) as this will fail saying NotImplementedError: Could not find signature for add_nums: <int, float>. Essentially any call not in (int, int) or (str, str) format, fail with NotImplementedError exception.
This is by far the closest behaviour of function overloading, when comparing with typed languages.
The only concern I have - this library was last updated on Aug 9, 2018.
Say that I've got a polymorphic function that repeats any object passed into it as an argument (similar to the itertools.repeat from the Python Standard Library):
def repeat(i):
while True:
yield i
How do I write function annotation that tells that this is a polymorphic function?
Let me be clear, I understand that one possibility is to write:
from typing import Any, Iterable
def repeat(i: Any) -> Iterable[Any]:
while True:
yield i
This solution is, however, ambiguous because it's true for the both of the following case:
repeat(i: Apples) -> Iterator[Apples]:
or
repeat(i: Apples) -> Iterator[Oranges]:
I would like to have a solution that truly reflects to the fact that the function accepts any type but that it gives back an iterator which produces the same type used for calling the function.
As an example from Haskell, this would be solved using type variables and the function type in Haskell would simply be:
repeat :: a -> [a]
where a is a type variable. How wold I get the same result in Python?
There is a very similar example in the Python docs using TypeVar:
def repeat(x: T, n: int) -> Sequence[T]:
"""Return a list containing n references to x."""
return [x]*n
where T = TypeVar('T') # Can be anything is used. So you can adapt this:
from typing import Iterable, TypeVar
T = TypeVar('T')
def repeat(i: T) -> Iterable[T]:
while True:
yield i
I want to statically enforce that a method of a class returns a value wrapped in some abstract type, that I know nothing about:
E.g. given the abstract class
F = ???
class ThingF(Generic[F]):
#abstractmethod
def action(self) -> F[Foo]:
...
I want to to be able to statically check that this is invalid:
class ThingI(ThingF[List]):
def action(self) -> Foo:
...
because action does not return List[Foo].
However the above declaration for ThingF does not even run, because Generic expects its arguments to be type variables and I cannot find a way to make F a type variable "with a hole".
Both
F = TypeVar('F')
and
T = TypeVar('T')
F = Generic[T]
do not work, because either TypeVar is not subscriptable or Generic[~T] cannot be used as a type variable.
Basically what I want is a "higher kinded type variable", an abstraction of a type constructor, if you will. I.e. something that says "F can be any type that takes another type to produce a concrete type".
Is there any way to express this with Python's type annotations and have it statically checked with mypy?
Unfortunately, the type system (as described in PEP 484) does not support higher-kinded types -- there's some relevant discussion here: https://github.com/python/typing/issues/548.
It's possible that mypy and other type checking tools will gain support for them at some point in the future, but I wouldn't hold my breath. It would require some pretty complicated implementation work to pull off.
You can use Higher Kinded Types with dry-python/returns.
We ship both primitives and a custom mypy plugin to make it work.
Here's an example with Mappable aka Functor:
from typing import Callable, TypeVar
from returns.interfaces.mappable import MappableN
from returns.primitives.hkt import Kinded, KindN, kinded
_FirstType = TypeVar('_FirstType')
_SecondType = TypeVar('_SecondType')
_ThirdType = TypeVar('_ThirdType')
_UpdatedType = TypeVar('_UpdatedType')
_MappableKind = TypeVar('_MappableKind', bound=MappableN)
#kinded
def map_(
container: KindN[_MappableKind, _FirstType, _SecondType, _ThirdType],
function: Callable[[_FirstType], _UpdatedType],
) -> KindN[_MappableKind, _UpdatedType, _SecondType, _ThirdType]:
return container.map(function)
It will work for any Mappable, examples:
from returns.maybe import Maybe
def test(arg: float) -> int:
...
reveal_type(map_(Maybe.from_value(1.5), test)) # N: Revealed type is 'returns.maybe.Maybe[builtins.int]'
And:
from returns.result import Result
def test(arg: float) -> int:
...
x: Result[float, str]
reveal_type(map_(x, test)) # N: Revealed type is 'returns.result.Result[builtins.int, builtins.str]'
It surely has some limitations, like: it only works with a direct Kind subtypes and we need a separate alias of Kind1, Kind2, Kind3, etc. Because at the time mypy does not support variadic generics.
Source: https://github.com/dry-python/returns/blob/master/returns/primitives/hkt.py
Plugin: https://github.com/dry-python/returns/blob/master/returns/contrib/mypy/_features/kind.py
Docs: https://returns.readthedocs.io/en/latest/pages/hkt.html
Announcement post: https://sobolevn.me/2020/10/higher-kinded-types-in-python
Is there a "standard" way to add simple dynamic type checking in Python, doing something like:
def add(a, b):
# Argument type check
check(a, int)
check(b, int)
# Calculate
res = a + b
# Result type check and return
check(res, int)
return res
An exception could then be raised by check in case of a type mismatch.
I could of course cook something myself, doing isinstance(..., ...) or type(...) == ..., but I wonder if there is some "standard" module for this kind of type checking.
It would be nice if more complex type checking could also be done, like checking if an argument is either str or int, or for example a list of str.
I am aware that it somehow defies Pythons principle of duck typing, but I just spent several hours debugging due to an argument with wrong type, and it was a large program, so the cause shown up many nested calls from the reason.
You could use a decorator function. Something like this:
def typecheck(*types):
def __f(f):
def _f(*args):
for a, t in zip(args, types):
if not isinstance(a, t):
print "WARNING: Expected", t, "got", a
return f(*args)
return _f
return __f
#typecheck(int, str, int)
def foo(a, b, c):
pass
foo(1, "bar", 5)
foo(4, None, "string")
Output (for the second call) is
WARNING: Expected <type 'str'>, got None
WARNING: Expected <type 'int'>, got 'string'
As it stands, this does not work for keyword parameters, though.
Edit: After some googling, I found some much more complex type checking decorators (1) (2) also supporting keyword parameters and return types.
There is mypy which is being considered for entry into Python proper but in general, there isn't any way to do what you want.
Your code should not depend on concrete types but on general interfaces (e.g. not whether two things are integers but whether they are "addable"). This allows you to take advantage of dynamic typing and write generic functions. If a type does not handle the interface you want, it will throw an exception which you can catch. So, your add would be better done like so.
def add(a, b):
try:
return a + b
except TypeError as t:
# some logging code here for your debugging ease
raise t
If you are on Python 3, there is optional type annotation for functions. This means that the following code is valid Python 3.
def add(a:int, b:int):
return a + b
I don't know if there any tools that take advantage of the hints to give you actual compile time checking though.
I know it's not Pythonic to write functions that care about the type of the arguments, but there are cases when it's simply impossible to ignore types because they are handled differently.
Having a bunch of isinstance checks in your function is just ugly; is there any function decorator available that enables function overloads? Something like this:
#overload(str)
def func(val):
print('This is a string')
#overload(int)
def func(val):
print('This is an int')
Update:
Here's some comments I left on David Zaslavsky's answer:
With a few modification[s], this will suit my purposes pretty well. One other limitation I noticed in your implementation, since you use func.__name__ as the dictionary key, you are prone to name collisions between modules, which is not always desirable. [cont'd]
[cont.] For example, if I have one module that overloads func, and another completely unrelated module that also overloads func, these overloads will collide because the function dispatch dict is global. That dict should be made local to the module, somehow. And not only that, it should also support some kind of 'inheritance'. [cont'd]
[cont.] By 'inheritance' I mean this: say I have a module first with some overloads. Then two more modules that are unrelated but each import first; both of these modules add new overloads to the already existing ones that they just imported. These two modules should be able to use the overloads in first, but the new ones that they just added should not collide with each other between modules. (This is actually pretty hard to do right, now that I think about it.)
Some of these problems could possibly be solved by changing the decorator syntax a little bit:
first.py
#overload(str, str)
def concatenate(a, b):
return a + b
#concatenate.overload(int, int)
def concatenate(a, b):
return str(a) + str(b)
second.py
from first import concatenate
#concatenate.overload(float, str)
def concatenate(a, b):
return str(a) + b
Since Python 3.4 the functools module supports a #singledispatch decorator. It works like this:
from functools import singledispatch
#singledispatch
def func(val):
raise NotImplementedError
#func.register
def _(val: str):
print('This is a string')
#func.register
def _(val: int):
print('This is an int')
Usage
func("test") --> "This is a string"
func(1) --> "This is an int"
func(None) --> NotImplementedError
Quick answer: there is an overload package on PyPI which implements this more robustly than what I describe below, although using a slightly different syntax. It's declared to work only with Python 3 but it looks like only slight modifications (if any, I haven't tried) would be needed to make it work with Python 2.
Long answer: In languages where you can overload functions, the name of a function is (either literally or effectively) augmented by information about its type signature, both when the function is defined and when it is called. When a compiler or interpreter looks up the function definition, then, it uses both the declared name and the types of the parameters to resolve which function to access. So the logical way to implement overloading in Python is to implement a wrapper that uses both the declared name and the parameter types to resolve the function.
Here's a simple implementation:
from collections import defaultdict
def determine_types(args, kwargs):
return tuple([type(a) for a in args]), \
tuple([(k, type(v)) for k,v in kwargs.iteritems()])
function_table = defaultdict(dict)
def overload(arg_types=(), kwarg_types=()):
def wrap(func):
named_func = function_table[func.__name__]
named_func[arg_types, kwarg_types] = func
def call_function_by_signature(*args, **kwargs):
return named_func[determine_types(args, kwargs)](*args, **kwargs)
return call_function_by_signature
return wrap
overload should be called with two optional arguments, a tuple representing the types of all positional arguments and a tuple of tuples representing the name-type mappings of all keyword arguments. Here's a usage example:
>>> #overload((str, int))
... def f(a, b):
... return a * b
>>> #overload((int, int))
... def f(a, b):
... return a + b
>>> print f('a', 2)
aa
>>> print f(4, 2)
6
>>> #overload((str,), (('foo', int), ('bar', float)))
... def g(a, foo, bar):
... return foo*a + str(bar)
>>> #overload((str,), (('foo', float), ('bar', float)))
... def g(a, foo, bar):
... return a + str(foo*bar)
>>> print g('a', foo=7, bar=4.4)
aaaaaaa4.4
>>> print g('b', foo=7., bar=4.4)
b30.8
Shortcomings of this include
It doesn't actually check that the function the decorator is applied to is even compatible with the arguments given to the decorator. You could write
#overload((str, int))
def h():
return 0
and you'd get an error when the function was called.
It doesn't gracefully handle the case where no overloaded version exists corresponding to the types of the arguments passed (it would help to raise a more descriptive error)
It distinguishes between named and positional arguments, so something like
g('a', 7, bar=4.4)
doesn't work.
There are a lot of nested parentheses involved in using this, as in the definitions for g.
As mentioned in the comments, this doesn't deal with functions having the same name in different modules.
All of these could be remedied with enough fiddling, I think. In particular, the issue of name collisions is easily resolved by storing the dispatch table as an attribute of the function returned from the decorator. But as I said, this is just a simple example to demonstrate the basics of how to do it.
This doesn't directly answer your question, but if you really want to have something that behaves like an overloaded function for different types and (quite rightly) don't want to use isinstance then I'd suggest something like:
def func(int_val=None, str_val=None):
if sum(x != None for x in (int_val, str_val)) != 1:
#raise exception - exactly one value should be passed in
if int_val is not None:
print('This is an int')
if str_val is not None:
print('This is a string')
In use the intent is obvious, and it doesn't even require the different options to have different types:
func(int_val=3)
func(str_val="squirrel")
Yes, there is an overload decorator in the typing library that can be used to help make complex type hints easier.
from collections.abc import Sequence
from typing import overload
#overload
def double(input_: int) -> int:
...
#overload
def double(input_: Sequence[int]) -> list[int]:
...
def double(input_: int | Sequence[int]) -> int | list[int]:
if isinstance(input_, Sequence):
return [i * 2 for i in input_]
return input_ * 2
Check this link for more details.
Just noticed it is a 11 years old question, sorry to bring it up again. It was by mistake.