ParamSpec for a pre-defined function, without using generic Callable[P] - python

I want to write a wrapper function for a known function, like
def wrapper(*args, **kwargs)
foo()
return known_function(*args, **kwargs)
How can i add type-annotations to wrapper, such that it exactly follows the type annotations of known_function
I have looked at ParamSpec, but it appears to only work when the wrapper-function is generic and takes the inner function as argument.
P = ParamSpec("P")
T = TypeVar('T')
def wrapper(func_arg_that_i_dont_want: Callable[P,T], *args: P.args, **kwargs: P.kwargs)
foo()
return known_function(*args, **kwargs)
Can i force the P to only be valid for known_function, without linking it to a Callable-argument?

PEP 612 as well as the documentation of ParamSpec.args and ParamSpec.kwargs are pretty clear on this:
These “properties” can only be used as the annotated types for *args and **kwargs, accessed from a ParamSpec already in scope.
- Source: PEP 612 ("The components of a ParamSpec" -> "Valid use locations")
Both attributes require the annotated parameter to be in scope.
- Source: python.typing module documentation (class typing.ParamSpec -> args/kwargs)
They [parameter specifications] are only valid when used in Concatenate, or as the first argument to Callable, or as parameters for user-defined Generics.
- Source: python.typing module documentation (class typing.ParamSpec, second paragraph)
So no, you cannot use parameter specification args/kwargs, without binding it a concrete Callable in the scope you want to use them in.
I question why you would even want that. If you know that wrapper will always call known_function and you want it to (as you said) have the exact same arguments, then you just annotate it with the same arguments. Example:
def known_function(x: int, y: str) -> bool:
return str(x) == y
def wrapper(x: int, y: str) -> bool:
# other things...
return known_function(x, y)
If you do want wrapper to accept additional arguments aside from those passed on to known_function, then you just include those as well:
def known_function(x: int, y: str) -> bool:
return str(x) == y
def wrapper(a: float, x: int, y: str) -> bool:
print(a ** 2)
return known_function(x, y)
If your argument is that you don't want to repeat yourself because known_function has 42 distinct and complexly typed parameters, then (with all due respect) the design of known_function should be covered in copious amounts gasoline and set ablaze.
If you insist to dynamically associate the parameter specifications (or are curious about possible workarounds for academic reasons), the following is the best thing I can think of.
You write a protected decorator that is only intended to be used on known_function. (You could even raise an exception, if it is called with anything else to protect your own sanity.) You define your wrapper inside that decorator (and add any additional arguments, if you want any). Thus, you'll be able to annotate its *args/**kwargs with the ParamSpecArgs/ParamSpecKwargs of the decorated function. In this case you probably don't want to use functools.wraps because the function you receive out of that decorator is probably intended not to replace known_function, but stand alongside it.
Here is a full working example:
from collections.abc import Callable
from typing import Concatenate, ParamSpec, TypeVar
P = ParamSpec("P")
T = TypeVar("T")
def known_function(x: int, y: str) -> bool:
"""Does thing XY"""
return str(x) == y
def _decorate(f: Callable[P, T]) -> Callable[Concatenate[float, P], T]:
if f is not known_function: # type: ignore[comparison-overlap]
raise RuntimeError("This is an exclusive decorator.")
def _wrapper(a: float, /, *args: P.args, **kwargs: P.kwargs) -> T:
"""Also does thing XY, but first does something else."""
print(a ** 2)
return f(*args, **kwargs)
return _wrapper
wrapper = _decorate(known_function)
if __name__ == "__main__":
print(known_function(1, "2"))
print(wrapper(3.14, 10, "10"))
Output as expected:
False
9.8596
True
Adding reveal_type(wrapper) to the script and running mypy gives the following:
Revealed type is "def (builtins.float, x: builtins.int, y: builtins.str) -> builtins.bool"
PyCharm also gives the relevant suggestions regarding the function signature, which it infers from having known_function passed into _decorate.
But again, just to be clear, I don't think this is good design. If your "wrapper" is not generic, but instead always calls the same function, you should explicitly annotate it, so that its parameters correspond to that function. After all:
Explicit is better than implicit.
- Zen of Python, line 2

Related

Type Hint `Callable[[int, ...], None]` using `ParamSpec`?

Similar to Python type hint Callable with one known positional type and then *args and **kwargs, I want to type hint a Callable for which is known:
It must have at least 1 positional input.
The first positional input must be int.
It must return None.
Apart from that, any signature is valid. I tried to do the following, but it doesn't work. So, is it possible to do it in python 3.10/3.11 at all?
from typing import TypeAlias, ParamSpec, Concatenate, Callable
P = ParamSpec("P")
intfun: TypeAlias = Callable[Concatenate[int, P], None]
def foo(i: int) -> None:
pass
a: intfun = foo # ✘ Incompatible types in assignment
# expression has type "Callable[[int], None]",
# variable has type "Callable[[int, VarArg(Any), KwArg(Any)], None]")
https://mypy-play.net/?mypy=latest&python=3.11&gist=f4c26907bfc0ae0118b90c1fa5a79fe8
I am using mypy==1.0.0.
Context: I want to type hint a dict hat holds key-value pairs where the value could be any Callable satisfying properties 1,2,3.
The somewhat cryptic notation used by mypy for the type you annotated a with indicates that it does not implicitly specify the generic Callable you defined to intfun[...] (literal ellipses here), when you omit type arguments. As pointed out here, there is a case to be made for mypy to simply assume that like it assumes an Any argument for "normal" generics.
But that is beside the point here because when you are dealing with generics you should treat them as such. Essentially the P is the type variable here, so whenever you want to use intfun for annotation, you should either specify it or bind another type variable, depending on the context.
In this case, you can just do a: intfun[[]] = foo and the code will pass the type check. That annotation specifies intfun to callables that have no other parameters (besides the mandatory int). Thus the following will cause an error in the last line:
from typing import TypeAlias, ParamSpec, Concatenate, Callable
P = ParamSpec("P")
intfun: TypeAlias = Callable[Concatenate[int, P], None]
def foo(i: int) -> None:
pass
def bar(i: int, s: str) -> None:
pass
a: intfun[[]] = foo # passes
a = bar # error
Depending on the context, in which you want to use that type alias, you may want to provide a different type argument. Here is another example, where you use intfun in its generic form:
from typing import TypeAlias, ParamSpec, Concatenate, Callable
P = ParamSpec("P")
intfun: TypeAlias = Callable[Concatenate[int, P], None]
def foo(i: int) -> None:
pass
def baz(f: intfun[P]) -> Callable[Concatenate[int, P], int]:
def wrapper(i: int, /, *args: P.args, **kwargs: P.kwargs) -> int:
f(i, *args, **kwargs)
return i
return wrapper
fooz = baz(foo) # passes

How do I type correctly a wrapper function?

Assume following declarations:
from typing import Callable, TypeVar
T = TypeVar('T')
def wrapper(fn: Callable[..., T]) -> Callable[..., T]:
...
def identity(a: T) -> T:
...
#wrapper
def w_wrapped(a: T) -> T:
...
#identity
def i_wrapped(a: T) -> T:
...
The two annotated functions can be used like this:
def apply(fn: Callable[[str], int]) -> int:
# types fine:
val1 = fn(i_wrapped(''))
# mypy complains: error: Argument 1 has incompatible type "T"; expected "str"
val2 = fn(w_wrapped(''))
return val1 + val2
What's wrong with the Callable type? I can use Callable[..., Any] instead of Callable[..., T] in the wrapper declaration. But I feel like this partially defeats the purpose, I would like to declare that when you use the wrapper with str, the result would be str, not just anything. There may be other workarounds too, but is this mypy limitation or my misunderstanding?
I think there may be two things going on here:
Firstly, a mypy bug, see this and this
Secondly, Callable[..., T] is too loose. Specifically, there's no connection between its argument and its return value. As a result, with #wrapper, w_wrapped becomes a Callable[..., T], with no constraint on its argument, and the output of w_wrapped('') is a unbound T, which can't be passed to fn which expects a str.
You have a number of options depending on your use case, including
def wrapper(fn: Callable[[U], T]) -> Callable[[U], T]: for U = TypeVar('U'), though I believe the mypy bugs stops this working. U could also be T
def wrapper(fn: C) -> C: for C = TypeVar('C', bound=Callable). This doesn't have the problem of Any because you're bounding on Callable so you retain the type signature. However, it will limit your implementation of wrapper, short of type: ignore
When you write Callable[..., T], I believe the editor is not able to correlate "..." with "T".
"apply" fn is expecting a str, w_wrapped is returning a str, but this process is happening inside "wrapper" inner logic, since "..." is not relatable to T, the editor is dealing with two different T's, like T_0 and T_1. Maybe thats whats causing type mismatch.

Regrading Python methods in same class [duplicate]

I am trying to implement method overloading in Python:
class A:
def stackoverflow(self):
print 'first method'
def stackoverflow(self, i):
print 'second method', i
ob=A()
ob.stackoverflow(2)
but the output is second method 2; similarly:
class A:
def stackoverflow(self):
print 'first method'
def stackoverflow(self, i):
print 'second method', i
ob=A()
ob.stackoverflow()
gives
Traceback (most recent call last):
File "my.py", line 9, in <module>
ob.stackoverflow()
TypeError: stackoverflow() takes exactly 2 arguments (1 given)
How do I make this work?
It's method overloading, not method overriding. And in Python, you historically do it all in one function:
class A:
def stackoverflow(self, i='some_default_value'):
print('only method')
ob=A()
ob.stackoverflow(2)
ob.stackoverflow()
See the Default Argument Values section of the Python tutorial. See "Least Astonishment" and the Mutable Default Argument for a common mistake to avoid.
See PEP 443 for information about the single dispatch generic functions added in Python 3.4:
>>> from functools import singledispatch
>>> #singledispatch
... def fun(arg, verbose=False):
... if verbose:
... print("Let me just say,", end=" ")
... print(arg)
>>> #fun.register(int)
... def _(arg, verbose=False):
... if verbose:
... print("Strength in numbers, eh?", end=" ")
... print(arg)
...
>>> #fun.register(list)
... def _(arg, verbose=False):
... if verbose:
... print("Enumerate this:")
... for i, elem in enumerate(arg):
... print(i, elem)
You can also use pythonlangutil:
from pythonlangutil.overload import Overload, signature
class A:
#Overload
#signature()
def stackoverflow(self):
print('first method')
#stackoverflow.overload
#signature("int")
def stackoverflow(self, i):
print('second method', i)
While agf was right with the answer in the past, pre-3.4, now with PEP-3124 we got our syntactic sugar.
See typing documentation for details on the #overload decorator, but note that this is really just syntactic sugar and IMHO this is all people have been arguing about ever since.
Personally, I agree that having multiple functions with different signatures makes it more readable then having a single function with 20+ arguments all set to a default value (None most of the time) and then having to fiddle around using endless if, elif, else chains to find out what the caller actually wants our function to do with the provided set of arguments. This was long overdue following the Python Zen:
Beautiful is better than ugly.
and arguably also
Simple is better than complex.
Straight from the official Python documentation linked above:
from typing import overload
#overload
def process(response: None) -> None:
...
#overload
def process(response: int) -> Tuple[int, str]:
...
#overload
def process(response: bytes) -> str:
...
def process(response):
<actual implementation>
EDIT: for anyone wondering why this example is not working as you'd expect if from other languages I'd suggest to take a look at this discussion. The #overloaded functions are not supposed to have any actual implementation. This is not obvious from the example in the Python documentation.
In Python, you don't do things that way. When people do that in languages like Java, they generally want a default value (if they don't, they generally want a method with a different name). So, in Python, you can have default values.
class A(object): # Remember the ``object`` bit when working in Python 2.x
def stackoverflow(self, i=None):
if i is None:
print 'first form'
else:
print 'second form'
As you can see, you can use this to trigger separate behaviour rather than merely having a default value.
>>> ob = A()
>>> ob.stackoverflow()
first form
>>> ob.stackoverflow(2)
second form
You can't, never need to and don't really want to.
In Python, everything is an object. Classes are things, so they are objects. So are methods.
There is an object called A which is a class. It has an attribute called stackoverflow. It can only have one such attribute.
When you write def stackoverflow(...): ..., what happens is that you create an object which is the method, and assign it to the stackoverflow attribute of A. If you write two definitions, the second one replaces the first, the same way that assignment always behaves.
You furthermore do not want to write code that does the wilder of the sorts of things that overloading is sometimes used for. That's not how the language works.
Instead of trying to define a separate function for each type of thing you could be given (which makes little sense since you don't specify types for function parameters anyway), stop worrying about what things are and start thinking about what they can do.
You not only can't write a separate one to handle a tuple vs. a list, but also don't want or need to.
All you do is take advantage of the fact that they are both, for example, iterable (i.e. you can write for element in container:). (The fact that they aren't directly related by inheritance is irrelevant.)
I write my answer in Python 3.2.1.
def overload(*functions):
return lambda *args, **kwargs: functions[len(args)](*args, **kwargs)
How it works:
overload takes any amount of callables and stores them in tuple functions, then returns lambda.
The lambda takes any amount of arguments,
then returns result of calling function stored in functions[number_of_unnamed_args_passed] called with arguments passed to the lambda.
Usage:
class A:
stackoverflow=overload( \
None, \
#there is always a self argument, so this should never get called
lambda self: print('First method'), \
lambda self, i: print('Second method', i) \
)
I think the word you're looking for is "overloading". There isn't any method overloading in Python. You can however use default arguments, as follows.
def stackoverflow(self, i=None):
if i != None:
print 'second method', i
else:
print 'first method'
When you pass it an argument, it will follow the logic of the first condition and execute the first print statement. When you pass it no arguments, it will go into the else condition and execute the second print statement.
I write my answer in Python 2.7:
In Python, method overloading is not possible; if you really want access the same function with different features, I suggest you to go for method overriding.
class Base(): # Base class
'''def add(self,a,b):
s=a+b
print s'''
def add(self,a,b,c):
self.a=a
self.b=b
self.c=c
sum =a+b+c
print sum
class Derived(Base): # Derived class
def add(self,a,b): # overriding method
sum=a+b
print sum
add_fun_1=Base() #instance creation for Base class
add_fun_2=Derived()#instance creation for Derived class
add_fun_1.add(4,2,5) # function with 3 arguments
add_fun_2.add(4,2) # function with 2 arguments
In Python, overloading is not an applied concept. However, if you are trying to create a case where, for instance, you want one initializer to be performed if passed an argument of type foo and another initializer for an argument of type bar then, since everything in Python is handled as object, you can check the name of the passed object's class type and write conditional handling based on that.
class A:
def __init__(self, arg)
# Get the Argument's class type as a String
argClass = arg.__class__.__name__
if argClass == 'foo':
print 'Arg is of type "foo"'
...
elif argClass == 'bar':
print 'Arg is of type "bar"'
...
else
print 'Arg is of a different type'
...
This concept can be applied to multiple different scenarios through different methods as needed.
In Python, you'd do this with a default argument.
class A:
def stackoverflow(self, i=None):
if i == None:
print 'first method'
else:
print 'second method',i
Python does not support method overloading like Java or C++. We may overload the methods, but we can only use the latest defined method.
# First sum method.
# Takes two argument and print their sum
def sum(a, b):
s = a + b
print(s)
# Second sum method
# Takes three argument and print their sum
def sum(a, b, c):
s = a + b + c
print(s)
# Uncommenting the below line shows an error
# sum(4, 5)
# This line will call the second sum method
sum(4, 5, 5)
We need to provide optional arguments or *args in order to provide a different number of arguments on calling.
Courtesy Python | Method Overloading
I just came across overloading.py (function overloading for Python 3) for anybody who may be interested.
From the linked repository's README file:
overloading is a module that provides function dispatching based on
the types and number of runtime arguments.
When an overloaded function is invoked, the dispatcher compares the
supplied arguments to available function signatures and calls the
implementation that provides the most accurate match.
Features
Function validation upon registration and detailed resolution rules
guarantee a unique, well-defined outcome at runtime. Implements
function resolution caching for great performance. Supports optional
parameters (default values) in function signatures. Evaluates both
positional and keyword arguments when resolving the best match.
Supports fallback functions and execution of shared code. Supports
argument polymorphism. Supports classes and inheritance, including
classmethods and staticmethods.
Python 3.x includes standard typing library which allows for method overloading with the use of #overload decorator. Unfortunately, this is to make the code more readable, as the #overload decorated methods will need to be followed by a non-decorated method that handles different arguments.
More can be found here here but for your example:
from typing import overload
from typing import Any, Optional
class A(object):
#overload
def stackoverflow(self) -> None:
print('first method')
#overload
def stackoverflow(self, i: Any) -> None:
print('second method', i)
def stackoverflow(self, i: Optional[Any] = None) -> None:
if not i:
print('first method')
else:
print('second method', i)
ob=A()
ob.stackoverflow(2)
Python added the #overload decorator with PEP-3124 to provide syntactic sugar for overloading via type inspection - instead of just working with overwriting.
Code example on overloading via #overload from PEP-3124
from overloading import overload
from collections import Iterable
def flatten(ob):
"""Flatten an object to its component iterables"""
yield ob
#overload
def flatten(ob: Iterable):
for o in ob:
for ob in flatten(o):
yield ob
#overload
def flatten(ob: basestring):
yield ob
is transformed by the #overload-decorator to:
def flatten(ob):
if isinstance(ob, basestring) or not isinstance(ob, Iterable):
yield ob
else:
for o in ob:
for ob in flatten(o):
yield ob
In the MathMethod.py file:
from multipledispatch import dispatch
#dispatch(int, int)
def Add(a, b):
return a + b
#dispatch(int, int, int)
def Add(a, b, c):
return a + b + c
#dispatch(int, int, int, int)
def Add(a, b, c, d):
return a + b + c + d
In the Main.py file
import MathMethod as MM
print(MM.Add(200, 1000, 1000, 200))
We can overload the method by using multipledispatch.
There are some libraries that make this easy:
functools - if you only need the first argument use #singledispatch
plum-dispatch - feature rich method/function overloading.
multipledispatch - alternative to plum less features but lightweight.
python 3.5 added the typing module. This included an overload decorator.
This decorator's intended purpose it to help type checkers. Functionally its just duck typing.
from typing import Optional, overload
#overload
def foo(index: int) -> str:
...
#overload
def foo(name: str) -> str:
...
#overload
def foo(name: str, index: int) -> str:
...
def foo(name: Optional[str] = None, index: Optional[int] = None) -> str:
return f"name: {name}, index: {index}"
foo(1)
foo("bar", 1)
foo("bar", None)
This leads to the following type information in vs code:
And while this can help, note that this adds lots of "weird" new syntax. Its purpose - purely type hints - is not immediately obvious.
Going with Union of types usually is a better option.

Type annotation for Callable that takes **kwargs

There is a function (f) which consumes a function signature (g) that takes a known first set of arguments and any number of keyword arguments **kwargs. Is there a way to include the **kwargs in the type signature of (g) that is described in (f)?
For example:
from typing import Callable, Any
from functools import wraps
import math
def comparator(f: Callable[[Any, Any], bool]) -> Callable[[str], bool]:
#wraps(f)
def wrapper(input_string: str, **kwargs) -> bool:
a, b, *_ = input_string.split(" ")
return f(eval(a), eval(b), **kwargs)
return wrapper
#comparator
def equal(a, b):
return a == b
#comparator
def equal_within(a, b, rel_tol=1e-09, abs_tol=0.0):
return math.isclose(a, b, rel_tol=rel_tol, abs_tol=abs_tol)
# All following statements should print `True`
print(equal("1 1") == True)
print(equal("1 2") == False)
print(equal_within("5.0 4.99998", rel_tol=1e-5) == True)
print(equal_within("5.0 4.99998") == False)
The function comparator wraps its argument f with wrapper, which consumes the input for f as a string, parses it and evaluates it using f. In this case, Pycharm gives a warning that return f(eval(a), eval(b), **kwargs) calls f with the unexpected argument **kwargs, which doesn't match the expected signature.
This post on Reddit suggests adding either Any or ... to the type signature of f like
f: Callable[[Any, Any, ...], bool]
f: Callable[[Any, Any, Any], bool]
The former causes a TypeError [1], while the latter seems to misleading, since f accepts at least 2 arguments, rather than exactly 3.
Another workaround is to leave the Callable args definition open with ... like f: Callable[..., bool], but I'm wondering if there is a more appropriate solution.
TypeError: Callable[[arg, ...], result]: each arg must be a type. Got Ellipsis.
tl;dr: Protocol may be the closest feature that's implemented, but it's still not sufficient for what you need. See this issue for details.
Full answer:
I think the closest feature to what you're asking for is Protocol, which was introduced in Python 3.8 (and backported to older Pythons via typing_extensions). It allows you to define a Protocol subclass that describes the behaviors of the type, pretty much like an "interface" or "trait" in other languages. For functions, a similar syntax is supported:
from typing import Protocol
# from typing_extensions import Protocol # if you're using Python 3.6
class MyFunction(Protocol):
def __call__(self, a: Any, b: Any, **kwargs) -> bool: ...
def decorator(func: MyFunction):
...
#decorator # this type-checks
def my_function(a, b, **kwargs) -> bool:
return a == b
In this case, any function that have a matching signature can match the MyFunction type.
However, this is not sufficient for your requirements. In order for the function signatures to match, the function must be able to accept an arbitrary number of keyword arguments (i.e., have a **kwargs argument). To this point, there's still no way of specifying that the function may (optionally) take any keyword arguments. This GitHub issue discusses some possible (albeit verbose or complicated) solutions under the current restrictions.
For now, I would suggest just using Callable[..., bool] as the type annotation for f. It is possible, though, to use Protocol to refine the return type of the wrapper:
class ReturnFunc(Protocol):
def __call__(self, s: str, **kwargs) -> bool: ...
def comparator(f: Callable[..., bool]) -> ReturnFunc:
....
This gets rid of the "unexpected keyword argument" error at equal_within("5.0 4.99998", rel_tol=1e-5).
With PEP 612 in Python 3.10, you may try the following solution:
from typing import Callable, Any, ParamSpec, Concatenate
from functools import wraps
P = ParamSpec("P")
def comparator(f: Callable[Concatenate[Any, Any, P], bool]) -> Callable[Concatenate[str, P], bool]:
#wraps(f)
def wrapper(input_string: str, *args: P.args, **kwargs: P.kwargs) -> bool:
a, b, *_ = input_string.split(" ")
return f(eval(a), eval(b), *args, **kwargs)
return wrapper
However it seems that you cannot get rid of *args: P.args (which you actually don't need) as PEP 612 requires P.args and P.kwargs to be used together. If you can make sure that your decorated functions (e.g. equal and equal_within) do not take extra positional arguments, any attempts to call the functions with extra positional arguments should be rejected by the type checker.

How to get Mypy to realize that the default value won't be used in certain cases

I have the following function:
#!/usr/bin/env python3
from typing import Union
def foo(b: Union[str, int]) -> int:
def bar(a: int = b) -> int: # Incompatible default for argument "a" (default has type "Union[str, int]", argument has type "int")
return a + 1
if isinstance(b, str):
return bar(0)
else:
return bar()
print(foo(3)) # 4
print(foo("hello")) # 1
On the line where I define bar, Mypy says that setting b as the default won't work.
However, due to how the program works, the only way that the default b will be used is if b is an integer. So this should work fine.
But Mypy doesn't realize that.
How can I
Get Mypy to realize that int is the correct type for a
or
Fix this is some way that doesn't cause too much code duplication.
(For example, I know I could write two foo functions with different signatures, but that would be too much code duplication.)
TL;DR below is just my real-life use case, since at least one answer relied on how simple my MCVE above was.
It's a function that takes a dictionary. The function returns a decorator that, when used, will add the decorated function (the decorated function is a TypeChecker) to the dictionary. The decorator allows for a parameter that specifies the name/key that the decorated function (the TypeChecker) is placed under in the dictionary. If a name is not specified, then it will use a different function (StringHelper.str_function) to figure out a name from the properties of the function itself.
Due to how decorator parameters work, the decorator creator needs to take in either the name (or nothing) or the function. If it takes just the function, then no name was specified, and it should grab a name from the function. If it takes just the name, then it will be called again on the function, and the name should be used. If it takes nothing, then it will be called again on the function, and it should grab a name from the function.
def get_type_checker_decorator(type_checkers: Dict[str, TypeChecker]) -> Callable[[Union[Optional[str], TypeChecker]], Union[Callable[[TypeChecker], TypeChecker], TypeChecker]]:
#overload
def type_checker_decorator(name: Optional[str]) -> Callable[[TypeChecker], TypeChecker]:
pass
#overload
def type_checker_decorator(name: TypeChecker) -> TypeChecker:
pass
def type_checker_decorator(name: Union[Optional[str], TypeChecker] = None) -> Union[Callable[[TypeChecker], TypeChecker], TypeChecker]:
# if name is a function, then the default will never be used
def inner_decorator(function: TypeChecker, name: Optional[str] = name) -> TypeChecker: # this gives the Mypy error
if name is None:
name = StringHelper.str_function(function)
type_checkers[name] = function
def wrapper(string: str) -> bool:
return function(string)
return wrapper
if callable(name):
# they just gave us the function right away without a name
function = name
name = None
return inner_decorator(function, name)
else:
assert isinstance(name, str) or name is None
# should be called with just the function as a parameter
# the name will default to the given name (which may be None)
return inner_decorator
return type_checker_decorator
It feels awkward to force a type signature if that's not really what the function is expecting. Your bar function clearly expects an int, and forcing a Union on the type hint just to later assert that you actually only accept ints shouldn't be necessary in order to silence mypy.
Since you are accepting b as a default in bar, you should take care of the str case inside of bar, because the type signature of b has already been specified in foo. Two alternative solutions that I would've considered more appropriate to the issue at hand:
def foo(b: Union[str, int]) -> int:
# bar takes care of the str case. Type of b already documented
def bar(a=b) -> int:
if isinstance(b, str):
return bar(0)
return a + 1
return bar()
Defining a default value before defining bar:
def foo(b: Union[str, int]) -> int:
x: int = 0 if isinstance(b, str) else b
# bar does not take a default type it won't use.
def bar(a: int = x) -> int:
return a + 1
return bar()
This default value shouldn't even exist. The safe way to write this code would be
def foo(b: Union[str, int]) -> int:
def bar(a) -> int:
return a + 1
if isinstance(b, str):
return bar(0)
else:
return bar(b)
I don't know what situation motivated you to ask this question, but whatever program you really want to write, you probably shouldn't have a default argument value either.
Your code is very similar to trying to do
def f(x: Union[int, str]) -> int:
y: int = x
if isinstance(x, str):
return 1
else:
return y + 1
Written like that, it seems obvious that y is wrong. We shouldn't be assigning something to a variable of static type int unless we actually know at the point of assignment that it's an int. It would be unreasonable to expect the type checker to examine all code paths that could lead to y's use to determine that this is safe. Default argument value type checking follows a similar principle; it's checked based on information available when the default value is set, not on the code paths where it could be used.
For an even more extreme example, consider
def f(x: Union[int, str]) -> int:
def g(y: int = x):
pass
return 4
y will never be used. The type checker still reports the type mismatch, and there would be bug reports about it if the type checker didn't report it.
While writing this question, I think I answered it myself, so I might as well share.
The solution is slightly awkward, but it's the best I could think of, and it seems to work. First, type a as being either an integer or a string. Then, as the first line in the function, assert that a is an int.
Together, it looks like
#!/usr/bin/env python3
from typing import Union
def foo(b: Union[str, int]) -> int:
def bar(a: Union[int, str] = b) -> int: # Incompatible default for argument "a" (default has type "Union[str, int]", argument has type "int")
assert isinstance(a, int)
return a + 1
if isinstance(b, str):
return bar(0)
else:
return bar()
print(foo(3)) # 4
print(foo("hello")) # 1
This solution isn't perfect, though, since if you do follow the function signature and pass it a string, it will fail (due to the assert). So it requires some introspection of the function itself to determine what can actually get passed.

Categories

Resources