I'm trying to write a Python function that constructs a list with intercepted methods that's reasonably type safe. It intercepts the methods by subclassing the list that's passed.
from typing import Type, TypeVar, List
V = TypeVar("V")
T = TypeVar("T", bound=List[V])
def build_interceptor(list_cls: Type[T]) -> T:
class LImpl(list_cls):
def append(self, v: V) -> None:
print(v)
super().append(v)
return LImpl()
l: List[int] = build_interceptor(List[int])
l.append(10)
MyPy isn't happy with this, but the code does work.
main.py:4: error: Type variable "__main__.V" is unbound
main.py:4: note: (Hint: Use "Generic[V]" or "Protocol[V]" base class to bind "V" inside a class)
main.py:4: note: (Hint: Use "V" in function signature to bind "V" inside a function)
main.py:8: error: Variable "list_cls" is not valid as a type
main.py:8: note: See https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases
main.py:8: error: Invalid base class "list_cls"
I'm not sure what the fixes are. Yes, V is unbound, but I don't really care what it is beyond getting the right return type. I also think there's an issue with making both the list and its contents generic, but I'm not sure how to express that.
I think the problem with 'V' is that it cant be used in the context of a TypeVar, but when defining your new class:
from typing import List, Type, TypeVar
T = TypeVar("T", bound="List")
V = TypeVar("V")
def build_interceptor(list_cls: Type[T]) -> T:
class LImpl(list_cls[V]): # type: ignore
def append(self, v: V) -> None:
print(v)
super().append(v)
return LImpl()
l: List[int] = build_interceptor(List[int])
l.append(10)
This still produces 'Variable "list_cls" is not valid as a type' which is likely related to mypy.
It seems to work after adding a type ignore comment.
Related
Suppose I've got a map like function:
def generate(data, per_element):
for element in data:
per_element(element)
How can I add type-hints so that if I call generate(some_data, some_function) where some_data: List[SomeClass], I get a warning if SomeClass is missing a field used by some_function?
As an example - with the following code:
def some_function(x):
print(x.value)
some_data: List[int] = [1, 2, 3]
generate(some_data, some_function)
I would like to get a warning that int does not have the attribute value.
Use a type variable to make generate generic in the type of object that data contains and that per_element expects as an argument.
from typing import TypeVar, List, Callable
T = TypeVar('T')
def generate(data: List[T], per_element: Callable[[T], Any]):
for element in data:
per_element(element)
class Foo:
def __init__(self):
self.value = 3
def foo(x: Foo):
print(x.value)
def bar(x: int):
pass
generate([Foo(), Foo()], foo) # OK
# Argument 2 to "generate" has incompatible type "Callable[[Foo], Any]"; expected "Callable[[int], Any]"
generate([1,2,3], foo)
Whatever T is, it has to be the same type for both the list and the function, to ensure that per_element can, in fact, be called on every value in data. The error produced by the second call to generate isn't exactly what you asked for, but it essentially catches the same problem: the list establishes what type T is bound to, and the function doesn't accept the correct type.
If you specifically want to require that T be a type whose instances have a value attribute, it's a bit trickier. It's similar to the use case for Protocol, but that only supports methods (or class attributes in general?), not instance attributes, as far as I know. Perhaps someone else can provide a better answer.
Seems like you're searching for:
def generate(data: List[AClass], per_element):
for element in data:
per_element(element)
So that AClass implements the method you need.
Your class needs the value attribute:
class SomeClass:
value: Any # I used any but use whatever type hint is appropriate
Then using typing.Callable in your function as well as the builtin types. starting with python 3.7 and finally fully implemented in python 3.9 you can use the builtins themselves as well as in python 3.9 you can use parameter specifications
from typing import ParamSpec, TypeVar, Callable
P = ParamSpec("P")
R = TypeVar("R")
def generate(data: list[SomeClass], per_element: Callable[P, R]) -> None:
for element in data:
per_element(element)
Then in some_function using the class type hint and None return variable:
def some_function(x: SomeClass) -> None:
print(x.value)
I have a generic lookup function, that mostly returns TypeA, but sometimes can return TypeB:
Types = Union[TypeA,TypeB]
def get_hashed_value(
key:str, table: Dict[str,Types]
) -> Types:
return table.get(key)
and I use it in two less-generic functions:
def get_valueA(key: str) -> TypeA:
return get_hashed_value(key, A_dict) # A_dict: Dict[str, TypeA]
and
def get_valueB(key: str) -> TypeB:
return get_hashed_value(key, B_dict) # B_dict: Dict[str, TypeB]
what is the best way to handle typing on this?
since get_hashed_value can return either TypeA or TypeB, the return statement in the get_* functions throws a typing exception (during my linting)
there’s more logic in these methods, and I need the separate get_* functions, so I can’t just collapse all the usages
it would be really nice to have explicit return types on the get_* functions
it feels like a bad practice to duplicate get_hashed_value, just to get around the typing issue
it feels bad to just ignore type everything get_hashed_value is called
Thanks for your help! Also I am sure this has been asked before, but I had trouble finding the answer. :\
Interestingly, this doesn't return a type warning for me (in Pycharm). I'm not sure why it isn't warning on what's comparable to a "downcast", but Pycharm isn't perfect.
Regardless, this seems like a job that's more suited for a TypeVar than a Union:
from typing import TypeVar, Dict
T = TypeVar("T", TypeA, TypeB) # A generic type that can only be a TypeA or TypeB
# And the T stays consistent from the input to the output
def get_hashed_value(key: str, table: Dict[str, T]) -> T:
return table.get(key)
# Which means that if you feed it a Dict[str, TypeA], it will expect a TypeA return
def get_valueA(key: str) -> TypeA:
return get_hashed_value(key, A_dict)
# And if you feed it a Dict[str, TypeB], it will expect an TypeB return
def get_valueB(key: str) -> TypeB:
return get_hashed_value(key, B_dict)
Suppose I have function that takes type as argument and returns instance of that type:
def fun(t):
return t(42)
Then I can call it and get objects of provided types:
fun(int) # 42
fun(float) # 42.0
fun(complex) # (42+0j)
fun(str) # "42"
fun(MyCustomType) # something
That list is not exhaustive, I'd like to be able to use any type with appropriate constructor.
Then, I'd like to add type hints for that function. What should be the type hint for return value of that function?
I've tried using simply t, as t is a type:
def fun(t: type) -> t:
return t(42)
but that doesn't work:
main.py:1: error: Name 't' is not defined
This answer suggests using a TypeVar:
from typing import TypeVar
T = TypeVar("T")
def fun(t: T) -> T:
return t(42)
But that doesn't seem to be right, as T denotes a type, so it suggests that type itself is returned, not its instance. Mypy rejects it:
main.py:6: error: "object" not callable
Using Any obviously work, but I feel it's too vague, it doesn't convey the intent:
from typing import Any
def fun(t: type) -> Any:
return t(42)
TLDR: You need a TypeVar for the return type of calling t:
def fun(t: Callable[[int], R]) -> R:
...
Constraining on a type is too restrictive here. The function accepts any Callable that takes an integer, and the return type of the function is that of the Callable. This can be specified using a TypeVar for the return type:
from typing import Callable, TypeVar
R = TypeVar('R') # the variable return type
def fun(t: Callable[[int], R]) -> R:
return t(42)
fun(int) # Revealed type is 'builtins.int*'
fun(float) # Revealed type is 'builtins.float*'
reveal_type(fun(lambda x: str(x))) # Revealed type is 'builtins.str*'
This works for types as well, because type instantiation is a call.
If a more complex signature, e.g. with keyword arguments, is needed, use Protocol (from typing or typing_extensions).
Note that if one explicitly wants to pass only 42 to the Callable, Literal (from typing or typing_extensions) can be used to specify that.
R = TypeVar('R')
def fun(t: Callable[[Literal[42]], R]) -> R:
return t(42)
Note that any function of the type Callable[[int], R] also satisfies Callable[[Literal[42]], R].
You are looking for typing.Type, so something to the effect of:
from typing import TypeVar, Type
T = TypeVar("T", str, complex, float, int)
def fun(t: Type[T]) -> T:
return t(42)
fun(int)
fun(float)
fun(complex)
fun(str)
Note, your type variable needs to be constrained, because not all Type objects accept arguments, but you can constrain it to a few that do like your example.
I am trying to wrap my head around generic type hints. Reading over this section in PEP 483, I got the impression that in
SENSOR_TYPE = TypeVar("SENSOR_TYPE")
EXP_A = Tuple[SENSOR_TYPE, float]
class EXP_B(Tuple[SENSOR_TYPE, float]):
...
EXP_A and EXP_B should identify the same type. In PyCharm #PC-181.4203.547, however, only EXP_Bworks as expected. Upon investigation, I noticed that EXP_B features a __dict__ member while EXP_A doesn't.
That got me to wonder, are both kinds of type definition actually meant to be synonymous?
Edit: My initial goal was to design a generic class EXP of 2-tuples where the second element is always a float and the first element type is variable. I want to use instances of this generic class as follows
from typing import TypeVar, Tuple, Generic
T = TypeVar("T")
class EXP_A(Tuple[T, float]):
...
EXP_B = Tuple[T, float]
V = TypeVar("V")
class MyClass(Generic[V]):
def get_value_a(self, t: EXP_A[V]) -> V:
return t[0]
def get_value_b(self, t: EXP_B[V]) -> V:
return t[0]
class StrClass(MyClass[str]):
pass
instance = "a", .5
sc = StrClass()
a: str = sc.get_value_a(instance)
b: str = sc.get_value_b(instance)
(The section on user defined generic types in PEP 484 describes this definition of EXP as equivalent to EXP_B in my original code example.)
The problem is that PyCharm complains about the type of instance as a parameter:
Expected type EXP (matched generic type EXP[V]), got Tuple[str, float] instead`. With `EXP = Tuple[T, float]` instead, it says: `Expected type 'Tuple[Any]' (matched generic type Tuple[V]), got Tuple[str, float] instead.
I followed #Michael0c2a's advice, headed over to the python typing gitter chat, and asked the question there. The answer was that the example is correct.
From this, I follow that
EXP_A and EXP_B are indeed defining the same kind of types
PyCharm as of build #PC-182.4323.49 just doesn't deal with generic type annotations very well.
Consider following code sample:
from typing import Dict, Union
def count_chars(string) -> Dict[str, Union[str, bool, int]]:
result = {} # type: Dict[str, Union[str, bool, int]]
if isinstance(string, str) is False:
result["success"] = False
result["message"] = "Inavlid argument"
else:
result["success"] = True
result["result"] = len(string)
return result
def get_square(integer: int) -> int:
return integer * integer
def validate_str(string: str) -> bool:
check_count = count_chars(string)
if check_count["success"] is False:
print(check_count["message"])
return False
str_len_square = get_square(check_count["result"])
return bool(str_len_square > 42)
result = validate_str("Lorem ipsum")
When running mypy against this code, following error is returned:
error: Argument 1 to "get_square" has incompatible type "Union[str, bool, int]"; expected "int"
and I'm not sure how I could avoid this error without using Dict[str, Any] as returned type in the first function or installing 'TypedDict' mypy extension. Is mypy actually 'right', any my code isn't type safe or is this should be considered as mypy bug?
Mypy is correct here -- if the values in your dict can be strs, ints, or bools, then strictly speaking we can't assume check_count["result"] will always evaluate to exactly an int.
You have a few ways of resolving this. The first way is to actually just check the type of check_count["result"] to see if it's an int. You can do this using an assert:
assert isinstance(check_count["result"], int)
str_len_square = get_square(check_count["result"])
...or perhaps an if statement:
if isinstance(check_count["result"], int):
str_len_square = get_square(check_count["result"])
else:
# Throw some kind of exception here?
Mypy understands type checks of this form in asserts and if statements (to a limited extent).
However, it can get tedious scattering these checks throughout your code. So, it might be best to actually just give up on using dicts and switch to using classes.
That is, define a class:
class Result:
def __init__(self, success: bool, message: str) -> None:
self.success = success
self.message = message
...and return an instance of that instead.
This is slightly more inconvenient in that if your goal is to ultimately return/manipulate json, you now need to write code to convert this class from/to json, but it does let you avoid type-related errors.
Defining a custom class can get slightly tedious, so you can try using the NamedTuple type instead:
from typing import NamedTuple
Result = NamedTuple('Result', [('success', bool), ('message', str)])
# Use Result as a regular class
You still need to write the tuple -> json code, and iirc namedtuples (both the regular version from the collections module and this typed variant) are less performant then classes, but perhaps that doesn't matter for your use case.