I'm trying to use the Python function annotations (PEP 3107) as type hints for PyCharm, but failed to do so. The problem is probably related to my use of ABCMeta:
import abc
class base(object, metaclass=abc.ABCMeta):
#abc.abstractmethod
def test(self):
pass
class deriv1(base):
def test(self):
return "deriv1"
class deriv2(base):
def test(self):
return "deriv2"
my_list = []
def append_to_list(el: base) -> list(base):
# def append_to_list(el):
# """
# :param el: item to add
# :type: base
# :return: items so far
# :rtype: list[base]
# """
my_list.append(el)
return my_list
append_to_list(deriv1())
a = append_to_list(deriv2())
for o in a:
print(o.test())
This code does not run. Instead, I get a TypeError: 'ABCMeta' object is not iterable on the def append_to_list line.
When I use the alternative function with docstring type hints (the commented lines in the code above), everything works great.
Is it possible to use annotations for this kind of type hinting?
It's not related to abc but because you told Python to literally evaluate
list(base)
which is impossible because base is not iterable. That's what the error message is telling you.
You need to change it to square brackets and wrap it in quotes (because the list type is not subscriptable):
def append_to_list(el: base) -> 'list[base]':
or use typing.List which is subscriptable:
from typing import List
def append_to_list(el: base) -> List[base]:
To indicate it's a list containing base objects.
Related
Suppose I've got a map like function:
def generate(data, per_element):
for element in data:
per_element(element)
How can I add type-hints so that if I call generate(some_data, some_function) where some_data: List[SomeClass], I get a warning if SomeClass is missing a field used by some_function?
As an example - with the following code:
def some_function(x):
print(x.value)
some_data: List[int] = [1, 2, 3]
generate(some_data, some_function)
I would like to get a warning that int does not have the attribute value.
Use a type variable to make generate generic in the type of object that data contains and that per_element expects as an argument.
from typing import TypeVar, List, Callable
T = TypeVar('T')
def generate(data: List[T], per_element: Callable[[T], Any]):
for element in data:
per_element(element)
class Foo:
def __init__(self):
self.value = 3
def foo(x: Foo):
print(x.value)
def bar(x: int):
pass
generate([Foo(), Foo()], foo) # OK
# Argument 2 to "generate" has incompatible type "Callable[[Foo], Any]"; expected "Callable[[int], Any]"
generate([1,2,3], foo)
Whatever T is, it has to be the same type for both the list and the function, to ensure that per_element can, in fact, be called on every value in data. The error produced by the second call to generate isn't exactly what you asked for, but it essentially catches the same problem: the list establishes what type T is bound to, and the function doesn't accept the correct type.
If you specifically want to require that T be a type whose instances have a value attribute, it's a bit trickier. It's similar to the use case for Protocol, but that only supports methods (or class attributes in general?), not instance attributes, as far as I know. Perhaps someone else can provide a better answer.
Seems like you're searching for:
def generate(data: List[AClass], per_element):
for element in data:
per_element(element)
So that AClass implements the method you need.
Your class needs the value attribute:
class SomeClass:
value: Any # I used any but use whatever type hint is appropriate
Then using typing.Callable in your function as well as the builtin types. starting with python 3.7 and finally fully implemented in python 3.9 you can use the builtins themselves as well as in python 3.9 you can use parameter specifications
from typing import ParamSpec, TypeVar, Callable
P = ParamSpec("P")
R = TypeVar("R")
def generate(data: list[SomeClass], per_element: Callable[P, R]) -> None:
for element in data:
per_element(element)
Then in some_function using the class type hint and None return variable:
def some_function(x: SomeClass) -> None:
print(x.value)
I'm trying to work out how to add a type annotation for a function argument that should be a class implementing a generic protocol.
As an example, assume I have a protocol for a set that could look something like this:
from typing import (
Protocol, TypeVar, Iterable
)
T = TypeVar('T', contravariant=True)
class Set(Protocol[T]):
"""A set of elements of type T."""
def __init__(self, init: Iterable[T]) -> None:
"""Initialise set with init."""
...
def __contains__(self, x: T) -> bool:
"""Test if x is in set."""
...
def add(self, x: T) -> None:
"""Add x to the set."""
...
def remove(self, x: T) -> None:
"""Remove x from the set."""
...
and I have an algorithm that uses sets of various types, that I want to parameterise with the set implementation. For simplicity I'll just create a list in this function to use as an example:
from typing import Type
def foo(set_type: Type[Set]) -> None:
"""Do clever stuff."""
x = list(range(10))
s = set_type(x)
...
Here, mypy tells me that Set is missing a type parameter, which I suppose is correct, but I don't want to give it one, as I plan to use set_type with different types.
If I give Set a TypeVar instead
def foo(set_type: Type[Set[T]]) -> None:
"""Do clever stuff"""
x = list(range(10))
s = set_type(x)
...
I instead get the warning that I set_type() gets an incompatible type, List[int] instead of Iterable[T], which again is correct, but doesn't help me much.
Is there a way to specify that my function argument can be used as a generic constructor for sets of different types?
Protocol says nothing about the signature of __init__, even if it's defined on the Protocol. Type does a similar thing - even if Set isn't a Protocol, Type[Set] says nothing about how the type is called.
I initially suggested using Callable[[Iterable[T]], Set[T]]. However, this is problematic, and only works because I omitted the generic parameter, essentially making it Any, as discussed in this Github issue. You can instead use a (rather verbose) protocol.
class MkSet(Protocol):
def __call__(self, it: Iterable[T]) -> Set[T]:
...
def foo(set_type: MkSet) -> None:
...
I want to make Entity Component System (ECS) in python.
I make Entity class:
from typing import Optional, TypeVar, Type
T = TypeVar('T')
class Entity:
def __init__(self):
self.components = []
def add_component(self, c):
self.components.append(c)
def get_first_component(self, Type: Type[T]) -> Optional[T]:
for c in self.components:
if isinstance(c, Type):
return c
def get_first_components(self, *Types):
res = []
for Type in Types:
res.append(self.get_first_component(Type))
return res
type hinting for get_first_component was easy, but i dont understand how to do type hinting for get_first_components function. This function give list of Types and returns list of object of thees Types.
Example:
e.get_first_components(Position, Health) # returns [Position(2, 2), Health(10, 10)]
I see it like:
A = TypeVar('A')
B = TypeVar('B')
def f(Types: [Type[A], Type[B], ...]) -> [A, B, ...]:
# some code ...
Sorry my english is bad :(
It need for type hinting in systems:
class MoveSystem(System):
def __init__(self) -> None:
pass
def run_for_entity(self, e: Entity):
pos, m2 = e.get_first_components(Pos2, Move2)
if m2.active: # <- no type hinting after typing "m2."
pos.x += m2.dx
pos.y += m2.dy
m2.active = False
Python's type hinting system doesn't have the ability to describe your function in the way you want. You need to be able to describe a sequence of different types of arbitrary length, and then in parallel, describe objects of those types. Unfortunately, that's not currently possible.
About the best you can do is:
def get_first_components(self, *Types: Type[T]) -> List[Optional[T]]:
But this probably won't do what you want. The T will match a common base class of the types you pass in to the function, which might be object if your classes don't have any other common base class. That means that when you unpack the returned list into separate variables, they'll all be identified by the type checker as being instance of the base class, not each having the specific type you passed it in the corresponding position.
You can make your calling code work though, by using the other method that does have workable type hints:
pos = e.get_first_component(Pos2) # will be identified as Optional[Pos2]
m2 = e.get_first_component(Move2) # will be identified as Optional[Move2]
As a side note, since the values you're getting are Optional, you probably need a check for them being None. If you get the type hints working, you'll get warned if you do something like m2.active without checking that first, since None doesn't have an active attribute.
I'm aware there's this new typing format Annotated where you can specify some metadata to the entry variables of a function. From the docs, you could specify the maximum length of a incoming list such as:
Annotated can be used with nested and generic aliases:
T = TypeVar('T')
Vec = Annotated[list[tuple[T, T]], MaxLen(10)]
V = Vec[int]
V == Annotated[list[tuple[int, int]], MaxLen(10)]
But I cannot finish to comprehend what MaxLen is. Are you supposed to import a class from somewhere else? I've tried importing typing.MaxLen but doesn't seems to exists (I'm using Python 3.9.6, which I think it should exist here...?).
Example code of what I imagined it should have worked:
from typing import List, Annotated, MaxLen
def function(foo: Annotated[List[int], MaxLen(10)]):
# ...
return True
Where can one find MaxLen?
EDIT:
It seems like MaxLen is some sort of class you have to create. The problem is that I cannot see how you should do it. Are there public examples? How can someone implement this function?
As stated by AntiNeutronicPlasma, Maxlen is just an example so you'll need to create it yourself.
Here's an example for how to create and parse a custom annotation such as MaxLen to get you started.
First, we define the annotation class itself. It's a very simple class, we only need to store the relevant metadata, in this case, the max value:
class MaxLen:
def __init__(self, value):
self.value = value
Now, we can define a function that uses this annotation, such as the following:
def sum_nums(nums: Annotated[List[int], MaxLen(10)]):
return sum(nums)
But it's going to be of little use if nobody checks for it. So, one option could be to implement a decorator that checks your custom annotations at runtime. The functions get_type_hints, get_origin and get_args from the typing module are going to be your best friends. Below is an example of such a decorator, which parses and enforces the MaxLen annotation on list types:
from functools import wraps
from typing import get_type_hints, get_origin, get_args, Annotated
def check_annotations(func):
#wraps(func)
def wrapped(**kwargs):
# perform runtime annotation checking
# first, get type hints from function
type_hints = get_type_hints(func, include_extras=True)
for param, hint in type_hints.items():
# only process annotated types
if get_origin(hint) is not Annotated:
continue
# get base type and additional arguments
hint_type, *hint_args = get_args(hint)
# if a list type is detected, process the args
if hint_type is list or get_origin(hint_type) is list:
for arg in hint_args:
# if MaxLen arg is detected, process it
if isinstance(arg, MaxLen):
max_len = arg.value
actual_len = len(kwargs[param])
if actual_len > max_len:
raise ValueError(f"Parameter '{param}' cannot have a length "
f"larger than {max_len} (got length {actual_len}).")
# execute function once all checks passed
return func(**kwargs)
return wrapped
(Note that this particular example only works with keyword arguments, but you could probably find a way to make it work for normal arguments too).
Now, you can apply this decorator to any function, and your custom annotation will get parsed:
from typing import Annotated, List
#check_annotations
def sum_nums_strict(nums: Annotated[List[int], MaxLen(10)]):
return sum(nums)
Below is an example of the code in action:
>>> sum_nums(nums=list(range(5)))
10
>>> sum_nums(nums=list(range(15)))
105
>>> sum_nums_strict(nums=list(range(5)))
10
>>> sum_nums_strict(nums=list(range(15)))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "annotated_test.py", line 29, in wrapped
raise ValueError(f"Parameter '{param}' cannot have a length "
ValueError: Parameter 'nums' cannot have a length larger than 10 (got length 15).
In the following example, how can I properly annotate the return type of the sum_two function?
from typing import Any, TypeVar
T = TypeVar('T')
S = TypeVar('S')
def sum_two(first: T, second: S):
return first + second
Assuming the __add__ operator is properly annotated for all possible arguments that will be passed to this function, is there some way to express the return type as the return type of calling __add__ on objects of type T and S?
I would like to avoid using typing's overload decorator to identify all possible cases as there could be dozens of cases.
You can theoretically accomplish a part of his by making first a generic protocol, which lets you "capture" the return type of __add__. For example:
# If you are using Python 3.7 or earlier, you'll need to pip-install
# the typing_extensions module and import Protocol from there.
from typing import TypeVar, Protocol, Generic
TOther = TypeVar('TOther', contravariant=True)
TSum = TypeVar('TSum', covariant=True)
class SupportsAdd(Protocol, Generic[TOther, TSum]):
def __add__(self, other: TOther) -> TSum: ...
Then, you could do the following:
S = TypeVar('S')
R = TypeVar('R')
# Due to how we defined the protocol, R will correspond to the
# return type of `__add__`.
def sum_two(first: SupportsAdd[S, R], second: S) -> R:
return first + second
# Type checks
reveal_type(sum_two("foo", "bar")) # Revealed type is str
reveal_type(sum_two(1, 2)) # Revealed type is int
reveal_type(sum_two(1.0, 2)) # Revealed type is float
# Does not type check, since float's __radd__ is ignored
sum_two(1, 2.0)
class Custom:
def __add__(self, x: int) -> int:
return x
# Type checks
reveal_type(sum_two(Custom(), 3)) # Revealed type is int
# Does not type check
reveal_type(sum_two(Custom(), "bad"))
This approach does have a few limitations, however:
It does not handle cases where there's no matching __add__ in 'first' but do have a matching __radd__ in 'second'.
You might get some weird results if you modify Custom so __add__ is an overload. I think at least mypy currently has a bug where it doesn't know how to handle complicated cases involving subtypes and overloads properly.