I have a frozen dataclass MyData that holds data.
I would like a distinguished subclass MySpecialData can only hold data of length 1.
Here is a working implementation.
from dataclasses import dataclass, field
#dataclass(frozen=True)
class MyData:
id: int = field()
data: list[float] = field()
def __len__(self) -> int:
return len(self.data)
#dataclass(frozen=True)
class MySpecialData(MyData):
def __post_init__(self):
assert len(self) == 1
# correctly throws exception
special_data = MySpecialData(id=1, data=[2, 3])
I spent some time messing with __new__ and __init__, but couldn't reach a working solution.
The code works, but I am a novice and am soliciting the opinion of someone experienced if this is the "right" way to accomplish this.
Any critiques or suggestions on how to do this better or more correctly would be appreciated.
For examples not using dataclasses, I imagine the correct way would be overriding __new__ in the subclass.
I suspect my attempts at overriding __new__ fail here because of the special way dataclasses works.
Would you agree?
Thank you for your opinion.
Don't use assert. Use
if len(self) != 1:
raise ValueError
assert can be turned off with the -O switch ie., if you run your script like
python -O my_script.py
it will no longer raise an error.
Another option is to use a custom user-defined list subclass, which checks the len of the list upon instantiation.
from dataclasses import dataclass, field
from typing import Sequence, TypeVar, Generic
T = TypeVar('T')
class ConstrainedList(list, Generic[T]):
def __init__(self, seq: Sequence[T] = (), desired_len: int = 1):
super().__init__(seq)
if len(self) != desired_len:
raise ValueError(f'expected length {desired_len}, got {len(self)}. items={self}')
#dataclass(frozen=True)
class MyData:
id: int = field()
data: ConstrainedList[float] = field(default_factory=ConstrainedList)
#dataclass(frozen=True)
class MySpecialData(MyData):
...
# correctly throws exception
special_data = MySpecialData(id=1, data=ConstrainedList([2, 3]))
Related
I have a set of unrelated classes (some imported) which all have a common attribute (or property) a of type dict[str, Any].
Within a there should be another dict under the key "b", which I would like to expose on any of these classes as an attribute b to simplify inst.a.get("b", {})[some_key] to inst.b[some_key].
I have made the following subclass factory to work as a class decorator for local classes and a function for imported classes.
But so far I'm failing to type hint its cls argument and return value correctly.
from functools import wraps
def access_b(cls):
#wraps(cls, updated=())
class Wrapper(cls):
#property
def b(self) -> dict[str, bool]:
return self.a.get("b", {})
return Wrapper
MRE of my latest typing attemp (with mypy 0.971 errors):
from functools import wraps
from typing import Any, Protocol, TypeVar
class AProtocol(Protocol):
a: dict[str, Any]
class BProtocol(AProtocol, Protocol):
b: dict[str, bool]
T_a = TypeVar("T_a", bound=AProtocol)
T_b = TypeVar("T_b", bound=BProtocol)
def access_b(cls: type[T_a]) -> type[T_b]:
#wraps(cls, updated=())
class Wrapper(cls): # Variable "cls" is not valid as a type & Invalid base class "cls"
#property
def b(self) -> dict[str, bool]:
return self.a.get("b", {})
return Wrapper
#access_b
class Demo1:
"""Local class."""
def __init__(self, a: dict[str, Any]):
self.a = a.copy()
demo1 = Demo1({"b": {"allow_X": True}})
demo1.b["allow_X"] # "Demo1" has no attribute "b"
class Demo2:
"""Consider me an imported class."""
def __init__(self, a: dict[str, Any]):
self.a = a.copy()
demo2 = access_b(Demo2)({"b": {"allow_X": True}}) # Cannot instantiate type "Type[<nothing>]"
demo2.b["allow_X"]
I do not understand why cls is not valid as a type, even after reading https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases.
I understand I should probably not return a Protocol (I suspect that is the source of Type[<nothing>]), but I don't see how I could specify "returns the original type with an extension".
PS1. I have also tried with a decorator which adds b dynamically, still failed to type it...
PS2. ...and with a decorator which uses a mixin as per #DaniilFajnberg's answer, still failing.
References:
functools.wraps(cls, update=()) from https://stackoverflow.com/a/65470430/17676984
(Type) Variables as base classes?
This is actually a really interesting question and I am curious about what solutions other people come up with.
I read up a little on these two errors:
Variable "cls" is not valid as a type / Invalid base class "cls"
There seems to be an issue here with mypy that has been open for a long time now. There doesn't seem to be a workaround yet.
The problem, as I understand it, is that no matter how you annotate it, the function argument cls will always be a type variable and that is considered invalid as a base class. The reasoning is apparently that there is no way to make sure that the value of that variable isn't overwritten somewhere.
I honestly don't understand the intricacies well enough, but it is really strange to me that mypy seems to treat a class A defined via class A: ... different than a variable of Type[A] since the former should essentially just be syntactic sugar for this:
A = type('A', (object,), {})
There was also a related discussion in the mypy issue tracker. Again, hoping someone can shine some light onto this.
Adding a convenience property
In any case, from your example I gather that you are not dealing with foreign classes, but that you define them yourself. If that is the case, a Mix-in would be the simplest solution:
from typing import Any, Protocol
class AProtocol(Protocol):
a: dict[str, Any]
class MixinAccessB:
#property
def b(self: AProtocol) -> dict[str, bool]:
return self.a.get("b", {})
class SomeBase:
...
class OwnClass(MixinAccessB, SomeBase):
def __init__(self, a: dict[str, Any]):
self.a = a.copy()
demo1 = OwnClass({"b": {"allow_X": True}})
print(demo1.b["allow_X"])
Output: True
No mypy issues in --strict mode.
Mixin with a foreign class
If you are dealing with foreign classes, you could still use the Mix-in and then use functools.update_wrapper like this:
from functools import update_wrapper
from typing import Any, Protocol
class AProtocol(Protocol):
a: dict[str, Any]
class MixinAccessB:
"""My mixin"""
#property
def b(self: AProtocol) -> dict[str, bool]:
return self.a.get("b", {})
class Foreign:
"""Foreign class documentation"""
def __init__(self, a: dict[str, Any]):
self.a = a.copy()
class MixedForeign(MixinAccessB, Foreign):
"""foo"""
pass
update_wrapper(MixedForeign, Foreign, updated=())
demo2 = MixedForeign({"b": {"allow_X": True}})
print(demo2.b["allow_X"])
print(f'{MixedForeign.__name__=} {MixedForeign.__doc__=}')
Output:
True
MixedForeign.__name__='Foreign' MixedForeign.__doc__='Foreign class documentation'
Also no mypy issues in --strict mode.
Note that you still need the AProtocol to make it clear that whatever self will be in that property follows that protocol, i.e. has an attribute a with the type dict[str, Any].
I hope I understood your requirements correctly and this at least provides a solution for your particular situation, even though I could not enlighten you on the type variable issue.
I have two pydantic models, the second inheriting from the first:
class RandomBaseModel(pydantic.BaseModel):
foo: typing.Any
class RandomSpecializedModel(RandomBaseModel):
foo: str
Then I have a function that accepts some data and a model to use for instanciating a response:
def do_something(
data: typing.Any,
response_model: typing.Type[RandomBaseModel] = RandomBaseModel
) -> RandomBaseModel:
response = response_model(foo=data)
print(f"---{type(response)}---")
return response
Finally the result of this function is stored into typed variable:
value_1: RandomBaseModel = do_something(42)
value_2: RandomSpecializedModel = do_something("42", response_model=RandomSpecializedModel)
This executes without any problem and works as expected, the do_something function instanciates a RandomBaseModel when response_model is omitted and instanciates a RandomSpecializedModel when it is instructed to use it. here is the output:
---<class '__main__.RandomBaseModel'>---
---<class '__main__.RandomSpecializedModel'>---
BUT this does not please mypy which complains with this message on line value_2: RandomSpecializedModel = do_something("42", response_model=RandomSpecializedModel):
error: Incompatible types in assignment (expression has type "RandomBaseModel", variable has type "RandomSpecializedModel") [assignment]
How could I inform mypy that this function returns an instance of the pydantic model passed as the response_model argument?
To be clear: I am looking for a way to instruct mypy that this function could return a RandomBaseModel instance, a RandomSpecializedModel or an instance of any RandomBaseModel's subclass.
I found some similar question whose answers suggested to use a TypeVar, so I tried to change the do_something function for this:
AnyRandomBaseModel = typing.TypeVar("AnyRandomBaseModel", bound=RandomBaseModel)
def do_something(
data: typing.Any,
response_model: typing.Type[AnyRandomBaseModel] = RandomBaseModel
) -> AnyRandomBaseModel:
response = response_model(foo=data)
print(f"---{type(response)}---")
return response
Although it still executes as expected, mypy now complains with:
error: Incompatible default for argument "response_model" (default has type "Type[RandomBaseModel]", argument has type "Type[AnyRandomBaseModel]")
You should probably make the base model generic instead.
from typing import TypeVar, Generic
T = TypeVar('T')
class RandomBaseModel(pydantic.BaseModel, Generic[T]):
foo: T
class RandomSpecializedModel(RandomBaseModel[str]):
foo: str # you might not need this line
def do_something(
data: T,
response_model: typing.Type[RandomBaseModel[T]] = RandomBaseModel
) -> RandomBaseModel[T]:
response = response_model(foo=data)
print(f"---{type(response)}---")
return response
Try with an Union of types:
def do_something(
data: typing.Any,
response_model: typing.Type[RandomBaseModel] = RandomBaseModel
) -> Union[RandomBaseModel,RandomSpecializedModel]
I know it's weird as one of your class inherits from the other but I think that you do not have the choice
Just for information, since I am not very happy with it, I solved my problem with typing.cast:
value_2: RandomSpecializedModel = typing.cast(
RandomSpecializedModel,
do_something("42", response_model=RandomSpecializedModel)
)
This is not very satisfying to me to make it so verbose in order to just please Mypy, but at least it works without muting Mypy with a # type: ignore[].
So this is good enough... :/
I'm a little new to tinkering with class inheritance in python, particularly when it comes down to using class attributes. In this case I am using a class attribute to change an argument in pydantic's Field() function. This wouldn't be too hard to do if my class contained it's own constructor, however, my class User1 is inheriting this from pydantic's BaseModel.
The idea is that I would like to be able to change the class attribute prior to creating the instance.
Please see some example code below:
from pydantic import Basemodel, Field
class User1(BaseModel):
_set_ge = None # create class attribute
item: float = Field(..., ge=_set_ge)
# avoid overriding BaseModel's __init__
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
User1._set_ge = 0 # setting the class attribute to a new value
instance = User1(item=-1)
print(instance) # item=-1.0
When creating the instance using instance = User1(item=-1) I would expect a validation error to be thrown, but it instead passes validation and simply returns the item value.
If I had my own constructor there would be little issue in changing the _set_ge, but as User1 inheriting this constructor from BaseModel, things are a little more complicated.
The eventual aim is to add this class to a fastapi endpoint as follows:
from fastapi import Fastapi
from schemas import User1
class NewUser1(User1):
pass
NewUser1._set_ge = 0
#app.post("/")
def endpoint(request: NewUser1):
return User1.item
To reduce code duplication, I aimed to use this method to easily change Field() arguments. If there is a better way, I'd be glad to consider that too.
This question is quite closely related to this unanswered one.
In the end, the #validator proposal by #hernán-alarcón is probably the best way to do this. For example:
from pydantic import Basemodel, Field, NumberNotGeError
from typing import ClassVar
class User(BaseModel):
_set_ge = ClassVar[float] # added the ClassVar typing to make clearer, but the underscore should be sufficient
item: float = Field(...)
#validator('item')
def limits(cls, v):
limit_number = cls._set_ge
if v >= limit_number:
return v
else:
raise NumberNotGeError(limit_value=limit_number)
class User1(User)
_set_ge = 0 # setting the class attribute to a new value
instance = User1(item=-1) # raises the error
I have something like this:
from typing import TypeVar, Generic, Tuple
T = TypeVar('T')
S = TypeVar('S')
U = TypeVar('U')
class Foo(Generic[T, S]):
def get_type_vars(self) -> Tuple[TypeVar]:
return #TODO how do I return T and S here?
assert Foo().get_type_vars() == (T, S)
Is there any way to get this behavior? I need a way to find out, that S and T are the TypeVars of the generic Class Foo. Any ideas?
I should mention that I write some class decorators and the method get_type_vars() will be added to the class by a decorator. So all I have is the instance self in the method:
def get_type_vars(self) -> Tuple[TypeVar]:
return #TODO how do I return T and S here in case that self is an instance of Foo?
You can use get_args combined with __orig_bases__ to inspect generic types of the base class:
class Foo(Generic[T, S]):
def get_type_vars(self) -> Tuple[TypeVar]:
return get_args(type(self).__orig_bases__[0])
This would get a bit more complicated for more complex inheritance chains though.
I'm using class decorators in Python and cannot figure out which type annotation to give to my class to make mypy happy.
My code is the following:
from typing import Type
from pprint import pformat
def betterrepr(cls:Type[object]):
"""Improve representation of a class"""
class improved_class(cls): # line 12
def __repr__(self) -> str:
return f"Instance of {cls.__name__}, vars = {pformat(vars(self))}"
return improved_class
I'm currently having the 2 following errors:
myprog.py:12: error: Invalid type "cls"
myprog.py:12: error: Invalid base class
What shall I use for the type of cls (and by the way, is it Pythonic to use this keyword for a class used as argument?)?
Thanks
Using function arguments as base classes is currently not supported by mypy. Your only option is to silence the error, either with a type: ignore comment or a dummy alias like base: Any = cls.
Even without annotating cls, mypy will correctly infer the type of a class decorated with betterrepr. To document that your decorator returns a class similar to the decorated class, use a TypeVar.
from typing import Type, TypeVar
from pprint import pformat
T = TypeVar('T')
def betterrepr(cls: Type[T]) -> Type[T]:
"""Improve representation of a class"""
class IClass(cls): # type: ignore
def __repr__(self) -> str:
return f"Instance of {cls.__name__}, vars = {pformat(vars(self))}"
return IClass