Coming from a C# background and knowing its generic type approaches I'm now trying to implement something similar in Python. I need to serialize and de-serialize classes in a special string format, so I created the following two base classes, the first for single entity serialization and the second one for list serialization of that entity type.
from typing import Any, TypeVar, List, cast, Type, Generic, NewType
import re
T = TypeVar('T')
class Serializable(Generic[T]):
def to_str(self) -> str:
raise NotImplementedError
#classmethod
def from_str(cls, str: str):
raise NotImplementedError
class SerializableList(List[Serializable[T]]):
def __init__(self):
self.separator: str = "\n"
#classmethod
def from_str(cls, str: str):
list = cls()
for match in re.finditer(list.separator, str):
list.append(T().from_str(match)) # <-- PROBLEM: HOW TO INIT A GENERIC ENTITY ???
# list.append(Serializable[T].from_str(match)) <-- Uses base class (NotImplemented) instead of derived class
return list
def to_str(self) -> str:
str = ""
for e in self:
str = str + f"{e.to_str()}{self.separator}"
return str
Then I can derive from those classes and have to implement to_str and from_str. Please see the marker <-- PROBLEM". I have no idea how I can init a new entity of the currently used type for the list. How do we do this in the Python way?
As #user2357112supportsMonica says in the comments, typing.Generic is pretty much only there for static analysis, and has essentially no effect at runtime under nearly all circumstances. From the look of your code, it looks like what you're doing might be better suited to Abstract Base Classes (documentation here, tutorial here), which can be easily combined with Generic.
A class that has ABCMeta as its metaclass is marked as an Abstract Base Class (ABC). A subclass of an ABC cannot be instantiated unless all methods in the ABC marked with the #abstractmethod decorator have been overridden. In my suggested code below, I've explicitly added the ABCMeta metaclass to your Serializable class, and implicitly added it to your SerializableList class by having it inherit from collections.UserList instead of typing.List. (collections.UserList already has ABCMeta as its metaclass.)
Using ABCs, you could define some interfaces like this (you won't be able to instantiate these because of the abstract methods):
### ABSTRACT INTERFACES ###
from abc import ABCMeta, abstractmethod
from typing import Any, TypeVar, Type, Generic
from collections import UserList
import re
T = TypeVar('T')
class AbstractSerializable(metaclass=ABCMeta):
#abstractmethod
def to_str(self) -> str: ...
#classmethod
#abstractmethod
def from_str(cls: Type[T], string: str) -> T: ...
S = TypeVar('S', bound=AbstractSerializable)
class AbstractSerializableList(UserList[S]):
separator = '\n'
#classmethod
#property
#abstractmethod
def element_cls(cls) -> Type[S]: ...
#classmethod
def from_str(cls, string: str):
new_list = cls()
for match in re.finditer(cls.separator, string):
new_list.append(cls.element_cls.from_str(match))
return new_list
def to_str(self) -> str:
return self.separator.join(e.to_str() for e in self)
You could then provide some concrete implementations like this:
class ConcreteSerializable(AbstractSerializable):
def to_str(self) -> str:
# put your actual implementation here
#classmethod
def from_str(cls: Type[T], string: str) -> T:
# put your actual implementation here
class ConcreteSerializableList(AbstractSerializableList[ConcreteSerializable]:
# this overrides the abstract classmethod-property in the base class
element_cls = ConcreteSerializable
(By the way — I changed several of your variable names — str, list, etc — as they were shadowing builtin types and/or functions. This can often lead to annoying bugs, and even if it doesn't, is quite confusing for other people reading your code! I also cleaned up your to_str method, which can be simplified to a one-liner, and moved your separator variable to be a class variable, since it appears to be the same for all class instances and does not appear to ever be altered.)
For now I found a dirty solution - this is to add a Type (constructor) parameter of the list entries like so:
class SerializableList(List[Serializable[T]]):
# This one
# |
# v
def __init__(self, separator: str = "\n", entity_class: Type = None):
self.separator = separator
self.entity_class = entity_class
#classmethod
def from_str(cls, str: str):
list = cls()
for match in re.finditer(list.separator, str):
list.append(list.entity_class.from_str(match))
return list
I wonder if there is a cleaner way to get the correct [T] type constructor from List[T] since it is already provided there?
Related
I have come across an issue during set-up python classes. I have such classes:
class ModifierType(ABC):
"""Represents a generic modifier type."""
...
class DefaultModifierType(ModifierType):
"""Represents a default modifier type."""
#staticmethod
def check_modifier(modifier_id: str, item_name: str, default_modifiers: tuple[str]) -> None:
...
#dataclass
class RequiredModifierType(ModifierType):
"""Represents a required modifier type."""
default_quantity: int = None
action_codes: list[str] = field(default_factory=list)
def check_modifier(self, modifier_id: str, item_name: str) -> None:
...
#dataclass
class Modifier:
"""Represents a generic modifier of group."""
modifier_id: str
modifier_type: ModifierType
And now I also have the outer function that runs kind of such code:
if isinstance(modifier.modifier_type, RequiredModifierType):
modifier.modifier_type.check_modifier(
modifier_id=modifier.modifier_id,
item_name=item_name
)
elif isinstance(modifier.modifier_type, DefaultModifierType):
modifier.modifier_type.check_modifier(
modifier_id=modifier.modifier_id,
item_name=item_name,
default_modifiers=default_modifiers
)
The issue: as I figured out I cannot create an abstract method for ModifierType class because it has different params in DefaultModifierType and RequiredModifierType respectively. So I'd like to know if there's any opportunity to create this abstract method? If not, the checks better to move into Modifier class and check the instance of modifier_type there?
You could add some further arguments, give them a default value, and ignore them where they are irrelevant or don't make sense.
class ModifierType():
#staticmethod
def check_modifier(modifier_id: str, item_name: str, default_modifiers: tuple[str] = None) -> None:
...
class DefaultModifierType(ModifierType):
"""Represents a default modifier type."""
#staticmethod
def check_modifier(modifier_id: str, item_name: str, default_modifiers: tuple[str]) -> None:
assert (default_modifiers is not None)
...
However, if they really do have quite different signatures, it is worth considering that the two methods are not conceptually the same, and should be different methods that the calling side needs to explicitly distinguish.
I am creating a dynamic class from an abstract base class. Here is a simplified example.
from abc import ABC
from typing import List, Type
class IParentClass(ABC):
_children: List['IParentClass'] = []
#property
def children(self) -> List['IParentClass']:
return self._children
def add_child(self) -> None:
self._children.append(self._create_child()())
#classmethod
def _create_child(cls: Type['IParentClass']) -> Type['IParentClass']:
class DynamicChild(cls):
pass
return DynamicChild
class RealChild(IParentClass):
pass
rc = RealChild()
rc.add_child()
rc.children[0].add_child()
rc.children[0].children[0]
The code works, but Mypy gives me two errors (Variable "cls" is not valid as a type, and Invalid base class "cls") on cls in _create_child.
I could ignore the error, but I was wondering if there is a better way to manage this.
We have a number of dataclasses representing various results with common ancestor Result. Each result then provides its data using its own subclass of ResultData. But we have trouble to annotate the case properly.
We came up with following solution:
from dataclasses import dataclass
from typing import ClassVar, Generic, Optional, Sequence, Type, TypeVar
class ResultData:
...
T = TypeVar('T', bound=ResultData)
#dataclass
class Result(Generic[T]):
_data_cls: ClassVar[Type[T]]
data: Sequence[T]
#classmethod
def parse(cls, ...) -> T:
self = cls()
self.data = [self._data_cls.parse(...)]
return self
class FooResultData(ResultData):
...
class FooResult(Result):
_data_cls = FooResultData
but it stopped working lately with mypy error ClassVar cannot contain type variables [misc]. It is also against PEP 526, see https://www.python.org/dev/peps/pep-0526/#class-and-instance-variable-annotations, which we missed earlier.
Is there a way to annotate this case properly?
As hinted in the comments, the _data_cls attribute could be removed, assuming that it's being used for type hinting purposes. The correct way to annotate a Generic class defined like class MyClass[Generic[T]) is to use MyClass[MyType] in the type annotations.
For example, hopefully the below works in mypy. I only tested in Pycharm and it seems to infer the type well enough at least.
from dataclasses import dataclass
from functools import cached_property
from typing import Generic, Sequence, TypeVar, Any, Type
T = TypeVar('T', bound='ResultData')
class ResultData:
...
#dataclass
class Result(Generic[T]):
data: Sequence[T]
#cached_property
def data_cls(self) -> Type[T]:
"""Get generic type arg to Generic[T] using `__orig_class__` attribute"""
# noinspection PyUnresolvedReferences
return self.__orig_class__.__args__[0]
def parse(self):
print(self.data_cls)
#dataclass
class FooResultData(ResultData):
# can be removed
this_is_a_test: Any = 'testing'
class AnotherResultData(ResultData): ...
# indicates `data` is a list of `FooResultData` objects
FooResult = Result[FooResultData]
# indicates `data` is a list of `AnotherResultData` objects
AnotherResult = Result[AnotherResultData]
f: FooResult = FooResult([FooResultData()])
f.parse()
_ = f.data[0].this_is_a_test # no warnings
f: AnotherResult = AnotherResult([AnotherResultData()])
f.parse()
Output:
<class '__main__.FooResultData'>
<class '__main__.AnotherResultData'>
And of course, here is proof that it seems to be working on my end:
At the end I just replaced the variable in _data_cls annotation with the base class and fixed the annotation of subclasses as noted by #rv.kvetch in his answer.
The downside is the need to define the result class twice in every subclass, but in my opinion it is more legible than extracting the class in property.
The complete solution:
from dataclasses import dataclass
from typing import ClassVar, Generic, Optional, Sequence, Type, TypeVar
class ResultData:
...
T = TypeVar('T', bound=ResultData)
#dataclass
class Result(Generic[T]):
_data_cls: ClassVar[Type[ResultData]] # Fixed annotation here
data: Sequence[T]
#classmethod
def parse(cls, ...) -> T:
self = cls()
self.data = [self._data_cls.parse(...)]
return self
class FooResultData(ResultData):
...
class FooResult(Result[FooResultData]): # Fixed annotation here
_data_cls = FooResultData
I'm trying to define a couple of dataclasses and an abstract class that manipulates those classes. Eventually, the my_class_handler types could be dealing with say: json, xml or sqlite files as concrete instance types.
Can someone please explain to me what this message means?
<bound method my_class_handler.class_name of <__main__.my_class_handler object at 0x000001A55FB96580>>
Here's the source code that generates the error for me.
from abc import ABC, abstractmethod
from dataclasses import dataclass
from typing import List
#dataclass
class column:
name: str
heading: str
#dataclass
class my_class:
class_name: str
class_description: str
columns: List[column]
class iclass_handler(ABC):
#abstractmethod
def class_name(self) -> str:
pass
#abstractmethod
def class_name(self, value: str):
pass
class my_class_handler(iclass_handler):
obj: my_class
def __init__(self):
self.obj = my_class("test-class", "", None)
def class_name(self) -> str:
return self.obj.class_names
def class_name(self, value: str):
if (value != self.obj.class_name):
self.obj.class_name = value
if __name__ == '__main__':
handler = my_class_handler()
print(handler.class_name)
If this is not the proper way of doing this, please point me in the direction where I might learn the proper way.
Thanks for your time,
Python does not allow overloading like Java, so remove methods that overlap.
#khelwood pointed out the answer to the original question. Thanks
As for the #property approach, I tried that and was having nothing but problems and couldn't find any useful examples of inherited properties so I just rewrote the function to take an additional parameter:
# I'm working from memory here but I believe this is the jist...
def class_name(self, new_value: str = None) -> str:
if (new_value is None)
return self.obj.class_name
if (isinstance(new_value, str)):
if (new_value != self.obj.class_name):
self.obj.class_name = new_value
return None
Anyhow, I have since refactored and have completely removed the whole class_name() method as a result of a redesign that dropped the whole concept of data-handlers.
Thanks again for the comments.
Let's say i have to following classes.
class A:
#staticmethod
def foo():
pass
class B(A):
pass
And I have some kind of function that constructs an object based on it's type as well as calls a function.
def create(cls: Type[A]) -> A:
cls.foo()
return cls()
Now I can make the following calls to that function. And because B inherits from A it's all good.
instance_a: A = create(A)
instance_b: B = create(B)
Except the with the latter, type-checking will start complaining because create according to the annotations returns an instance of A.
This could be solved with TypeVar as follows.
from typing import Type, TypeVar
T = TypeVar('T')
def create(cls: Type[T]) -> T:
cls.foo()
return cls()
Except now the typing checking doesn't do it's original job of guarantying that cls has a method called foo. Is there a way to specify a generic to be of a certain type?
You can supply a bound:
T = TypeVar('T', bound=A)