Problem
Suppose I want to implement a class decorator that adds some attributes and functions to an existing class.
In particular, let's say I have a protocol called HasNumber, and I need a decorator can_add that adds the missing methods to convert HasNumber class to CanAdd.
class HasNumber(Protocol):
num: int
class CanAdd(HasNumber):
def add(self, num: int) -> int: ...
Implementation
I implement the decorator as follows:
_HasNumberT = TypeVar("_HasNumberT", bound=HasNumber)
def can_add(cls: Type[_HasNumberT]) -> Type[CanAdd]:
def add(self: _HasNumberT, num: int) -> int:
return self.num + num
setattr(cls, "add", add)
return cast(Type[CanAdd], cls)
#can_add
class Foo:
num: int = 12
Error
The code works just fine when I run it, but mypy is unhappy about it for some reason.
It gives the error "Foo" has no attribute "add" [attr-defined], as if it doesn't take the return value (annotated as Type[CanAdd]) of the can_add decorator into account.
foo = Foo()
print(foo.add(4)) # "Foo" has no attribute "add" [attr-defined]
reveal_type(foo) # note: Revealed type is "test.Foo"
Question
In this issue, someone demonstrated a way of annotating this with Intersection. However, is there a way to achieve it without Intersection? (Supposing that I don't care about other attributes in Foo except the ones defined in the protocols)
Or, is it a limitation of mypy itself?
Related posts that don't solve my problem:
Mypy annotation on a class decorator
Class Decorator Compatible for Mypy
cast tells mypy that cls (with or without an add attribute) is safe to use as the return value for can_add. It does not guarantee that the protocol holds.
As a result, mypy cannot tell whether Foo has been given an add attribute, only that it's OK to use the can_add decorator. The fact that can_add has a side effect of defining the add attribute isn't visible to mypy.
You can, however, replace the decorator with direct inheritance, something like
class HasNumber(Protocol):
num: int
_HasNumberT = TypeVar("_HasNumberT", bound=HasNumber)
class Adder(HasNumber):
def add(self, num: int) -> int:
return self.num + num
class Foo(Adder):
num: int = 12
foo = Foo()
print(foo.add(4))
Related
I would like typehint a function such that it only accepts a type of a subclass of Foo and returns an instance of that subclass, rather than a different subclass:
class Foo:
pass
class Bar(Foo):
pass
class Baz(Foo):
pass
class Spam:
pass
def func(t):
return t()
x: Bar = func(Bar)
y: Baz = func(Bar) # disallowed
func(Spam) # disallowed
The closest attempt I have is:
T = typing.TypeVar("T", bound=Foo)
def func(t: typing.Type[T]) -> T:
return t()
However lets say the function is more complicated and results in:
def func(t: typing.Type[T]) -> T:
return Baz()
I would like the above to error, but it doesnt
Just to basically sum up and expand a little on the comments to your question:
That proposed annotation of yours does exactly what you described:
from typing import Type, TypeVar
T = TypeVar("T", bound="Foo")
...
def func(t: Type[T]) -> T:
...
I.e. t must be the class Foo or any subclass of Foo and the output will be an instance of the class provided as an argument for t.
This means that the following should be picked up by a static type checker as wrong, regardless of what kind of class Baz is:
...
def func(t: Type[T]) -> T:
return Baz()
Returning a Baz object without consideration for t breaks the aforementioned contract.
And indeed, mypy states the following referring to the return statement:
error: Incompatible return value type (got "Baz", expected "T") [return-value]
Since this all works as expected, it is not clear what you mean, when you say this:
I would like the above to error, but it doesn't
Now, this is all relevant to static type checkers only. Python, being what it is (i.e. dynamically typed), could not care less and happily allows you to return whatever object you desire from func as well as call that function with any argument you want.
If you want errors to be raised by the Python interpreter, you will have to add some runtime type checking logic to your code yourself. For example:
...
def func(t: Type[T]) -> T:
if not isinstance(t, type) or not issubclass(t, Foo):
raise TypeError(f"{t} is not a subclass of `Foo`")
return t()
I'm solving a funny problem that requires to define a class that can be called like this:
class Chain(2)(3)(4)
And it should print out the multiplication of all arguments.
I ended up a solution like this:
class Chain():
calc = 1
def __new__(cls, a=None):
if a:
cls.calc = cls.calc*a
return cls
else:
return cls.calc
This works fine and self.calc is equal to 24 but i have a wrong representation <class '__main__.Chain'>.
Is there anyway to have representation of multiplication instead of class name like what we have in __repr__ for objects?
note: The call arguments count has no limits and may be different on each call.
First of all to answer your direct question from the title:
As everything in Python, classes are too - objects. And just like classes define how instances are created (what attributes and methods they will have), metaclasses define how classes are created. So let's create a metaclass:
class Meta(type):
def __repr__(self):
return str(self.calc)
class Chain(metaclass=Meta):
calc = 1
def __new__(cls, a=None):
if a:
cls.calc = cls.calc*a
return cls
else:
return cls.calc
print(Chain(2)(3)(4))
This will print, as expected, 24.
A few notes:
Currently Meta simply accesses a calc attribute blindly. A check that it actually exists could be done but the code above was just to make the point.
The way your class is implemented, you can just do Chain(2)(3)(4)() and you will get the same result (that's based on the else part of your __new__).
That's a weird way to implement such behavior - you are returning the class itself (or an int...) from the __new__ method which should return a new object of this class. This is problematic design. A classic way to do what you want is by making the objects callable:
class Chain():
def __init__(self, a=1):
self.calc = a
def __call__(self, a=None):
if a:
return self.__class__(self.calc * a)
else:
return self.calc
def __repr__(self):
return str(self.calc)
print(Chain(2)(3)(4))
This solves your problem of even needing to do what you want, because now you just implement the class' __repr__ (because now each call in the chain returns a new object, and not the class itself).
I'm trying to define a couple of dataclasses and an abstract class that manipulates those classes. Eventually, the my_class_handler types could be dealing with say: json, xml or sqlite files as concrete instance types.
Can someone please explain to me what this message means?
<bound method my_class_handler.class_name of <__main__.my_class_handler object at 0x000001A55FB96580>>
Here's the source code that generates the error for me.
from abc import ABC, abstractmethod
from dataclasses import dataclass
from typing import List
#dataclass
class column:
name: str
heading: str
#dataclass
class my_class:
class_name: str
class_description: str
columns: List[column]
class iclass_handler(ABC):
#abstractmethod
def class_name(self) -> str:
pass
#abstractmethod
def class_name(self, value: str):
pass
class my_class_handler(iclass_handler):
obj: my_class
def __init__(self):
self.obj = my_class("test-class", "", None)
def class_name(self) -> str:
return self.obj.class_names
def class_name(self, value: str):
if (value != self.obj.class_name):
self.obj.class_name = value
if __name__ == '__main__':
handler = my_class_handler()
print(handler.class_name)
If this is not the proper way of doing this, please point me in the direction where I might learn the proper way.
Thanks for your time,
Python does not allow overloading like Java, so remove methods that overlap.
#khelwood pointed out the answer to the original question. Thanks
As for the #property approach, I tried that and was having nothing but problems and couldn't find any useful examples of inherited properties so I just rewrote the function to take an additional parameter:
# I'm working from memory here but I believe this is the jist...
def class_name(self, new_value: str = None) -> str:
if (new_value is None)
return self.obj.class_name
if (isinstance(new_value, str)):
if (new_value != self.obj.class_name):
self.obj.class_name = new_value
return None
Anyhow, I have since refactored and have completely removed the whole class_name() method as a result of a redesign that dropped the whole concept of data-handlers.
Thanks again for the comments.
The goal is to have the following pseudocode valid in Python 3.7+ and have static analysis tools understand it.
class VariadicType(MaybeASpecialBaseClass, metaclass=MaybeASpecialMetaClass):
#classmethod
def method(cls)->Union[???]:
pass # some irrelevant code
assert(VariadicType[Type1, Type2, Type3, Type4].method.__annotations__["return"] == Union[Type1, Type2, Type3, Type4])
assert(VariadicType[Type1, Type2, Type3, Type4, Type5].method.__annotations__["return"] == Union[Type1, Type2, Type3, Type4, Type5])
Is it possible to support some kind of class VariadicType(Generic[...]) but then get all the passed generic types?
I was considering a C# approach of having
class VariadicType(Generic[T1]):
...
class VariadicType(Generic[T1, T2]):
...
class VariadicType(Generic[T1, T2, T3]):
...
class VariadicType(Generic[T1, T2, T3, T4]):
...
class VariadicType(Generic[T1, T2, T3, T4, T5]):
...
but that it not a valid code - VariadicType should only be defined once.
PS. the irrelevant part of code should be checking the __annotations__["return"] and returning results accordingly. It is applying mixins. If the return type is not a union of all applied mixins, then static analysis complains on missing fields and methods. Having a non-hinted code where types are given as method arguments but return type is Any is the last resort.
I already faced this problem, so maybe i can put some light on it.
The problem
Suppose we have the next class definition:
T = TypeVar('T')
S = TypeVar('S')
class VaradicType(Generic[T, S]):
pass
The issue is that VaradicType[T, S] invokes VaradicType.__class_getitem__((T, S)) which returns an object of the class _GenericAlias.
Then, if you do cls = VaradicType[int, float], you can introspect the arguments used as indices with
cls.__args__.
However, if you instantiate an object like obj = cls(), you cannot do obj.__class__.__args__.
This is because _GenericAlias implements the method __call__ that returns directly an object of VaradicType which dont have any class in its MRO that contains information about the arguments supplied.
class VaradicType(Generic[T, S]):
pass
cls = VaradicType[int, float]().__class__
print('__args__' in cls) # False
One solution
One possible approach to solve this issue could be adding information about the generic arguments to the objects of the class VaradicType when they are instantiated.
First (following the previous code snippets), we will add a metaclass to VaradicType:
class VaradicType(Generic[T, S], metaclass=GenericMixin):
pass
We can use the fact that if __getitem__ its defined on the metaclass, has priority over __class_getitem__ in order to bypass Generic.__class_getitem__
class GenericMixin(type):
def __getitem__(cls, items):
return GenericAliasWrapper(cls.__class_getitem__(items))
Now, VaradicType[int, float] is equivalent to GenericMixin.__getitem__(VaradicType, (int, float)) and it will return an object of the class GenericAliasWrapper (it is used to "wrap" typing._GenericAlias instances):
class GenericAliasWrapper:
def __init__(self, x):
self.wrapped = x
def __call__(self, *args, **kwargs):
obj = self.wrapped.__call__(*args, **kwargs)
obj.__dict__['__args__'] = self.wrapped.__args__
return obj
Now, if you have cls=VaradicType[int, float], the code cls() will be equivalent to GenericAliasWrapper( VaradicType.__class_getitem__((int, float)) ).__call__() which creates a new instance of the class VaradicType and also adds the attribute __args__ to its dictionary.
e.g:
VaradicType[int, float]().__args__ # (<class int>, <class float>)
The problem:
I have implemented a class with rather complex internal behavior which pretends to be an int type for all intents and purposes. Then, as a cherry on top, I really wanted my class to successfully pass isinstance() and issubclass() checks for int. I failed so far.
Here's a small demo class that I'm using to test the concept. I have tried inheriting it from both object and int, and while inheriting it from int makes it pass the checks, it also breaks some of it's behavior:
#class DemoClass(int):
class DemoClass(object):
_value = 0
def __init__(self, value = 0):
print 'init() called'
self._value = value
def __int__(self):
print 'int() called'
return self._value + 2
def __index__(self):
print 'index() called'
return self._value + 2
def __str__(self):
print 'str() called'
return str(self._value + 2)
def __repr__(self):
print 'repr() called'
return '%s(%d)' % (type(self).__name__, self._value)
# overrides for other magic methods skipped as irrelevant
a = DemoClass(3)
print a # uses __str__() in both cases
print int(a) # uses __int__() in both cases
print '%d' % a # __int__() is only called when inheriting from object
rng = range(10)
print rng[a] # __index__() is only called when inheriting from object
print isinstance(a, int)
print issubclass(DemoClass, int)
Essentially, inheriting from an immutable class results in an immutable class, and Python will often use base class raw value instead of my carefully-designed magic methods. Not good.
I have looked at abstract base classes, but they seem to be doing something entirely opposite: instead of making my class look like a subclass of an built-in type, they make a class pretend to be a superclass to one.
Using __new__(cls, ...) doesn't seem like a solution either. It's good if all you want is modify object starting value before actually creating it, but I want to evade the immutability curse. Attempt to use object.__new__() did not bear fruit either, as Python simply complained that it's not safe to use object.__new__ to create an int object.
Attempt to inherit my class from (int, dict) and use dict.__new__() was not very successful either as Python apparenty doesn't allow to combine them in a single class.
I suspect the solution might be found with metaclasses, but so far haven't been successful with them either, mostly because my brains simply aren't bent enough to comprehend them properly. I'm still trying but it doesn't look like I'll be getting results soon.
So, the question: is it possible at all to inherit or imitate inheritance from immutable type even though my class is very much mutable? Class inheritance structure doesn't really matter for me for as long as a solution is found (assuming it exists at all).
The problem here is not immutability, but simply inheritance. If DemoClass is a subclass of int, a true int is constructed for each object of type DemoClass and will be used directly without calling __int__ wherever a int could be used, just try a + 2.
I would rather try to simply cheat isinstance here. I would just make DemoClass subclass of object and hide the built-in isinstance behind a custom function:
class DemoClass(object):
...
def isinstance(obj, cls):
if __builtins__.isinstance(obj, DemoClass) and issubclass(int, cls):
return True
else:
return __builtins__.isinstance(obj, cls)
I can then do:
>>> a = DemoClass(3)
init() called
>>> isinstance("abc", str)
True
>>> isinstance(a, DemoClass)
True
>>> isinstance(a, int)
True
>>> issubclass(DemoClass, int)
False
So, if I understand correctly, you have:
def i_want_int(int_):
# can't read the code; it uses isinstance(int_, int)
And you want call i_want_int(DemoClass()), where DemoClass is convertible to int via __int__ method.
If you want to subclass int, instances' values are determined at creation time.
If you don't want to write conversion to int everywhere (like i_want_int(int(DemoClass()))), the simplest approach I can think about is defining wrapper for i_want_int, doing the conversion:
def i_want_something_intlike(intlike):
return i_want_int(int(intlike))
So far, no alternative solutions have been suggested, so here's the solution that I'm using in the end (loosely based on Serge Ballesta's answer):
def forge_inheritances(disguise_heir = {}, disguise_type = {}, disguise_tree = {},
isinstance = None, issubclass = None, type = None):
"""
Monkey patch isinstance(), issubclass() and type() built-in functions to create fake inheritances.
:param disguise_heir: dict of desired subclass:superclass pairs; type(subclass()) will return subclass
:param disguise_type: dict of desired subclass:superclass pairs, type(subclass()) will return superclass
:param disguise_tree: dict of desired subclass:superclass pairs, type(subclass()) will return superclass for subclass and all it's heirs
:param isinstance: optional callable parameter, if provided it will be used instead of __builtins__.isinstance as Python real isinstance() function.
:param issubclass: optional callable parameter, if provided it will be used instead of __builtins__.issubclass as Python real issubclass() function.
:param type: optional callable parameter, if provided it will be used instead of __builtins__.type as Python real type() function.
"""
if not(disguise_heir or disguise_type or disguise_tree): return
import __builtin__
from itertools import chain
python_isinstance = __builtin__.isinstance if isinstance is None else isinstance
python_issubclass = __builtin__.issubclass if issubclass is None else issubclass
python_type = __builtin__.type if type is None else type
def disguised_isinstance(obj, cls, honest = False):
if cls == disguised_type: cls = python_type
if honest:
if python_isinstance.__name__ == 'disguised_isinstance':
return python_isinstance(obj, cls, True)
return python_isinstance(obj, cls)
if python_type(cls) == tuple:
return any(map(lambda subcls: disguised_isinstance(obj, subcls), cls))
for subclass, superclass in chain(disguise_heir.iteritems(),
disguise_type.iteritems(),
disguise_tree.iteritems()):
if python_isinstance(obj, subclass) and python_issubclass(superclass, cls):
return True
return python_isinstance(obj, cls)
__builtin__.isinstance = disguised_isinstance
def disguised_issubclass(qcls, cls, honest = False):
if cls == disguised_type: cls = python_type
if honest:
if python_issubclass.__name__ == 'disguised_issubclass':
return python_issubclass(qcls, cls, True)
return python_issubclass(qcls, cls)
if python_type(cls) == tuple:
return any(map(lambda subcls: disguised_issubclass(qcls, subcls), cls))
for subclass, superclass in chain(disguise_heir.iteritems(),
disguise_type.iteritems(),
disguise_tree.iteritems()):
if python_issubclass(qcls, subclass) and python_issubclass(superclass, cls):
return True
return python_issubclass(qcls, cls)
__builtin__.issubclass = disguised_issubclass
if not(disguise_type or disguise_tree): return # No need to patch type() if these are empty
def disguised_type(obj, honest = False, extra = None):
if (extra is not None):
# this is a call to create a type instance, we must not touch it
return python_type(obj, honest, extra)
if honest:
if python_type.__name__ == 'disguised_type':
return python_type(obj, True)
return python_type(obj)
for subclass, superclass in disguise_type.iteritems():
if obj == subclass:
return superclass
for subclass, superclass in disguise_tree.iteritems():
if python_isinstance(obj, subclass):
return superclass
return python_type(obj)
__builtin__.type = disguised_type
if __name__ == '__main__':
class A(object): pass
class B(object): pass
class C(object): pass
forge_inheritances(disguise_type = { C: B, B: A })
print issubclass(B, A) # prints True
print issubclass(C, B) # prints True
print issubclass(C, A) # prints False - cannot link two fake inheritances without stacking
It is possible to ignore the faked inheritance by providing optional honest parameter to isinstance(), issubclass() and type() calls.
Usage examples.
Make class B a fake heir of class A:
class A(object): pass
class B(object): pass
forge_inheritances(disguise_heir = { B: A })
b = B()
print isinstance(b, A) # prints True
print isinstance(b, A, honest = True) # prints False
Make class B pretend to be class A:
class A(object): pass
class B(object): pass
forge_inheritances(disguise_type = { B: A})
b = B()
print type(b) # prints "<class '__main__.A'>"
print type(b, honest = True) # prints "<class '__main__.B'>"
Make class B and all it's heirs pretend to be class A:
class A(object): pass
class B(object): pass
class D(B): pass
forge_inheritances(disguise_tree = { B: A})
d = D()
print type(d) # prints "<class '__main__.A'>"
Multiple layers of fake inheritances can be achieved by stacking calls to forge_inheritances():
class A(object): pass
class B(object): pass
class C(object): pass
forge_inheritance(disguise_heir = { B: A})
forge_inheritance(disguise_heir = { C: B})
c = C()
print isinstance(c, A) # prints True
Obviously, this hack will not affect super() calls and attribute/method inheritance in any way, the primary intent here is just to cheat isinstance() and type(inst) == class checks in a situation when you have no way to fix them directly.