Usually one initializes class members inside the class (not into the constructor) and the references them via MyClass.mymember instead of self.mymember.
However I noticed that I can actually reference the member later via self.mymember. Further, I can even overwrite it. This has the very curious effect, that the class member becomes an instance member.
This effect can be seen in the following code with property_cache of MyClass.
It is really helpful in this context, because now the object property_cache (not the class PropertyCache!) can be used as a decorator and it is a instance member of myclass(not a class member of MyClass!).
However, the effect that a class member is turned into an instance member after the declaration of MyClass is very unexpected. I did not even consider that such a thing is possible and I don't quite understand this mechanism.
Can someone shed some light on it and possible reference some literature on this? I did not find a description in the documentation of python.
class PropertyCache:
def __init__(self):
self.cache = {}
def reset(self):
self.cache = {}
def __call__(this, function):
def wrapper(self, *args, **kwargs):
if function.__qualname__ not in self.property_cache.cache:
self.property_cache.cache[function.__qualname__] = function(self, *args, **kwargs)
return self.property_cache.cache[function.__qualname__]
return wrapper
class MyClass:
property_cache = PropertyCache()
def __init__(self, parameter):
self.parameter = parameter
self.property_cache = PropertyCache()
def fit(self, data):
print('fit')
self.intermediate_data_ = data + self.parameter + 2
self.property_cache.reset()
return self
#property
#property_cache
def trans1(self):
print('trans 1')
return self.intermediate_data_ / self.parameter / 2
#property
#property_cache
def trans2(self):
print('trans 2')
return self.intermediate_data_ / self.parameter / 5
myclass = MyClass(2)
myclass.fit(10)
myclass.trans1
myclass.trans1
myclass.trans2
myclass.fit(15)
myclass.trans1
myclass.trans1
myclass.trans2
myclass2 = MyClass(3)
myclass2.fit(15)
myclass2.trans2
myclass2.trans2
myclass.trans2
Related
I am trying to find a good way for returning a (new) class object in class method that can be extended as well.
I have a class (classA) which has among other methods, a method that returns a new classA object after some processing
class classA:
def __init__(): ...
def methodX(self, **kwargs):
process data
return classA(new params)
Now, I am extending this class to another classB. I need methodX to do the same, but return classB this time, instead of classA
class classB(classA):
def __init__(self, params):
super().__init__(params)
self.newParams = XYZ
def methodX(self, **kwargs):
???
This may be something trivial but I simply cannot figure it out. In the end I dont want to rewrite the methodX each time the class gets extended.
Thank you for your time.
Use the __class__ attribute like this:
class A:
def __init__(self, **kwargs):
self.kwargs = kwargs
def methodX(self, **kwargs):
#do stuff with kwargs
return self.__class__(**kwargs)
def __repr__(self):
return f'{self.__class__}({self.kwargs})'
class B(A):
pass
a = A(foo='bar')
ax = a.methodX(gee='whiz')
b = B(yee='haw')
bx = b.methodX(cool='beans')
print(a)
print(ax)
print(b)
print(bx)
class classA:
def __init__(self, x):
self.x = x
def createNew(self, y):
t = type(self)
return t(y)
class classB(classA):
def __init__(self, params):
super().__init__(params)
a = classA(1)
newA = a.createNew(2)
b = classB(1)
newB = b.createNew(2)
print(type(newB))
# <class '__main__.classB'>
I want to propose what I think is the cleanest approach, albeit similar to existing answers. The problem feels like a good fit for a class method:
class A:
#classmethod
def method_x(cls, **kwargs):
return cls(<init params>)
Using the #classmethod decorator ensures that the first input (traditionally named cls) will refer to the Class to which the method belongs, rather than the instance.
(usually we call the first method input self and this refers to the instance to which the method belongs)
Because cls refers to A, rather than an instance of A, we can call cls() as we would call A().
However, in a class that inherits from A, cls will instead refer to the child class, as required:
class A:
def __init__(self, x):
self.x = x
#classmethod
def make_new(cls, **kwargs):
y = kwargs["y"]
return cls(y) # returns A(y) here
class B(A):
def __init__(self, x):
super().__init__(x)
self.z = 3 * x
inst = B(1).make_new(y=7)
print(inst.x, inst.z)
And now you can expect that print statement to produce 7 21.
That inst.z exists should confirm for you that the make_new call (which was only defined on A and inherited unaltered by B) has indeed made an instance of B.
However, there's something I must point out. Inheriting the unaltered make_new method only works because the __init__ method on B has the same call signature as the method on A. If this weren't the case then the call to cls might have had to be altered.
This can be circumvented by allowing **kwargs on the __init__ method and passing generic **kwargs into cls() in the parent class:
class A:
def __init__(self, **kwargs):
self.x = kwargs["x"]
#classmethod
def make_new(cls, **kwargs):
return cls(**kwargs)
class B(A):
def __init__(self, x, w):
super().__init__(x=x)
self.w = w
inst = B(1,2).make_new(x="spam", w="spam")
print(inst.x, inst.w)
Here we were able to give B a different (more restrictive!) signature.
This illustrates a general principle, which is that parent classes will typically be more abstract/less specific than their children.
It follows that, if you want two classes that substantially share behaviour but which do quite specific different things, it will be better to create three classes: one rather abstract one that defines the behaviour-in-common, and two children that give you the specific behaviours you want.
Sorry for the confusing question.
Say there are two instances of two different classes (e.g. 'big_instance' and 'little_instance').
The little_instance is defined as an attribute of the big instance.
How would a method in the little class access an attribute of the big instance.
An example is below.
The line 'return parent.attribute1' is basically pseudo code. How would this line be written properly?
class BigClass:
def __init__(self, att):
self.attribute1 = att
self.little_instance = LittleClass()
class LittleClass:
def parents_att(self):
return parent.attribute1
big_instance = BigClass(1)
print(big_instance.little_instance.parents_att())
ah yes I got it. Read the comments for explanation.
The test code at the end shows that it works even after attribute1 changes :)
class BigClass:
def __init__(self, att):
self.attribute1 = att
# pass in self
self.little_instance = LittleClass(self)
class LittleClass:
def __init__(self, the_big_class):
# the big class is held in an instance var
self.the_big_class = the_big_class
def parents_att(self):
# the instance var is used to reference the attribute
return self.the_big_class.attribute1
big_instance = BigClass(1)
print(big_instance.little_instance.parents_att())
big_instance.attribute1 = 2
print(big_instance.little_instance.parents_att())
You can do the following if you want to access attribute from BigClass into LittleClass.
class BigClass:
def __init__(self, att):
self.attribute1 = att
class LittleClass(BigClass):
def __init__(self, att):
BigClass.__init__(self, att)
def parent_att(self):
return self.attribute1
small_instance = LittleClass(1)
print(small_instance.parent_att)
When I try to allocate different function to different class members the last declared value gets overriden.
class Object:
def __init__(self, *args, **kwargs):
pass
class Service:
#staticmethod
def get_a_value():
return 1
#staticmethod
def get_b_value(*args, **kwargs):
return 2
def __init__(self, *args, **kwargs):
self.a = Object
self.b = Object
self.a.execute = self.get_a_value
self.b.execute = self.get_b_value
if __name__ == '__main__':
obj = Service()
print(obj.a().execute())
print(obj.b().execute())
The expected output is 1 & 2 but I'm getting both as 2. Not sure what I'm missing here. How can I make sure I can allocate different functions to a.execute and b.execute. Any help would be much appreciated.
what about creating specific objects for your a and b members, with a class deriving from Object:
class Object:
def __init__(self, *args, **kwargs):
pass
class Service:
#staticmethod
def get_a_value():
return 1
#staticmethod
def get_b_value(*args, **kwargs):
return 2
def __init__(self, *args, **kwargs):
class ObjectA(Object):
#staticmethod
def execute():
return Service.get_a_value()
class ObjectB(Object):
#staticmethod
def execute():
return Service.get_b_value()
self.a = ObjectA
self.b = ObjectB
if __name__ == '__main__':
obj = Service()
print(obj.a().execute())
print(obj.b().execute())
This prints
1
2
Of course this may become slightly more complex if the methods aren't static but this can be easily adapted in the example below where all methods are full non-static methods
class Service:
def get_a_value(self):
return self.__a_value
def get_b_value(self):
return self.__b_value
def __init__(self, *args, **kwargs):
self.__a_value = 1
self.__b_value = 2
class ObjectA(Object):
def execute(myself):
return self.get_a_value()
class ObjectB(Object):
def execute(myself):
return self.get_b_value()
You'll notice that self used in execute methods refer to instance of Service (hence the myself first argument in child classes). It works as well, even if no method is static, and can access values of the instance.
With that, you can create full-fledged object interfaces.
It is because obj.a and obj.b contain both a reference to the same class.
So everything you modify on obj.a will be reflected in obj.b.
What you probably wanted to do is :
def __init__(self, *args, **kwargs):
self.a = Object()
self.b = Object()
self.a.execute = self.get_a_value
self.b.execute = self.get_b_value
I'm trying to add extra decorator for magic method (__get__) in descriptor class.
I'm able to do it when I use #property but not when I use descriptor class.
I check range because my object set registers on the bus and some registers can take only specific range of values:
import functools
def check_range(min, max):
def decorator(f):
#functools.wraps(f)
def wrap(self, value):
if value not in range(min, max+1):
return
return f(self, value)
return wrap
return decorator
This works:
class Foo:
def __init__(self):
self.device.init_smth('my_object')
#property
def my_object(self):
return self.device.get_value('my_object')
#my_object.setter
#check_range(0,1)
def my_object(self, value):
self.device.set_value('my_object', value)
a = Foo()
print(a.my_object)
a.my_object = 1
print(a.my_object)
a.myobject = -1
And in this example everything works the same but check_range is not invoked:
class Register:
def __init__(self, name, device):
self.name = name
device.init_smth(name)
def __get__(self, instance, owner):
return instance.device.get_value(self.name)
#check_range(0,1)
def __set__(self, instance, value):
instance.device.set_value(self.name, value)
class Foo:
def __init__(self):
self.my_object = Register('my_object', self.device)
a = Foo()
print(a.my_object)
a.my_object = 1
print(a.my_object)
a.myobject = -1
I may be wrong, but most probably your descriptor not invoked at all, decorator is not the problem. Descriptors meant to be used like
class Foo2:
my_object = Register('my_object', 'init_value')
— you're defining it like class attribute. And python will execute all machinery with __get__/__set__/__del__ if your class attribute supports it (i.e. it is descriptor).
This is why there is an "instance" argument in descriptor methods — you're defining descriptor as class variable, but i.e. __set__ method will receive actual instance of your class, so you can manage per-instance data, like your device
I have given up memoization of a class as a bag-of-worms that I didn't want to explore and here is one example of why. The question I ask is "how does one extend or inherit from a memoized class" but it's very possible I have made a mistake. The memoize class below is a cut-down version of the one by brandizzi in How can I memoize a class instantiation in Python? and googling the subject finds more involved such classes.
class memoize(object):
def __init__(self, cls):
self.cls = cls
# I didn't understand why this was needed
self.__dict__.update(cls.__dict__)
# bit about static methods not needed
def __call__(self, *args):
try:
self.cls.instances
except:
self.cls.instances = {}
key = '//'.join(map(str, args))
if key not in self.cls.instances:
self.cls.instances[key] = self.cls(*args)
return self.cls.instances[key]
class Foo():
def __init__(self,val):
self.val = val
def __repr__(self):
return "{}<{},{}>".format(self.__class__.__name__,self.val,id(self))
class Bar(Foo):
def __init__(self,val):
super().__init__(val)
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])
# produces exactly what I expect
# [Foo<0,3071981964>, Foo<0,3071982092>, Foo<1,3071982316>]
# [Bar<0,3071983340>, Bar<0,3071983404>, Bar<1,3071983436>]
Foo = memoize(Foo)
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])
# and now Foo has been memoized so Foo(0) always produces the same object
# [Foo<0,3071725804>, Foo<0,3071725804>, Foo<1,3071726060>]
# [Bar<0,3071711916>, Bar<0,3071711660>, Bar<1,3071725644>]
# this produces a compilation error that I don't understand
class Baz(Foo):
def __init__(self,val):
super().__init__(val)
# Traceback (most recent call last):
# File "/tmp/foo.py", line 49, in <module>
# class Baz(Foo):
# TypeError: __init__() takes 2 positional arguments but 4 were given
This "recipe" is indeed a very bad idea - once you rebind Foo to memoize(Foo), Foo is a memoize instance and not class Foo anymore. This breaks all expectations wrt/ python's type and the whole object model. In this case, it about how the class statement works. Actually, this:
class Titi():
x = 42
def toto(self):
print(self.x)
is syntactic sugar for:
def toto(self):
print(self.x)
Titi = type("Titi", (object,), {x:42, toto:toto})
del toto
Note that this happens at runtime (like everything in Python except parsing / bytecode compilation), and that type is a class so calling type creates a new class which is a type instance (this is named a 'metaclass' - the class of a class - and type is the default metaclass).
So with Foo being now a memoize instance instead of a Type instance, and since memoize is not a proper metaclass (it's __init__ methods signature is incompatible), the whole thing just cannot work.
To get this to work, you'd have to make memoize a proper metaclass (this is a simplified example assuming a single arg named param but it can be generalized if you want to):
class FooType(type):
def __new__(meta, name, bases, attrs):
if "_instances" not in attrs:
attrs["_instances"] = dict()
return type.__new__(meta, name, bases, attrs)
def __call__(cls, param):
if param not in cls._instances:
cls._instances[param] = super(FooType, cls).__call__(param)
return cls._instances[param]
class Foo(metaclass=FooType):
def __init__(self, param):
self._param = param
print("%s init(%s)" % (self, param))
def __repr__(self):
return "{}<{},{}>".format(self.__class__.__name__, self._param, id(self))
class Bar(Foo):
pass
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])