I currently have a TestClass that is callable. The callable executes a function that raises an exception if any attribute is equal to None. The purpose of defining it to be callable is so when TestClass instance is passed to another function or copied, it will perform a check that all attributes exist prior to being passed, else it will raise an exception.
The line below that exhibits this logic is UsesTestClass(testClass()).
Ideally I want to be able to perform the same check without having to "call" the class instance. For example, UsesTestClass(testClass). Is there a magic method or some other way to configure the class to be able to execute a function prior to being passed as an argument?
class TestClass:
def __init__(self):
self.name = None
def run(self):
if self.name is None:
raise Exception("'name' attribute is 'None'")
def __call__(self):
self.run()
return self
def UsesTestClass(testClass):
print(testClass.name)
testClass = TestClass()
testClass.name = "Hello"
UsesTestClass(testClass())
If you use the types library integrated into python you can do this.
import types
class TestClass:
def __init__(self):
self.name = None
def __getattribute__(self, attr):
method = object.__getattribute__(self, attr)
if not method:
raise Exception("Attribute %s not implemented" % attr)
if type(method) == types.MethodType:
self.run()
return method
def run(self):
if self.name is None:
raise Exception("'name' attribute is 'None'")
def __call__(self):
self.run()
return self
def UsesTestClass(testClass):
print(testClass.name)
testClass = TestClass()
testClass.name = "Hello"
UsesTestClass(testClass)
Related
I'm trying to implement privacy modifiers into python using decorators.
My problem is that whenever I decorate a method that has self as an argument, when the method is called using dot notation, it doesn't pass self in automatically.
Public decorator class:
class Public:
def __init__(self, method, *args):
if type(method).__name__ == 'function':
self.method = method
def __call__(self, *args, **kwargs):
return self.method(*args, **kwargs)
Example code:
class Test:
#Public
def test(self):
return "Hello"
class Test1(Test):
def __init__(self):
super().__init__()
print(self.test())
x = Test1()
How do I pass self into Public.__call__?
I tried regularly passing in self:
class Test:
#Public
def test(self):
return "Hello"
class Test1(Test):
def __init__(self):
super().__init__()
print(self.test(self))
x = Test1()
which works but I would much rather not have to do that every time I need to call a method.
I found an answer
Here's how if anyone else is doing something similar:
First I had to create a decorator function that changed the getattribute function in the class
def modifiable(cls):
if isinstance(cls, type):
original_getattr = cls.__getattribute__
def getattr(_self, name):
attr = original_getattr(_self, name)
if isinstance(attr, Modifier):
def wrapper(*args, **kwargs):
return attr(_self, *args, **kwargs)
return wrapper
return attr
cls.__getattribute__ = getattr
return cls
I also created an empty Modifier class that all the privacy modifiers inherit from to make it easier to check if a method is modified.
example code:
#modifiable
class Test:
#Protected
def test(self):
return "Hello"
#modifiable
class Test1(Test):
x = 1
def __init__(self):
super().__init__()
print(self.test())
test = Test1()
print(test.test())
and output:
Hello
Traceback (most recent call last):
File "C:\Users\tyson\OneDrive\Documents\GitHub\better-decorators\src\test.py", line 21, in <module>
print(test.test())
privacy.AccessError: test is a protected method
(privacy.AccessError is a custom error)
I'm trying to add extra decorator for magic method (__get__) in descriptor class.
I'm able to do it when I use #property but not when I use descriptor class.
I check range because my object set registers on the bus and some registers can take only specific range of values:
import functools
def check_range(min, max):
def decorator(f):
#functools.wraps(f)
def wrap(self, value):
if value not in range(min, max+1):
return
return f(self, value)
return wrap
return decorator
This works:
class Foo:
def __init__(self):
self.device.init_smth('my_object')
#property
def my_object(self):
return self.device.get_value('my_object')
#my_object.setter
#check_range(0,1)
def my_object(self, value):
self.device.set_value('my_object', value)
a = Foo()
print(a.my_object)
a.my_object = 1
print(a.my_object)
a.myobject = -1
And in this example everything works the same but check_range is not invoked:
class Register:
def __init__(self, name, device):
self.name = name
device.init_smth(name)
def __get__(self, instance, owner):
return instance.device.get_value(self.name)
#check_range(0,1)
def __set__(self, instance, value):
instance.device.set_value(self.name, value)
class Foo:
def __init__(self):
self.my_object = Register('my_object', self.device)
a = Foo()
print(a.my_object)
a.my_object = 1
print(a.my_object)
a.myobject = -1
I may be wrong, but most probably your descriptor not invoked at all, decorator is not the problem. Descriptors meant to be used like
class Foo2:
my_object = Register('my_object', 'init_value')
— you're defining it like class attribute. And python will execute all machinery with __get__/__set__/__del__ if your class attribute supports it (i.e. it is descriptor).
This is why there is an "instance" argument in descriptor methods — you're defining descriptor as class variable, but i.e. __set__ method will receive actual instance of your class, so you can manage per-instance data, like your device
I have the following classes implementing a "Delegation Design Pattern" with an additional DelegatorParent class:
class DelegatorParent():
def __init__(self):
self.a = 'whatever'
class ConcreteDelegatee():
def myMethod(self):
return 'myMethod'
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee = ConcreteDelegatee()
DelegatorParent.__init__(self)
def __getattr__(self, attrname):
return getattr(self.delegatee, attrname)
a = Delegator()
result = a.myMethod()
Everything looks fine.
Now I would like to put an abstract method in DelegatorParent, to ensure that "myMethod" is always defined.
from abc import ABCMeta, abstractmethod
class DelegatorParent():
__metaclass__ = ABCMeta
#abstractmethod
def myMethod(self):
pass
def __init__(self):
self.a = 'whatever'
class ConcreteDelegatee():
def myMethod(self):
return 'myMethod'
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee = ConcreteDelegatee()
DelegatorParent.__init__(self)
def __getattr__(self, attrname):
return getattr(self.delegatee, attrname)
# This method seems unnecessary, but if I erase it an exception is
# raised because the abstract method's restriction is violated
def myMethod(self):
return self.delegatee.myMethod()
a = Delegator()
result = a.myMethod()
Can you help me find an "elegant" way to remove "myMethod" from "Delegator"... Intuition tells me that it is somehow redundant (considering that a custom getattr method is defined).
And more importantly, notice that with this implementation, if I forget to define myMethod in ConcreteDelegatee the program compiles, but it may crash in runtime if I call Delegator.myMethod(), which is exactly what I wanted to avoid by using abstract methods in DelegatorParent.
Obviously a simple solution would be to move #abstractmethod to the Delegator class, but I want to avoid doing that because in my program DelegatorParent is a very important class (and Delegator is just an auxiliary class).
You can decide to automatically implement abstract methods delegared to ConcreteDelegatee.
For each abstract method, check if it's name exist in the ConcreteDelegatee class and implement this method as a delegate to this class method.
from abc import ABCMeta, abstractmethod
class DelegatorParent(object):
__metaclass__ = ABCMeta
def __init__(self):
self.a = 'whatever'
#abstractmethod
def myMethod(self):
pass
class Delegatee(object):
pass
class ConcreteDelegatee(Delegatee):
def myMethod(self):
return 'myMethod'
def myMethod2(self):
return 'myMethod2'
class Delegator(DelegatorParent):
def __new__(cls, *args, **kwargs):
implemented = set()
for name in cls.__abstractmethods__:
if hasattr(ConcreteDelegatee, name):
def delegated(this, *a, **kw):
meth = getattr(this.delegatee, name)
return meth(*a, **kw)
setattr(cls, name, delegated)
implemented.add(name)
cls.__abstractmethods__ = frozenset(cls.__abstractmethods__ - implemented)
obj = super(Delegator, cls).__new__(cls, *args, **kwargs)
obj.delegatee = ConcreteDelegatee()
return obj
def __getattr__(self, attrname):
# Called only for attributes not defined by this class (or its bases).
# Retrieve attribute from current behavior delegate class instance.
return getattr(self.delegatee, attrname)
# All abstract methods are delegared to ConcreteDelegatee
a = Delegator()
print(a.myMethod()) # correctly prints 'myMethod'
print(a.myMethod2()) #correctly prints 'myMethod2'
This solves the main problem (prevent ConcreteDelegatee from forgetting to define myMethod). Other abstract methods are still checked if you forgot to implement them.
The __new__ method is in charge of the delegation, that frees your __init__ to do it.
Since you use ABCMeta, you must defined the abstract methods. One could remove your method from the __abstractmethods__ set, but it is a frozenset. Anyway, it involves listing all abstract methods.
So, instead of playing with __getattr__, you can use a simple descriptor.
For instance:
class Delegated(object):
def __init__(self, attrname=None):
self.attrname = attrname
def __get__(self, instance, owner):
if instance is None:
return self
delegatee = instance.delegatee
return getattr(delegatee, self.attrname)
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee = ConcreteDelegatee()
DelegatorParent.__init__(self)
myMethod = Delegated('myMethod')
An advantage here: the developer has the explicit information that "myMethod" is delegated.
If you try:
a = Delegator()
result = a.myMethod()
It works! But if you forget to implement myMethod in Delegator class, you have the classic error:
Traceback (most recent call last):
File "script.py", line 40, in <module>
a = Delegator()
TypeError: Can't instantiate abstract class Delegator with abstract methods myMethod
Edit
This implementation can be generalized as follow:
class DelegatorParent():
__metaclass__ = ABCMeta
#abstractmethod
def myMethod1(self):
pass
#abstractmethod
def myMethod2(self):
pass
def __init__(self):
self.a = 'whatever'
class ConcreteDelegatee1():
def myMethod1(self):
return 'myMethod1'
class ConcreteDelegatee2():
def myMethod2(self):
return 'myMethod2'
class DelegatedTo(object):
def __init__(self, attrname):
self.delegatee_name, self.attrname = attrname.split('.')
def __get__(self, instance, owner):
if instance is None:
return self
delegatee = getattr(instance, self.delegatee_name)
return getattr(delegatee, self.attrname)
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee1 = ConcreteDelegatee1()
self.delegatee2 = ConcreteDelegatee2()
DelegatorParent.__init__(self)
myMethod1 = DelegatedTo('delegatee1.myMethod1')
myMethod2 = DelegatedTo('delegatee2.myMethod2')
a = Delegator()
result = a.myMethod2()
Here, we can specify the delegatee name and delegatee method.
Here is my current solution. It solves the main problem (prevent ConcreteDelegatee from forgetting to define myMethod), but I'm still not convinced because I still need to define myMethod inside Delegator, which seems redundant
from abc import ABCMeta, abstractmethod
class DelegatorParent(object):
__metaclass__ = ABCMeta
def __init__(self):
self.a = 'whatever'
#abstractmethod
def myMethod(self):
pass
class Delegatee(object):
def checkExistence(self, attrname):
if not callable(getattr(self, attrname, None)):
error_msg = "Can't instantiate " + str(self.__class__.__name__) + " without abstract method " + attrname
raise NotImplementedError(error_msg)
class ConcreteDelegatee(Delegatee):
def myMethod(self):
return 'myMethod'
def myMethod2(self):
return 'myMethod2'
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee = ConcreteDelegatee()
DelegatorParent.__init__(self)
for method in DelegatorParent.__abstractmethods__:
self.delegatee.checkExistence(method)
def myMethod(self, *args, **kw):
return self.delegatee.myMethod(*args, **kw)
def __getattr__(self, attrname):
# Called only for attributes not defined by this class (or its bases).
# Retrieve attribute from current behavior delegate class instance.
return getattr(self.delegatee, attrname)
# if I forget to implement myMethod inside ConcreteDelegatee,
# the following line will correctly raise an exception saying
# that 'myMethod' is missing inside 'ConcreteDelegatee'.
a = Delegator()
print a.myMethod() # correctly prints 'myMethod'
print a.myMethod2() #correctly prints 'myMethod2'
I'm trying to create a sample object to test the __new__ and __init__ method.
Here is my sample code. When I run this - I see "Its been created" msg and dont see the "Initialized" & "Deleted" msg.
class Test( object ):
def __new__(self):
print 'Its been Created'
def __init__(self):
print 'Its been Initialzed'
def __del__(self):
print 'Its been Deleted'
T = Test()
__new__ needs to return an instance of the class (see docs). What you're effectively doing here is returning the instance on NoneType (since functions with no explicit return value return None (and 'the' in this case because None is a special-case singleton in Python)), then having __init__ of that object called. The simplest way to fix this would be something like:
class Test(object):
def __new__(cls):
print 'Creating'
return super(Test, cls).__new__(cls)
def __init__(self):
print 'Intializing'
def __del__(self):
print 'Deleting'
This will cause Test.__new__() to return the result of Test's superclass' (object in this case) __new__ method as the newly created instance.
It may help you to understand what's going on if you try the following:
class A(object):
def __new__(cls):
print 'A.__new__'
return super(A, cls).__new__(cls)
def __init__(self):
print 'A.__init__'
def __del__(self):
print 'A.__del__'
class FakeA(object):
def __new__(cls):
print 'FakeA.__new__'
return A.__new__()
def __init__(self):
print 'FakeA.__init__'
def __del__(self):
print 'FakeA.__del__'
a = A()
fa = FakeA()
del a
del fa
However, it is important to note that __del__ is not guaranteed to be called on every instance every time.
I'm having a hard time understanding what happens when I try to nest descriptors/decorators. I'm using python 2.7.
For example, let's take the following simplified versions of property and classmethod:
class MyProperty(object):
def __init__(self, fget):
self.fget = fget
def __get__(self, obj, objtype=None):
print 'IN MyProperty.__get__'
return self.fget(obj)
class MyClassMethod(object):
def __init__(self, f):
self.f = f
def __get__(self, obj, objtype=None):
print 'IN MyClassMethod.__get__'
def f(*args, **kwargs):
return self.f(objtype, *args, **kwargs)
return f
Trying to nest them:
class A(object):
# doesn't work:
#MyProperty
#MyClassMethod
def klsproperty(cls):
return 555
# works:
#MyProperty
def prop(self):
return 111
# works:
#MyClassMethod
def klsmethod(cls, x):
return x**2
% print A.klsproperty
IN MyProperty.__get__
...
TypeError: 'MyClassMethod' object is not callable
The __get__ method of the inner descriptor MyClassMethod is not getting called.
Failing to figure out why, I tried throwing in (what I think is) a no-op descriptor:
class NoopDescriptor(object):
def __init__(self, f):
self.f = f
def __get__(self, obj, objtype=None):
print 'IN NoopDescriptor.__get__'
return self.f.__get__(obj, objtype=objtype)
Trying to use the no-op descriptor/decorator in nesting:
class B(object):
# works:
#NoopDescriptor
#MyProperty
def prop1(self):
return 888
# doesn't work:
#MyProperty
#NoopDescriptor
def prop2(self):
return 999
% print B().prop1
IN NoopDescriptor.__get__
IN MyProperty.__get__
888
% print B().prop2
IN MyProperty.__get__
...
TypeError: 'NoopDescriptor' object is not callable
I don't understand why B().prop1 works and B().prop2 does not.
Questions:
What am I doing wrong? Why am I getting a object is not callable error?
What's the right way? e.g. what is the best way to define MyClassProperty while re-using MyClassMethod and MyProperty (or classmethod and property)
In this case, when the decorators are used without parameters, a decorator is called with the function it decorates as its parameter. The decorator's return value is used instead of the decorated function. So:
#MyProperty
def prop(self):
...
is equivalent to:
def prop(self):
...
prop = MyProperty(prop)
Since MyProperty implements the descriptor protocol, accessing A.prop will actually call A.prop.__get__(), and you've defined __get__ to call the object which was decorated (in this case, the original function/method), so everything works fine.
Now, in the nested case:
#MyProperty
#MyClassMethod
def prop(self):
...
The equivalent is:
def prop(self):
...
prop = MyClassMethod(prop) # prop is now instance of MyClassMethod
prop = MyProperty(prop) # prop is now instance of MyProperty
# (with fget == MyClassMethod instance)
Now, as before, accessing A.prop will actually call A.prop.__get__() (in MyProperty) which then tries to call the instance of MyClassMethod (the object which was decorated and stored in the fget attribute).
But the MyClassMethod does not have a __call__ method defined, so you get the error MyClassMethod is not callable.
And to address your second question: A property is already a class attribute - in your example, accessing A.prop will return the value of the property in the class object and A().prop will return the value of the property in an instance object (which can be the same as the class object if the instance did not override it).
You can make your code work if you make MyProperty apply the descriptor protocol to its wrapped object:
class MyProperty(object):
def __init__(self, fget):
self.fget = fget
def __get__(self, obj, objtype=None):
print('IN MyProperty.__get__')
try:
return self.fget.__get__(obj, objtype)()
except AttributeError: # self.fget has no __get__ method
return self.fget(obj)
Now your example code works:
class A(object):
#MyProperty
#MyClassMethod
def klsproperty(cls):
return 555
print(A.klsproperty)
The output is:
IN MyProperty.__get__
IN MyClassMethod.__get__
555
I found the definitive answer to my own old question in Graham Dumpleton's fascinating blog.
In short, the decorators I wrote do not honour the descriptors protocol, by trying to call the wrapped function/object directly, instead of first giving them a chance to perform their "descriptor magic" (by calling their __get__() first).