Related
I have a master class for a planet:
class Planet:
def __init__(self,name):
self.name = name
(...)
def destroy(self):
(...)
I also have a few classes that inherit from Planet and I want to make one of them unable to be destroyed (not to inherit the destroy function)
Example:
class Undestroyable(Planet):
def __init__(self,name):
super().__init__(name)
(...)
#Now it shouldn't have the destroy(self) function
So when this is run,
Undestroyable('This Planet').destroy()
it should produce an error like:
AttributeError: Undestroyable has no attribute 'destroy'
The mixin approach in other answers is nice, and probably better for most cases. But nevertheless, it spoils part of the fun - maybe obliging you to have separate planet-hierarchies - like having to live with two abstract classes each ancestor of "destroyable" and "non-destroyable".
First approach: descriptor decorator
But Python has a powerful mechanism, called the "descriptor protocol", which is used to retrieve any attribute from a class or instance - it is even used to ordinarily retrieve methods from instances - so, it is possible to customize the method retrieval in a way it checks if it "should belong" to that class, and raise attribute error otherwise.
The descriptor protocol mandates that whenever you try to get any attribute from an instance object in Python, Python will check if the attribute exists in that object's class, and if so, if the attribute itself has a method named __get__. If it has, __get__ is called (with the instance and class where it is defined as parameters) - and whatever it returns is the attribute. Python uses this to implement methods: functions in Python 3 have a __get__ method that when called, will return another callable object that, in turn, when called will insert the self parameter in a call to the original function.
So, it is possible to create a class whose __get__ method will decide whether to return a function as a bound method or not depending on the outer class been marked as so - for example, it could check an specific flag non_destrutible. This could be done by using a decorator to wrap the method with this descriptor functionality
class Muteable:
def __init__(self, flag_attr):
self.flag_attr = flag_attr
def __call__(self, func):
"""Called when the decorator is applied"""
self.func = func
return self
def __get__(self, instance, owner):
if instance and getattr(instance, self.flag_attr, False):
raise AttributeError('Objects of type {0} have no {1} method'.format(instance.__class__.__name__, self.func.__name__))
return self.func.__get__(instance, owner)
class Planet:
def __init__(self, name=""):
pass
#Muteable("undestroyable")
def destroy(self):
print("Destroyed")
class BorgWorld(Planet):
undestroyable = True
And on the interactive prompt:
In [110]: Planet().destroy()
Destroyed
In [111]: BorgWorld().destroy()
...
AttributeError: Objects of type BorgWorld have no destroy method
In [112]: BorgWorld().destroy
AttributeError: Objects of type BorgWorld have no destroy method
Perceive that unlike simply overriding the method, this approach raises the error when the attribute is retrieved - and will even make hasattr work:
In [113]: hasattr(BorgWorld(), "destroy")
Out[113]: False
Although, it won't work if one tries to retrieve the method directly from the class, instead of from an instance - in that case the instance parameter to __get__ is set to None, and we can't say from which class it was retrieved - just the owner class, where it was declared.
In [114]: BorgWorld.destroy
Out[114]: <function __main__.Planet.destroy>
Second approach: __delattr__ on the metaclass:
While writting the above, it occurred me that Pythn does have the __delattr__ special method. If the Planet class itself implements __delattr__ and we'd try to delete the destroy method on specifc derived classes, it wuld nt work: __delattr__ gards the attribute deletion of attributes in instances - and if you'd try to del the "destroy" method in an instance, it would fail anyway, since the method is in the class.
However, in Python, the class itself is an instance - of its "metaclass". That is usually type . A proper __delattr__ on the metaclass of "Planet" could make possible the "disinheitance" of the "destroy" method by issuing a `del UndestructiblePlanet.destroy" after class creation.
Again, we use the descriptor protocol to have a proper "deleted method on the subclass":
class Deleted:
def __init__(self, cls, name):
self.cls = cls.__name__
self.name = name
def __get__(self, instance, owner):
raise AttributeError("Objects of type '{0}' have no '{1}' method".format(self.cls, self.name))
class Deletable(type):
def __delattr__(cls, attr):
print("deleting from", cls)
setattr(cls, attr, Deleted(cls, attr))
class Planet(metaclass=Deletable):
def __init__(self, name=""):
pass
def destroy(self):
print("Destroyed")
class BorgWorld(Planet):
pass
del BorgWorld.destroy
And with this method, even trying to retrieve or check for the method existense on the class itself will work:
In [129]: BorgWorld.destroy
...
AttributeError: Objects of type 'BorgWorld' have no 'destroy' method
In [130]: hasattr(BorgWorld, "destroy")
Out[130]: False
metaclass with a custom __prepare__ method.
Since metaclasses allow one to customize the object that contains the class namespace, it is possible to have an object that responds to a del statement within the class body, adding a Deleted descriptor.
For the user (programmer) using this metaclass, it is almost the samething, but for the del statement been allowed into the class body itself:
class Deleted:
def __init__(self, name):
self.name = name
def __get__(self, instance, owner):
raise AttributeError("No '{0}' method on class '{1}'".format(self.name, owner.__name__))
class Deletable(type):
def __prepare__(mcls,arg):
class D(dict):
def __delitem__(self, attr):
self[attr] = Deleted(attr)
return D()
class Planet(metaclass=Deletable):
def destroy(self):
print("destroyed")
class BorgPlanet(Planet):
del destroy
(The 'deleted' descriptor is the correct form to mark a method as 'deleted' - in this method, though, it can't know the class name at class creation time)
As a class decorator:
And given the "deleted" descriptor, one could simply inform the methods to be removed as a class decorator - there is no need for a metaclass in this case:
class Deleted:
def __init__(self, cls, name):
self.cls = cls.__name__
self.name = name
def __get__(self, instance, owner):
raise AttributeError("Objects of type '{0}' have no '{1}' method".format(self.cls, self.name))
def mute(*methods):
def decorator(cls):
for method in methods:
setattr(cls, method, Deleted(cls, method))
return cls
return decorator
class Planet:
def destroy(self):
print("destroyed")
#mute('destroy')
class BorgPlanet(Planet):
pass
Modifying the __getattribute__ mechanism:
For sake of completeness - what really makes Python reach methods and attributes on the super-class is what happens inside the __getattribute__ call. n the object version of __getattribute__ is where the algorithm with the priorities for "data-descriptor, instance, class, chain of base-classes, ..." for attribute retrieval is encoded.
So, changing that for the class is an easy an unique point to get a "legitimate" attribute error, without need for the "non-existent" descritor used on the previous methods.
The problem is that object's __getattribute__ does not make use of type's one to search the attribute in the class - if it did so, just implementing the __getattribute__ on the metaclass would suffice. One have to do that on the instance to avoid instance lookp of an method, and on the metaclass to avoid metaclass look-up. A metaclass can, of course, inject the needed code:
def blocker_getattribute(target, attr, attr_base):
try:
muted = attr_base.__getattribute__(target, '__muted__')
except AttributeError:
muted = []
if attr in muted:
raise AttributeError("object {} has no attribute '{}'".format(target, attr))
return attr_base.__getattribute__(target, attr)
def instance_getattribute(self, attr):
return blocker_getattribute(self, attr, object)
class M(type):
def __init__(cls, name, bases, namespace):
cls.__getattribute__ = instance_getattribute
def __getattribute__(cls, attr):
return blocker_getattribute(cls, attr, type)
class Planet(metaclass=M):
def destroy(self):
print("destroyed")
class BorgPlanet(Planet):
__muted__=['destroy'] # or use a decorator to set this! :-)
pass
If Undestroyable is a unique (or at least unusual) case, it's probably easiest to just redefine destroy():
class Undestroyable(Planet):
# ...
def destroy(self):
cls_name = self.__class__.__name__
raise AttributeError("%s has no attribute 'destroy'" % cls_name)
From the point of view of the user of the class, this will behave as though Undestroyable.destroy() doesn't exist … unless they go poking around with hasattr(Undestroyable, 'destroy'), which is always a possibility.
If it happens more often that you want subclasses to inherit some properties and not others, the mixin approach in chepner's answer is likely to be more maintainable. You can improve it further by making Destructible an abstract base class:
from abc import abstractmethod, ABCMeta
class Destructible(metaclass=ABCMeta):
#abstractmethod
def destroy(self):
pass
class BasePlanet:
# ...
pass
class Planet(BasePlanet, Destructible):
def destroy(self):
# ...
pass
class IndestructiblePlanet(BasePlanet):
# ...
pass
This has the advantage that if you try to instantiate the abstract class Destructible, you'll get an error pointing you at the problem:
>>> Destructible()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Destructible with abstract methods destroy
… similarly if you inherit from Destructible but forget to define destroy():
class InscrutablePlanet(BasePlanet, Destructible):
pass
>>> InscrutablePlanet()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class InscrutablePlanet with abstract methods destroy
Rather than remove an attribute that is inherited, only inherit destroy in the subclasses where it is applicable, via a mix-in class. This preserves the correct "is-a" semantics of inheritance.
class Destructible(object):
def destroy(self):
pass
class BasePlanet(object):
...
class Planet(BasePlanet, Destructible):
...
class IndestructiblePlanet(BasePlanet): # Does *not* inherit from Destructible
...
You can provide suitable definitions for destroy in any of Destructible, Planet, or any class that inherits from Planet.
Metaclasses and descriptor protocols are fun, but perhaps overkill. Sometimes, for raw functionality, you can't beat good ole' __slots__.
class Planet(object):
def __init__(self, name):
self.name = name
def destroy(self):
print("Boom! %s is toast!\n" % self.name)
class Undestroyable(Planet):
__slots__ = ['destroy']
def __init__(self,name):
super().__init__(name)
print()
x = Planet('Pluto') # Small, easy to destroy
y = Undestroyable('Jupiter') # Too big to fail
x.destroy()
y.destroy()
Boom! Pluto is toast!
Traceback (most recent call last):
File "planets.py", line 95, in <module>
y.destroy()
AttributeError: destroy
You cannot inherit only a portion of a class. Its all or nothing.
What you can do is to put the destroy function in a second level of the class, such you have the Planet-class without the destry-function, and then you make a DestroyablePlanet-Class where you add the destroy-function, which all the destroyable planets use.
Or you can put a flag in the construct of the Planet-Class which determines if the destroy function will be able to succeed or not, which is then checked in the destroy-function.
I have a large Python 2.3 based installation with 200k LOC. As part of a migration project I need to intercept all attribute lookups of all old-style class.
Old legacy code:
class Foo(Bar):
...
My idea is to inject a common mixin class like
class Foo(Bar, Mixin):
...
class Mixin:
def __getattr__(self, k)
print repr(self), k
return Foo.__getattr__(self, k)
However I am running always into a recursion because Foo.__getattr__ resolves
to Mixin.__getattr__.
Is there any way to fix the code for Python 2.3 old-style classes?
If you are already injecting mixins, why not add object as parent, to make them new style
class Foo(Mixin, Bar, object):
...
And then use super
class Mixin(object):
def __getattr__(self, k)
print repr(self), k
return super(Mixin, self).__getattr__(k)
Assuming that none of the classes in your code base implement __setattr__ or __getattr__ then one approach is to intercept __setattr__ in your Mixin, writing the value to another reserved attribute, then read it back in __getattr__
class Mixin:
def __setattr__(self, attr, value):
# write the value into some special reserved space
namespace = self.__dict__.setdefault("_namespace", {})
namespace[attr] = value
def __getattr__(self, attr):
# reject special methods so e.g. __repr__ can't recurse
if attr.startswith("__") and attr.endswith("__"):
raise AttributeError
# do whatever you wish to do here ...
print repr(self), attr
# read the value from the reserved space
namespace = self.__dict__.get("_namespace", {})
return namespace[attr]
Example:
class Foo(Mixin):
def __init__(self):
self.x = 1
Then
>>> Foo().x
<__main__.Foo instance at 0x10c4dad88> x
Clearly this won't work if any of your Foo classes implement __setattr__ or __getattr__ themselves.
So I have a .py file containing a class where its subclasses can be accessed as properties. All these subclasses are defined beforehand. I also need all the subclasses to have the same ability (having their own subclasses be accessible as properties). The biggest problem I've been facing is that I don't know how to access the current class within my implementation of __getattr__(), so that'd be a good place to start.
Here's some Python+Pseudocode with what I've tried so far. I'm pretty sure it won't work since __getattr__() seems to be only working with instances of a class. If that is case, sorry, I am not as familiar with OOP in Python as I would like.
class A(object):
def __getattr__(self, name):
subclasses = [c.__name__ for c in current_class.__subclasses__()]
if name in subclasses:
return name
raise AttributeError
If I've understood your question properly, you can do what you want by using a custom metaclass that adds a classmethod to its instances. Here's an example:
class SubclassAttributes(type):
def __getattr__(cls, name): # classmethod of instances
for subclass in cls.__subclasses__():
if subclass.__name__ == name:
return subclass
else:
raise TypeError('Class {!r} has no subclass '
'named {!r}'.format(cls.__name__, name))
class Base(object):
__metaclass__ = SubclassAttributes # Python 2 metaclass syntax
#class Base(object, metaclass=SubclassAttributes): # Python 3 metaclass syntax
# """ nothing to see here """
class Derived1(Base): pass
class Derived2(Base): pass
print(Base.Derived1) # -> <class '__main__.Derived1'>
print(Base.Derived2) # -> <class '__main__.Derived2'>
print(Base.Derived3) # -> TypeError: Class 'Base' has no subclass named 'Derived3'
For something that works in both Python 2 and 3, define the class as shown below. Derives Base from a class that has SubclassAttributes as its metaclass. The is similar to what the six module's with_metaclass() function does:
class Base(type.__new__(type('TemporaryMeta', (SubclassAttributes,), {}),
'TemporaryClass', (), {})): pass
class A(object):
def __getattr__(self, key):
for subclass in self.__class__.__subclasses__():
if (subclass.__name__ == key):
return subclass
raise AttributeError, key
Out of curiosity, what is this designed to be used for?
>>> class A(object):
... pass
...
>>> foo = A()
>>> foo.__class__
<class '__main__.A'>
I'm writing a decorator for methods that must inspect the parent methods (the methods of the same name in the parents of the class in which I'm decorating).
Example (from the fourth example of PEP 318):
def returns(rtype):
def check_returns(f):
def new_f(*args, **kwds):
result = f(*args, **kwds)
assert isinstance(result, rtype), \
"return value %r does not match %s" % (result,rtype)
return result
new_f.func_name = f.func_name
# here I want to reach the class owning the decorated method f,
# it should give me the class A
return new_f
return check_returns
class A(object):
#returns(int)
def compute(self, value):
return value * 3
So I'm looking for the code to type in place of # here I want...
Thanks.
As bobince said it, you can't access the surrounding class, because at the time the decorator is invoked, the class does not exist yet. If you need access to the full dictionary of the class and the bases, you should consider a metaclass:
__metaclass__
This variable can be any callable accepting arguments for name, bases, and dict. Upon class creation, the callable is used instead of the built-in type().
Basically, we convert the returns decorator into something that just tells the metaclass to do some magic on class construction:
class CheckedReturnType(object):
def __init__(self, meth, rtype):
self.meth = meth
self.rtype = rtype
def returns(rtype):
def _inner(f):
return CheckedReturnType(f, rtype)
return _inner
class BaseInspector(type):
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.iteritems():
if isinstance(obj, CheckedReturnType):
# do your wrapping & checking here, base classes are in bases
# reassign to dct
return type.__new__(mcs, name, bases, dct)
class A(object):
__metaclass__ = BaseInspector
#returns(int)
def compute(self, value):
return value * 3
Mind that I have not tested this code, please leave comments if I should update this.
There are some articles on metaclasses by the highly recommendable David Mertz, which you might find interesting in this context.
here I want to reach the class owning the decorated method f
You can't because at the point of decoration, no class owns the method f.
class A(object):
#returns(int)
def compute(self, value):
return value * 3
Is the same as saying:
class A(object):
pass
#returns(int)
def compute(self, value):
return value*3
A.compute= compute
Clearly, the returns() decorator is built before the function is assigned to an owner class.
Now when you write a function to a class (either inline, or explicitly like this) it becomes an unbound method object. Now it has a reference to its owner class, which you can get by saying:
>>> A.compute.im_class
<class '__main__.A'>
So you can read f.im_class inside ‘new_f’, which is executed after the assignment, but not in the decorator itself.
(And even then it's a bit ugly relying on a CPython implementation detail if you don't need to. I'm not quite sure what you're trying to do, but things involving “get the owner class” are often doable using metaclasses.)
I am trying to write a decorator to do logging:
def logger(myFunc):
def new(*args, **keyargs):
print 'Entering %s.%s' % (myFunc.im_class.__name__, myFunc.__name__)
return myFunc(*args, **keyargs)
return new
class C(object):
#logger
def f():
pass
C().f()
I would like this to print:
Entering C.f
but instead I get this error message:
AttributeError: 'function' object has no attribute 'im_class'
Presumably this is something to do with the scope of 'myFunc' inside 'logger', but I've no idea what.
Claudiu's answer is correct, but you can also cheat by getting the class name off of the self argument. This will give misleading log statements in cases of inheritance, but will tell you the class of the object whose method is being called. For example:
from functools import wraps # use this to preserve function signatures and docstrings
def logger(func):
#wraps(func)
def with_logging(*args, **kwargs):
print "Entering %s.%s" % (args[0].__class__.__name__, func.__name__)
return func(*args, **kwargs)
return with_logging
class C(object):
#logger
def f(self):
pass
C().f()
As I said, this won't work properly in cases where you've inherited a function from a parent class; in this case you might say
class B(C):
pass
b = B()
b.f()
and get the message Entering B.f where you actually want to get the message Entering C.f since that's the correct class. On the other hand, this might be acceptable, in which case I'd recommend this approach over Claudiu's suggestion.
Functions only become methods at runtime. That is, when you get C.f you get a bound function (and C.f.im_class is C). At the time your function is defined it is just a plain function, it is not bound to any class. This unbound and disassociated function is what is decorated by logger.
self.__class__.__name__ will give you the name of the class, but you can also use descriptors to accomplish this in a somewhat more general way. This pattern is described in a blog post on Decorators and Descriptors, and an implementation of your logger decorator in particular would look like:
class logger(object):
def __init__(self, func):
self.func = func
def __get__(self, obj, type=None):
return self.__class__(self.func.__get__(obj, type))
def __call__(self, *args, **kw):
print 'Entering %s' % self.func
return self.func(*args, **kw)
class C(object):
#logger
def f(self, x, y):
return x+y
C().f(1, 2)
# => Entering <bound method C.f of <__main__.C object at 0x...>>
Obviously the output can be improved (by using, for example, getattr(self.func, 'im_class', None)), but this general pattern will work for both methods and functions. However it will not work for old-style classes (but just don't use those ;)
Ideas proposed here are excellent, but have some disadvantages:
inspect.getouterframes and args[0].__class__.__name__ are not suitable for plain functions and static-methods.
__get__ must be in a class, that is rejected by #wraps.
#wraps itself should be hiding traces better.
So, I've combined some ideas from this page, links, docs and my own head,
and finally found a solution, that lacks all three disadvantages above.
As a result, method_decorator:
Knows the class the decorated method is bound to.
Hides decorator traces by answering to system attributes more correctly than functools.wraps() does.
Is covered with unit-tests for bound an unbound instance-methods, class-methods, static-methods, and plain functions.
Usage:
pip install method_decorator
from method_decorator import method_decorator
class my_decorator(method_decorator):
# ...
See full unit-tests for usage details.
And here is just the code of the method_decorator class:
class method_decorator(object):
def __init__(self, func, obj=None, cls=None, method_type='function'):
# These defaults are OK for plain functions
# and will be changed by __get__() for methods once a method is dot-referenced.
self.func, self.obj, self.cls, self.method_type = func, obj, cls, method_type
def __get__(self, obj=None, cls=None):
# It is executed when decorated func is referenced as a method: cls.func or obj.func.
if self.obj == obj and self.cls == cls:
return self # Use the same instance that is already processed by previous call to this __get__().
method_type = (
'staticmethod' if isinstance(self.func, staticmethod) else
'classmethod' if isinstance(self.func, classmethod) else
'instancemethod'
# No branch for plain function - correct method_type for it is already set in __init__() defaults.
)
return object.__getattribute__(self, '__class__')( # Use specialized method_decorator (or descendant) instance, don't change current instance attributes - it leads to conflicts.
self.func.__get__(obj, cls), obj, cls, method_type) # Use bound or unbound method with this underlying func.
def __call__(self, *args, **kwargs):
return self.func(*args, **kwargs)
def __getattribute__(self, attr_name): # Hiding traces of decoration.
if attr_name in ('__init__', '__get__', '__call__', '__getattribute__', 'func', 'obj', 'cls', 'method_type'): # Our known names. '__class__' is not included because is used only with explicit object.__getattribute__().
return object.__getattribute__(self, attr_name) # Stopping recursion.
# All other attr_names, including auto-defined by system in self, are searched in decorated self.func, e.g.: __module__, __class__, __name__, __doc__, im_*, func_*, etc.
return getattr(self.func, attr_name) # Raises correct AttributeError if name is not found in decorated self.func.
def __repr__(self): # Special case: __repr__ ignores __getattribute__.
return self.func.__repr__()
It seems that while the class is being created, Python creates regular function objects. They only get turned into unbound method objects afterwards. Knowing that, this is the only way I could find to do what you want:
def logger(myFunc):
def new(*args, **keyargs):
print 'Entering %s.%s' % (myFunc.im_class.__name__, myFunc.__name__)
return myFunc(*args, **keyargs)
return new
class C(object):
def f(self):
pass
C.f = logger(C.f)
C().f()
This outputs the desired result.
If you want to wrap all the methods in a class, then you probably want to create a wrapClass function, which you could then use like this:
C = wrapClass(C)
Class functions should always take self as their first argument, so you can use that instead of im_class.
def logger(myFunc):
def new(self, *args, **keyargs):
print 'Entering %s.%s' % (self.__class__.__name__, myFunc.__name__)
return myFunc(self, *args, **keyargs)
return new
class C(object):
#logger
def f(self):
pass
C().f()
at first I wanted to use self.__name__ but that doesn't work because the instance has no name. you must use self.__class__.__name__ to get the name of the class.
I found another solution to a very similar problem using the inspect library. When the decorator is called, even though the function is not yet bound to the class, you can inspect the stack and discover which class is calling the decorator. You can at least get the string name of the class, if that is all you need (probably can't reference it yet since it is being created). Then you do not need to call anything after the class has been created.
import inspect
def logger(myFunc):
classname = inspect.getouterframes(inspect.currentframe())[1][3]
def new(*args, **keyargs):
print 'Entering %s.%s' % (classname, myFunc.__name__)
return myFunc(*args, **keyargs)
return new
class C(object):
#logger
def f(self):
pass
C().f()
While this is not necessarily better than the others, it is the only way I can figure out to discover the class name of the future method during the call to the decorator. Make note of not keeping references to frames around in the inspect library documentation.
As shown in Asa Ayers' answer, you don't need to access the class object. It may be worth to know that since Python 3.3, you can also use __qualname__, which gives you the fully qualified name:
>>> def logger(myFunc):
... def new(*args, **keyargs):
... print('Entering %s' % myFunc.__qualname__)
... return myFunc(*args, **keyargs)
...
... return new
...
>>> class C(object):
... #logger
... def f(self):
... pass
...
>>> C().f()
Entering C.f
This has the added advantage of working also in the case of nested classes, as shown in this example taken from PEP 3155:
>>> class C:
... def f(): pass
... class D:
... def g(): pass
...
>>> C.__qualname__
'C'
>>> C.f.__qualname__
'C.f'
>>> C.D.__qualname__
'C.D'
>>> C.D.g.__qualname__
'C.D.g'
Notice also that in Python 3 the im_class attribute is gone, therefore if you really wish to access the class in a decorator, you need an other method. The approach I currently use involves object.__set_name__ and is detailed in my answer to "Can a Python decorator of an instance method access the class?"
You can also use new.instancemethod() to create an instance method (either bound or unbound) from a function.
Instead of injecting decorating code at definition time, when function doesn't know it's class, delay running this code until function is accessed/called. Descriptor object facilitates injecting own code late, at access/call time:
class decorated(object):
def __init__(self, func, type_=None):
self.func = func
self.type = type_
def __get__(self, obj, type_=None):
return self.__class__(self.func.__get__(obj, type_), type_)
def __call__(self, *args, **kwargs):
name = '%s.%s' % (self.type.__name__, self.func.__name__)
print('called %s with args=%s kwargs=%s' % (name, args, kwargs))
return self.func(*args, **kwargs)
class Foo(object):
#decorated
def foo(self, a, b):
pass
Now we can inspect class both at access time (__get__) and at call time (__call__). This mechanism works for plain methods as well as static|class methods:
>>> Foo().foo(1, b=2)
called Foo.foo with args=(1,) kwargs={'b': 2}
Full example at: https://github.com/aurzenligl/study/blob/master/python-robotwrap/Example4.py