Discover decorated class instance methods in python - python

I have a python class, for example:
class Book(models.Model):
enabled = models.BooleanField(default=False)
full_title = models.CharField(max_length=256)
alias = models.CharField(max_length=64)
author = models.CharField(max_length=64)
status = models.CharField(max_length=64)
#serializable
def pretty_status(self):
return [b for a, b in BOOK_STATUS_CHOICES if a == self.status][0]
The method pretty_status is decorated with #serializable.
What is the simplest and most efficient way to discover the methods in a class that have a certain decoration ? (in the above example giving: pretty_status).
Edit:
Please also note that the decorator in question is custom/modifiable.

If you have no control over what the decorator does, then in general, you can not identify decorated methods.
However, since you can modify serializable, then you could add an attribute to the wrapped function which you could later use to identify serialized methods:
import inspect
def serializable(func):
def wrapper(self):
pass
wrapper.serialized = True
return wrapper
class Book:
#serializable
def pretty_status(self):
pass
def foo(self):
pass
for name, member in inspect.getmembers(Book, inspect.ismethod):
if getattr(member, 'serialized', False):
print(name, member)
yields
('pretty_status', <unbound method Book.wrapper>)

Generally speaking, you can't. A decorator is just syntactic sugar for applying a callable. In your case the decorator syntax translates to:
def pretty_status(self):
return [b for a, b in BOOK_STATUS_CHOICES if a == self.status][0]
pretty_status = serializable(pretty_status)
That is, pretty_status is replaced by whatever serializable() returns. What it returns could be anything.
Now, if what serializable returns has itself been decorated with functools.wraps() and you are using Python 3.2 or newer, then you can see if there is a .__wrapped__ attribute on the new .pretty_status method; it's a reference to the original wrapped function.
On earlier versions of Python, you can easily do this yourself too:
def serializable(func):
def wrapper(*args, **kw):
# ...
wrapper.__wrapped__ = func
return wrapper
You can add any number of attributes to that wrapper function, including custom attributes of your own choosing:
def serializable(func):
def wrapper(*args, **kw):
# ...
wrapper._serializable = True
return wrapper
and then test for that attribute:
if getattr(method, '_serializable', False):
print "Method decorated with the #serializable decorator"
One last thing you can do is test for that wrapper function; it'll have a .__name__ attribute that you can test against. That name might not be unique, but it is a start.
In the above sample decorator, the wrapper function is called wrapper, so pretty_status.__name__ == 'wrapper' will be True.

You can't discover them directly but You can mark decorated methods with some flag.
import functools
def serializable(func):
functools.wraps(func)
def wrapper(*args, **kw):
# ...
wrapper._serializable = True
return wrapper
And then You can make metaclass for example analyse presence or absence of _serializable attribute.
Or You can collect all wrapped methodsin decorator
import functools
DECORATED = {}
def serializable(func):
functools.wraps(func)
def wrapper(*args, **kw):
# ...
DECORATED[func.__name__] = wrapper
return wrapper

Related

Dynamic Wrapper in Python

I'm looking to create a dynamic wrapper class that exposes the API calls from a provided object using data in the object.
Statically it looks like this:
class Concrete:
def __init__(self, data):
self.data = data
def print_data(self):
print(self.data)
class Wrapper:
'''
One day this will wrap a variety of objects. But today
it can only handle Concrete objects.
'''
def wrap_it(self, concrete):
self.cco = concrete # concreteobject=cco
def print_data(self):
self.cco.print_data()
cco = Concrete(5)
wcco = Wrapper()
wcco.wrap_it(cco)
wcco.print_data()
Produces
5
I'd like to figure out how to do the same thing but make
wrap_it dynamic. It should search the concrete object
find the functions, and create functions of the same name
that call the same function in the concrete object.
I imagine that the solution involves inspect.signature or
at least some use of *args and **kwargs, but I've not seen
an example on how to put all this together.
You can use the __getattr__ magic method to hook getting undefined attributes, and forward them to the concrete object:
class DynamicWrapper():
def wrap_it(self, concrete):
self.cco = concrete
def __getattr__(self, k):
def wrapper(*args, **kwargs):
print(f'DynamicWrapper calling {k} with args {args} {kwargs}')
return getattr(self.cco, k)(*args, **kwargs)
if hasattr(self.cco, k):
return wrapper
else:
raise AttributeError(f'No such field/method: {k}')
cco = Concrete(5)
dwcco = DynamicWrapper()
dwcco.wrap_it(cco)
dwcco.print_data()
Use the dir() function to get the attributes of the given object, check if they are callable and assign them to your wrapper, like this:
class Wrapper:
def wrap_it(self, objToWrap):
for attr in dir(objToWrap):
if not attr.startswith('__') and callable(getattr(objToWrap, attr)):
exec('self.%s = objToWrap.%s' % (attr, attr))
And now, for testing.
>>> cco = Concrete(5)
>>> wcco = Wrapper()
>>> wcco.wrap_it(cco)
>>> wcco.print_data()
5

Decorate any python function inside context manager

I would like to create a python context manager, which would allow the following (with reverse_decorator applying the decorated function with first argument reversed if it is string):
print('hi')
with MyFunctionDecorator('print', reverse_decorator):
print('hello')
print('bye')
to result in:
hi
olleh
bye
The point is not the print function itself, but writing this kind of context manager, that could decorate any function - local, global, builtin, from whatever module. Is this even possible in python? How should I do it?
EDIT: To clarify a bit, the point was not to have to change the code inside the with context.
This is my approach:
from contextlib import contextmanager
from importlib import import_module
#contextmanager
def MyFunctionDecorator(func, decorator):
if hasattr(func, '__self__'):
owner = func.__self__
elif hasattr(func, '__objclass__'):
owner = func.__objclass__
else:
owner = import_module(func.__module__)
qname = func.__qualname__
while '.' in qname:
parent, qname = qname.split('.', 1)
owner = getattr(owner, parent)
setattr(owner, func.__name__, decorator(func))
yield
setattr(owner, func.__name__, func)
# Example decorator, reverse all str arguments
def reverse_decorator(f):
def wrapper(*args, **kwargs):
newargs = []
for arg in args:
newargs.append(arg[::-1] if isinstance(arg, str) else arg)
newkwargs = {}
for karg, varg in kwargs.values():
newkwargs[karg] = varg[::-1] if isinstance(varg, str) else varg
return f(*newargs, **newkwargs)
return wrapper
# Free functions
print('hi')
with MyFunctionDecorator(print, reverse_decorator):
print('hello')
print('bye')
# Class for testing methods (does not work with builtins)
class MyClass(object):
def __init__(self, objId):
self.objId = objId
def print(self, arg):
print('Printing from object', self.objId, arg)
# Class level (only affects instances created within managed context)
# Note for decorator: first argument of decorated function is self here
with MyFunctionDecorator(MyClass.print, reverse_decorator):
myObj = MyClass(1)
myObj.print('hello')
# Instance level (only affects one instance)
myObj = MyClass(2)
myObj.print('hi')
with MyFunctionDecorator(myObj.print, reverse_decorator):
myObj.print('hello')
myObj.print('bye')
Output:
hi
olleh
bye
Printing from object 1 olleh
Printing from object 2 hi
Printing from object 2 olleh
Printing from object 2 bye
This should work across functions and other modules and so on, since it modifies the attributes of the module or class. Class methods are complicated, because once you create an instance of a class its attributes point to the functions defined in the class at the time the object was created, so you have to choose between modifying the behavior of a particular instance or modifying the behavior of new instances within the managed context, as in the example. Also, trying to decorate methods of builtin classes like list or dict does not work.
It is possible if you modify it add a bit:
print('hi')
with MyFunctionDecorator(print, reverse_decorator) as print:
print('hello')
print('bye')
Here is a definition that works for this example*:
def reverse_decorator(func):
def wrapper(*args, **kwargs):
if len(args) == 1 and not kwargs and isinstance(args[0], str):
return func(args[0][::-1])
return func(*args, **kwargs)
return wrapper
class MyFunctionDecorator:
def __init__(self, func, decorator):
self.func = func
self.decorator = decorator
def __enter__(self):
"""Return the decorated function"""
return self.decorator(self.func)
def __exit__(self, *args):
"""Reset the function in the global namespace"""
globals()[self.func.__name__] = self.func
But its probably easier to just do it explicitly, following the Python Zen:
print('hi')
print('hello'[::-1])
print('bye')
*This code does not work under many circumstances, as #AranFey noted in the comments:
Inside functions
If the function you want to decorate is imported with import x from y as z
If you care that afterwards you have a print function defined in the globals(), instead of directly being a built-in
Since this is more a proof-of-concept, that yes, one can write a decorator that works in this example, I will not try to fix these shortcomings. Just use the way I gave above, or use only the decorator:
print('hi')
reverse_decorator(print)('hello')
print('bye')

How can I delay the __init__ call until an attribute is accessed?

I have a test framework that requires test cases to be defined using the following class patterns:
class TestBase:
def __init__(self, params):
self.name = str(self.__class__)
print('initializing test: {} with params: {}'.format(self.name, params))
class TestCase1(TestBase):
def run(self):
print('running test: ' + self.name)
When I create and run a test, I get the following:
>>> test1 = TestCase1('test 1 params')
initializing test: <class '__main__.TestCase1'> with params: test 1 params
>>> test1.run()
running test: <class '__main__.TestCase1'>
The test framework searches for and loads all TestCase classes it can find, instantiates each one, then calls the run method for each test.
load_test(TestCase1(test_params1))
load_test(TestCase2(test_params2))
...
load_test(TestCaseN(test_params3))
...
for test in loaded_tests:
test.run()
However, I now have some test cases for which I don't want the __init__ method called until the time that the run method is called, but I have little control over the framework structure or methods. How can I delay the call to __init__ without redefining the __init__ or run methods?
Update
The speculations that this originated as an XY problem are correct. A coworker asked me this question a while back when I was maintaining said test framework. I inquired further about what he was really trying to achieve and we figured out a simpler workaround that didn't involve changing the framework or introducing metaclasses, etc.
However, I still think this is a question worth investigating: if I wanted to create new objects with "lazy" initialization ("lazy" as in lazy evaluation generators such as range, etc.) what would be the best way of accomplishing it? My best attempt so far is listed below, I'm interested in knowing if there's anything simpler or less verbose.
First Solution:use property.the elegant way of setter/getter in python.
class Bars(object):
def __init__(self):
self._foo = None
#property
def foo(self):
if not self._foo:
print("lazy initialization")
self._foo = [1,2,3]
return self._foo
if __name__ == "__main__":
f = Bars()
print(f.foo)
print(f.foo)
Second Solution:the proxy solution,and always implement by decorator.
In short, Proxy is a wrapper that wraps the object you need. Proxy could provide additional functionality to the object that it wraps and doesn't change the object's code. It's a surrogate which provide the abitity of control access to a object.there is the code come form user Cyclone.
class LazyProperty:
def __init__(self, method):
self.method = method
self.method_name = method.__name__
def __get__(self, obj, cls):
if not obj:
return None
value = self.method(obj)
print('value {}'.format(value))
setattr(obj, self.method_name, value)
return value
class test:
def __init__(self):
self._resource = None
#LazyProperty
def resource(self):
print("lazy")
self._resource = tuple(range(5))
return self._resource
if __name__ == '__main__':
t = test()
print(t.resource)
print(t.resource)
print(t.resource)
To be used for true one-time calculated lazy properties. I like it because it avoids sticking extra attributes on objects, and once activated does not waste time checking for attribute presence
Metaclass option
You can intercept the call to __init__ using a metaclass. Create the object with __new__ and overwrite the __getattribute__ method to check if __init__ has been called or not and call it if it hasn't.
class DelayInit(type):
def __call__(cls, *args, **kwargs):
def init_before_get(obj, attr):
if not object.__getattribute__(obj, '_initialized'):
obj.__init__(*args, **kwargs)
obj._initialized = True
return object.__getattribute__(obj, attr)
cls.__getattribute__ = init_before_get
new_obj = cls.__new__(cls, *args, **kwargs)
new_obj._initialized = False
return new_obj
class TestDelayed(TestCase1, metaclass=DelayInit):
pass
In the example below, you'll see that the init print won't occur until the run method is executed.
>>> new_test = TestDelayed('delayed test params')
>>> new_test.run()
initializing test: <class '__main__.TestDelayed'> with params: delayed test params
running test: <class '__main__.TestDelayed'>
Decorator option
You could also use a decorator that has a similar pattern to the metaclass above:
def delayinit(cls):
def init_before_get(obj, attr):
if not object.__getattribute__(obj, '_initialized'):
obj.__init__(*obj._init_args, **obj._init_kwargs)
obj._initialized = True
return object.__getattribute__(obj, attr)
cls.__getattribute__ = init_before_get
def construct(*args, **kwargs):
obj = cls.__new__(cls, *args, **kwargs)
obj._init_args = args
obj._init_kwargs = kwargs
obj._initialized = False
return obj
return construct
#delayinit
class TestDelayed(TestCase1):
pass
This will behave identically to the example above.
In Python, there is no way that you can avoid calling __init__ when you instantiate a class cls. If calling cls(args) returns an instance of cls, then the language guarantees that cls.__init__ will have been called.
So the only way to achieve something similar to what you are asking is to introduce another class that will postpone the calling of __init__ in the original class until an attribute of the instantiated class is being accessed.
Here is one way:
def delay_init(cls):
class Delay(cls):
def __init__(self, *arg, **kwarg):
self._arg = arg
self._kwarg = kwarg
def __getattribute__(self, name):
self.__class__ = cls
arg = self._arg
kwarg = self._kwarg
del self._arg
del self._kwarg
self.__init__(*arg, **kwarg)
return getattr(self, name)
return Delay
This wrapper function works by catching any attempt to access an attribute of the instantiated class. When such an attempt is made, it changes the instance's __class__ to the original class, calls the original __init__ method with the arguments that were used when the instance was created, and then returns the proper attribute. This function can be used as decorator for your TestCase1 class:
class TestBase:
def __init__(self, params):
self.name = str(self.__class__)
print('initializing test: {} with params: {}'.format(self.name, params))
class TestCase1(TestBase):
def run(self):
print('running test: ' + self.name)
>>> t1 = TestCase1("No delay")
initializing test: <class '__main__.TestCase1'> with params: No delay
>>> t2 = delay_init(TestCase1)("Delayed init")
>>> t1.run()
running test: <class '__main__.TestCase1'>
>>> t2.run()
initializing test: <class '__main__.TestCase1'> with params: Delayed init
running test: <class '__main__.TestCase1'>
>>>
Be careful where you apply this function though. If you decorate TestBase with delay_init, it will not work, because it will turn the TestCase1 instances into TestBase instances.
In my answer I'd like to focus on cases when one wants to instantiate a class whose initialiser (dunder init) has side effects. For instance, pysftp.Connection, creates an SSH connection, which may be undesired until it's actually used.
In a great blog series about conceiving of wrapt package (nit-picky decorator implementaion), the author describes Transparent object proxy. This code can be customised for the subject in question.
class LazyObject:
_factory = None
'''Callable responsible for creation of target object'''
_object = None
'''Target object created lazily'''
def __init__(self, factory):
self._factory = factory
def __getattr__(self, name):
if not self._object:
self._object = self._factory()
return getattr(self._object, name)
Then it can be used as:
obj = LazyObject(lambda: dict(foo = 'bar'))
obj.keys() # dict_keys(['foo'])
But len(obj), obj['foo'] and other language constructs which invoke Python object protocols (dunder methods, like __len__ and __getitem__) will not work. However, for many cases, which are limited to regular methods, this is a solution.
To proxy object protocol implementations, it's possible to use neither __getattr__, nor __getattribute__ (to do it in a generic way). The latter's documentation notes:
This method may still be bypassed when looking up special methods as the result of implicit invocation via language syntax or built-in functions. See Special method lookup.
As a complete solution is demanded, there are examples of manual implementations like werkzeug's LocalProxy and django's SimpleLazyObject. However a clever workaround is possible.
Luckily there's a dedicated package (based on wrapt) for the exact use case, lazy-object-proxy which is described in this blog post.
from lazy_object_proxy import Proxy
obj = Proxy(labmda: dict(foo = 'bar'))
obj.keys() # dict_keys(['foo'])
len(len(obj)) # 1
obj['foo'] # 'bar'
One alternative would be to write a wrapper that takes a class as input and returns a class with delayed initialization until any member is accessed. This could for example be done as this:
def lazy_init(cls):
class LazyInit(cls):
def __init__(self, *args, **kwargs):
self.args = args
self.kwargs = kwargs
self._initialized = False
def __getattr__(self, attr):
if not self.__dict__['_initialized']:
cls.__init__(self,
*self.__dict__['args'], **self.__dict__['kwargs'])
self._initialized = True
return self.__dict__[attr]
return LazyInit
This could then be used as such
load_test(lazy_init(TestCase1)(test_params1))
load_test(lazy_init(TestCase2)(test_params2))
...
load_test(lazy_init(TestCaseN)(test_params3))
...
for test in loaded_tests:
test.run()
Answering your original question (and the problem I think you are actually trying to solve), "How can I delay the init call until an attribute is accessed?": don't call init until you access the attribute.
Said another way: you can make the class initialization simultaneous with the attribute call. What you seem to actually want is 1) create a collection of TestCase# classes along with their associated parameters; 2) run each test case.
Probably your original problem came from thinking you had to initialize all your TestCase classes in order to create a list of them that you could iterate over. But in fact you can store class objects in lists, dicts etc. That means you can do whatever method you have for finding all TestCase classes and store those class objects in a dict with their relevant parameters. Then just iterate that dict and call each class with its run() method.
It might look like:
tests = {TestCase1: 'test 1 params', TestCase2: 'test 2 params', TestCase3: 'test 3 params'}
for test_case, param in tests.items():
test_case(param).run()
Overridding __new__
You could do this by overriding __new__ method and replacing __init__ method with a custom function.
def init(cls, real_init):
def wrapped(self, *args, **kwargs):
# This will run during the first call to `__init__`
# made after `__new__`. Here we re-assign the original
# __init__ back to class and assign a custom function
# to `instances.__init__`.
cls.__init__ = real_init
def new_init():
if new_init.called is False:
real_init(self, *args, **kwargs)
new_init.called = True
new_init.called = False
self.__init__ = new_init
return wrapped
class DelayInitMixin(object):
def __new__(cls, *args, **kwargs):
cls.__init__ = init(cls, cls.__init__)
return object.__new__(cls)
class A(DelayInitMixin):
def __init__(self, a, b):
print('inside __init__')
self.a = sum(a)
self.b = sum(b)
def __getattribute__(self, attr):
init = object.__getattribute__(self, '__init__')
if not init.called:
init()
return object.__getattribute__(self, attr)
def run(self):
pass
def fun(self):
pass
Demo:
>>> a = A(range(1000), range(10000))
>>> a.run()
inside __init__
>>> a.a, a.b
(499500, 49995000)
>>> a.run(), a.__init__()
(None, None)
>>> b = A(range(100), range(10000))
>>> b.a, b.b
inside __init__
(4950, 49995000)
>>> b.run(), b.__init__()
(None, None)
Using cached properties
The idea is to do the heavy calculation only once by caching results. This approach will lead to much more readable code if the whole point of delaying initialization is improving performance.
Django comes with a nice decorator called #cached_property. I tend to use it a lot in both code and unit-tests for caching results of heavy properties.
A cached_property is a non-data descriptor. Hence once the key is set in instance's dictionary, the access to property would always get the value from there.
class cached_property(object):
"""
Decorator that converts a method with a single self argument into a
property cached on the instance.
Optional ``name`` argument allows you to make cached properties of other
methods. (e.g. url = cached_property(get_absolute_url, name='url') )
"""
def __init__(self, func, name=None):
self.func = func
self.__doc__ = getattr(func, '__doc__')
self.name = name or func.__name__
def __get__(self, instance, cls=None):
if instance is None:
return self
res = instance.__dict__[self.name] = self.func(instance)
return res
Usage:
class A:
#cached_property
def a(self):
print('calculating a')
return sum(range(1000))
#cached_property
def b(self):
print('calculating b')
return sum(range(10000))
Demo:
>>> a = A()
>>> a.a
calculating a
499500
>>> a.b
calculating b
49995000
>>> a.a, a.b
(499500, 49995000)
I think you can use a wrapper class to hold the real class you want to instance, and use call __init__ yourself in your code, like(Python 3 code):
class Wrapper:
def __init__(self, cls):
self.cls = cls
self.instance = None
def your_method(self, *args, **kwargs):
if not self.instance:
self.instnace = cls()
return self.instance(*args, **kwargs)
class YourClass:
def __init__(self):
print("calling __init__")
but it's a dump way, but without any trick.

How to replace __str__ for a function

I want to change the string representation of a python function to be just the function name.
Eg for some function
def blah(x):
...
str(blah) currently gives
<function blah at 0x10127b578>
So I replace __str__ like this:
blah.__str__=lambda: 'blah'
but it doesn't get called by str(blah).
Is it possible to change __str__ for a function?
Replacing __str__ like that never actually works on any type of object. The builtin function type isn't special:
>>> class Foo(object):
pass
>>> f = Foo()
>>> f.__str__ = lambda: 'f'
>>> str(f)
'<__main__.Foo object at 0x0000000002D13F28>'
>>> f.__str__()
'f'
The str builtin doesn't lookup __str__ via the usual lookup protocol, it skips straight to looking at the class of its argument. So you can't "override" the __str__ method on an individual object; you have to set it on a class to apply to all objects of that class.1
But then again, the builtin function type isn't special. You can make a class that behaves mostly like a function, and use that instead:
class Function(object):
def __init__(self, raw_function):
self.raw_function = raw_function
def __call__(self, *args, **kwargs):
return self.raw_function(*args, **kwargs)
def __str__(self):
return self.raw_function.__name__
The __call__ method means you can call objects of this class exactly as if they were functions. This one just passes whatever arguments it receives straight through to the underlying raw function and returns whatever it returns (or throws whatever exceptions it throws). And since this class takes a single function as an argument, it also fits the pattern for a decorator:
#Function
def blah(x):
return x + 1
With that:
>>> blah
<__main__.Function object at 0x0000000002D13EB8>
>>> str(blah)
'blah'
>>> blah(33)
34
1 The builtin function type is special in some regards (I lied), one of which being that you can't assign to its attributes (that's the attributes of the class itself, not the attributes of any particular function object). So before you think maybe it would be great if you could monkeypatch the builtin function type so that str worked the way you want on all functions, that doesn't work. It's probably for the best; modifying things as global and fundamental as the builtin function type would be a great way to invent some really bizarre bugs.
As Joran said:
class NamedFunction:
def __init__(self, name, f):
self.f = f
self.name = name
def __call__(self, *args, **kwargs):
return self.f(*args, **kwargs)
def __str__(self):
return self.name
f = NamedFunction("lambda: 'blah'", lambda: 'blah')
print(f())
print(f)
class FNMagic:
def __init__(self,fn,fn_name):
self.fn = fn
self.fn_name = fn_name
def __call__(self,*args,**kwargs):
return self.fn(*args,**kwargs)
def __str__(self):
return self.fn_name
def blah(x):
return x
blah = FNMagic(blah,"blah!")
print blah
you could make a simple decorator
class NamedFn:
def __init__(self,name):
self.fn_name = name
def __call__(self,fn):
return FNMagic(fn,self.fn_name)
#NamedFn("some_name!!!")
def blah2(x,y):
return x*y
print blah2
As noted already, the answer is no. If you just want to get the name of a function as a string, you can use blah.__name__.

Writing decorator for pytest test method

Assuming the following structure:
class SetupTestParam(object):
def setup_method(self, method):
self.foo = bar()
#pytest.fixture
def some_fixture():
self.baz = 'foobar'
I use SetupTestParam as a parent class for test classes.
class TestSomething(SetupTestParam):
def test_a_lot(self, some_fixture):
with self.baz as magic:
with magic.fooz as more_magic:
blah = more_magic.much_more_magic() # repetative bleh
... # not repetative code here
assert spam == 'something cool'
Now, writing tests gets repetitive (with statement usage) and I would like to write a decorator to reduce the number of code lines. But there is a problem with pytest and the function signature.
I found out library which should be helpful but I can't manage to get it to work.
I made a classmethod in my SetupTestParam class.
#classmethod
#decorator.decorator
def this_is_decorator(cls, f):
def wrapper(self, *args, **kw):
with self.baz as magic:
with magic.fooz as more_magic:
blah = more_magic.much_more_magic() # repetative bleh
return f(self, *args)
return wrapper
After I decorate the test_a_lot method, I receive the error TypeError: transaction_decorator() takes exactly 1 argument (2 given)
Can someone explain me please what am I doing wrong? (I assume there is a problem with self from the test method?)
Chaining decorators is not the simplest thing to do. One solution might be to separate the two decorators. Keep the classmethod but move decorator.decorator to the end:
#classmethod
def this_is_decorator(cls, f):
def wrapper(self, *args, **kw):
with self.baz as magic:
with magic.fooz as more_magic:
blah = more_magic.much_more_magic() # repetative bleh
return f(self, *args)
return decorator.decorator(wrapper, f)
Maybe this works for you.
After some tweaking and realizing that I need to pass a parameter to decorator I choosed to write it as a class:
class ThisIsDecorator(object):
def __init__(self, param):
self.param = param # Parameter may vary with the function being decorated
def __call__(self, fn):
wraps(fn) # [1]
def wrapper(fn, fn_self, *args): # [2] fn_self refers to original self param from function fn (test_a_lot) [2]
with fn_self.baz as fn_self.magic: # I pass magic to fn_self to make magic accesible in function fn (test_a_lot)
with fn_self.magic.fooz as more_magic:
blah = self.param.much_more_magic() # repetative bleh
return fn(fn_self, *args)
return decorator.decorator(wrapper, fn)
[1] I use wraps to have original fn __name__, __module__ and __doc__.
[2] Params passed to wrapper were self = <function test_a_lot at 0x24820c8> args = (<TestSomething object at 0x29c77d0>, None, None, None, None), kw = {} so I took out the args[0] as a fn_self.
Original version (without passing a parameter):
#classmethod
def this_is_decorator(cls, fn):
#wraps(fn)
def wrapper(fn, fn_self, *args):
with fn_self.baz as fn_self.magic:
with fn_self.magic.fooz as more_magic:
blah = more_magic.much_more_magic() # repetative bleh
return fn(fn_self, *args)
return decorator.decorator(wrapper,fn)
Thanks go to Mike Muller for pointing out the right direction.
Here's what happens in time order as this method is defined.
this_is_decorator is created (not called).
decorator.decorator(this_is_decorator) is called. This returns a new function which becomes this_is_decorator and has the same usage.
classmethod(this_is_decorator) is called, and the result of that is a classmethod that accepts (cls, f) and returns wrapper.
Later at runtime, a call to this_is_decorator will return wrapper.
But considering that this_is_decorator is a class method, it's not clear to me that this is what you want. I'm guessing that you may want something more like this:
from decorator import decorator
#decorator
def mydecorator(f):
def wrapper(cls, *args, **kw):
# ... logging, reporting, caching, whatever
return f(*args, **kw)
return wrapper
class MyClass(object):
#classmethod
#mydecorator
def myclsmethod(a, b, c):
# no cls or self value accepted here; this is a function not a method
# ...
Here your decorator is defined outside your class, because it's changing an ordinary function into a classmethod (and because you may want to use it in other places). The order of execution here is:
mydecorator is defined, not called.
decorator(mydecorator) is called, and the result becomes mydecorator.
Creation of MyClass starts.
myclsmethod is created. It is an ordinary function, not a method. There is a difference within the VM, so that you do not have to explicitly supply cls or self arguments to methods.
myclsmethod is passed to mydecorator (which has itself been decorated before) and the result (wrapper) is still a function not a method.
The result of mydecorator is passed to classmethod which returns an actual class method that is bound to MyClass.myclsmethod.
Definition of MyClass finishes.
Later when MyClass.myclsmethod(a, b, c) is called, wrapper executes, which then calls the original myclsmethod(a, b, c) function (which it knows as f) without supplying the cls argument.
Since you have an additional need to preserve the argument list exactly, so that even the names of the arguments are preserved in the decorated function, except with an extra initial argument cls, then you could implement mydecorator this way:
from decorator import decorator
from inspect import getargspec
#decorator
def mydecorator(func):
result = [None] # necessary so exec can "return" objects
namespace = {'f': func, 'result': result}
source = []
add = lambda indent, line: source.append(' ' * indent + line) # shorthand
arglist = ', '.join(getargspec(func).args) # this does not cover keyword or default args
add(0, 'def wrapper(cls, %s):' % (arglist,))
add(2, 'return f(%s)' % (arglist,))
add(0, 'result[0] = wrapper') # this is how to "return" something from exec
exec '\n'.join(source) in namespace
return result[0] # this is wrapper
It's kinda ugly, but this is the only way I know to dynamically set the argument list of a function based on data. If returning a lambda is OK, you could use eval instead of exec, which eliminates the need for an array that gets written into but is otherwise about the same.

Categories

Resources