I'm trying to find a way to dynamically add methods to a class through decorator.
The decorator i have look like:
def deco(target):
def decorator(function):
#wraps(function)
def wrapper(self, *args, **kwargs):
return function(*args, id=self.id, **kwargs)
setattr(target, function.__name__, wrapper)
return function
return decorator
class A:
pass
# in another module
#deco(A)
def compute(id: str):
return do_compute(id)
# in another module
#deco(A)
def compute2(id: str):
return do_compute2(id)
# **in another module**
a = A()
a.compute() # this should work
a.compute2() # this should work
My hope is the decorator should add the compute() function to class A, any object of A should have the compute() method.
However, in my test, this only works if i explicitly import compute into where an object of A is created. I think i'm missing something obvious, but don't know how to fix it. appreciate any help!
I think this will be quite simpler using a decorator implemented as a class:
class deco:
def __init__(self, cls):
self.cls = cls
def __call__(self, f):
setattr(self.cls, f.__name__, f)
return self.cls
class A:
def __init__(self, val):
self.val = val
#deco(A)
def compute(a_instance):
print(a_instance.val)
A(1).compute()
A(2).compute()
outputs
1
2
But just because you can do it does not mean you should. This can become a debugging nightmare, and will probably give a hard time to any static code analyser or linter (PyCharm for example "complains" with Unresolved attribute reference 'compute' for class 'A')
Why doesn't it work out of the box when we split it to different modules (more specifically, when compute is defined in another module)?
Assume the following:
a.py
print('importing deco and A')
class deco:
def __init__(self, cls):
self.cls = cls
def __call__(self, f):
setattr(self.cls, f.__name__, f)
return self.cls
class A:
def __init__(self, val):
self.val = val
b.py
print('defining compute')
from a import A, deco
#deco(A)
def compute(a_instance):
print(a_instance.val)
main.py
from a import A
print('running main')
A(1).compute()
A(2).compute()
If we execute main.py we get the following:
importing deco and A
running main
Traceback (most recent call last):
A(1).compute()
AttributeError: 'A' object has no attribute 'compute'
Something is missing. defining compute is not outputted. Even worse, compute is never defined, let alone getting bound to A.
Why? because nothing triggered the execution of b.py. Just because it sits there does not mean it gets executed.
We can force its execution by importing it. Feels kind of abusive to me, but it works because importing a file has a side-effect: it executes every piece of code that is not guarded by if __name__ == '__main__, much like importing a module executes its __init__.py file.
main.py
from a import A
import b
print('running main')
A(1).compute()
A(2).compute()
outputs
importing deco and A
defining compute
running main
1
2
Related
I'm trying to figure out how to decorate a test function in a way that makes the information from the decorator available to setUp. The code looks something like this:
import unittest
class MyTest(unittest.TestCase):
def setUp(self):
stopService()
eraseAllPreferences()
setTestPreferences()
startService()
#setPreference("abc", 5)
def testPreference1(self):
pass
#setPreference("xyz", 5)
def testPreference2(self):
pass
The goal is for setUp to understand it's running testPreference1 and to know that it needs to set preference "abc" to 5 before starting the service (& similarly regarding "xyz" and testPreference2).
I can of course just use a conditional on the the test name (if self._testMethodName == "testPreference1") but that doesn't feel quite as maintainable as the number of tests grows (+ refactoring is more error-prone). I'm hoping to solve this in setUp rather than overriding the run implementation. I'm also having
I'm running python3.6 although if there are creative solutions depending on newer python features happy to learn about that too.
Decorators work well but there's no real "official" way to get the underlying method so I just did what the unittest source does: method = getattr(self, self._testMethodName)
import functools
import unittest
def setFoo(value):
def inner(func):
print(f"Changing foo for function {func}")
func.foo = value
#functools.wraps(func)
def wrapper(self, *args, **kwargs):
return func(self, *args, **kwargs)
return wrapper
return inner
class Foo(unittest.TestCase):
def setUp(self):
method = getattr(self, self._testMethodName)
print(f"Foo = {method.foo}")
#setFoo("abc")
def testFoo(self):
self.assertEqual(self.testFoo.foo, "abc")
#setFoo("xyz")
def testBar(self):
self.assertEqual(self.testBar.foo, "xyz")
if __name__ == "__main__":
unittest.main()
How can I execute code using a class with a "with" statement, but don't execute code until certain function is run?
For example if I have the class:
class Foo:
def __init__(self):
self.bar = "baz"
def __enter__(self):
return self
def __exit__(self, x, y, z):
return None
def run(self):
# Start running code inserted into __enter__?
And I used the class like:
bar = Foo()
with bar as f:
print(f.bar)
# Don't execute this code until run() is called
bar.run()
# Now execute the code that was inserted into the with statement above
Is this possible?
I don't think you can do that, the way your code is currently written. However, you effectively want bar.run() to be called from Foo.__enter__.
class Foo:
def __init__(self, pre=None):
self.bar = "baz"
def __enter__(self):
self.run()
return `self`
def __exit__(self, x, y, z):
return None
def run(self):
# Start running code inserted into __enter__?
bar = Foo()
with bar as f:
print(f.bar)
Nope, the code inside the with block gets executed only during the execution of the with block. It's not, for example, passed as a closure to the context manager.
If you want that behaviour, you could possibly use a decorator and a local function declaration instead of a with block.
class Foo:
def __init__(self, callback):
self.callback = callback
self.bar = 'baz'
def run(self):
# Setup goes here.
try:
self.callback(self)
finally:
# Cleanup goes here.
pass
#Foo
def bar(f):
print(f.bar)
bar.run()
That's as close to your example as I can get, names included. I think perhaps applying the constructor directly as a decorator is a bit ungainly - you might prefer to have a separate decorator function to call and return a class instance, but you get the general idea.
I am writing a class that sends slack messages to users when processes have finished. I thought it would be useful to provide a Jupyter magic so that users can be notified when the cell is executed.
The class already provides a decorator, so I figured I'd just wrap a cell execution in a decorated function.
from IPython.core.magic import register_cell_magic
from IPython import get_ipython
import functools
class MyClass(object):
def decorate(self, f):
#functools.wraps(f)
def wrapped(*args, **kwargs):
r = f(*args, **kwargs)
print('Send a message here!')
return r
return wrapped
#register_cell_magic
def magic(self, line, cell):
ip = get_ipython()
#self.decorate
def f():
return ip.run_cell(cell)
return f()
So then I'd do:
obj = MyClass()
# ----- NEW CELL
%%obj.magic
'''do some stuff'''
But I get
>> UsageError: Cell magic `%%obj.magic` not found.
I found out that the magic is registered under its name (above, magic), so %%magic works. But then the arguments are all messed up because there is no self in the mix.
I want the magic to be an instance method so that config (set in __init__ can be used). Is there any way to do this?
Here are a couple hacky solutions I don't want to implement unless I really have to:
Register a regular function with the instance as an argument. I don't want to add that line of code to the notebook, I want to use an instance method.
Register a regular function that constructs an instance on the fly.
This is the best I can come up with, and it's #1 on the list of the things I didn't want to do.
from IPython.core.magic import register_cell_magic
from IPython import get_ipython
import functools
class MyClass(object):
def decorate(self, f):
#functools.wraps(f)
def wrapped(*args, **kwargs):
r = f(*args, **kwargs)
print('Send a message here!')
return r
return wrapped
def register_magic(self):
#register_cell_magic
def magic(line, cell):
ip = get_ipython()
#self.decorate
def f():
return ip.run_cell(cell)
return f()
Then
obj = MyClass()
obj.register_magic()
# ------
%%magic
...
Suppose you want to call a method foo on object bar, but somehow while typing the method invocation you intuitively treated foo as a property and you typed bar.foo instead of bar.foo() (with parenthesis). Now, both are syntactically correct, so no error is raised, but semantically very different. It happened to me several times already (my experience in Ruby makes it even worse) and caused me dearly in terms of long and confusing debugging sessions.
Is there a way to make Python interpreter print a warning in such cases - whenever you access an attribute which is callable, but you haven't actually called it?
For the record - I thought about overriding __getattribute__ but it's messy and ultimately won't achieve the goal since function invocation via () happens after __getattribute__ has returned.
This can't be done in all cases because sometimes you don't want to call the method, e.g. you might want to store it as a callable to be used later, like callback = object.method.
But you can use static analysis tools such as pylint or PyCharm (my recommendation) that warn you if you write a statement that looks pointless, e.g. object.method without any assignment.
Furthermore if you write x = obj.get_x but meant get_x(), then later when you try to use x a static analysis tool may be able to warn you (if you're lucky) that x is a method but an instance of X is expected.
It was quite challenging, but I think i get it done! My code isn't very complicated, but you need to be well aware of metaclasses.
Metaclass and wrapper (WarnIfNotCalled.py):
class Notifier:
def __init__(self, name, obj, callback):
self.callback = callback
self.name = name
self.obj = obj
self.called = False
def __call__(self, *args, **kwargs):
self.callback(self.obj, *args, **kwargs)
self.called = True
def __del__(self):
if not self.called:
print("Warning! {} function hasn't been called!".format(self.name))
class WarnIfNotCalled(type):
def __new__(cls, name, bases, dct):
dct_func = {}
for name, val in dct.copy().items():
if name.startswith('__') or not callable(val):
continue
else:
dct_func[name] = val
del dct[name]
def getattr(self, name):
if name in dct_func:
return Notifier(name, self, dct_func[name])
dct['__getattr__'] = getattr
return super(WarnIfNotCalled, cls).__new__(cls, name, bases, dct)
It's very easy to use - just specify a metaclass
from WarnIfNotCalled import WarnIfNotCalled
class A(metaclass = WarnIfNotCalled):
def foo(self):
print("foo has been called")
def bar(self, x):
print("bar has been called and x =", x)
If you didn't forget to call these functions, everything works as usual
a = A()
a.foo()
a.bar(5)
Output:
foo has been called
bar has been called and x = 5
But if you DID forget:
a = A()
a.foo
a.bar
You see the following
Warning! foo function hasn't been called!
Warning! bar function hasn't been called!
Happy debugging!
Let's consider this piece of code where I would like to create bar dynamically with a decorator
def foo():
def bar():
print "I am bar from foo"
print bar()
def baz():
def bar():
print "I am bar from baz"
print bar()
I thought I could create bar from the outside with a decorator:
def bar2():
print "I am super bar from foo"
setattr(foo, 'bar', bar2)
But the result is not what I was expecting (I would like to get I am super bar from foo:
>>> foo()
I am bar from foo
Is it possible to override a sub-function on an existing function with a decorator?
The actual use case
I am writing a wrapper for a library and to avoid boilerplate code I would like to simplify my work.
Each library function has a prefix lib_ and returns an error code. I would like to add the prefix to the current function and treat the error code. This could be as simple as this:
def call():
fname = __libprefix__ + inspect.stack()[1][3]
return_code = getattr(__lib__, fname)(*args)
if return_code < 0: raise LibError(fname, return_code)
def foo():
call()
The problem is that call might act differently in certain cases. Some library functions do not return an error_code so it would be easier to write it like
this:
def foo():
call(check_status=True)
Or much better in my opinion (this is the point where I started thinking about decorators):
#LibFunc(check_status=True)
def foo():
call()
In this last example I should declare call inside foo as a sub-function created dynamically by the decorator itself.
The idea was to use something like this:
class LibFunc(object):
def __init__(self,**kwargs):
self.kwargs = kwargs
def __call__(self, original_func):
decorator_self = self
def wrappee( *args, **kwargs):
def call(*args):
fname = __libprefix__ + original_func.__name__
return_code = getattr(__lib__, fname)(*args)
if return_code < 0: raise LibError(fname, return_code)
print original_func
print call
# <<<< The part that does not work
setattr(original_func, 'call', call)
# <<<<
original_func(*args,**kwargs)
return wrappee
Initially I was tempted to call the call inside the decorator itself to minimize the writing:
#LibFunc():
foo(): pass
Unfortunately, this is not an option since other things should sometime be done before and after the call:
#LibFunc():
foo(a,b):
value = c_float()
call(a, pointer(value), b)
return value.value
Another option that I thought about was to use SWIG, but again this is not an option because I will need to rebuild the existing library with the SWIG wrapping functions.
And last but not least, I may get inspiration from SWIG typemaps and declare my wrapper as this:
#LibFunc(check_exit = true, map = ('<a', '>c_float', '<c_int(b)')):
foo(a,b): pass
This looks like the best solution to me, but this is another topic and another question...
Are you married to the idea of a decorator? Because if your goal is bunch of module-level functions each of which wraps somelib.lib_somefunctionname, I don't see why you need one.
Those module-level names don't have to be functions, they just have to be callable. They could be a bunch of class instances, as long as they have a __call__ method.
I used two different subclasses to determine how to treat the return value:
#!/usr/bin/env python3
import libtowrap # Replace with the real library name.
class Wrapper(object):
'''
Parent class for all wrapped functions in libtowrap.
'''
def __init__(self, name):
self.__name__ = str(name)
self.wrapped_name = 'lib_' + self.__name__
self.wrapped_func = getattr(libtowrap, self.wrapped_name)
self.__doc__ = self.wrapped_func.__doc__
return
class CheckedWrapper(Wrapper):
'''
Wraps functions in libtowrap that return an error code that must
be checked. Negative return values indicate an error, and will
raise a LibError. Successful calls return None.
'''
def __call__(self, *args, **kwargs):
error_code = self.wrapped_func(*args, **kwargs)
if error_code < 0:
raise LibError(self.__name__, error_code)
return
class UncheckedWrapper(Wrapper):
'''
Wraps functions in libtowrap that return a useful value, as
opposed to an error code.
'''
def __call__(self, *args, **kwargs):
return self.wrapped_func(*args, **kwargs)
strict = CheckedWrapper('strict')
negative_means_failure = CheckedWrapper('negative_means_failure')
whatever = UncheckedWrapper('whatever')
negative_is_ok = UncheckedWrapper('negative_is_ok')
Note that the wrapper "functions" are assigned while the module is being imported. They are in the top-level module namespace, and not hidden by any if __name__ == '__main__' test.
They will behave like functions for most purposes, but there will be minor differences. For example, I gave each instance a __name__ that matches the name they're assigned to, not the lib_-prefixed name used in libtowrap... but I copied the original __doc__, which might refer to a prefixed name like lib_some_other_function. Also, testing them with isinstance will probably surprise people.
For more about decorators, and for many more annoying little discrepancies like the ones I mentioned above, see Graham Dumpleton's half-hour lecture "Advanced Methods for Creating Decorators" (PyCon US 2014; slides). He is the author of the wrapt module (Python Package Index; Git Hub; Read the Docs), which corrects all(?) of the usual decorator inconsistencies. It might solve your problem entirely (except for the old lib_-style names showing up in __doc__).