New to Python and I have a bunch of functions to perform various tasks on some hardware. Each function has different numbers of parameters and returns.
I want to make a kind of generic "retry" wrapper function that will catch an exception from any of my functions and do some error handling (such as retrying the task).
From what I understand I should be able to use a decorator function as a generic wrapper for each of my functions. That seems to work, but I don't seem to be able to actually get any of the exceptions from the function being called from within my decorator function.
I've looked at various examples and come up with this:
def retry(function):
def _retry(*args, **kwargs):
try:
reply = function(*args, **kwargs)
print "reply: ", reply
return reply
except PDError as msg:
print "_retry", msg
except:
print "_retry: another error"
return _retry
Then I call it using the name of one of my functions:
value = retry(pd.command_get_parameter(0x00))
It seems to call my function and return correctly, but the exceptions are never caught within my retry function. So I can't handle an error and do a retry.
I've also tried this:
from functools import wraps
def retry(function):
#wraps(function)
def _retry(*args, **kwargs):
.....
I'm not sure what I'm doing wrong, or if this is even the best way to be doing this. Does anyone have a suggestion on how to do this? I don't really want to have to make separate "retry" functions for each of my main functions.
Converting my comment to answer:
You should be using like:
def retry(function):
#wraps(function)
def _retry(*args, **kwargs):
try:
reply = function(*args, **kwargs)
print "reply: ", reply
return reply
except PDError as msg:
print "_retry", msg
except:
print "_retry: another error"
return _retry
class SomeClass(object):
#retry
def command_get_parameter(..):
return <some value>
s = SomeClass()
result = s.command_get_parameter(..) #retry(..) actually invokes this function.
Decorators take in a function, and return a decorated function. A decoration is something that is capable of doing something before the function is invoked, after it, or catch exceptions etc. If you the above syntax (#retry), the interpreter call the retry(..), passes in the function object (command_get_parameter), and replaces the function with the function returned by retry(command_get_parameter).
What's going on is somewhat similar to below steps (pseudocode):
new_command_get_parameter = retry(command_get_parameter) ##retry has this effect.
result = new_command_get_parameter(your_input)
The difference is the above two steps are done done for you by the interpreter magically -- keeping the code cleaner and readable for the eyes.
Currently you are invoking the function, and passing the result of it to retry(..) which is obviously wrong. Further it wont catch exceptions the way you want it to.
Update: If you want the retry to access the instance variable, all you have to do is let _retry use the first parameter as self. Something like:
def retry(func):
def _retry(self, *args, **kwargs):
print "Decorator printing a:", self.a
print "Decorator printing b:", self.b
try:
return func(*args, **kwargs)
except Exception as e:
print "Caught exception"
return "Grr.."
return _retry
class Temp(object):
def __init__(self, a, b):
self.a = a
self.b = b
#retry
def command(self, *args, **kwargs):
print "In command."
print "Args:", args
print "KWargs:", kwargs
raise Exception("DIE!")
t = Temp(3, 5)
print t.command(3,4,5, a=4, b=8)
Output:
Decorator printing a: 3
Decorator printing b: 5
In command.
Args: (4, 5)
KWargs: {'a': 4, 'b': 8}
Caught exception
Grr..
Related
I'm quite new to decorators and I'm trying to build a decorator with an argument that should work both as a decorator and a stand-alone function. The basic idea is to raise an error if some condition is not satisfied. Example:
ensure_condition("fail") # exception should be raised
ensure_condition("pass") # nothing should happen
#ensure_condition("fail") # check condition before every `func` call
def f1():
return 1
I thought about doing this:
def ensure_condition(arg: str):
if not _validate(arg):
raise Exception("failed")
def ensure_condition_decorator(f = lambda *_: None):
def wrapper(*args, **kwargs):
return f(*args, **kwargs)
return wrapper
return ensure_condition_decorator
But the above results in the _validate function being called also when the f1 function is declared (not only executed).
Any other ideas?
Thanks!
Say I have a function F that may return boolean false. If I have a caller named main() that will call F in multiple places, can I attach a decorator to F that will propagate the return value and cause its parent (main) to also exit early?
No function can 'return' to a context higher than it's caller. This (to my knowledge) is universal in most programming languages. You could probably hack it by inspecting the python state and call stack, but a much better / more appropriate solution would be to wrap main in a try: except: block that catches a custom exception that you raise inside this decorator depending on the output of F()
import random
from functools import wraps
class ShortCircuit(Exception):
pass
def short_circuit(f):
#wraps(f)
def wrapped(*args, **kwargs):
res = f(*args, **kwargs)
if not res:
raise ShortCircuit()
else:
return res
return wrapped
#short_circuit
def F():
return random.choice([True, False])
def main():
print(F())
print(F())
if __name__=="__main__":
try:
main()
except ShortCircuit:
print("short circuited")
I have a class with plenty static methods with Tornado coroutine decorator. And I want to add another decorator, to catch exceptions and write them to a file:
# my decorator
def lifesaver(func):
def silenceit(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as ex:
# collect info and format it
res = ' ... '
# writelog(res)
print(res)
return None
return silenceit
However, it doesn't work with gen.coroutine decorator:
class SomeClass:
# This doesn't work!
# I tried to pass decorators in different orders,
# but got no result.
#staticmethod
#lifesaver
#gen.coroutine
#lifesaver
def dosomething1():
raise Exception("Test error!")
# My decorator works well
# if it is used without gen.coroutine.
#staticmethod
#gen.coroutine
def dosomething2():
SomeClass.dosomething3()
#staticmethod
#lifesaver
def dosomething3():
raise Exception("Test error!")
I understand, that Tornado uses raise Return(...) approach which is probably based on Exceptions, and maybe it somehow blocks try-catches of other decorators... So, how can I used my decorator to handle Exceptions with Tornado coroutines?
The answer
Thanks to Martijn Pieters, I got this code working:
def lifesaver(func):
def silenceit(*args, **kwargs):
try:
return func(*args, **kwargs)
except (gen.Return, StopIteration):
raise
except Exception as ex:
# collect info and format it
res = ' ... '
# writelog(res)
print(res)
raise gen.Return(b"")
return silenceit
So, I only needed to specify Tornado Return. I tried to add #gen.coroutine decorator to silenceit function and use yield in it, but this leads to Future objects of Future objects and some other strange unpredictable behaviour.
You are decorating the output of gen.coroutine, because decorators are applied from bottom to top (as they are nested inside one another from top to bottom).
Rather than decorate the coroutine, decorate your function and apply the gen.coroutine decorator to that result:
#gen.coroutine
#lifesaver
def dosomething1():
raise Exception("Test error!")
Your decorator can't really handle the output that a #gen.coroutine decorated function produces. Tornado relies on exceptions to communicate results (because in Python 2, generators can't use return to return results). You need to make sure you pass through the exceptions Tornado relies on. You also should re-wrap your wrapper function:
from tornado import gen
def lifesaver(func):
#gen.coroutine
def silenceit(*args, **kwargs):
try:
return func(*args, **kwargs)
except (gen.Return, StopIteration):
raise
except Exception as ex:
# collect info and format it
res = ' ... '
# writelog(res)
print(res)
raise gen.Return(b"")
return silenceit
On exception, an empty Return() object is raised; adjust this as needed.
Do yourself a favour and don't use a class just put staticmethod functions in there. Just put those functions at the top level in the module. Classes are there to combine methods and shared state, not to create a namespace. Use modules to create namespaces instead.
I was wondering, is there a simple magic method in python that allows customization of the behaviour of an exception-derived object when it is raised? I'm looking for something like __raise__ if that exists. If no such magic methods exist, is there any way I could do something like the following (it's just an example to prove my point):
class SpecialException(Exception):
def __raise__(self):
print('Error!')
raise SpecialException() #this is the part of the code that must stay
Is it possible?
I don't know about such magic method but even if it existed it is just some piece of code that gets executed before actually raising the exception object. Assuming that its a good practice to raise exception objects that are instantiated in-place you can put such code into the __init__ of the exception. Another workaround: instead of raising your exception directly you call an error handling method/function that executes special code and then finally raises an exception.
import time
from functools import wraps
def capture_exception(callback=None, *c_args, **c_kwargs):
"""捕获到异常后执行回调函数"""
assert callable(callback), "callback 必须是可执行对象"
def _out(func):
#wraps(func)
def _inner(*args, **kwargs):
try:
res = func(*args, **kwargs)
return res
except Exception as e:
callback(*c_args, **c_kwargs)
raise e
return _inner
return _out
def send_warning():
print("warning message..............")
class A(object):
#capture_exception(callback=send_warning)
def run(self):
print('run')
raise SystemError("测试异常捕获回调功能")
time.sleep(0.2)
if __name__ == '__main__':
a = A()
a.run()
I'm writing a program in Python, and nearly every method im my class is written like this:
def someMethod(self):
try:
#...
except someException:
#in case of exception, do something here
#e.g display a dialog box to inform the user
#that he has done something wrong
As the class grows, it is a little bit annoying to write the same try-except block over and over. Is it possible to create some sort of 'global' exception for the whole class? What's the recommended way in Python to deal with this?
Write one or more exception handler functions that, given a function and the exception raised in it, does what you want to do (e.g. displays an alert). If you need more than one, write them.
def message(func, e):
print "Exception", type(e).__name__, "in", func.__name__
print str(e)
Now write a decorator that applies a given handler to a called function:
import functools
def handle_with(handler, *exceptions):
try:
handler, cleanup = handler
except TypeError:
cleanup = lambda f, e: None
def decorator(func):
#functools.wraps(func)
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except exceptions or Exception as e:
return handler(func, e)
else:
e = None
finally:
cleanup(func, e)
return wrapper
return decorator
This only captures the exceptions you specify. If you don't specify any, Exception is caught. Additionally, the first argument can be a tuple (or other sequence) of two handler functions; the second handler, if given, is called in a finally clause. The value returned from the primary handler is returned as the value of the function call.
Now, given the above, you can write:
#handle_with(message, TypeError, ValueError)
def add(x, y):
return x + y
You could also do this with a context manager:
from contextlib import contextmanager
#contextmanager
def handler(handler, *exceptions):
try:
handler, cleanup = handler
except TypeError:
cleanup = lambda e: None
try:
yield
except exceptions or Exception as e:
handler(e)
else:
e = None
finally:
cleanup(e)
Now you can write:
def message(e):
print "Exception", type(e).__name__
print str(e)
def add(x, y):
with handler(message, TypeError, ValueError):
return x + y
Note that the context manager doesn't know what function it's in (you can find this out, sorta, using inspect, though this is "magic" so I didn't do it) so it gives you a little less useful information. Also, the context manager doesn't give you the opportunity to return anything in your handler.
I can think of two options:
Write a decorator that can wrap each method in the try block.
Write a "dispatcher" method that calls the appropriate method inside a try block, then call that method instead of the individual ones. That is, instead of calling obj.someMethod(), obj.otherMethod, you call obj.dispatch('someMethod') or obj.dispatch('otherMethod'), where dispatch is a wrapper that contains the try block.
Your approach seems like a bit of a strange design, though. It might make more sense to have the dialog-box stuff in some other part of the code, some higher-level event loop that catches errors and displays messages about them.