I want to use a decorator (composer) that recevices as parameter n number of decorators, this decorators will be used to decorate a function. Also I want to pass some parameters from two origins, a parameter named "SKIP" in the composer and another parameter named "parameter" sent by the parameter_sender decorator. Here's what I tried:
def compose(*decorators, SKIP=None):
def something(func):
#wraps(func)
def func_wrap(parameter = None, **kwargs):
try:
if SKIP:
print("I'm here")
return func(parameter = parameter,**kwargs)
else:
for decorator in reversed(decorators):
func = decorator(func, parameter = parameter,**kwargs) # --------- This line is providing the error ------------------
return func
raise exception
except Exception as e:
print(e)
raise exception
return func_wrap
return something
And here is an example of where do I want to use it. In this example I want to SKIP the composing of all the decorators if the variable SKIP is true.
#application.route("/function/<id_something>", methods=['GET'])
#parameter_sender
#compose(decorator_1,decorator_2, SKIP=True)
def function (id_something, **kwargs):
try:
#TODO:
return jsonify("ok")
except Exception as e:
print(e)
But i've got an error that says this:
>>I'm here
>>local variable 'func' referenced before assignment
Even when the if statement is working. PD: It works without the line indicated in the composer.
The following code should do the thing.
You were trying to set a value for a variable from outer scope. In my example I used separate temp variable composition.
def compose(*decorators, SKIP=None):
def something(func):
#wraps(func)
def func_wrap(*args, **kwargs):
try:
if SKIP:
print("I'm here")
return func(*args, **kwargs)
else:
composition = func
for decorator in reversed(decorators):
composition = decorator(composition)
return composition(*args, **kwargs)
except Exception as e:
print(e)
raise
return func_wrap
return something
I have a function "main_func" , from which i am calling another function and so on.
class Error(Exception):
"""Base class for other exceptions"""
pass
def main_func():
return sub_func()
def sub_func():
return sub_sub_func()
def sub_sub_func():
return sub_sub_sub_func()
def sub_sub_sub_func():
try:
x = len(10)
res = 'b'
except:
raise Error
return res
main_func()
If you see in sub_sub_sub_func() i have added a line x = len(10) which will cause exception.
What i want is, if this happens, i should directly jump to main_func() and return a flag(str) as 'fail'
I looked into defining custom exceptions , but it didn't helped me.
I want to return after i raise.
len(10) will raise a TypeError you can catch this specific exception in your main_func and do the thing that needs to happen then.
Please note that you need to create an instance of your error class when raising. Error()
class Error(Exception):
"""Base class for other exceptions"""
pass
def main_func():
try:
return sub_func()
except (Error as e):
# The raised error will be cought here.
# Do the stuff that needs to happen here.
return 'fail'
def sub_func():
return sub_sub_func()
def sub_sub_func():
return sub_sub_sub_func()
def sub_sub_sub_func():
try:
x = len(10) # Will raise a `TypeError`
res = 'b'
except:
# `TypeError` that is raised will get here
raise Error()
return res
main_func()
Note: Your custom Error hides a lot of information that can come in handy later. What happened what raised this error. Best is to put the original TypeError as an inner exception to Error.
try:
x = len(10)
except Throwable as e:
raise Error(e)
In theory in your code a potential out of memory exception will be converted to your Error without knowing what happened.
I'd interacting with a lot of deeply nested json I didn't write, and would like to make my python script more 'forgiving' to invalid input. I find myself writing involved try-except blocks, and would rather just wrap the dubious function up.
I understand it's a bad policy to swallow exceptions, but I'd rather prefer they to be printed and analysed later, than to actually stop execution. It's more valuable, in my use-case to continue executing over the loop than to get all keys.
Here's what I'm doing now:
try:
item['a'] = myobject.get('key').METHOD_THAT_DOESNT_EXIST()
except:
item['a'] = ''
try:
item['b'] = OBJECT_THAT_DOESNT_EXIST.get('key2')
except:
item['b'] = ''
try:
item['c'] = func1(ARGUMENT_THAT_DOESNT_EXIST)
except:
item['c'] = ''
...
try:
item['z'] = FUNCTION_THAT_DOESNT_EXIST(myobject.method())
except:
item['z'] = ''
Here's what I'd like, (1):
item['a'] = f(myobject.get('key').get('subkey'))
item['b'] = f(myobject.get('key2'))
item['c'] = f(func1(myobject)
...
or (2):
#f
def get_stuff():
item={}
item['a'] = myobject.get('key').get('subkey')
item['b'] = myobject.get('key2')
item['c'] = func1(myobject)
...
return(item)
...where I can wrap either the single data item (1), or a master function (2), in some function that turns execution-halting exceptions into empty fields, printed to stdout. The former would be sort of an item-wise skip - where that key isn't available, it logs blank and moves on - the latter is a row-skip, where if any of the fields don't work, the entire record is skipped.
My understanding is that some kind of wrapper should be able to fix this. Here's what I tried, with a wrapper:
def f(func):
def silenceit():
try:
func(*args,**kwargs)
except:
print('Error')
return(silenceit)
Here's why it doesn't work. Call a function that doesn't exist, it doesn't try-catch it away:
>>> f(meow())
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'meow' is not defined
Before I even add a blank return value, I'd like to get it to try-catch correctly. If the function had worked, this would have printed "Error", right?
Is a wrapper function the correct approach here?
UPDATE
I've had a lot of really useful, helpful answers below, and thank you for them---but I've edited the examples I used above to illustrate that I'm trying to catch more than nested key errors, that I'm looking specifically for a function that wraps a try-catch for...
When a method doesn't exist.
When an object doesn't exist, and is getting a method called on it.
When an object that does not exist is being called as an argument to a function.
Any combination of any of these things.
Bonus, when a function doesn't exist.
There are lots of good answers here, but I didn't see any that address the question of whether you can accomplish this via decorators.
The short answer is "no," at least not without structural changes to your code. Decorators operate at the function level, not on individual statements. Therefore, in order to use decorators, you would need to move each of the statements to be decorated into its own function.
But note that you can't just put the assignment itself inside the decorated function. You need to return the rhs expression (the value to be assigned) from the decorated function, then do the assignment outside.
To put this in terms of your example code, one might write code with the following pattern:
#return_on_failure('')
def computeA():
item['a'] = myobject.get('key').METHOD_THAT_DOESNT_EXIST()
item["a"] = computeA()
return_on_failure could be something like:
def return_on_failure(value):
def decorate(f):
def applicator(*args, **kwargs):
try:
return f(*args,**kwargs)
except:
print('Error')
return value
return applicator
return decorate
You could use a defaultdict and the context manager approach as outlined in Raymond Hettinger's PyCon 2013 presentation
from collections import defaultdict
from contextlib import contextmanager
#contextmanager
def ignored(*exceptions):
try:
yield
except exceptions:
pass
item = defaultdict(str)
obj = dict()
with ignored(Exception):
item['a'] = obj.get(2).get(3)
print item['a']
obj[2] = dict()
obj[2][3] = 4
with ignored(Exception):
item['a'] = obj.get(2).get(3)
print item['a']
It's very easy to achieve using configurable decorator.
def get_decorator(errors=(Exception, ), default_value=''):
def decorator(func):
def new_func(*args, **kwargs):
try:
return func(*args, **kwargs)
except errors, e:
print "Got error! ", repr(e)
return default_value
return new_func
return decorator
f = get_decorator((KeyError, NameError), default_value='default')
a = {}
#f
def example1(a):
return a['b']
#f
def example2(a):
return doesnt_exist()
print example1(a)
print example2(a)
Just pass to get_decorator tuples with error types which you want to silence and default value to return.
Output will be
Got error! KeyError('b',)
default
Got error! NameError("global name 'doesnt_exist' is not defined",)
default
Edit: Thanks to martineau i changed default value of errors to tuples with basic Exception to prevents errors.
It depends on what exceptions you expect.
If your only use case is get(), you could do
item['b'] = myobject.get('key2', '')
For the other cases, your decorator approach might be useful, but not in the way you do it.
I'll try to show you:
def f(func):
def silenceit(*args, **kwargs): # takes all kinds of arguments
try:
return func(*args, **kwargs) # returns func's result
except Exeption, e:
print('Error:', e)
return e # not the best way, maybe we'd better return None
# or a wrapper object containing e.
return silenceit # on the correct level
Nevertheless, f(some_undefined_function())won't work, because
a) f() isn't yet active at the execution time and
b) it is used wrong. The right way would be to wrap the function and then call it: f(function_to_wrap)().
A "layer of lambda" would help here:
wrapped_f = f(lambda: my_function())
wraps a lambda function which in turn calls a non-existing function. Calling wrapped_f() leads to calling the wrapper which calls the lambda which tries to call my_function(). If this doesn't exist, the lambda raises an exception which is caught by the wrapper.
This works because the name my_function is not executed at the time the lambda is defined, but when it is executed. And this execution is protected and wrapped by the function f() then. So the exception occurs inside the lambda and is propagated to the wrapping function provided by the decorator, which handles it gracefully.
This move towards inside the lambda function doesn't work if you try to replace the lambda function with a wrapper like
g = lambda function: lambda *a, **k: function(*a, **k)
followed by a
f(g(my_function))(arguments)
because here the name resolution is "back at the surface": my_function cannot be resolved and this happens before g() or even f() are called. So it doesn't work.
And if you try to do something like
g(print)(x.get('fail'))
it cannot work as well if you have no x, because g() protects print, not x.
If you want to protect x here, you'll have to do
value = f(lambda: x.get('fail'))
because the wrapper provided by f() calls that lambda function which raises an exception which is then silenced.
Extending #iruvar answer - starting with Python 3.4 there is an existing context manager for this in Python standard lib: https://docs.python.org/3/library/contextlib.html#contextlib.suppress
from contextlib import suppress
with suppress(FileNotFoundError):
os.remove('somefile.tmp')
with suppress(FileNotFoundError):
os.remove('someotherfile.tmp')
in your case you first evaluate the value of the meow call (which doesn't exist) and then wrap it in the decorator. this doesn't work that way.
first the exception is raised before it was wrapped, then the wrapper is wrongly indented (silenceit should not return itself). You might want to do something like:
def hardfail():
return meow() # meow doesn't exist
def f(func):
def wrapper():
try:
func()
except:
print 'error'
return wrapper
softfail =f(hardfail)
output:
>>> softfail()
error
>>> hardfail()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in hardfail
NameError: global name 'meow' is not defined
anyway in your case I don't understand why you don't use a simple method such as
def get_subkey(obj, key, subkey):
try:
return obj.get(key).get(subkey, '')
except AttributeError:
return ''
and in the code:
item['a'] = get_subkey(myobject, 'key', 'subkey')
Edited:
In case you want something that will work at any depth. You can do something like:
def get_from_object(obj, *keys):
try:
value = obj
for k in keys:
value = value.get(k)
return value
except AttributeError:
return ''
That you'd call:
>>> d = {1:{2:{3:{4:5}}}}
>>> get_from_object(d, 1, 2, 3, 4)
5
>>> get_from_object(d, 1, 2, 7)
''
>>> get_from_object(d, 1, 2, 3, 4, 5, 6, 7)
''
>>> get_from_object(d, 1, 2, 3)
{4: 5}
And using your code
item['a'] = get_from_object(obj, 2, 3)
By the way, on a personal point of view I also like #cravoori solution using contextmanager. But this would mean having three lines of code each time:
item['a'] = ''
with ignored(AttributeError):
item['a'] = obj.get(2).get(3)
Why not just use cycle?
for dst_key, src_key in (('a', 'key'), ('b', 'key2')):
try:
item[dst_key] = myobject.get(src_key).get('subkey')
except Exception: # or KeyError?
item[dst_key] = ''
Or if you wish write a little helper:
def get_value(obj, key):
try:
return obj.get(key).get('subkey')
except Exception:
return ''
Also you can combine both solutions if you have a few places where you need to get value and helper function would be more reasonable.
Not sure that you actually need a decorator for your problem.
Since you're dealing with lots of broken code, it may be excusable to use eval in this case.
def my_eval(code):
try:
return eval(code)
except: # Can catch more specific exceptions here.
return ''
Then wrap all your potentially broken statements:
item['a'] = my_eval("""myobject.get('key').get('subkey')""")
item['b'] = my_eval("""myobject.get('key2')""")
item['c'] = my_eval("""func1(myobject)""")
How about something like this:
def exception_handler(func):
def inner_function(*args, **kwargs):
try:
func(*args, **kwargs)
except TypeError:
print(f"{func.__name__} error")
return inner_function
then
#exception_handler
def doSomethingExceptional():
a=2/0
all credits go to:https://medium.com/swlh/handling-exceptions-in-python-a-cleaner-way-using-decorators-fae22aa0abec
Try Except Decorator for sync and async functions
Note: logger.error can be replaced with print
Latest version can be found here.
The Python documentation for unittest implies that the assertRaises() method can be used as a context manager. The code below shows gives a simple example of the unittest from the Python docs. The assertRaises() call in the testsample() method works fine.
Now I'd like to access the exception in when it is raised, but if I comment it out and instead uncomment the next block in which I attempt to used a context manager I get an AttributeError: __exit__ when I attempt to execute the code. This happens for both Python 2.7.2 and 3.2.2. I could catch the exception in a try...except block and access it that way but the documentation for unittest seems to imply the context manager would do this as well.
Is there something else I'm doing wrong here?
class TestSequenceFunctions(unittest.TestCase):
def setUp(self):
self.seq = [x for x in range(10)]
def testshuffle(self):
# make sure the shuffled sequence does not lose any elements
random.shuffle(self.seq)
self.seq.sort()
self.assertEqual(self.seq, [x for x in range(10)])
def testchoice(self):
element = random.choice(self.seq)
self.assert_(element in self.seq)
def testsample(self):
self.assertRaises(ValueError, random.sample, self.seq, 20)
# with self.assertRaises(ValueError, random.sample, self.seq, 20):
# print("Inside cm")
for element in random.sample(self.seq, 5):
self.assert_(element in self.seq)
if __name__ == '__main__':
unittest.main()
It seems no-one has yet suggested:
import unittest
# For python < 2.7, do import unittest2 as unittest
class Class(object):
def should_raise(self):
raise ValueError('expected arg')
class test_Class(unittest.TestCase):
def test_something(self):
DUT = Class()
with self.assertRaises(ValueError) as exception_context_manager:
DUT.should_raise()
exception = exception_context_manager.exception
self.assertEqual(exception.args, ('expected arg', ))
I usually use e_cm as short for exception_context_manager.
The source code for unittest doesn't show an exception hook for assertRaises:
class _AssertRaisesContext(object):
"""A context manager used to implement TestCase.assertRaises* methods."""
def __init__(self, expected, test_case, expected_regexp=None):
self.expected = expected
self.failureException = test_case.failureException
self.expected_regexp = expected_regexp
def __enter__(self):
return self
def __exit__(self, exc_type, exc_value, tb):
if exc_type is None:
try:
exc_name = self.expected.__name__
except AttributeError:
exc_name = str(self.expected)
raise self.failureException(
"{0} not raised".format(exc_name))
if not issubclass(exc_type, self.expected):
# let unexpected exceptions pass through
return False
self.exception = exc_value # store for later retrieval
if self.expected_regexp is None:
return True
expected_regexp = self.expected_regexp
if isinstance(expected_regexp, basestring):
expected_regexp = re.compile(expected_regexp)
if not expected_regexp.search(str(exc_value)):
raise self.failureException('"%s" does not match "%s"' %
(expected_regexp.pattern, str(exc_value)))
return True
So, as you suspected, forming your own try/except block is the way to go if you want to intercept the exception while still keeping the assertRaises test:
def testsample(self):
with self.assertRaises(ValueError):
try:
random.sample(self.seq, 20)
except ValueError as e:
# do some action with e
self.assertEqual(e.args,
('sample larger than population',))
# now let the context manager do its work
raise
According to the documentation:
If called with callableObj omitted or None, will return a context object
So that code should be:
with self.assertRaises(ValueError):
random.sample(self.seq, 20)
Given this was asked six years ago I imagine this is something which works now but didn't work then. The docs state this appeared in 2.7 but not which micro version.
import unittest
class TestIntParser(unittest.TestCase):
def test_failure(self):
failure_message = 'invalid literal for int() with base 10'
with self.assertRaises(ValueError) as cm:
int('forty two')
self.assertIn(failure_message, cm.exception.message)
if __name__ == '__main__':
unittest.main()
Can you suggest a way to code a drop-in replacement for the "with" statement that will work in Python 2.4?
It would be a hack, but it would allow me to port my project to Python 2.4 more nicely.
EDIT:
Removed irrelevant metaclass sketch
Just use try-finally.
Really, this may be nice as a mental exercise, but if you actually do it in code you care about you will end up with ugly, hard to maintain code.
You could (ab)use decorators to do this, I think. The following works, eg:
def execute_with_context_manager(man):
def decorator(f):
target = man.__enter__()
exc = True
try:
try:
f(target)
except:
exc = False
if not man.__exit__(*sys.exc_info()):
raise
finally:
if exc:
man.__exit__(None, None, None)
return None
return decorator
#execute_with_context_manager(open("/etc/motd"))
def inside(motd_file):
for line in motd_file:
print line,
(Well, in Python 2.4 file objects don't have __enter__ and __exit__ methods, but otherwise it works)
The idea is you're replacing the with line in:
with bar() as foo:
do_something_with(foo)
do_something_else_with(foo)
# etc...
with the decorated function "declaration" in:
#execute_with_context_manager( bar() )
def dummyname( foo ):
do_something_with(foo)
do_something_else_with(foo)
# etc...
but getting the same behaviour (the do_something_... code executed). Note the decorator changes the function declaration into an immediate invocation which is more than a little evil.
Since you need to exit the context manager both during errors and not errors, I don't think it's possible to do a generic usecase with metaclasses, or in fact at all. You are going to need try/finally blocks for that.
But maybe it's possible to do something else in your case. That depends on what you use the context manager for.
Using __del__ can help in some cases, like deallocating resource, but since you can't be sure it gets called, it can only be used of you need to release resources that will be released when the program exits. That also won't work if you are handling exceptions in the __exit__ method.
I guess the cleanest method is to wrap the whole context management in a sort of context managing call, and extract the code block into a method. Something like this (untested code, but mostly stolen from PEP 343):
def call_as_context_manager(mgr, function):
exit = mgr.__exit__
value = mgr.__enter__()
exc = True
try:
try:
function(value)
except:
exc = False
if not exit(*sys.exc_info()):
raise
finally:
if exc:
exit(None, None, None)
How about this?
def improvize_context_manager(*args, **kwargs):
assert (len(args) + len(kwargs)) == 1
if args:
context_manager = args[0]
as_ = None
else: # It's in kwargs
(as_, context_manager) = kwargs.items()[0]
def decorator(f):
exit_ = context_manager.__exit__ # Not calling it yet
enter_ = context_manager.__enter__()
exc = True
try:
try:
if as_:
f(*{as_: enter_})
else:
f()
except:
exc = False
if not exit_(*sys.exc_info()):
raise
finally:
if exc:
exit_(None, None, None)
return None
return decorator
Usage:
#improvize_context_manager(lock)
def null():
do(stuff)
Which parallels the with keyword without as.
Or:
#improvize_context_manager(my_lock=lock)
def null(my_lock):
do(stuff_with, my_lock)
Which parallels the with keyword with the as.
If you are OK with using def just to get a block, and decorators that immediately execute, you could use the function signature to get something more natural for the named case.
import sys
def with(func):
def decorated(body = func):
contexts = body.func_defaults
try:
exc = None, None, None
try:
for context in contexts:
context.__enter__()
body()
except:
exc = sys.exc_info()
raise
finally:
for context in reversed(contexts):
context.__exit__(*exc)
decorated()
class Context(object):
def __enter__(self):
print "Enter %s" % self
def __exit__(self, *args):
print "Exit %s(%s)" % (self, args)
x = Context()
#with
def _(it = x):
print "Body %s" % it
#with
def _(it = x):
print "Body before %s" % it
raise "Nothing"
print "Body after %s" % it