Repetitive wrapper functions for logging - python

I am writing a script that needs to use a class from an external library, do some
operations on instances of that class, and then repeat with some more instances.
Something like this:
import some_library
work_queue = get_items()
for item in work_queue:
some_object = some_library.SomeClass(item)
operation_1(some_object)
# ...
operation_N(some_object)
However, each of the operations in the loop body can raise some different exceptions.
When these happen I need to log them and skip to the next item. If they raise
some unexpected exception I need to log that before crashing.
I could catch all the exceptions in the main loop, but that would obscure what it does.
So I find myself writing a bunch of wrapper functions that all look kind of similar:
def wrapper_op1(some_object):
try:
some_object.method_1()
except (some_library.SomeOtherError, ValueError) as error_message:
logger.error("Op1 error on {}".format(some_object.friendly_name))
return False
except Exception as error_message:
logger.error("Unknown error during op1 on {} - crashing: {}".format(some_object.friendly_name, error_message))
raise
else:
return True
# Notice there is a different tuple of anticipated exceptions
# and the message formatting is different
def wrapper_opN(some_object):
try:
some_function(some_object.some_attr)
except (RuntimeError, AttributeError) as error_message:
logger.error("OpN error on {} with {}".format(some_object.friendly_name, some_object.some_attr, error_message))
return False
except Exception as error_message:
logger.error("Unknown error during opN on {} with {} - crashing: {}".(some_object.friendly_name, some_object.some_attr, error_message))
raise
else:
return True
And modifying my main loop to be:
for item in work_queue:
some_object = some_library.SomeClass(item)
if not wrapper_op1(some_object):
continue
# ...
if not wrapper_opN(some_object):
continue
This does the job, but it feels like a lot of copy and paste programming with
the wrappers. What would be great is to write a decorator function that could
do all that try...except...else stuff so I could do:
# logged_call(known_exception, known_error_message, unknown_error_message)
def wrapper_op1(some_object):
some_object.method_1()
The wrapper would return True if the operation succeeds, catch the known exceptions
and log with a specified format, and catch any unknown exceptions for logging before re-raising.
However, I can't seem to fathom how to make the error messages work - I can do it with fixed strings:
def logged_call(known_exceptions, s_err, s_fatal):
def decorate(f):
#wraps(f)
def wrapper(*args, **kwargs):
try:
f(*args, **kwargs)
# How to get parameters from args in the message?
except known_exceptions as error:
print(s_err.format(error))
return False
except Exception as error:
print(s_fatal.format(error))
raise
else:
return True
return wrapper
return decorate
However, my error messages need to get attributes that belong to the decorated function.
Is there some Pythonic way to make this work? Or a different pattern to be using
when dealing with might-fail-in-known-ways functions?

The below tryblock function can be used for this purpose, to implement your logged_call concept and reduce the total amount of code, assuming you have enough checks to overcome the decorator implementation. Folks not used to functional programming in Python may actually find this more difficult to understand than simply writing out the try blocks as you have done. Simplicity, like so many things, is in the eye of the beholder.
Python 2.7 using no imports. This uses the exec statement to create a customized try block that works in a functional pattern.
def tryblock(tryf, *catchclauses, **otherclauses):
u'return a general try-catch-else-finally block as a function'
elsef = otherclauses.get('elsef', None)
finallyf = otherclauses.get('finallyf', None)
namespace = {'tryf': tryf, 'elsef': elsef, 'finallyf': finallyf, 'func': []}
for pair in enumerate(catchclauses):
namespace['e%s' % (pair[0],)] = pair[1][0]
namespace['f%s' % (pair[0],)] = pair[1][1]
source = []
add = lambda indent, line: source.append(' ' * indent + line)
add(0, 'def block(*args, **kwargs):')
add(1, "u'generated function executing a try block'")
add(1, 'try:')
add(2, '%stryf(*args, **kwargs)' % ('return ' if otherclauses.get('returnbody', elsef is None) else '',))
for index in xrange(len(catchclauses)):
add(1, 'except e%s as ex:' % (index,))
add(2, 'return f%s(ex, *args, **kwargs)' % (index,))
if elsef is not None:
add(1, 'else:')
add(2, 'return elsef(*args, **kwargs)')
if finallyf is not None:
add(1, 'finally:')
add(2, '%sfinallyf(*args, **kwargs)' % ('return ' if otherclauses.get('returnfinally', False) else '',))
add(0, 'func.append(block)')
exec '\n'.join(source) in namespace
return namespace['func'][0]
This tryblock function is general enough to go into a common library, as it is not specific to the logic of your checks. Add to it your logged_call decorator, implemented as (one import here):
import functools
resultof = lambda func: func() # # token must be followed by an identifier
#resultof
def logged_call():
truism = lambda *args, **kwargs: True
def raisef(ex, *args, **kwargs):
raise ex
def impl(exlist, err, fatal):
return lambda func: \
functools.wraps(func)(tryblock(func,
(exlist, lambda ex, *args, **kwargs: err(ex, *args, **kwargs) and False),
(Exception, lambda ex, *args, **kwargs: fatal(ex, *args, **kwargs) and raisef(ex))),
elsef=truism)
return impl # impl therefore becomes logged_call
Using logged_call as implemented, your two sanity checks look like:
op1check = logged_call((some_library.SomeOtherError, ValueError),
lambda _, obj: logger.error("Op1 error on {}".format(obj.friendly_name)),
lambda ex, obj: logger.error("Unknown error during op1 on {} - crashing: {}".format(obj.friendly_name, ex.message)))
opNcheck = logged_call((RuntimeError, AttributeError),
lambda ex, obj: logger.error("OpN error on {} with {}".format(obj.friendly_name, obj.some_attr, ex.message)),
lambda ex, obj: logger.error("Unknown error during opN on {} with {} - crashing: {}".format(obj.friendly_name, obj.some_attr, ex.message)))
#op1check
def wrapper_op1(obj):
return obj.method_1()
#opNcheck
def wrapper_opN(obj):
return some_function(obj.some_attr)
Neglecting blank lines, this is more compact than your original code by 10 lines, though at the sunk cost of the tryblock and logged_call implementations; whether it is now more readable is a matter of opinion.
You also have the option to define logged_call itself and all distinct decorators derived from it in a separate module, if that's sensible for your code; and therefore to use each derived decorator multiple times.
You may also find more of the logic structure that you can factor into logged_call by tweaking the actual checks.
But in the worst case, where each check has logic that no other does, you may find that it's more readable to just write out each one like you have already. It really depends.
For completeness, here's a unit test for the tryblock function:
import examplemodule as ex
from unittest import TestCase
class TestTryblock(TestCase):
def test_tryblock(self):
def tryf(a, b):
if a % 2 == 0:
raise ValueError
return a + b
def proc_ve(ex, a, b):
self.assertIsInstance(ex, ValueError)
if a % 3 == 0:
raise ValueError
return a + b + 10
def elsef(a, b):
return a + b + 20
def finallyf(a, b):
return a + b + 30
block = ex.tryblock(tryf, (ValueError, proc_ve))
self.assertRaises(ValueError, block, 0, 4)
self.assertRaises(ValueError, block, 6, 4)
self.assertEqual([5, 16, 7, 18, 9], map(lambda v: block(v, 4), xrange(1, 6)))
block = ex.tryblock(tryf, (ValueError, proc_ve), elsef=elsef)
self.assertEqual([25, 16, 27, 18, 29], map(lambda v: block(v, 4), xrange(1, 6)))
block = ex.tryblock(tryf, (ValueError, proc_ve), elsef=elsef, returnbody=True)
self.assertEqual([5, 16, 7, 18, 9], map(lambda v: block(v, 4), xrange(1, 6)))
block = ex.tryblock(tryf, (ValueError, proc_ve), finallyf=finallyf)
self.assertEqual([5, 16, 7, 18, 9], map(lambda v: block(v, 4), xrange(1, 6)))
block = ex.tryblock(tryf, (ValueError, proc_ve), finallyf=finallyf, returnfinally=True)
self.assertEqual([35, 36, 37, 38, 39], map(lambda v: block(v, 4), xrange(1, 6)))

Related

Wrapping a Python function that uses with

Say have a python function foo() that uses some resource and is meant to be called as follows:
with foo(x,y,z) as f:
doSomething(f)
So far so good. Now lets say foo takes in a complex set of arguments based on a variety of factors, and I'd like to define a wrapper function to make things simpler. Something like:
def simple_foo():
if x:
return foo(a,b,c)
else:
return foo(d,e,f)
Now, I'd like to use simple_foo in place of foo, like:
with simple_foo() as f:
doSomething(f)
However, unsurprisingly, this does not work. How can I write simple_foo() to get this behavior?
Decorate function foo() with contextmanager (doc):
from contextlib import contextmanager
#contextmanager
def foo(a, b, c):
try:
yield a + b + c
finally:
pass
def simple_foo(x):
if x:
return foo(1, 2, 3)
return foo(4, 5, 6)
with simple_foo(True) as v:
print(v)
with simple_foo(False) as v:
print(v)
Prints:
6
15
You can do by writing a custom context manager that internally calls that function, try code given below:
class SimpleFoo:
def __init__(self,x,y,z, option):
self.x = x
self.y = y
self.z = z
self.option = option
def __enter__(self):
if self.option:
return foo(self.x,self.y,self.z)
else:
return foo(self.y,self.z,self.x)
def __exit__(self, type, value, traceback):
if type != None:
print("Error in SimpleFoo")
print("Error Type :", type)
print("Error Value :", value)
print("Error Traceback :", traceback)
self.status = value
Now if you want to use this, use it as below:
with SimpleFoo(1,2,3,True) as foo:
doSomething(foo)
I hope this helps.

better python pattern for exception handler in a loop?

I found myself using the following pattern in my tests quite often:
def test(params):
e_list = []
for p in params:
try:
run_test(p) # Or a block of codes that can continue or break
except Exception as e:
e_list.append(e)
assert isEmpty(e_list), 'error encountered: {}'.format(e_list)
I find myself rewriting this pattern quite often, especially with long code block for the loop that has some flow controls with continue and break. I am wondering if there is a python-ic wrapping to this pattern.
I have thought about a wrapper function like this:
def assert_all_tests(test_list):
e_list = []
for t in test_list:
try:
t()
except Exception as e:
e_list.append(e)
assert isEmpty(e_list), 'error encountered: {}'.format(e_list)
def test(params):
assert_all_tests([functools.partial(run_test, p) for p in params])
But I dislike this approach because it wrapped away the loop. There is no way for callable t to do flow control of the loop with continue or break (there is no loop any more, only a list comprehension).
Another approach is to use a context class like this:
def test(params):
ErrorHandler.clearErrorList()
for p in params:
with ErrorHandler():
run_test(p) # or code block that can continue or break
ErrorHandler.assertEmptyErrorList()
where ErrorHandler would be a class with appropriate __enter__ and __exit__ and keep an error list in a class variable. But I feel that at the test function level, this is not any simpler than the original pattern: since there is no way for a ErrorHandler instance to know when a loop has began and ended, I still have to write the pre- and post- loop fixtures.
I'd like to hear idea of approaches to wrap around this pattern. Thanks.
EDIT
Thank you all for your comments.
New approach inspired by #paul-cornelius's answer
class ResultCollector(object):
def __init__(self, raise_on_error=True):
self.result_list = []
self.raise_on_error = raise_on_error
def do(self, func, *args, **kwds):
'''do can only deal with code block that can be wrapped into a function'''
try:
return func(*args, **kwds)
except Exception as e:
if not isinstance(e, AssertionError) and self.raise_on_error:
raise
self.result_list.append(e.message or e)
else:
self.result_list.append(None)
def assertClean(self):
assert not [x for x in self.result_list if x is not None], 'test results: {}'.format(self.result_list)
def __enter__(self):
self.result_list = []
return self
def __exit__(self, exc_t, exc_i, exc_tb):
if exc_t:
return None
self.assertClean()
return True
def test():
def can_be_refactored_into_func(p):
assert p%3, 'failed {}'.format(p)
def condition_for_skip(p):
return p%2
def condition_for_break(p):
return p>5
with ResultCollector() as rc:
for p in range(10):
if condition_for_skip(p):
rc.result_list.append('skipped {}'.format(p))
continue
if condition_for_break(p):
rc.result_list.append('ended {}'.format(p))
break
rc.do(can_be_refactored_into_func, p)
It works pretty well when the code into loop block can be divided up into functions like above.
How about a little class that only does the one thing you find yourself doing over and over:
class TestTracker:
def __init__(self):
self.error_list = []
def do_test(self, f, p):
try:
f(p)
except Exception as e:
self.error_list.append(e)
def __enter__(self):
return self
def __exit__(self, exc_type, exc_value, traceback):
if exc_value is not None:
self.error_list.append(exc_value)
return True
def test(params):
tt = TestTracker()
for p in params:
tt.do_test(run_test, p)
assert isEmpty(tt.error_list), 'error encountered: {}'.format(tt.error_list)
def test2(params):
tt = TestTracker()
for p in params:
with tt:
# a block of code with loop control statements
pass
assert isEmpty(tt.error_list), 'error encountered: {}'.format(tt.error_list)
I modified this answer to make the class a context manager. The test2 shows how that can be used with loop control statements. If no exception is raised within the context, the arguments to __exit__ will be None.
You could even mix with statements and calls to do_test.
Python can do anything!
Edits :
Add some convenience to TestTracker
class TestTracker:
def __init__(self):
self.error_list = []
def do_test(self, f, p):
try:
f(p)
except Exception as e:
self.error_list.append(e)
def __bool__(self):
return len(self.error_list) == 0
def __str__(self):
return 'error encountered: {}'.format(self.error_list)
def test(params):
tt = TestTracker()
for p in params:
tt.do_test(run_test, p)
assert tt, str(tt)

Use decorators to wrap all functions with "if func returned false, return false"

I'm writing a very basic Python script based on a main function that sequentially calls other functions.
What I would like to do is to wrap all of the functions that are called from main with something like:
result = func(*some_args):
if (result != True):
return result
For example, for this code:
def func1(arg1 , arg2):
if (some_cond):
return False #Or some err_val
return True
def func2(arg1):
if (some_cond):
return False
return True
def main():
func1(val1 , val2)
func2(val3)
return True
if __name__ == "__main__":
import sys
result = main()
if (result == err_val1):
# Do something. Maybe print. Maybe call some function.
sys.exit(1)
I want that if one of the functions fails main would break and return its error. Can I do this using decorators?
I maintain that the best solution would be the use of exceptions. If that's absolutely not what you want, you can do some short-circuiting:
return func1() and func2()
To extend this to more functions without a ton of ands:
from functools import partial
def main():
funcs = (partial(func1, arg1, arg2),
partial(func2, arg1))
if any(!f() for f in funcs):
return False
Though this doesn't return "its error" (the failed function's error), it just returns False. If you want to differentiate more between different kinds of errors... well, you're back to exceptions.
This is precisely what exceptions are built for in Python.
# imports belong at the *top* of the file
import sys
class SomeDescriptiveError(Exception): pass
class SomeOtherSpecialError(Exception): pass
def func1(arg1 , arg2):
if (some_cond):
raise SomeDescriptiveError('Cannot frobnosticate the fizzbuzz')
return arg1 + arg2
# or skip the return statement altogether
def func2(arg1):
if (some_cond):
raise SomeOtherSpecialError('The frontobulator is no longer cromulent')
return ''.join(reversed(arg1))
def main():
print(func1(val1 , val2))
print(func2(val3))
if __name__ == "__main__":
try:
result = main()
except SomeDescriptiveError as e:
print('Oh dear')
sys.exit(e.args[0])
except SomeOtherSpecialError as e:
print('Oh no')
sys.exit(e.args[0])
else:
print('All systems are fully operational')
finally:
print('I really should clean up all these bits.')
Since you do actually want the program to die when one of these errors occurs you might as well raise SystemExit. Here's a way to do it with a decorator.
flag = 2
def die_on_not_True(func):
def wrapper(*args):
rc = func(*args)
if rc is not True:
fmt = 'Function {} failed with return value {!r}'
print(fmt.format(func.__name__, rc))
raise SystemExit(1)
return True
return wrapper
#die_on_not_True
def func1(arg1 , arg2):
if arg1 == flag:
return 'error 1'
return True
#die_on_not_True
def func2(arg1):
if arg1 == flag:
return 'error 2'
return True
def main():
val1, val2, val3 = 1, 2, 3
print(func1(val1, val2))
print('one')
print(func2(val3))
print('two')
if __name__ == '__main__':
main()
output
True
one
True
two
If we set flag = 1, the output becomes
Function func1 failed with return value 'error 1'
If we set flag = 3, the output becomes
True
one
Function func2 failed with return value 'error 2'
When flag equals 2, the exit status of 0 is returned to the shell, when flag equals 1 or 3, the exit status of 1 is returned.
If you want to do further processing after printing the error message, then raise a custom exception instead of SystemExit and catch it by wrapping your main call in a try...except.
I guess, what you really want is a universal exception catcher, that would catch and return the exception of any wrapped function. You can easily do it this way.
def return_exception(func):
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as e:
return e
return wrapper
Example
In [3]: #return_exception
...: def div(a, b):
...: return a / b
...:
In [4]: div(1, 0)
Out[4]: ZeroDivisionError('division by zero')
So then you can process the return exception object the way you want, though it's pretty hard to say why you need that.
Update As others have noted it's generally good to only catch particular exceptions. You can modify the decorator slightly.
def return_exception(*exception_types):
def build_wrapper(func):
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except exception_types as e:
return e
return wrapper
return build_wrapper
Example:
In [6]: #return_exception(ZeroDivisionError)
...: def div(a, b):
...: return a / b
...:
In [7]: div(0, 1)
Out[7]: 0.0
In [8]: div(1, 0)
Out[8]: ZeroDivisionError('division by zero')
In [9]: div(1, "a")
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
...
TypeError: unsupported operand type(s) for /: 'int' and 'str'
In [10]: #return_exception(ZeroDivisionError, TypeError)
....: def div(a, b):
....: return a / b
....:
In [11]: div(1, 0)
Out[11]: ZeroDivisionError('division by zero')
In [12]: div(1, "a")
Out[12]: TypeError("unsupported operand type(s) for /: 'int' and 'str'")
As you see, you only catch the specified exceptions (you can still specify the universal Exception class, though).

General decorator to wrap try except in python?

I'd interacting with a lot of deeply nested json I didn't write, and would like to make my python script more 'forgiving' to invalid input. I find myself writing involved try-except blocks, and would rather just wrap the dubious function up.
I understand it's a bad policy to swallow exceptions, but I'd rather prefer they to be printed and analysed later, than to actually stop execution. It's more valuable, in my use-case to continue executing over the loop than to get all keys.
Here's what I'm doing now:
try:
item['a'] = myobject.get('key').METHOD_THAT_DOESNT_EXIST()
except:
item['a'] = ''
try:
item['b'] = OBJECT_THAT_DOESNT_EXIST.get('key2')
except:
item['b'] = ''
try:
item['c'] = func1(ARGUMENT_THAT_DOESNT_EXIST)
except:
item['c'] = ''
...
try:
item['z'] = FUNCTION_THAT_DOESNT_EXIST(myobject.method())
except:
item['z'] = ''
Here's what I'd like, (1):
item['a'] = f(myobject.get('key').get('subkey'))
item['b'] = f(myobject.get('key2'))
item['c'] = f(func1(myobject)
...
or (2):
#f
def get_stuff():
item={}
item['a'] = myobject.get('key').get('subkey')
item['b'] = myobject.get('key2')
item['c'] = func1(myobject)
...
return(item)
...where I can wrap either the single data item (1), or a master function (2), in some function that turns execution-halting exceptions into empty fields, printed to stdout. The former would be sort of an item-wise skip - where that key isn't available, it logs blank and moves on - the latter is a row-skip, where if any of the fields don't work, the entire record is skipped.
My understanding is that some kind of wrapper should be able to fix this. Here's what I tried, with a wrapper:
def f(func):
def silenceit():
try:
func(*args,**kwargs)
except:
print('Error')
return(silenceit)
Here's why it doesn't work. Call a function that doesn't exist, it doesn't try-catch it away:
>>> f(meow())
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'meow' is not defined
Before I even add a blank return value, I'd like to get it to try-catch correctly. If the function had worked, this would have printed "Error", right?
Is a wrapper function the correct approach here?
UPDATE
I've had a lot of really useful, helpful answers below, and thank you for them---but I've edited the examples I used above to illustrate that I'm trying to catch more than nested key errors, that I'm looking specifically for a function that wraps a try-catch for...
When a method doesn't exist.
When an object doesn't exist, and is getting a method called on it.
When an object that does not exist is being called as an argument to a function.
Any combination of any of these things.
Bonus, when a function doesn't exist.
There are lots of good answers here, but I didn't see any that address the question of whether you can accomplish this via decorators.
The short answer is "no," at least not without structural changes to your code. Decorators operate at the function level, not on individual statements. Therefore, in order to use decorators, you would need to move each of the statements to be decorated into its own function.
But note that you can't just put the assignment itself inside the decorated function. You need to return the rhs expression (the value to be assigned) from the decorated function, then do the assignment outside.
To put this in terms of your example code, one might write code with the following pattern:
#return_on_failure('')
def computeA():
item['a'] = myobject.get('key').METHOD_THAT_DOESNT_EXIST()
item["a"] = computeA()
return_on_failure could be something like:
def return_on_failure(value):
def decorate(f):
def applicator(*args, **kwargs):
try:
return f(*args,**kwargs)
except:
print('Error')
return value
return applicator
return decorate
You could use a defaultdict and the context manager approach as outlined in Raymond Hettinger's PyCon 2013 presentation
from collections import defaultdict
from contextlib import contextmanager
#contextmanager
def ignored(*exceptions):
try:
yield
except exceptions:
pass
item = defaultdict(str)
obj = dict()
with ignored(Exception):
item['a'] = obj.get(2).get(3)
print item['a']
obj[2] = dict()
obj[2][3] = 4
with ignored(Exception):
item['a'] = obj.get(2).get(3)
print item['a']
It's very easy to achieve using configurable decorator.
def get_decorator(errors=(Exception, ), default_value=''):
def decorator(func):
def new_func(*args, **kwargs):
try:
return func(*args, **kwargs)
except errors, e:
print "Got error! ", repr(e)
return default_value
return new_func
return decorator
f = get_decorator((KeyError, NameError), default_value='default')
a = {}
#f
def example1(a):
return a['b']
#f
def example2(a):
return doesnt_exist()
print example1(a)
print example2(a)
Just pass to get_decorator tuples with error types which you want to silence and default value to return.
Output will be
Got error! KeyError('b',)
default
Got error! NameError("global name 'doesnt_exist' is not defined",)
default
Edit: Thanks to martineau i changed default value of errors to tuples with basic Exception to prevents errors.
It depends on what exceptions you expect.
If your only use case is get(), you could do
item['b'] = myobject.get('key2', '')
For the other cases, your decorator approach might be useful, but not in the way you do it.
I'll try to show you:
def f(func):
def silenceit(*args, **kwargs): # takes all kinds of arguments
try:
return func(*args, **kwargs) # returns func's result
except Exeption, e:
print('Error:', e)
return e # not the best way, maybe we'd better return None
# or a wrapper object containing e.
return silenceit # on the correct level
Nevertheless, f(some_undefined_function())won't work, because
a) f() isn't yet active at the execution time and
b) it is used wrong. The right way would be to wrap the function and then call it: f(function_to_wrap)().
A "layer of lambda" would help here:
wrapped_f = f(lambda: my_function())
wraps a lambda function which in turn calls a non-existing function. Calling wrapped_f() leads to calling the wrapper which calls the lambda which tries to call my_function(). If this doesn't exist, the lambda raises an exception which is caught by the wrapper.
This works because the name my_function is not executed at the time the lambda is defined, but when it is executed. And this execution is protected and wrapped by the function f() then. So the exception occurs inside the lambda and is propagated to the wrapping function provided by the decorator, which handles it gracefully.
This move towards inside the lambda function doesn't work if you try to replace the lambda function with a wrapper like
g = lambda function: lambda *a, **k: function(*a, **k)
followed by a
f(g(my_function))(arguments)
because here the name resolution is "back at the surface": my_function cannot be resolved and this happens before g() or even f() are called. So it doesn't work.
And if you try to do something like
g(print)(x.get('fail'))
it cannot work as well if you have no x, because g() protects print, not x.
If you want to protect x here, you'll have to do
value = f(lambda: x.get('fail'))
because the wrapper provided by f() calls that lambda function which raises an exception which is then silenced.
Extending #iruvar answer - starting with Python 3.4 there is an existing context manager for this in Python standard lib: https://docs.python.org/3/library/contextlib.html#contextlib.suppress
from contextlib import suppress
with suppress(FileNotFoundError):
os.remove('somefile.tmp')
with suppress(FileNotFoundError):
os.remove('someotherfile.tmp')
in your case you first evaluate the value of the meow call (which doesn't exist) and then wrap it in the decorator. this doesn't work that way.
first the exception is raised before it was wrapped, then the wrapper is wrongly indented (silenceit should not return itself). You might want to do something like:
def hardfail():
return meow() # meow doesn't exist
def f(func):
def wrapper():
try:
func()
except:
print 'error'
return wrapper
softfail =f(hardfail)
output:
>>> softfail()
error
>>> hardfail()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in hardfail
NameError: global name 'meow' is not defined
anyway in your case I don't understand why you don't use a simple method such as
def get_subkey(obj, key, subkey):
try:
return obj.get(key).get(subkey, '')
except AttributeError:
return ''
and in the code:
item['a'] = get_subkey(myobject, 'key', 'subkey')
Edited:
In case you want something that will work at any depth. You can do something like:
def get_from_object(obj, *keys):
try:
value = obj
for k in keys:
value = value.get(k)
return value
except AttributeError:
return ''
That you'd call:
>>> d = {1:{2:{3:{4:5}}}}
>>> get_from_object(d, 1, 2, 3, 4)
5
>>> get_from_object(d, 1, 2, 7)
''
>>> get_from_object(d, 1, 2, 3, 4, 5, 6, 7)
''
>>> get_from_object(d, 1, 2, 3)
{4: 5}
And using your code
item['a'] = get_from_object(obj, 2, 3)
By the way, on a personal point of view I also like #cravoori solution using contextmanager. But this would mean having three lines of code each time:
item['a'] = ''
with ignored(AttributeError):
item['a'] = obj.get(2).get(3)
Why not just use cycle?
for dst_key, src_key in (('a', 'key'), ('b', 'key2')):
try:
item[dst_key] = myobject.get(src_key).get('subkey')
except Exception: # or KeyError?
item[dst_key] = ''
Or if you wish write a little helper:
def get_value(obj, key):
try:
return obj.get(key).get('subkey')
except Exception:
return ''
Also you can combine both solutions if you have a few places where you need to get value and helper function would be more reasonable.
Not sure that you actually need a decorator for your problem.
Since you're dealing with lots of broken code, it may be excusable to use eval in this case.
def my_eval(code):
try:
return eval(code)
except: # Can catch more specific exceptions here.
return ''
Then wrap all your potentially broken statements:
item['a'] = my_eval("""myobject.get('key').get('subkey')""")
item['b'] = my_eval("""myobject.get('key2')""")
item['c'] = my_eval("""func1(myobject)""")
How about something like this:
def exception_handler(func):
def inner_function(*args, **kwargs):
try:
func(*args, **kwargs)
except TypeError:
print(f"{func.__name__} error")
return inner_function
then
#exception_handler
def doSomethingExceptional():
a=2/0
all credits go to:https://medium.com/swlh/handling-exceptions-in-python-a-cleaner-way-using-decorators-fae22aa0abec
Try Except Decorator for sync and async functions
Note: logger.error can be replaced with print
Latest version can be found here.

python re-usable user error handling

I have been playing around with using error handling. In particular with user defined errors.
However I am not sure if the following approach a bad idea / recommended / plain weird?
import operator
from functools import partial
class GenericError(Exception):
def __init__(self, value):
self.value = value
def __str__(self):
return repr(self.value)
def errorhandle(error, func):
print(func.__name__, "says: ", error)
# or perhaps error_dictionary[error]
def f_test_bool(x, bo, y, et, m, ef):
""" generic boolean test. If boolean_operator(x,y) == True --> raise passed in Error """
try:
if bo(x,y):
raise et(m)
else:
return x
except et as err:
ef(err, f_test_bool)
partial_ne = partial(f_test_bool,
bo=operator.__ne__,
et=GenericError,
ef=errorhandle)
partial_ne( x = 5,
y = 6,
m = "oops, 5 is not equal to 6" )
>>> imp.reload(errorhandling)
f_test_bool says: 'oops, 5 is not eqal to 6'
my thought was that this way, I could have a simple module that I can re-use, and pipe values through without having to add user-defined errors to any new functions that I write. I thought this would keep things cleaner.
You're adding a lot of overhead for something that should be simple and obvious. Don't forget the zen:
Simple is better than complex.
Why not simply:
if 5 != 6:
raise ValueError('oops, 5 is not equal to 6')

Categories

Resources