How to perform error handling for 200 functions? - python

Following is a sample function
def temp(data):
a = sample_function_1(data)
a = a[0][0]
b = sample_function_2(data,'no',0,30)
b = b[-1]
c = sample_function_3(data, 'thank', 0, None, 15, 5)
c = c[-2]
print(a, b, c)
return a, b, c
Error handling is done for sample_function_1, sample_function_2, and sample_function_3 . So the values returned by these functions cause no problem but the calculations done on the variables a, b, c can throw errors.
There are more than 200 functions like this and I need to manage the errors effectively. I know that try, catch, except is an option but are there any approaches that can be taken to solve this problem.

With 200 functions to change everything is going to be some hard work but you could try to use a decorator, you will only be able to intercept one error per function but it may be a good compromise for your case.
def exception_logger(func):
wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except IndexError:
...doSomething...
raise
return wrapper
#exception_logger
def temp(data):
...doSomething...

I think you can just define a function and just call the exception_logger function in any of your except block in any function.
import traceback
def exception_logger():
exception_log = traceback.format_exc().splitlines()
exception_log.append(str(datetime.now()))
print exception_log
return exception_log
def hello():
try:
print("Hello World!")
except Exception as e:
exception_logger()

Related

Stop running a chain of functions in Python if an exception is triggered in one of the earlier functions

I have a set of functions which, are being called one after another depending on user input. I'm using this code to run the functions:
#Run Multiple Functions
def MultipleFunctions(*funcs):
def MultipleFunctions(*args, **kwargs):
for f in funcs:
f(*args, **kwargs)
return MultipleFunctions
I am calling the functions with command lines tied to tkinter objects. For instance:
Op1 = OptionMenu(MainForm, Type, *Options, command=MultipleFunctions(Func1, Func2, Func3, Func4))
Func1 to 4 run various simple checks mostly to catch illegal option selections. For example Func1 checks the input of an edit box:
#Check SideA Box Length is Valid
def Func1(event):
try:
float(SideA.get())
except ValueError:
messagebox.showerror(message="This box only accepts numbers between 0 and 100")
SideA.focus()
if float(SideA.get()) > 100:
messagebox.showerror(message="This box only accepts numbers between 0 and 100")
SideA.focus()
What I want to do is break out of the Multiple Functions routine if an exception is triggered in say Func1. At the moment it runs all four functions regardless.
Can somebody point me in the right direction please?
Thank you
You code is handling exception in except block.
There are few ways on how to stop execution if exception pops up.
You can not handle exception and it'll interrupt program just by itself.
You can rethrow exception after handling it in 'except' block.
You can return a bool value as a success indicator from your function and add code to either continue or stop your loop in MultipleFunctions based on return value.
Add a try-except around the function call and add a break statement to quit the loop.
Notice: in the decorator don't use the same function's name
def MultipleFunctions(*funcs):
def wrapper(*args, **kwargs):
for f in funcs:
try:
f(*args, **kwargs)
except Exception as e:
print(e)
break # <- quit the loop
return wrapper
def a(): print('a()')
def b(): print('b()')
def c(): print('c()')
# here without exceptions
MultipleFunctions(a, b, c)()
#a()
#b()
#c()
# here with exception: none of the function accept arguments!
MultipleFunctions(a, b, c)(2)
#a() takes 0 positional arguments but 1 was given
Further you need to reimplement the body of the function to raise exceptions:
def Func1(x): # similar to the Func1!
if isinstance(x, float) or isinstance(x, int):
if x > 100:
raise Exception('Error, >100!')
print('good')
else:
raise Exception('Error, not a number')
# to make it compatible with my example they need to be modify like this
def a(*args): print('a()')
def b(*args): print('b()')
def c(*args): print('c()')
MultipleFunctions(a, b, Func1, c)(1113)
#a()
#b()
#Error, >100!
MultipleFunctions(a, b, Func1, c)('smt')
#a()
#b()
#Error, not a number
MultipleFunctions(a, b, Func1, c)(10)
#a()
#b()
#good
#c()
EDIT From the comments: this is how Func1 should look like (I have no experience with tkinter)
def Func1(event):
x = SideA.get()
if isinstance(x, float) or isinstance(x, int):
if x > 100:
SideA.focus()
messagebox.showerror(message="This box only accepts numbers between 0 and 100")
raise Exception('Error, >100!')
print('good')
else:
SideA.focus()
messagebox.showerror(message="This box only accepts numbers between 0 and 100")
raise Exception('Error, not a number')

Repetitive wrapper functions for logging

I am writing a script that needs to use a class from an external library, do some
operations on instances of that class, and then repeat with some more instances.
Something like this:
import some_library
work_queue = get_items()
for item in work_queue:
some_object = some_library.SomeClass(item)
operation_1(some_object)
# ...
operation_N(some_object)
However, each of the operations in the loop body can raise some different exceptions.
When these happen I need to log them and skip to the next item. If they raise
some unexpected exception I need to log that before crashing.
I could catch all the exceptions in the main loop, but that would obscure what it does.
So I find myself writing a bunch of wrapper functions that all look kind of similar:
def wrapper_op1(some_object):
try:
some_object.method_1()
except (some_library.SomeOtherError, ValueError) as error_message:
logger.error("Op1 error on {}".format(some_object.friendly_name))
return False
except Exception as error_message:
logger.error("Unknown error during op1 on {} - crashing: {}".format(some_object.friendly_name, error_message))
raise
else:
return True
# Notice there is a different tuple of anticipated exceptions
# and the message formatting is different
def wrapper_opN(some_object):
try:
some_function(some_object.some_attr)
except (RuntimeError, AttributeError) as error_message:
logger.error("OpN error on {} with {}".format(some_object.friendly_name, some_object.some_attr, error_message))
return False
except Exception as error_message:
logger.error("Unknown error during opN on {} with {} - crashing: {}".(some_object.friendly_name, some_object.some_attr, error_message))
raise
else:
return True
And modifying my main loop to be:
for item in work_queue:
some_object = some_library.SomeClass(item)
if not wrapper_op1(some_object):
continue
# ...
if not wrapper_opN(some_object):
continue
This does the job, but it feels like a lot of copy and paste programming with
the wrappers. What would be great is to write a decorator function that could
do all that try...except...else stuff so I could do:
# logged_call(known_exception, known_error_message, unknown_error_message)
def wrapper_op1(some_object):
some_object.method_1()
The wrapper would return True if the operation succeeds, catch the known exceptions
and log with a specified format, and catch any unknown exceptions for logging before re-raising.
However, I can't seem to fathom how to make the error messages work - I can do it with fixed strings:
def logged_call(known_exceptions, s_err, s_fatal):
def decorate(f):
#wraps(f)
def wrapper(*args, **kwargs):
try:
f(*args, **kwargs)
# How to get parameters from args in the message?
except known_exceptions as error:
print(s_err.format(error))
return False
except Exception as error:
print(s_fatal.format(error))
raise
else:
return True
return wrapper
return decorate
However, my error messages need to get attributes that belong to the decorated function.
Is there some Pythonic way to make this work? Or a different pattern to be using
when dealing with might-fail-in-known-ways functions?
The below tryblock function can be used for this purpose, to implement your logged_call concept and reduce the total amount of code, assuming you have enough checks to overcome the decorator implementation. Folks not used to functional programming in Python may actually find this more difficult to understand than simply writing out the try blocks as you have done. Simplicity, like so many things, is in the eye of the beholder.
Python 2.7 using no imports. This uses the exec statement to create a customized try block that works in a functional pattern.
def tryblock(tryf, *catchclauses, **otherclauses):
u'return a general try-catch-else-finally block as a function'
elsef = otherclauses.get('elsef', None)
finallyf = otherclauses.get('finallyf', None)
namespace = {'tryf': tryf, 'elsef': elsef, 'finallyf': finallyf, 'func': []}
for pair in enumerate(catchclauses):
namespace['e%s' % (pair[0],)] = pair[1][0]
namespace['f%s' % (pair[0],)] = pair[1][1]
source = []
add = lambda indent, line: source.append(' ' * indent + line)
add(0, 'def block(*args, **kwargs):')
add(1, "u'generated function executing a try block'")
add(1, 'try:')
add(2, '%stryf(*args, **kwargs)' % ('return ' if otherclauses.get('returnbody', elsef is None) else '',))
for index in xrange(len(catchclauses)):
add(1, 'except e%s as ex:' % (index,))
add(2, 'return f%s(ex, *args, **kwargs)' % (index,))
if elsef is not None:
add(1, 'else:')
add(2, 'return elsef(*args, **kwargs)')
if finallyf is not None:
add(1, 'finally:')
add(2, '%sfinallyf(*args, **kwargs)' % ('return ' if otherclauses.get('returnfinally', False) else '',))
add(0, 'func.append(block)')
exec '\n'.join(source) in namespace
return namespace['func'][0]
This tryblock function is general enough to go into a common library, as it is not specific to the logic of your checks. Add to it your logged_call decorator, implemented as (one import here):
import functools
resultof = lambda func: func() # # token must be followed by an identifier
#resultof
def logged_call():
truism = lambda *args, **kwargs: True
def raisef(ex, *args, **kwargs):
raise ex
def impl(exlist, err, fatal):
return lambda func: \
functools.wraps(func)(tryblock(func,
(exlist, lambda ex, *args, **kwargs: err(ex, *args, **kwargs) and False),
(Exception, lambda ex, *args, **kwargs: fatal(ex, *args, **kwargs) and raisef(ex))),
elsef=truism)
return impl # impl therefore becomes logged_call
Using logged_call as implemented, your two sanity checks look like:
op1check = logged_call((some_library.SomeOtherError, ValueError),
lambda _, obj: logger.error("Op1 error on {}".format(obj.friendly_name)),
lambda ex, obj: logger.error("Unknown error during op1 on {} - crashing: {}".format(obj.friendly_name, ex.message)))
opNcheck = logged_call((RuntimeError, AttributeError),
lambda ex, obj: logger.error("OpN error on {} with {}".format(obj.friendly_name, obj.some_attr, ex.message)),
lambda ex, obj: logger.error("Unknown error during opN on {} with {} - crashing: {}".format(obj.friendly_name, obj.some_attr, ex.message)))
#op1check
def wrapper_op1(obj):
return obj.method_1()
#opNcheck
def wrapper_opN(obj):
return some_function(obj.some_attr)
Neglecting blank lines, this is more compact than your original code by 10 lines, though at the sunk cost of the tryblock and logged_call implementations; whether it is now more readable is a matter of opinion.
You also have the option to define logged_call itself and all distinct decorators derived from it in a separate module, if that's sensible for your code; and therefore to use each derived decorator multiple times.
You may also find more of the logic structure that you can factor into logged_call by tweaking the actual checks.
But in the worst case, where each check has logic that no other does, you may find that it's more readable to just write out each one like you have already. It really depends.
For completeness, here's a unit test for the tryblock function:
import examplemodule as ex
from unittest import TestCase
class TestTryblock(TestCase):
def test_tryblock(self):
def tryf(a, b):
if a % 2 == 0:
raise ValueError
return a + b
def proc_ve(ex, a, b):
self.assertIsInstance(ex, ValueError)
if a % 3 == 0:
raise ValueError
return a + b + 10
def elsef(a, b):
return a + b + 20
def finallyf(a, b):
return a + b + 30
block = ex.tryblock(tryf, (ValueError, proc_ve))
self.assertRaises(ValueError, block, 0, 4)
self.assertRaises(ValueError, block, 6, 4)
self.assertEqual([5, 16, 7, 18, 9], map(lambda v: block(v, 4), xrange(1, 6)))
block = ex.tryblock(tryf, (ValueError, proc_ve), elsef=elsef)
self.assertEqual([25, 16, 27, 18, 29], map(lambda v: block(v, 4), xrange(1, 6)))
block = ex.tryblock(tryf, (ValueError, proc_ve), elsef=elsef, returnbody=True)
self.assertEqual([5, 16, 7, 18, 9], map(lambda v: block(v, 4), xrange(1, 6)))
block = ex.tryblock(tryf, (ValueError, proc_ve), finallyf=finallyf)
self.assertEqual([5, 16, 7, 18, 9], map(lambda v: block(v, 4), xrange(1, 6)))
block = ex.tryblock(tryf, (ValueError, proc_ve), finallyf=finallyf, returnfinally=True)
self.assertEqual([35, 36, 37, 38, 39], map(lambda v: block(v, 4), xrange(1, 6)))

Multiple try codes in one block

I have a problem with my code in the try block.
To make it easy this is my code:
try:
code a
code b #if b fails, it should ignore, and go to c.
code c #if c fails, go to d
code d
except:
pass
Is something like this possible?
You'll have to make this separate try blocks:
try:
code a
except ExplicitException:
pass
try:
code b
except ExplicitException:
try:
code c
except ExplicitException:
try:
code d
except ExplicitException:
pass
This assumes you want to run code c only if code b failed.
If you need to run code c regardless, you need to put the try blocks one after the other:
try:
code a
except ExplicitException:
pass
try:
code b
except ExplicitException:
pass
try:
code c
except ExplicitException:
pass
try:
code d
except ExplicitException:
pass
I'm using except ExplicitException here because it is never a good practice to blindly ignore all exceptions. You'll be ignoring MemoryError, KeyboardInterrupt and SystemExit as well otherwise, which you normally do not want to ignore or intercept without some kind of re-raise or conscious reason for handling those.
You can use fuckit module.
Wrap your code in a function with #fuckit decorator:
#fuckit
def func():
code a
code b #if b fails, it should ignore, and go to c.
code c #if c fails, go to d
code d
Extract (refactor) your statements. And use the magic of and and or to decide when to short-circuit.
def a():
try: # a code
except: pass # or raise
else: return True
def b():
try: # b code
except: pass # or raise
else: return True
def c():
try: # c code
except: pass # or raise
else: return True
def d():
try: # d code
except: pass # or raise
else: return True
def main():
try:
a() and b() or c() or d()
except:
pass
If you don't want to chain (a huge number of) try-except clauses, you may try your codes in a loop and break upon 1st success.
Example with codes which can be put into functions:
for code in (
lambda: a / b,
lambda: a / (b + 1),
lambda: a / (b + 2),
):
try: print(code())
except Exception as ev: continue
break
else:
print("it failed: %s" % ev)
Example with arbitrary codes (statements) directly in the current scope:
for i in 2, 1, 0:
try:
if i == 2: print(a / b)
elif i == 1: print(a / (b + 1))
elif i == 0: print(a / (b + 2))
break
except Exception as ev:
if i:
continue
print("it failed: %s" % ev)
Lets say each code is a function and its already written then the following can be used to iter through your coding list and exit the for-loop when a function is executed without error using the "break".
def a(): code a
def b(): code b
def c(): code c
def d(): code d
for func in [a, b, c, d]: # change list order to change execution order.
try:
func()
break
except Exception as err:
print (err)
continue
I used "Exception " here so you can see any error printed. Turn-off the print if you know what to expect and you're not caring (e.g. in case the code returns two or three list items (i,j = msg.split('.')).
You could try a for loop
for func,args,kwargs in zip([a,b,c,d],
[args_a,args_b,args_c,args_d],
[kw_a,kw_b,kw_c,kw_d]):
try:
func(*args, **kwargs)
break
except:
pass
This way you can loop as many functions as you want without making the code look ugly
I use a different way, with a new variable:
continue_execution = True
try:
command1
continue_execution = False
except:
pass
if continue_execution:
try:
command2
except:
command3
to add more commands you just have to add more expressions like this:
try:
commandn
continue_execution = False
except:
pass
I ran into this problem, but then it was doing the things in a loop which turned it into a simple case of issueing the continue command if successful. I think one could reuse that technique if not in a loop, at least in some cases:
while True:
try:
code_a
break
except:
pass
try:
code_b
break
except:
pass
etc
raise NothingSuccessfulError
Like Elazar suggested:
"I think a decorator would fit here."
# decorator
def test(func):
def inner(*args, **kwargs):
try:
func(*args, **kwargs)
except: pass
return inner
# code blocks as functions
#test
def code_a(x):
print(1/x)
#test
def code_b(x):
print(1/x)
#test
def code_c(x):
print(1/x)
#test
def code_d(x):
print(1/x)
# call functions
code_a(0)
code_b(1)
code_c(0)
code_c(4)
output:
1.0
0.25

General decorator to wrap try except in python?

I'd interacting with a lot of deeply nested json I didn't write, and would like to make my python script more 'forgiving' to invalid input. I find myself writing involved try-except blocks, and would rather just wrap the dubious function up.
I understand it's a bad policy to swallow exceptions, but I'd rather prefer they to be printed and analysed later, than to actually stop execution. It's more valuable, in my use-case to continue executing over the loop than to get all keys.
Here's what I'm doing now:
try:
item['a'] = myobject.get('key').METHOD_THAT_DOESNT_EXIST()
except:
item['a'] = ''
try:
item['b'] = OBJECT_THAT_DOESNT_EXIST.get('key2')
except:
item['b'] = ''
try:
item['c'] = func1(ARGUMENT_THAT_DOESNT_EXIST)
except:
item['c'] = ''
...
try:
item['z'] = FUNCTION_THAT_DOESNT_EXIST(myobject.method())
except:
item['z'] = ''
Here's what I'd like, (1):
item['a'] = f(myobject.get('key').get('subkey'))
item['b'] = f(myobject.get('key2'))
item['c'] = f(func1(myobject)
...
or (2):
#f
def get_stuff():
item={}
item['a'] = myobject.get('key').get('subkey')
item['b'] = myobject.get('key2')
item['c'] = func1(myobject)
...
return(item)
...where I can wrap either the single data item (1), or a master function (2), in some function that turns execution-halting exceptions into empty fields, printed to stdout. The former would be sort of an item-wise skip - where that key isn't available, it logs blank and moves on - the latter is a row-skip, where if any of the fields don't work, the entire record is skipped.
My understanding is that some kind of wrapper should be able to fix this. Here's what I tried, with a wrapper:
def f(func):
def silenceit():
try:
func(*args,**kwargs)
except:
print('Error')
return(silenceit)
Here's why it doesn't work. Call a function that doesn't exist, it doesn't try-catch it away:
>>> f(meow())
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'meow' is not defined
Before I even add a blank return value, I'd like to get it to try-catch correctly. If the function had worked, this would have printed "Error", right?
Is a wrapper function the correct approach here?
UPDATE
I've had a lot of really useful, helpful answers below, and thank you for them---but I've edited the examples I used above to illustrate that I'm trying to catch more than nested key errors, that I'm looking specifically for a function that wraps a try-catch for...
When a method doesn't exist.
When an object doesn't exist, and is getting a method called on it.
When an object that does not exist is being called as an argument to a function.
Any combination of any of these things.
Bonus, when a function doesn't exist.
There are lots of good answers here, but I didn't see any that address the question of whether you can accomplish this via decorators.
The short answer is "no," at least not without structural changes to your code. Decorators operate at the function level, not on individual statements. Therefore, in order to use decorators, you would need to move each of the statements to be decorated into its own function.
But note that you can't just put the assignment itself inside the decorated function. You need to return the rhs expression (the value to be assigned) from the decorated function, then do the assignment outside.
To put this in terms of your example code, one might write code with the following pattern:
#return_on_failure('')
def computeA():
item['a'] = myobject.get('key').METHOD_THAT_DOESNT_EXIST()
item["a"] = computeA()
return_on_failure could be something like:
def return_on_failure(value):
def decorate(f):
def applicator(*args, **kwargs):
try:
return f(*args,**kwargs)
except:
print('Error')
return value
return applicator
return decorate
You could use a defaultdict and the context manager approach as outlined in Raymond Hettinger's PyCon 2013 presentation
from collections import defaultdict
from contextlib import contextmanager
#contextmanager
def ignored(*exceptions):
try:
yield
except exceptions:
pass
item = defaultdict(str)
obj = dict()
with ignored(Exception):
item['a'] = obj.get(2).get(3)
print item['a']
obj[2] = dict()
obj[2][3] = 4
with ignored(Exception):
item['a'] = obj.get(2).get(3)
print item['a']
It's very easy to achieve using configurable decorator.
def get_decorator(errors=(Exception, ), default_value=''):
def decorator(func):
def new_func(*args, **kwargs):
try:
return func(*args, **kwargs)
except errors, e:
print "Got error! ", repr(e)
return default_value
return new_func
return decorator
f = get_decorator((KeyError, NameError), default_value='default')
a = {}
#f
def example1(a):
return a['b']
#f
def example2(a):
return doesnt_exist()
print example1(a)
print example2(a)
Just pass to get_decorator tuples with error types which you want to silence and default value to return.
Output will be
Got error! KeyError('b',)
default
Got error! NameError("global name 'doesnt_exist' is not defined",)
default
Edit: Thanks to martineau i changed default value of errors to tuples with basic Exception to prevents errors.
It depends on what exceptions you expect.
If your only use case is get(), you could do
item['b'] = myobject.get('key2', '')
For the other cases, your decorator approach might be useful, but not in the way you do it.
I'll try to show you:
def f(func):
def silenceit(*args, **kwargs): # takes all kinds of arguments
try:
return func(*args, **kwargs) # returns func's result
except Exeption, e:
print('Error:', e)
return e # not the best way, maybe we'd better return None
# or a wrapper object containing e.
return silenceit # on the correct level
Nevertheless, f(some_undefined_function())won't work, because
a) f() isn't yet active at the execution time and
b) it is used wrong. The right way would be to wrap the function and then call it: f(function_to_wrap)().
A "layer of lambda" would help here:
wrapped_f = f(lambda: my_function())
wraps a lambda function which in turn calls a non-existing function. Calling wrapped_f() leads to calling the wrapper which calls the lambda which tries to call my_function(). If this doesn't exist, the lambda raises an exception which is caught by the wrapper.
This works because the name my_function is not executed at the time the lambda is defined, but when it is executed. And this execution is protected and wrapped by the function f() then. So the exception occurs inside the lambda and is propagated to the wrapping function provided by the decorator, which handles it gracefully.
This move towards inside the lambda function doesn't work if you try to replace the lambda function with a wrapper like
g = lambda function: lambda *a, **k: function(*a, **k)
followed by a
f(g(my_function))(arguments)
because here the name resolution is "back at the surface": my_function cannot be resolved and this happens before g() or even f() are called. So it doesn't work.
And if you try to do something like
g(print)(x.get('fail'))
it cannot work as well if you have no x, because g() protects print, not x.
If you want to protect x here, you'll have to do
value = f(lambda: x.get('fail'))
because the wrapper provided by f() calls that lambda function which raises an exception which is then silenced.
Extending #iruvar answer - starting with Python 3.4 there is an existing context manager for this in Python standard lib: https://docs.python.org/3/library/contextlib.html#contextlib.suppress
from contextlib import suppress
with suppress(FileNotFoundError):
os.remove('somefile.tmp')
with suppress(FileNotFoundError):
os.remove('someotherfile.tmp')
in your case you first evaluate the value of the meow call (which doesn't exist) and then wrap it in the decorator. this doesn't work that way.
first the exception is raised before it was wrapped, then the wrapper is wrongly indented (silenceit should not return itself). You might want to do something like:
def hardfail():
return meow() # meow doesn't exist
def f(func):
def wrapper():
try:
func()
except:
print 'error'
return wrapper
softfail =f(hardfail)
output:
>>> softfail()
error
>>> hardfail()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in hardfail
NameError: global name 'meow' is not defined
anyway in your case I don't understand why you don't use a simple method such as
def get_subkey(obj, key, subkey):
try:
return obj.get(key).get(subkey, '')
except AttributeError:
return ''
and in the code:
item['a'] = get_subkey(myobject, 'key', 'subkey')
Edited:
In case you want something that will work at any depth. You can do something like:
def get_from_object(obj, *keys):
try:
value = obj
for k in keys:
value = value.get(k)
return value
except AttributeError:
return ''
That you'd call:
>>> d = {1:{2:{3:{4:5}}}}
>>> get_from_object(d, 1, 2, 3, 4)
5
>>> get_from_object(d, 1, 2, 7)
''
>>> get_from_object(d, 1, 2, 3, 4, 5, 6, 7)
''
>>> get_from_object(d, 1, 2, 3)
{4: 5}
And using your code
item['a'] = get_from_object(obj, 2, 3)
By the way, on a personal point of view I also like #cravoori solution using contextmanager. But this would mean having three lines of code each time:
item['a'] = ''
with ignored(AttributeError):
item['a'] = obj.get(2).get(3)
Why not just use cycle?
for dst_key, src_key in (('a', 'key'), ('b', 'key2')):
try:
item[dst_key] = myobject.get(src_key).get('subkey')
except Exception: # or KeyError?
item[dst_key] = ''
Or if you wish write a little helper:
def get_value(obj, key):
try:
return obj.get(key).get('subkey')
except Exception:
return ''
Also you can combine both solutions if you have a few places where you need to get value and helper function would be more reasonable.
Not sure that you actually need a decorator for your problem.
Since you're dealing with lots of broken code, it may be excusable to use eval in this case.
def my_eval(code):
try:
return eval(code)
except: # Can catch more specific exceptions here.
return ''
Then wrap all your potentially broken statements:
item['a'] = my_eval("""myobject.get('key').get('subkey')""")
item['b'] = my_eval("""myobject.get('key2')""")
item['c'] = my_eval("""func1(myobject)""")
How about something like this:
def exception_handler(func):
def inner_function(*args, **kwargs):
try:
func(*args, **kwargs)
except TypeError:
print(f"{func.__name__} error")
return inner_function
then
#exception_handler
def doSomethingExceptional():
a=2/0
all credits go to:https://medium.com/swlh/handling-exceptions-in-python-a-cleaner-way-using-decorators-fae22aa0abec
Try Except Decorator for sync and async functions
Note: logger.error can be replaced with print
Latest version can be found here.

Generic Exception Handling in Python the "Right Way"

Sometimes I find myself in the situation where I want to execute several sequential commands like such:
try:
foo(a, b)
except Exception, e:
baz(e)
try:
bar(c, d)
except Exception, e:
baz(e)
...
This same pattern occurs when exceptions simply need to be ignored.
This feels redundant and the excessive syntax causes it to be surprisingly difficult to follow when reading code.
In C, I would have solved this type of problem easily with a macro, but unfortunately, this cannot be done in straight python.
Question: How can I best reduce the code footprint and increase code readability when coming across this pattern?
You could use the with statement if you have python 2.5 or above:
from __future__ import with_statement
import contextlib
#contextlib.contextmanager
def handler():
try:
yield
except Exception, e:
baz(e)
Your example now becomes:
with handler():
foo(a, b)
with handler():
bar(c, d)
If this is always, always the behaviour you want when a particular function raises an exception, you could use a decorator:
def handle_exception(handler):
def decorate(func):
def call_function(*args, **kwargs):
try:
func(*args, **kwargs)
except Exception, e:
handler(e)
return call_function
return decorate
def baz(e):
print(e)
#handle_exception(baz)
def foo(a, b):
return a + b
#handle_exception(baz)
def bar(c, d):
return c.index(d)
Usage:
>>> foo(1, '2')
unsupported operand type(s) for +: 'int' and 'str'
>>> bar('steve', 'cheese')
substring not found
If they're simple one-line commands, you can wrap them in lambdas:
for cmd in [
(lambda: foo (a, b)),
(lambda: bar (c, d)),
]:
try:
cmd ()
except StandardError, e:
baz (e)
You could wrap that whole thing up in a function, so it looked like this:
ignore_errors (baz, [
(lambda: foo (a, b)),
(lambda: bar (c, d)),
])
The best approach I have found, is to define a function like such:
def handle_exception(function, reaction, *args, **kwargs):
try:
result = function(*args, **kwargs)
except Exception, e:
result = reaction(e)
return result
But that just doesn't feel or look right in practice:
handle_exception(foo, baz, a, b)
handle_exception(bar, baz, c, d)
You could try something like this. This is vaguely C macro-like.
class TryOrBaz( object ):
def __init__( self, that ):
self.that= that
def __call__( self, *args ):
try:
return self.that( *args )
except Exception, e:
baz( e )
TryOrBaz( foo )( a, b )
TryOrBaz( bar )( c, d )
In your specific case, you can do this:
try:
foo(a, b)
bar(c, d)
except Exception, e:
baz(e)
Or, you can catch the exception one step above:
try:
foo_bar() # This function can throw at several places
except Exception, e:
baz(e)

Categories

Resources