Exception handling of a function in Python - python

Suppose I have a function definiton:
def test():
print 'hi'
I get a TypeError whenever I gives an argument.
Now, I want to put the def statement in try. How do I do this?

try:
test()
except TypeError:
print "error"

In [1]: def test():
...: print 'hi'
...:
In [2]: try:
...: test(1)
...: except:
...: print 'exception'
...:
exception
Here is the relevant section in the tutorial
By the way. to fix this error, you should not wrap the function call in a try-except. Instead call it with the right number of arguments!

You said
Now, I want to put the def statement
in try. How to do this.
The def statement is correct, it is not raising any exceptions. So putting it in a try won't do anything.
What raises the exception is the actual call to the function. So that should be put in the try instead:
try:
test()
except TypeError:
print "error"

If you want to throw the error at call-time, which it sounds like you might want, you could try this aproach:
def test(*args):
if args:
raise
print 'hi'
This will shift the error from the calling location to the function. It accepts any number of parameters via the *args list. Not that I know why you'd want to do that.

A better way to handle a variable number of arguments in Python is as follows:
def foo(*args, **kwargs):
# args will hold the positional arguments
print args
# kwargs will hold the named arguments
print kwargs
# Now, all of these work
foo(1)
foo(1,2)
foo(1,2,third=3)

This is valid:
try:
def test():
print 'hi'
except:
print 'error'
test()

Related

Run function when error occurs without try

I am trying to run a function whenever an error occurs. I don't want to use try or except because my code is very large and there are so much chances of an error to occur , so I can't use try.. everywhere. This is what I am expecting:
>>> if ValueError: #don't works , just assuming.
print("Hey! You are done.")
>>> int("abc")
Hey! You are done.
>>> int("1b")
Hey! You are done.
>>>
Is there any way to do this?
If your code is a whole block, I recommend splitting it into functions. From here, you can wrap each function in a decorator that takes an error and a function to be run on error:
def handle_error(error, error_func):
def decorator(func):
def wrapper(*args, **kwargs):
r = None
try:
r = func(*args, **kwargs)
except error as e:
r = error_func()
finally:
return r
return wrapper
return decorator
Then use on functions like so:
def bad_value():
print('bad value given!')
#handle_error(ValueError, bad_value)
def multiply(a, b):
return a * b
Of course, you can be more 'broad', and catch all exceptions...
#handle_error(Exception, error_func)
def func():
# ...
A ValueError is triggered by a particular piece of code, and will raise itself immediately after the offending statement. int("abc") will raise a ValueError on its own, and the program execution will stop, before it reaches any if ValueError statement.
You need a try/except block to allow python to catch the error and continue executing. I can't see any way to achieve what you want without one.

Mocking method in production code, not in tests

I have a method in my code like this :
def save(items):
try:
for item in items():
do_something(item)
except Exception:
my_logging.warning("error happened")
def do_something(item):
pass
I would like to invoke this method from another location in the code, however, I would like to call a different method instead of do_something(item) :
#transaction.atomic
def do_with_transaction(item)
delete(item)
do_something(item)
Is it ok to use mock with side effect for production code? this way I can mock do_something() to use do_with_transaction(item).
It looks to me like a clean solution.
If what you want is to reuse save(items) but calling another function (instead of do_something() in the for loop, just pass the desired function as argument:
def save(items, callback=do_something):
for item in items():
try:
callback(item)
except Exception as e:
my_logging.exception("error %s happened on item %s", e, item)
and then:
save(items, do_something_with_transaction)

Detect whether an Exceptions was already handled in nested with statements in Python 2.7

Consider the following piece of code:
class Test(object):
def __enter__(self):
pass
def __exit__(self,type,value,trace):
if type:
print "Error occured: " + str(value.args)
#if I changed the next line to 'return True', the
#'print "should not happen"' statements are executed, but the
#error information would be only printed once (what I want)
return False
return True
with Test():
with Test():
with Test():
raise Exception('Foo','Bar')
print "should not happen"
print "should not happen"
Output of example:
Error occured: ('Foo', 'Bar')
Error occured: ('Foo', 'Bar')
Error occured: ('Foo', 'Bar')
I have several nested with Statements, and want to handle the case where an Exception is raised somewhere in the code. What I want to achieve is that the execution is stopped (no "should not happen" output in the example above), but the error information is only printed once. I therefore need to somehow know whether the same error was already handled.
Do you have any idea how this could be achieved?
You can't really do this cleanly -- Either the context manager swallows the exception or it doesn't.
If you're OK with the exception propagating out of the manager (which you should be if you're handling arbitrary exceptions here), you can monkey-patch the exception instance:
def __exit__(self, type, value, trace):
if type:
if not getattr(value, '_printed_it_already', False):
print "error occured: " + str(value.args)
value._printed_it_already = True
return False
return True
Note that this sort of monkey-patching is frowned upon in Python ... I think it's worth asking what you're actually trying to do. Usually when an unhandled exception prints it's stack-trace, you get a pretty good idea what the exception's args were to begin with...
You can add a error_handled attribute to the exception and test for it:
class Test(object):
def __enter__(self):
pass
def __exit__(self,type,value,trace):
if type:
if not getattr(value,'error_handled', False):
value.error_handled = True
print "Error occured: " + str(value.args)
with Test():
with Test():
with Test():
raise Exception('Foo','Bar')
print "should not happen"
print "should not happen"

a cleaner way to approach try except in python

So, let say I have 3 different calls called something, something1 and something2.
and right now, im calling it like
try:
something
something1
something2
except Keyerror as e:
print e
Note that in the above code, if something fails, something1 and something2 will not get executed and so on.
The wanted outcome is
try:
something
except KeyError as e:
print e
try:
something1
except KeyError as e:
print e
try:
something2
except KeyError as e:
print e
How can I achieve the above code without so many try except blocks.
EDIT:
So, the answer I chose as correct worked. But some of the others worked as well. I chose that because it was the simplist and I modified it a little.
Here is my solution based on the answer.
runs = [something, something1, something2]
for func in runs:
try:
func()
except Keyerror as e:
print e
You could try this, assuming you wrap things in functions:
for func in (something, something1, something2):
try:
func()
except Keyerror as e:
print e
Here's a little context manager I've used for similar situations:
from contextlib import contextmanager
#contextmanager
def ignoring(*exceptions):
try:
yield
except exceptions or Exception as e:
print e
with ignoring(KeyError):
something()
# you can also put it on the same line if it's just one statement
with ignoring(KeyError): something1()
with ignoring(KeyError): something2()
A Python 3 version could let you parameterize what to do when an exception occurs (the keyword-only arguments are needed here):
from contextlib import contextmanager
#contextmanager
def ignoring(*exceptions, action=print):
try:
yield
except exceptions or Exception as e:
callable(action) and action(e)
Then you could pass in some function other than print (such as a logger, assumed to be a function named log) or if you don't want anything, pass in None (since it checks to see if the action is callable):
with ignoring(KeyError, action=log): something()
I would go with something like this:
def safe_do(*statements):
for statement, args, kwargs in statements:
try:
statement(*args, **kwargs)
except KeyError as e:
print e
# usage:
safe_do(
(something1, [], {}),
(something2, [], {}),
)
But if you are expecting only one element to be missing in statements than why don't you if it?
if some_key1 in some_dict1:
something1
if some_key2 in some_dict2:
something2
much more readable and without any magic
Other possibility
def mydec(func):
def dec():
try:
func()
except KeyError as e:
print(e)
return dec
#mydec
def f1():
print('a')
#mydec
def f2():
print('b')
raise KeyError('Test')
f1()
f2()
This greatly depends on whether or not you're doing similar tasks, or very different tasks. For example, if your something lines are all very similar you could do the following:
def something(my_dict):
try:
d = my_dict['important_key'] # Just an example, since we
return d # don't know what you're really doing
except KeyError as e:
print e
something(dict1)
something(dict2)
something(dict3)
However, if your tasks are wildly different, this approach may not be applicable. To a certain degree you're asking "How do I write efficient code", and the answer to that depends on what code you're writing.
In python3, if you want to input a function with its args and kwargs, you can use the code below:
def safe_do(**statement):
try:
statement['func'](*statement['args'],**statement['kwargs'])
except Exception as e:
print(e)
print(statement['func'])
print(statement['args'])
print(statement['kwargs'])
def divide(a,b):
print(a/b)
safe_do(func=divide,args=[1,0],kwargs={})
In my colab notebook, I presented it.

Determining the number of return values in a Python function

I am creating a decorator that catches a raised error in it's target function, and allows the user to continue executing the script (bypassing the function) or drop out of the script.
def catch_error(func):
"""
This decorator is used to make sure that if a decorated function breaks
in the execution of a script, the script doesn't automatically crash.
Instead, it gives you the choice to continue or gracefully exit.
"""
def caught(*args):
try:
return func(*args)
except Exception as err:
question = '\n{0} failed. Continue? (yes/no): '.format(func.func_name)
answer = raw_input(question)
if answer.lower() in ['yes','y']:
pass
else:
print " Aborting! Error that caused failure:\n"
raise err
return None
return caught
Notice that, if the user chooses to bypass the error-returning function and continue executing the script, the decorator returns None. This works well for functions that only return a single value, but it is crashing on functions that attempt to unpack multiple values. For instance,
# Both function and decorator return single value, so works fine
one_val = decorator_works_for_this_func()
# Function nominally returns two values, but decorator only one, so this breaks script
one_val, two_val = decorator_doesnt_work_for_this_func()
Is there a way that I can determine the number of values my target function is supposed to return? For instance, something like:
def better_catch_error(func):
def caught(*args):
try:
return func(*args)
except Exception as err:
...
num_rvals = determine_num_rvals(func)
if num_rvals > 1:
return [ None for count in range(num_rvals) ]
else:
return None
return caught
As always, if there is a better way to do this sort of thing, please let me know. Thanks!
UPDATE:
Thanks for all the suggestions. I decided to narrow the scope of catch_error to a single class of functions, which only return one string value. I just split all the functions returning more than one value into separate functions that return a single value to make them compatible. I had been hoping to make catch_error more generic (and there were several helpful suggestions on how to do that), but for my application it was a little overkill. Thanks again.
Martijn Pieters answer is correct, this is a specific case of the Halting Problem
However you might get around it by passing a error return value to the decorator. Something like this:
def catch_error(err_val):
def wrapper(func):
def caught(*args):
try:
return func(*args)
except Exception as err:
question = '\n{0} failed. Continue? (yes/no): '.format(func.func_name)
answer = raw_input(question)
if answer.lower() in ['yes','y']:
pass
else:
print " Aborting! Error that caused failure:\n"
raise err
return err_val
return caught
return wrapper
Then you could decorate using:
#catch_error({})
def returns_a_dict(*args, **kwargs):
return {'a': 'foo', 'b': 'bar'}
Also as a point of style, if you are grabbing *args in your wrapped function, you should probably also grab **kwargs so that you can properly wrap functions that take keyword arguments. Otherwise your wrapped function will fail if you call wrapped_function(something=value)
Finally, as another point of style, it is confusing to see code that does if a: pass with an else. Try using if !a in these cases. So the final code:
def catch_error(err_val):
def wrapper(func):
def caught(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as err:
question = '\n{0} failed. Continue? (yes/no): '.format(func.func_name)
answer = raw_input(question)
if answer.lower() not in ['yes','y']:
print " Aborting! Error that caused failure:\n"
raise err
return err_val
return caught
return wrapper
No, there is no way you can determine that.
Python is a dynamic language, and a given function can return an arbitrary sequence of any size or no sequence at all, based on the inputs and other factors.
For example:
import random
def foo():
if random.random() < 0.5:
return ('spam', 'eggs')
return None
This function will return a tuple half of the time, but None the other half, so Python has no way of telling you what foo() will return.
There are many more ways your decorator can fail, btw, not just when the function returns a sequence that the caller then unpacks. What if the caller expected a dictionary and starts trying to access keys, or a list and wants to know the length?
Your decorator can't predict what your function is going to return, but nothing prevents you from telling the decorator what return signature to simulate:
#catch_error([None, None])
def tuple_returner(n):
raise Exception
return [2, 3]
Instead of returning None, your decorator will return its argument ([None, None]).
Writing an argument-taking decorator is just slightly tricky: The expression catch_error([None, None]) will be evaluated, and must return the actual decorator that will be applied to the decorated function. It looks like this:
def catch_error(signature=None):
def _decorator(func):
def caught(*args):
try:
return func(*args)
except Exception as err:
# Interactive code suppressed
return signature
return caught
return _decorator
Note that even if you just want it to return None, you need to execute it once:
#catch_error()
def some_function(x):
...

Categories

Resources