I am trying to run a function whenever an error occurs. I don't want to use try or except because my code is very large and there are so much chances of an error to occur , so I can't use try.. everywhere. This is what I am expecting:
>>> if ValueError: #don't works , just assuming.
print("Hey! You are done.")
>>> int("abc")
Hey! You are done.
>>> int("1b")
Hey! You are done.
>>>
Is there any way to do this?
If your code is a whole block, I recommend splitting it into functions. From here, you can wrap each function in a decorator that takes an error and a function to be run on error:
def handle_error(error, error_func):
def decorator(func):
def wrapper(*args, **kwargs):
r = None
try:
r = func(*args, **kwargs)
except error as e:
r = error_func()
finally:
return r
return wrapper
return decorator
Then use on functions like so:
def bad_value():
print('bad value given!')
#handle_error(ValueError, bad_value)
def multiply(a, b):
return a * b
Of course, you can be more 'broad', and catch all exceptions...
#handle_error(Exception, error_func)
def func():
# ...
A ValueError is triggered by a particular piece of code, and will raise itself immediately after the offending statement. int("abc") will raise a ValueError on its own, and the program execution will stop, before it reaches any if ValueError statement.
You need a try/except block to allow python to catch the error and continue executing. I can't see any way to achieve what you want without one.
Related
Hi im currently doing a program like this.
class MyError(Exception):
def __init__(self, text = "Correct")
self.text = text
def __str__(self):
return (self.kod)
class Atom(self):
.
.
.
try:
function()
else:
raise MyError("Incorrect use of function")
def main():
try:
a = Atom()
except:
# Here i want to print the error that was raised
What I think I understand is that the error is raised in an object created in Atom().
But I want to send it to my main program and do the print of the error MyError there.
Is it possible to do this and how should I write it so that the correct text of exception is printed since i will have several different error messages.
If i come to the except statement I would want to get the message "Incorrect use of function" printed.
It seems that you're pretty close:
class MyError(Exception):
def __init__(self, text = "Correct")
self.text = text
def __str__(self):
return (self.kod)
class Atom(self):
.
.
.
try:
function()
except: # try: ... else: raise ... seems a bit funky to me.
raise MyError("Incorrect use of function")
def main():
try:
a = Atom()
except Exception as err: # Possibly `except MyError as err` to be more specific
print err
The trick is that when you catch the error, you want to bind the exception instance to a name using the as clause. Then you can print it, look at it's attributes, re-raise or pretty much do anything you choose with it.
Please note that this code still isn't "clean". Generally, you want to limit exception handling as much as possible -- only catch exceptions that expect to see and that you know how to handle. Otherwise, you can sometimes mask hard to find bugs in your code. Because of this:
try:
do_something()
except:
...
is discouraged (it catches all sorts of things like KeyboardInterrupt and SystemExit) ... Instead:
try:
do_something()
except ExceptionIKnowHowToHandle:
...
is advised.
Firstly, never do a blank except. That will catch all errors, including things like KeyboardInterrupt - so you won't be able to ctrl-c out of your program. Here you should just catch MyError.
The except clause also allows you to assign the actual exception to a variable, which you can then print or do anything else with. So you can do:
try:
...
except MyError as e:
print e.text
This question already has answers here:
is there a pythonic way to try something up to a maximum number of times?
(10 answers)
Closed 7 months ago.
I am writing in Python 2.7 and encounter the following situation. I would like to try calling a function three times. If all three times raise errors, I will raise the last error I get. If any one of the calls succeed, I will quit trying and continue immediately.
Here is what I have right now:
output = None
error = None
for _e in range(3):
error = None
try:
print 'trial %d!' % (_e + 1)
output = trial_function()
except Exception as e:
error = e
if error is None:
break
if error is not None:
raise error
Is there a better snippet that achieve the same use case?
use decorator
from functools import wraps
def retry(times):
def wrapper_fn(f):
#wraps(f)
def new_wrapper(*args,**kwargs):
for i in range(times):
try:
print 'try %s' % (i + 1)
return f(*args,**kwargs)
except Exception as e:
error = e
raise error
return new_wrapper
return wrapper_fn
#retry(3)
def foo():
return 1/0;
print foo()
Here is one possible approach:
def attempt(func, times=3):
for _ in range(times):
try:
return func()
except Exception as err:
pass
raise err
A demo with a print statement in:
>>> attempt(lambda: 1/0)
Attempt 1
Attempt 2
Attempt 3
Traceback (most recent call last):
File "<pyshell#18>", line 1, in <module>
attempt(lambda: 1/0)
File "<pyshell#17>", line 8, in attempt
raise err
ZeroDivisionError: integer division or modulo by zero
If you're using Python 3.x and get an UnboundLocalError, you can adapt as follows:
def attempt(func, times=3):
to_raise = None
for _ in range(times):
try:
return func()
except Exception as err:
to_raise = err
raise to_raise
This is because the err is cleared at the end of the try statement; per the docs:
When an exception has been assigned using as target, it is cleared
at the end of the except clause.
Ignoring the debug output and the ancient Python dialect, this looks good. The only thing I would change is to put it into a function, you could then simply return the result of trial_function(). Also, the error = None then becomes unnecessary, including the associated checks. If the loop terminates, error must have been set, so you can just throw it. If you don't want a function, consider using else in combination with the for loop and breaking after the first result.
for i in range(3):
try:
result = foo()
break
except Exception as error:
pass
else:
raise error
use_somehow(result)
Of course, the suggestion to use a decorator for the function still holds. You can also apply this locally, the decorator syntax is only syntactic sugar after all:
# retry from powerfj's answer below
rfoo = retry(3)(foo)
result = rfoo()
Came across a clean way of doing the retries. There is a module called retry.
First install the module using
pip install retry
Then import the module in the code.
from retry import retry
Use #retry decorator above the method, We can pass the parameters to the decorator. Some of the parameters are tries , delay , Exception.
Example
from retry import retry
#retry(AssertionError, tries=3, delay=2)
def retryfunc():
try:
ret = False
assert ret, "Failed"
except Exception as ex:
print(ex)
raise ex
The above code asserts and fails everytime, but the retry decorator retries for 3 times with a delay of 2 seconds between retries. Also this only retries on Assertion failures since we have specified the error type as AssertionError on any other error the function wont retry.
I am creating a decorator that catches a raised error in it's target function, and allows the user to continue executing the script (bypassing the function) or drop out of the script.
def catch_error(func):
"""
This decorator is used to make sure that if a decorated function breaks
in the execution of a script, the script doesn't automatically crash.
Instead, it gives you the choice to continue or gracefully exit.
"""
def caught(*args):
try:
return func(*args)
except Exception as err:
question = '\n{0} failed. Continue? (yes/no): '.format(func.func_name)
answer = raw_input(question)
if answer.lower() in ['yes','y']:
pass
else:
print " Aborting! Error that caused failure:\n"
raise err
return None
return caught
Notice that, if the user chooses to bypass the error-returning function and continue executing the script, the decorator returns None. This works well for functions that only return a single value, but it is crashing on functions that attempt to unpack multiple values. For instance,
# Both function and decorator return single value, so works fine
one_val = decorator_works_for_this_func()
# Function nominally returns two values, but decorator only one, so this breaks script
one_val, two_val = decorator_doesnt_work_for_this_func()
Is there a way that I can determine the number of values my target function is supposed to return? For instance, something like:
def better_catch_error(func):
def caught(*args):
try:
return func(*args)
except Exception as err:
...
num_rvals = determine_num_rvals(func)
if num_rvals > 1:
return [ None for count in range(num_rvals) ]
else:
return None
return caught
As always, if there is a better way to do this sort of thing, please let me know. Thanks!
UPDATE:
Thanks for all the suggestions. I decided to narrow the scope of catch_error to a single class of functions, which only return one string value. I just split all the functions returning more than one value into separate functions that return a single value to make them compatible. I had been hoping to make catch_error more generic (and there were several helpful suggestions on how to do that), but for my application it was a little overkill. Thanks again.
Martijn Pieters answer is correct, this is a specific case of the Halting Problem
However you might get around it by passing a error return value to the decorator. Something like this:
def catch_error(err_val):
def wrapper(func):
def caught(*args):
try:
return func(*args)
except Exception as err:
question = '\n{0} failed. Continue? (yes/no): '.format(func.func_name)
answer = raw_input(question)
if answer.lower() in ['yes','y']:
pass
else:
print " Aborting! Error that caused failure:\n"
raise err
return err_val
return caught
return wrapper
Then you could decorate using:
#catch_error({})
def returns_a_dict(*args, **kwargs):
return {'a': 'foo', 'b': 'bar'}
Also as a point of style, if you are grabbing *args in your wrapped function, you should probably also grab **kwargs so that you can properly wrap functions that take keyword arguments. Otherwise your wrapped function will fail if you call wrapped_function(something=value)
Finally, as another point of style, it is confusing to see code that does if a: pass with an else. Try using if !a in these cases. So the final code:
def catch_error(err_val):
def wrapper(func):
def caught(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as err:
question = '\n{0} failed. Continue? (yes/no): '.format(func.func_name)
answer = raw_input(question)
if answer.lower() not in ['yes','y']:
print " Aborting! Error that caused failure:\n"
raise err
return err_val
return caught
return wrapper
No, there is no way you can determine that.
Python is a dynamic language, and a given function can return an arbitrary sequence of any size or no sequence at all, based on the inputs and other factors.
For example:
import random
def foo():
if random.random() < 0.5:
return ('spam', 'eggs')
return None
This function will return a tuple half of the time, but None the other half, so Python has no way of telling you what foo() will return.
There are many more ways your decorator can fail, btw, not just when the function returns a sequence that the caller then unpacks. What if the caller expected a dictionary and starts trying to access keys, or a list and wants to know the length?
Your decorator can't predict what your function is going to return, but nothing prevents you from telling the decorator what return signature to simulate:
#catch_error([None, None])
def tuple_returner(n):
raise Exception
return [2, 3]
Instead of returning None, your decorator will return its argument ([None, None]).
Writing an argument-taking decorator is just slightly tricky: The expression catch_error([None, None]) will be evaluated, and must return the actual decorator that will be applied to the decorated function. It looks like this:
def catch_error(signature=None):
def _decorator(func):
def caught(*args):
try:
return func(*args)
except Exception as err:
# Interactive code suppressed
return signature
return caught
return _decorator
Note that even if you just want it to return None, you need to execute it once:
#catch_error()
def some_function(x):
...
Sometimes I need the following pattern within a for loop. At times more than once in the same loop:
try:
# attempt to do something that may diversely fail
except Exception as e:
logging.error(e)
continue
Now I don't see a nice way to wrap this in a function as it can not return continue:
def attempt(x):
try:
raise random.choice((ValueError, IndexError, TypeError))
except Exception as e:
logging.error(e)
# continue # syntax error: continue not properly in loop
# return continue # invalid syntax
return None # this sort of works
If I return None than I could:
a = attempt('to do something that may diversely fail')
if not a:
continue
But I don't feel that does it the justice. I want to tell the for loop to continue (or fake it) from within attempt function.
Python already has a very nice construct for doing just this and it doesn't use continue:
for i in range(10):
try:
r = 1.0 / (i % 2)
except Exception, e:
print(e)
else:
print(r)
I wouldn't nest any more than this, though, or your code will soon get very ugly.
In your case I would probably do something more like this as it is far easier to unit test the individual functions and flat is better than nested:
#!/usr/bin/env python
def something_that_may_raise(i):
return 1.0 / (i % 2)
def handle(e):
print("Exception: " + str(e))
def do_something_with(result):
print("No exception: " + str(result))
def wrap_process(i):
try:
result = something_that_may_raise(i)
except ZeroDivisionError, e:
handle(e)
except OverflowError, e:
handle(e) # Realistically, this will be a different handler...
else:
do_something_with(result)
for i in range(10):
wrap_process(i)
Remember to always catch specific exceptions. If you were not expecting a specific exception to be thrown, it is probably not safe to continue with your processing loop.
Edit following comments:
If you really don't want to handle the exceptions, which I still think is a bad idea, then catch all exceptions (except:) and instead of handle(e), just pass. At this point wrap_process() will end, skipping the else:-block where the real work is done, and you'll go to the next iteration of your for-loop.
Bear in mind, Errors should never pass silently.
The whole idea of exceptions is that they work across multiple levels of indirection, i.e., if you have an error (or any other exceptional state) deep inside your call hierarchy, you can still catch it on a higher level and handle it properly.
In your case, say you have a function attempt() which calls the functions attempt2() and attempt3() down the call hierarchy, and attempt3() may encounter an exceptional state which should cause the main loop to terminate:
class JustContinueException(Exception):
pass
for i in range(0,99):
try:
var = attempt() # calls attempt2() and attempt3() in turn
except JustContinueException:
continue # we don't need to log anything here
except Exception, e:
log(e)
continue
foo(bar)
def attempt3():
try:
# do something
except Exception, e:
# do something with e, if needed
raise # reraise exception, so we catch it downstream
You can even throw a dummy exception yourself, that would just cause the loop to terminate, and wouldn't even be logged.
def attempt3():
raise JustContinueException()
Apart from the context I just want to answer the question in a brief fashion. No, a function cannot continue a loop it may be called in. That is because it has no information about this context. Also, it would raise a whole new class of questions like what shall happen if that function is called without a surrounding loop to handle that continue?
BUT a function can signal by various means that it wants the caller to continue any loop it currently performs. One means of course is the return value. Return False or None to signal this for example. Another way of signaling this is to raise a special Exception:
class ContinuePlease(Exception): pass
def f():
raise ContinuePlease()
for i in range(10):
try:
f()
except ContinuePlease:
continue
Maybe you want to do continuations? You could go and look at how Eric Lippert explains them (if you are ready to have your mind blown, but in Python it could look a bit like this:
def attempt(operation, continuation):
try:
operation()
except:
log('operation failed!')
continuation()
Inside your loop you could do:
attempt(attempt_something, lambda: foo(bar)) # attempt_something is a function
You could use this:
for l in loop:
attempt() and foo(bar)
but you should make sure attempt() returns True or False.
Really, though, Johnsyweb's answer is probably better.
Think that you are mapping foo on all items where attempt worked. So attempt is a filter and it's easy to write this as a generator:
def attempted( items ):
for item in items:
try:
yield attempt( item )
except Exception, e:
log(e)
print [foo(bar) for bar in attempted( items )]
I wouldn't normally post a second answer, but this is an alternative approach if you really don't like my first answer.
Remember that a function can return a tuple.
#!/usr/bin/env python
def something_that_mail_fail(i):
failed = False
result = None
try:
result = 1.0 / (i % 4)
except:
failed = True # But we don't care
return failed, result
for i in range(20):
failed, result = something_that_mail_fail(i)
if failed:
continue
for rah in ['rah'] * 3:
print(rah)
print(result)
I maintain that try ... except ... else is the way to go, and you shouldn't silently ignore errors though. Caveat emptor and all that.
Try the for loop outside the try, except block
This answer had Python 3.4 in mind however there are better ways in newer versions. Here is my suggestion
import sys
if '3.4' in sys.version:
from termcolor import colored
def list_attributes(module_name):
'''Import the module before calling this func on it.s '''
for index, method in enumerate(dir(module_name)):
try:
method = str(method)
module = 'email'
expression = module + '.' + method
print('*' * len(expression), '\n')
print( str(index).upper() + '. ',colored( expression.upper(), 'red'),
' ', eval( expression ).dir() , '...' , '\n'2 )
print('' * len(expression), '\n')
print( eval( expression + '.doc' ), '\n'*4,
'END OF DESCRIPTION FOR: ' + expression.upper(), '\n'*4)
except (AttributeError, NameError):
continue
else:
pass
finally:
pass
Edit: Removed all that stupidity I said...
The final answer was to rewrite the whole thing, so that I don't need to code like that.
Suppose I have a function definiton:
def test():
print 'hi'
I get a TypeError whenever I gives an argument.
Now, I want to put the def statement in try. How do I do this?
try:
test()
except TypeError:
print "error"
In [1]: def test():
...: print 'hi'
...:
In [2]: try:
...: test(1)
...: except:
...: print 'exception'
...:
exception
Here is the relevant section in the tutorial
By the way. to fix this error, you should not wrap the function call in a try-except. Instead call it with the right number of arguments!
You said
Now, I want to put the def statement
in try. How to do this.
The def statement is correct, it is not raising any exceptions. So putting it in a try won't do anything.
What raises the exception is the actual call to the function. So that should be put in the try instead:
try:
test()
except TypeError:
print "error"
If you want to throw the error at call-time, which it sounds like you might want, you could try this aproach:
def test(*args):
if args:
raise
print 'hi'
This will shift the error from the calling location to the function. It accepts any number of parameters via the *args list. Not that I know why you'd want to do that.
A better way to handle a variable number of arguments in Python is as follows:
def foo(*args, **kwargs):
# args will hold the positional arguments
print args
# kwargs will hold the named arguments
print kwargs
# Now, all of these work
foo(1)
foo(1,2)
foo(1,2,third=3)
This is valid:
try:
def test():
print 'hi'
except:
print 'error'
test()