How to prevent the wrong StopIteration from being caught? - python

Is there a way to prevent StopIteration exceptions from being thrown from unrelated code (without having to catch them manually)?
Example: loop_all wants to loop through the myiter iterator and simply move on when this one has finished. This works unless some_dangerous_method or any other code in myiter raises a StopIteration.
def loop_all():
myiter = myiter()
try:
while True:
next(myiter) # <- I want exactly the StopIteration from this next method
except StopIteration:
pass
def myiter():
some_dangerous_method() # what if this also raises a StopIteration?
for i in some_other_iter():
# here may be more code
yield
Is there a way to make it clear to which StopIteration the code should react to?

If a function you are calling is invoking next(iter), and isn't dealing with StopIteration, then that function has a bug. Fix it.

Perhaps I'm missing something but why not simply this:
def myiter():
try:
some_dangerous_method()
except StopIteration:
pass # or raise a different exception
for i in some_other_iter():
# here may be more code
yield

Related

How to catch StopIteration from subgenerator

I'd like to write a generator which can accept a limited number of inputs from yields and then gracefully handle further inputs. What's the best way of catching StopIteration?
I've tried wrapping by inner generator with an outer generator using a yield from expression inside a try-except block, but StopIteration gets raised anyway...
def limited_writer(max_writes):
for i in range(max_writes):
x = yield
print(x) #
def graceful_writer(l):
try:
yield from l
except StopIteration:
# Ideally will have additional handling logic here
raise Exception("Tried to write too much")
l_w = limited_writer(4)
g_w = graceful_writer(w)
g_w.send(None)
for i in range(5):
g_w.send(i)
I'd like the above to raise Exception (but more generally provide a nice way of handling providing too much data), but in fact it still raises StopIteration. What's the best solution?
If you want graceful_writer to keep accepting data that is sent to it via its .send() method, it needs to keep on yielding indefinitely. The try/except block you currently have doesn't actually do anything, the yield from statement already absorbs the StopIteration from limited_writer. The one you are seeing at the top level comes from graceful_writer itself, when it reaches the end of its code.
To avoid that, try using an infinite loop, like this:
def graceful_writer(gen):
yield from gen # send values to wrapped generator for as long as it will take them
while True:
yield # then loop forever, discarding any additional values sent in

try/except and descision making

I am working on a function which takes different kinds of date_formats as an argument and dispatches it to a function, which is in charge for parsing this format
In other words:
def parse_format(date_format):
# descision making happens here
try:
from_timestamp(date_format)
except:
pass
try:
from_dateformat(date_format)
except:
pass
def from_timestamp(format):
# raise if not in charge
def from_dateformat(format):
# raise if not in charge
def from_custom_format(format):
# raise if not in charge
Currently, parse_format has multiple try/except blocks. Is this the way to go, or is there a more obvious way to do it? Furthermore, how do I handle the case, where every function call fails?
I would do something like this:
class UnrecognizedFormatError(Exception):
pass
def parse_format(date_format):
methods = (from_timestamp, from_dateformat)
for method in methods:
try:
return method(date_format)
except:
pass
raise UnrecognizedFormatError
But also some key points:
except without a specific exception is bad, because a exception can be thrown from unexpected places, such as running out of memory, or a keyboard interrupt in a script. So please use the except SomeException as e form, and use a specific exception type.
If every function fails, this code will throw a UnrecognizedFormatError, allowing the function's user to respond appropriately.
Well, I would look at this as a great place for try/except/else/finally - the else catching your final case where every function call fails, and the 'finally' being run whatever happens in your try/except statements. If your exceptions are appropriately chosen, then it will pick the right function for you.
Also, I'm guessing that this is a learning exercise, as the activity you're describing would be better done in date.strftime()
def from_timestamp(format):
# raise if not in charge
def from_dateformat(format):
# raise if not in charge
def from_custom_format(format):
# raise if not in charge
def parse_format(date_format):
# decision making happens here
try:
from_timestamp(date_format)
except(FirstException):
from_dateformat(date_format)
except(SecondException):
from_custom_format(date_format)
else:
whatever_you_do_if_it_all_goes_wrong()
finally:
thing_that_happens_regardless_of_what's_called()

What is the elegant/Pythonic way to keep variables in scope, yet also catch exceptions?

I've been using Python for a few months now, and I love it so far. I also like how there is usually a single, "Pythonic" way to handle common programming problems.
In code I've been writing, as I make calls to network functions and have to handle exceptions, I keep running into this template of sorts that I end up writing:
proxyTest = None
try:
proxyTest = isProxyWorking(proxy)
except TimeoutError:
break
if proxyTest:
...
This is my way of declaring proxyTest so that it is in scope when I need to use it, yet also calling the function that will return a value for it inside of the proper exception handling structure. If I declare proxyTest inside of the try block, it will be out of scope for the rest of my program.
I feel like there has to be a more elegant way to handle this, but I'm not sure. Any suggestions?
You have a couple of better options, continue your flow control in the else block:
try:
proxyTest = isProxyWorking(proxy)
except TimeoutError:
break
else:
#proxyTest is guaranteed to be bound here
Or handle the failure case in the except block.
try:
proxyTest = isProxyWorking(proxy)
except TimeoutError:
proxyTest = None
#proxyTest is guaranteed to be bound here
Whichever is better depends on context, I think.
The obvious alternative would be to do the 'false' initialization in the except block:
try:
proxyTest = isProxyWorking(proxy)
except TimeoutError:
proxyTest = None
Whether this is easier/more appropriate than your constructions depends on how complicated the logic is in the middle, though.
I would put the code after
if proxyTest:
in the try block, just after binding proxyTest.

Use case of try-except-else statement

What is the point of using an else clause if there is a return instruction in the except clause?
def foo():
try:
# Some code
except:
# Some code
return
else:
# Some code
I'm asking this question because the Django documentation does it at some point, in the vote() function. Considering that the return instruction in the except clause will anyway stop the execution of the function, why did they use an else clause to isolate the code that should only be executed if no exception was raised? They could have just omitted the else clause entirely.
If there is no exception in the try: suite, then the else: suite is executed. In other words, only if there is an actual exception is the except: suite reached and the return statement used.
In my view, the return statement is what is redundant here; a pass would have sufficed. I'd use an else: suite to a try when there is additional code that should only be executed if no exception is raised, but could raise exceptions itself that should not be caught.
You are right that a return in the except clause makes using an else: for that section of code somewhat redundant. The whole suite could be de-dented and the else: line removed:
def foo():
try:
# Some code
except:
# Some code
return
# Some code
From the docs:
The use of the else clause is better than adding additional code to the try clause because it avoids accidentally catching an exception that wasn’t raised by the code being protected by the try ... except statement.
http://docs.python.org/2/tutorial/errors.html#handling-exceptions

Multiple exception handlers for the same Exception

I have a code for a function which is called inside another function.(Result of refactoring).
So in the called function I have a huge block of try-catch statements as.
def Called():
try:
#All statements for the function in the try block.
except A:
# Exception handler.
except B:
# Exception handler.
except A:
# Exception handler.
The problem I have is that I need to catch two exceptions of the same type (At different locations in the Called function). Which then are handled by the Calling function.
One way would be to define two try-except blocks within the Called function. But I am not understanding how the Calling function can handle two exceptions of the same type differently.
This won't work as advertised; only the first except A clause will ever get executed. What you need is either some logic inside the clause to further inspect the exception, or (if the code inside the try block permits) several try-except blocks.
Example of the former approach:
try:
something_that_might_fail()
except A as e:
if e.is_harmless():
pass
elif e.is_something_we_can_handle():
handle_it()
else:
raise # re-raise in the hope it gets handled further up the stack
I think this will work
def Called():
try:
#All statements for the function in the try block.
except A:
try:
do_someting()
except B:
try:
do_somthing_else()
except:
except A:
# Exception handler.

Categories

Resources