This question already has answers here:
Break or exit out of "with" statement?
(13 answers)
Closed 2 years ago.
I have a With statement that I'd like to skip if some <condition> is satisfied. That is, I write:
With MyContext() as mc:
do_something()
and
class MyContext(object):
...
def __enter__(self,...):
if <condition>:
JumpToExit()
def __exit__(self,...):
print('goodbye')
I would like do_something() to be executed only on certain conditions, otherwise I'd like JumpToExit() to skip the body entirely and just finish the block.
Thanks.
This is impossible. A with statement cannot cancel the block. However, you could do something like this:
def __enter__(self):
return condition
Then, when you use the context:
with context as condition:
if condition:
....
If you want to return something else as well you can return a tuple like return condition, self then to use it you can unpack the tuple like with condition, context:.
I am not sure that is possible. __enter__ is executed outside of the try block introduced by the with statement, so I don't see a way of jumping directly from __enter__ into __exit__.
with basically (simplified) turns this:
with context as x:
do_something()
Into
x = context.__enter__()
try:
do_something()
finally:
context.__exit__()
You need to throw an exception in do_something() or successfully complete it to get into __exit__. If you do throw an exception from do_something() you can do tricky stuff like suppressing it in the exit function (using some of the parameters passed to it), so you don't actually see it. But it has to be the code inside the with block which somehow causes the jump into __exit__.
Maybe if you can somehow ensure that do_something immediately throws an exception by setting some value in __enter__ you can make it work. But that does not sound like a very good idea to me.
You could put the with statement in a function and use return:
def foo():
With MyContext() as mc:
if <condition>:
return None
do_something()
Related
I want to implement a way to repeat a section of code as many times as it's needed using a context manager only, because of its pretty syntax. Like this:
with try_until_success(attempts=10):
command1()
command2()
command3()
The commands must be executed once if no errors happen. And they should be executed again if an error occurred, until 10 attempts has passed, if so the error must be raised. For example, it can be useful to reconnect to a data base. The syntax I represented is literal, I do not want to modify it (so do not suggest me to replace it with a kind of for of while statements).
Is there a way to implement try_until_success in Python to do what I want?
What I tried is:
from contextlib import contextmanager
#contextmanager
def try_until_success(attempts=None):
counter = 0
while True:
try:
yield
except Exception as exc:
pass
else:
break
counter += 1
if attempts is not None and counter >= attempts:
raise exc
And this gives me the error:
RuntimeError: generator didn't stop after throw()
I know, there are many ways to reach what I need using a loop instead of with-statement or with the help of a decorator. But both have syntax disadvantages. For example, in case of a loop I have to insert try-except block, and in case of a decorator I have to define a new function.
I have already looked at the questions:
How do I make a contextmanager with a loop inside?
Conditionally skipping the body of Python With statement
They did not help in my question.
The problem is that the body of the with statement does not run within the call to try_until_success. That function returns an object with a __enter__ method; that __enter__ method calls and returns, then the body of the with statement is executed. There is no provision for wrapping the body in any kind of loop that would allow it to be repeated once the end of the with statement is reached.
This goes against how context managers were designed to work, you'd likely have to resort to non-standard tricks like patching the bytecode to do this.
See the official docs on the with statement and the original PEP 343 for how they are expanded. It might help you understand why this isn't going to be officially supported, and maybe why other commenters are generally saying this is a bad thing to try and do.
As an example of something that might work, maybe try:
class try_until_success:
def __init__(self, attempts):
self.attempts = attempts
self.attempt = 0
self.done = False
self.failures = []
def __iter__(self):
while not self.done and self.attempt < self.attempts:
i = self.attempt
yield self
assert i != self.attempt, "attempt not attempted"
if self.done:
return
if self.failures:
raise Exception("failures occurred", self.failures)
def __enter__(self):
self.attempt += 1
def __exit__(self, _ext, exc, _tb):
if exc:
self.failures.append(exc)
return True
self.done = True
for attempt in try_until_success(attempts=10):
with attempt:
command1()
command2()
command3()
you'd probably want to separate out the context manager from the iterator (to help prevent incorrect usage) but it sort of does something similar to what you were after
Is there a way to implement try_until_success in Python to do what I
want?
Yes. You don't need to make it a context manager. Just make it a function accepting a function:
def try_until_success(command, attempts=1):
for _ in range(attempts):
try:
return command()
except Exception as exc:
err = exc
raise err
And then the syntax is still pretty clear, no for or while statements - not even with:
attempts = 10
try_until_success(command1, attempts)
try_until_success(command2, attempts)
try_until_success(command3, attempts)
Is it pythonic to call one function after the other? I have two functions, and one depends of the result of the other:
function1() # if something goes wrong, will raise a error, if not, will return None
function2()
And I was thinking about using:
function1() is None and function2()
Is this pythonic?
You shouldn't think of the return value of None as indicating success, but rather the absence of an exception. Use a try statement to make it explicit that you are aware of the possibility of an exception, but are intentionally letting it pass up the call chain should one be raised:
try:
function1()
else:
function2()
If you want, you can be explicit:
try:
function1()
except Exception:
raise
else:
function2()
I would be tempted to use a try....except test instead of two separate functions
MyFunction()
try:
<your first action goes here>
except:
<what you want to happen if an error occurs go here>
You might need to use two except statements, one for a None return and one for any others. There is plenty of useful information in the documentation: https://wiki.python.org/moin/HandlingExceptions
In Ruby, I can pass a block of code to a method.
For example, I can pass different code blocks to get_schedules_with_retries method.
And invoke the block by calling black.call
I'd like to know how could I implement that logic in Python,
Because I have lots of code blocks, need retry pattern.
I don't like copy paste the retry logic in many code blocks
Example:
def get_schedules_with_retries(&block)
max_retry_count = 3
retry_count = 0
while (retry_count < max_retry_count)
begin
schedules = get_more_raw_schedules
block.call(schedules)
rescue Exception => e
print_error(e)
end
if schedules.count > 0
break
else
retry_count+=1
end
end
return schedules
end
get_schedules_with_retries do |schedules|
# do something here
end
get_schedules_with_retries do |schedules|
# do another thing here
end
In Python, a block is a syntactic feature (an indentation under block opening statements like if or def) and not an object. The feature you expect may be a closure (which can access variables outside of the block), which you can achieve using inner functions, but any callable could be used. Because of how lambda works in Python, the inline function definition you've shown with do |arg| is limited to a single expression.
Here's a rough rewrite of your sample code in Python.
def get_schedules_with_retries(callable, max_retry_count = 3):
retry_count = 0
while retry_count < max_retry_count:
schedules = get_more_raw_schedules()
try:
callable(schedules)
except: # Note: could filter types, bind name etc.
traceback.print_exc()
if schedules.count > 0:
break
else:
retry_count+=1
return schedules
get_schedules_with_retries(lambda schedules: single_expression)
def more_complex_function(schedules):
pass # do another thing here
get_schedules_with_retries(more_complex_function)
One variant uses a for loop to make it clear the loop is finite:
def call_with_retries(callable, args=(), tries=3):
for attempt in range(tries):
try:
result=callable(*args)
break
except:
traceback.print_exc()
continue
else: # break never reached, so function always failed
raise # Reraises the exception we printed above
return result
Frequently when passing callables like this, you'll already have the function you want available somewhere and won't need to redefine it. For instance, methods on objects (bound methods) are perfectly valid callables.
You could do it like this:
def codeBlock(paramter1, parameter2):
print("I'm a code block")
def passMeABlock(block, *args):
block(*args)
#pass the block like this
passMeABlock(codeBlock, 1, 2)
You do so by defining a function, either by using the def statement or a lambda expression.
There are other techniques however, that may apply here. If you need to apply common logic to the input or output of a function, write a decorator. If you need to handle exceptions in a block of code, perhaps creating a context manager is applicable.
This question already has an answer here:
Why does my contextmanager-function not work like my contextmanager class in python?
(1 answer)
Closed 8 years ago.
Here are the relevant pieces of code:
#contextlib.contextmanager
def make_temp_dir():
temp_dir = tempfile.mkdtemp()
yield temp_dir
shutil.rmtree(temp_dir)
with make_temp_dir(listing_id) as tmpdir:
pass
# Sometimes something in here throws an exception that gets caught higher up
Ok, so writing this all out, I understand now what's happening. The exit method in the contextmanager I'm creating with the decorator is running but that doesn't, of course, return flow to my generator.
So how should I be doing this?
What happens here is the following:
On __enter__(), the generator is started. What it yields is taken for return value of __enter__().
On __exit__(), the generator is resumed, either in a normal way or by injecting an exception. The relevant code is in $PYTHONROOT/contextlib.py where you can see that either next() or throw() is called on the generator.
If throw() is called on a generator, the exception is raised inside it exactly where we left it the last time, i. e. the yield expression raises the exception then.
Thus, you will have to enclose the yield in a try: statement. Only then, you'll be able to do something with the exception.
If you fail to do so, your generator will raise the exception back without doing anything.
You probably want
#contextlib.contextmanager
def make_temp_dir():
temp_dir = tempfile.mkdtemp()
try:
yield temp_dir
finally:
shutil.rmtree(temp_dir)
This question already has answers here:
How to use "pass" statement?
(18 answers)
Closed 8 years ago.
Based on the answers to this question, the pass keyword in Python does absolutely nothing. It simply indicates that "nothing will be done here."
Given this, I don't understand the use of it. For example, I'm looking at the following block of code while trying to debug a script someone else wrote:
def dcCount(server):
ssh_cmd='ssh user#subserver.st%s' % (server)
cmd='%s "%s"' % (ssh_cmd, sub_cmd)
output=Popen (cmd, shell=True, stdout=PIPE)
result=output.wait()
queryResult=""
if result == 0:
queryResult = output.communicate()[0].strip()
else:
pass
takeData(server, "DC", queryResult)
Is there any purpose at all to having else: pass here? Does this in any way change the way the function runs? It seems like this if/else block could be rewritten like the following with absolutely no change:
if result == 0:
queryResult = output.communicate()[0].strip()
takeData(server, "DC", queryResult)
... or am I missing something? And, if I'm not missing something, why would I ever want to use the pass keyword at all?
It is indeed useless in your example.
It is sometimes helpful if you want a block to be empty, something not otherwise allowed by Python. For instance, when defining your own exception subclass:
class MyException(Exception):
pass
Or maybe you want to loop over some iterator for its side effects, but do nothing with the results:
for _ in iterator:
pass
But most of the time, you won't need it.
Remember that if you can add something else that isn't a comment, you may not need pass. An empty function, for instance, can take a docstring and that will work as a block:
def nop():
"""This function does nothing, intentionally."""
pass is used when you want to do nothing, but are syntatically required to have something. I most commonly use it with try...except blocks when I want to simply skip over lines which have an error, as shown below
>>> try:
... 1/0
...
File "<stdin>", line 3
^
SyntaxError: invalid syntax
>>> try:
... 1/0
... except:
...
File "<stdin>", line 4
^
IndentationError: expected an indented block
>>> try:
... 1/0
... except:
... pass
...
>>>
Empty classes
Empty functions
or, like the other guy said - empty clauses like "except"
There is no point to the use of pass you have shown, it can be removed along with the else. Still, I have found pass useful occasionally, such as when commenting out the body of a conditional, or when implementing a class which has no members:
class dummy:
pass