For a long time I didn't know you can't put return in front of a yield statement. But actually you can:
def gen():
return (yield 42)
which is similar to
def gen():
yield 42
return
And the only usage I can think of is to attach sent value to StopIteration: pep-0380
return expr in a generator causes StopIteration(expr) to be raised
upon exit from the generator.
def gen():
return (yield 42)
g = gen()
print(next(g)) # 42
try:
g.send('AAAA')
except StopIteration as e:
print(e.value) # 'AAAA'
But this can be done using an extra variable too, which is more explicit:
def gen():
a = yield 42
return a
g = gen()
print(next(g))
try:
g.send('AAAA')
except StopIteration as e:
print(e.value) # 'AAAA'
So it seems return (yield xxx) is merely a syntactic sugar. Am I missing something?
Inside a generator the expressions (yield 42) will yield the value 42, but it also returns a value which is either None, if you use next(generator) or a given value if you use generator.send(value).
So as you say, you could use an intermediate value to get the same behavior, not because this is syntactical sugar, but because the yield expressions is literally returning the value you send it.
You could equally do something like
def my_generator():
return (yield (yield 42) + 10)
If we call this, using the sequence of calls:
g = my_generator()
print(next(g))
try:
print('first response:', g.send(1))
print('Second response:', g.send(22))
print('third response:', g.send(3))
except StopIteration as e:
print('stopped at', e.value)
First we get the output of 42, and the generator is essentially paused in a state you could describe like: return (yield <Input will go here> + 10),
If we then call g.send(1) we get the output 11. and the generator is now in the state:
return <Input will go here>, then sending g.send(22) will throw a StopIteration(22), because of the way return is handled in generators.
So you never get to the third send because of the exception.
I hope this example makes it a bit more apparent how yield works in generators and why the syntax return (yield something) is nothing special or exotic and works exactly how you'd expect it.
As for the literal question, when would you do this? Well when ever you want to yield something, and then later return a StopIteration echoing the input of the user sent to the generator. Because this is literally what the code is stating. I expect that such behavior is very rarely wanted.
Related
This question already has answers here:
Return in generator together with yield
(2 answers)
Closed last year.
Why does
yield [cand]
return
lead to different output/behavior than
return [[cand]]
Minimal viable example
uses recursion
the output of the version using yield [1]; return is different than the output of the version using return [[1]]
def foo(i):
if i != 1:
yield [1]
return
yield from foo(i-1)
def bar(i):
if i != 1:
return [[1]]
yield from bar(i-1)
print(list(foo(1))) # [[1]]
print(list(bar(1))) # []
Min viable counter example
does not use recurion
the output of the version using yield [1]; return is the same as the output of the version using return [[1]]
def foo():
yield [1]
return
def foofoo():
yield from foo()
def bar():
return [[1]]
def barbar():
yield from bar()
print(list(foofoo())) # [[1]]
print(list(barbar())) # [[1]]
Full context
I'm solving Leetcode #39: Combination Sum and was wondering why one solution works, but not the other:
Working solution
from functools import cache # requires Python 3.9+
class Solution:
def combinationSum(self, candidates: List[int], target: int) -> List[List[int]]:
#cache
def helper(targ, i=0):
if i == N or targ < (cand := candidates[i]):
return
if targ == cand:
yield [cand]
return
for comb in helper(targ - cand, i):
yield comb + [cand]
yield from helper(targ, i+1)
N = len(candidates)
candidates.sort()
yield from helper(target)
Non-working solution
from functools import cache # requires Python 3.9+
class Solution:
def combinationSum(self, candidates: List[int], target: int) -> List[List[int]]:
#cache
def helper(targ, i=0):
if i == N or targ < (cand := candidates[i]):
return
if targ == cand:
return [[cand]]
for comb in helper(targ - cand, i):
yield comb + [cand]
yield from helper(targ, i+1)
N = len(candidates)
candidates.sort()
yield from helper(target)
Output
On the following input
candidates = [2,3,6,7]
target = 7
print(Solution().combinationSum(candidates, target))
the working solution correctly prints
[[3,2,2],[7]]
while the non-working solution prints
[]
I'm wondering why yield [cand]; return works, but return [[cand]] doesn't.
In a generator function, return just defines the value associated with the StopIteration exception implicitly raised to indicate an iterator is exhausted. It's not produced during iteration, and most iterating constructs (e.g. for loops) intentionally ignore the StopIteration exception (it means the loop is over, you don't care if someone attached random garbage to a message that just means "we're done").
For example, try:
>>> def foo():
... yield 'onlyvalue' # Existence of yield keyword makes this a generator
... return 'returnvalue'
...
>>> f = foo() # Makes a generator object, stores it in f
>>> next(f) # Pull one value from generator
'onlyvalue'
>>> next(f) # There is no other yielded value, so this hits the return; iteration over
--------------------------------------------------------------------------
StopIteration Traceback (most recent call last)
...
StopIteration: 'returnvalue'
As you can see, your return value does get "returned" in a sense (it's not completely discarded), but it's never seen by anything iterating normally, so it's largely useless. Outside of rare cases involving using generators as coroutines (where you're using .send() and .throw() on instances of the generator and manually advancing it with next(genobj)), the return value of a generator won't be seen.
In short, you have to pick one:
Use yield anywhere in a function, and it's a generator (whether or not the code path of a particular call ever reaches a yield) and return just ends generation (while maybe hiding some data in the StopIteration exception). No matter what you do, calling the generator function "returns" a new generator object (which you can loop over until exhausted), it can never return a raw value computed inside the generator function (which doesn't even begin running until you loop over it at least once).
Don't use yield, and return works as expected (because it's not a generator function).
As an example to explain what happens to the return value in normal looping constructs, this is what for x in gen(): effectively expands to a C optimized version of:
__unnamed_iterator = iter(gen())
while True:
try:
x = next(__unnamed_iterator)
except StopIteration: # StopIteration caught here without inspecting it
break # Loop ends, StopIteration exception cleaned even from sys.exc_info() to avoid possible reference cycles
# body of loop goes here
# Outside of loop, there is no StopIteration object left
As you can see, the expanded form of the for loop has to look for a StopIteration to indicate the loop is over, but it doesn't use it. And for anything that's not a generator, the StopIteration never has any associated values; the for loop has no way to report them even if it did (it has to end the loop when it's told iteration is over, and the arguments to StopIteration are explicitly not part of the values iterated anyway). Anything else that consumes the generator (e.g. calling list on it) is doing roughly the same thing as the for loop, ignoring the StopIteration in the same way; nothing except code that specifically expects generators (as opposed to more generalized iterables and iterators) will ever bother to inspect the StopIteration object (at the C layer, there are optimizations that StopIteration objects aren't even produced by most iterators; they return NULL and leave the set exception empty, which all iterator protocol using things know is equivalent to returning NULL and setting a StopIteration object, so for anything but a generator, there isn't even an exception to inspect much of the time).
According to PEP 8 we should be consistent in our function declarations and ensure that they all have the same return-pattern, i.e. all should return an expression or all should not. However, I am not sure how to apply this to generators.
A generator will yield values as long as the code reaches them, unless a return statement is encountered in which case it will stop the iteration. However, I don't see any use-case in which returning a value from a generator function can happen. In that spirit, I don't see why it is useful - from a PEP 8 perspective - to end such a function with the explicit return None. In other words, why do we ought to verbalize a return statement for generators if the return expression is only reached when the yield'ing is over?
Example: in the following code, I don't see how hello() can be used to assign 100 to a variable (thus using the return statement). So why does PEP 8 expect us to write a return statement (be it 100 or None).
def hello():
for i in range(5):
yield i
return 100
h = [x for x in hello()]
g = hello()
print(h)
# [0, 1, 2, 3, 4]
print(g)
# <generator object hello at 0x7fd2f285a7d8>
# can we ever get 100?
You have misread PEP8. PEP8 states:
Be consistent in return statements. Either all return statements in a function should return an expression, or none of them should.
(bold emphasis mine)
You should be consistent with how you use return within a single function, not across your whole project.
Use return, it's the only return statement in the function.
However, I don't see any use-case in which returning a value from a generator function can happen.
The return value of a generator is attached to the StopIteration exception raised:
>>> def gen():
... if False: yield
... return 'Return value'
...
>>> try:
... next(gen())
... except StopIteration as ex:
... print(ex.value)
...
Return value
And this is also the mechanism by which yield from produces a value; the return value of yield from is the value attribute on the StopIteration exception. A generator can thus return a result to code using result = yield from generator by using return result:
>>> def bar():
... result = yield from gen()
... print('gen() returned', result)
...
>>> next(bar(), None)
gen() returned Return value
This feature is used in the Python standard library; e.g. in the asyncio library the value of StopIteration is used to pass along Task results, and the #coroutine decorator uses res = yield from ... to run a wrapped generator or awaitable and pass through the return value.
So, from a PEP-8 point of view, for generators and there are two possibilities:
You are using return to exit the generator early, say in a loop with if. Use return, no need to add None:
def foo():
while bar:
yield ham
if spam:
return
You are using return <something> to exit and set StopIteration.value. Use return <something> consistently throughout your generator, even when returning None:
def foo():
for bar in baz:
yield bar
if spam:
return 'The bar bazzed the spam'
return None
In Python 2 there was an error when return was together with yield in a function definition. But for this code in Python 3.3:
def f():
return 3
yield 2
x = f()
print(x.__next__())
there is no error that return is used in function with yield. However when the function __next__ is called then there is thrown exception StopIteration. Why there is not just returned value 3? Is this return somehow ignored?
This is a new feature in Python 3.3. Much like return in a generator has long been equivalent to raise StopIteration(), return <something> in a generator is now equivalent to raise StopIteration(<something>). For that reason, the exception you're seeing should be printed as StopIteration: 3, and the value is accessible through the attribute value on the exception object. If the generator is delegated to using the (also new) yield from syntax, it is the result. See PEP 380 for details.
def f():
return 1
yield 2
def g():
x = yield from f()
print(x)
# g is still a generator so we need to iterate to run it:
for _ in g():
pass
This prints 1, but not 2.
The return value is not ignored, but generators only yield values, a return just ends the generator, in this case early. Advancing the generator never reaches the yield statement in that case.
Whenever a iterator reaches the 'end' of the values to yield, a StopIteration must be raised. Generators are no exception. As of Python 3.3 however, any return expression becomes the value of the exception:
>>> def gen():
... return 3
... yield 2
...
>>> try:
... next(gen())
... except StopIteration as ex:
... e = ex
...
>>> e
StopIteration(3,)
>>> e.value
3
Use the next() function to advance iterators, instead of calling .__next__() directly:
print(next(x))
I'm curious as to what's happening here. Can someone who knows generators and coroutines well explain this code.
def b():
for i in range(5):
yield i
x = (yield)
print(x)
def a():
g = b()
next(g)
for i in range(4):
g.send(5)
print(next(g))
a()
output
None
1
None
2
None
3
None
4
but when I switch around lines 3 and 4: the lines yield i and x = (yield), I get the following.
5
None
5
None
5
None
5
None
I suspect the problem might me from trying to use the yield statement to both receive and send values in the same function. Is this not possible in Python?
I have successfully written a couple of programs that use coroutines, so I am familiar with the way they work, but I am confused as to the way this snippet of code is behaving. Any insights into this would be appreciated.
Thanks
Edit: Thanks BrenBarn and unutbu for your answers. What's happening here makes more sense when you expand the problem out as such.
def b():
for i in range(5):
yield i
x = yield None
def a():
g = b()
print('* got', g.send(None) )
for i in range(4):
print('+ got', g.send(5) )
print('- got', g.send(None))
a()
I don't quite get what you're asking, but basically: when you use send, it causes the most-recently-reached yield expression in the generator to evaluate to the value you send. Note also that send advances the generator to the next yield. One thing that may be confusing you is that you are printing the value of x inside the generator, and you are printing the value of next(g) inside b, but the generator is also yielding values at g.send(5), and you aren't printing those.
In your first case, your first send causes the yield i statement to evaluate to 5 inside b, but you don't use this value inside b (you don't assign yield i to anything), so it does nothing. Also, when you do send(5), the generator is yielding None (from the x = (yield) line), but you don't print it so you don't know this. Then you advance the generator again with next(g). The most-recently-reached yield is the x = yield, but next(g) passes no value, so x gets set to None.
In the second case, the parity of the calls is reversed. Now your first send does send to the x = yield line, so x gets set to 5. This send also yields the loop value in b, but you ignore this value in a and don't print it. You then print next(g), which is None. On each subsequent send, b prints the value of x, which is always 5 because that's what you always send, and then a prints the next yielded value, which is always None (because that's what x = yield yields).
I don't quite get what you mean by "using the yield statement to both receive and send values in the same function". You can certainly do this, but you have to realize that: a) a value (None) is still sent even when you call next(g); and b) a value is still yielded when you call g.send(5).
Using traceit to step through the program line-by-line:
import sys
import linecache
class SetTrace(object):
'''
with SetTrace(monitor):
...
'''
def __init__(self, func):
self.func = func
def __enter__(self):
sys.settrace(self.func)
return self
def passit(self, frame, event, arg):
return self.passit
def __exit__(self, ext_type, exc_value, traceback):
sys.settrace(self.passit)
def traceit(frame, event, arg):
'''
http://www.dalkescientific.com/writings/diary/archive/2005/04/20/tracing_python_code.html
'''
if event == "line":
lineno = frame.f_lineno
filename = frame.f_globals["__file__"]
if (filename.endswith(".pyc") or
filename.endswith(".pyo")):
filename = filename[:-1]
name = frame.f_globals["__name__"]
line = linecache.getline(filename, lineno)
print("%s # %s:%s" % (line.rstrip(), name, lineno, ))
return traceit
def b():
for i in range(5):
yield i
x = (yield)
print(x)
def a():
g = b()
next(g)
for i in range(4):
g.send(5)
print(next(g))
with SetTrace(traceit):
a()
we obtain
g = b() # __main__:44
next(g) # __main__:45 # runs b until you get to a yield
for i in range(5): # __main__:38
yield i # __main__:39 # stop before the yield; resume a
^
for i in range(4): # __main__:46
g.send(5) # __main__:47 # resume b; (yield i) expression evals to 5 then thrown away
x = (yield) # __main__:40 # stop before yield; resume a
^
print(next(g)) # __main__:48 # next(g) called; resume b; print not called yet
print(x) # __main__:41 # next(g) causes (yield) to evaluate to None
None
for i in range(5): # __main__:38
yield i # __main__:39 # yield 1; resume a; `print(next(g))` prints 1
1
for i in range(4): # __main__:46
g.send(5) # __main__:47 # resume b; (yield i) expression evals to 5 then thrown away
The comments on the right-hand side (above) explain why Python prints None then 1. If you get that far, I think it is clear why you get None, 2, etc. -- it's the same story all over again with different values for i.
The other scenario, where x = (yield) and yield i are reverse can be analyzed similarly.
suppose I have some manager object. This object's API has a main_hook function, that gets another function f as it's argument, and runs the given f in a loop, doing some stuff in between each iteration:
def main_hook(self,f):
while (self.shouldContinue()):
#do some preparations
f(self)
#do some tear down
Now, I also have (more accurately, would like to have) a function stop_and_do_stuff, that once called, stops main_hook dead in it's tracks, returns the control to whichever func called main_hook, and after that func finished what's it doing, get control back to main_hook and continue. Basically the result will be the same as doing
def main_hook(self,f):
while (self.shouldContinue()):
#do some preparations
yield
#do some tear down
Except that instead yield I want to have a call to f(), while giving f the option to call self.stop_and_do_stuff()
I can't work around this by making f also a generator for 2 reasons:
1.f isn't part of my API - it's given to me by a user who uses my lib
2.Even if could ask him to use yield, the place in the code in which he will need to call stop_and_do_stuff won't be directly inside f, rather in some place in the function stack which will be inside f(), but not directly in it, e.g
def h(manager):
#do stuff
if should stop:
manager.stop_and_do_stuff()
#do more stuff
def g(manager):
#some stuff
if should stop:
manager.stop_and_do_stuff()
#more stuff
if should stop again:
manager.stop_and_do_stuff()
if should call h:
h()
def f(manager):
g(manager)
so if I choose to make f a generator, I also need to make g a generator and also h, otherwise this trick won't work.
Is there any solution to all of this? maybe I'm trying to solve it the wrong way?
(I know this question is long and ugly - it's the best I could do. If something isn't clear please tell me and I'll clarify it)
EDIT
Maybe pep 342 is the solution?
My previous answer describes how to do this in Python2, which is very ugly. But now I ran across PEP 380: Syntax for Delegating to a Subgenerator. That does exactly what you ask. The only problem is that it requires Python3. But that shouldn't really be a problem.
Here's how it works:
def worker():
yield 1
yield 2
return 3
def main():
yield 0
value = yield from worker()
print('returned %d' % value)
yield 4
for m in main():
print('generator yields %d' % m)
The result of this is:
generator yields 0
generator yields 1
generator yields 2
returned 3
generator yields 4
Exceptions are passed through the way you would expect.
I believe I should also add an answer from the other point of view, ie not trying to explain how you could achieve what we can understand of what you are trying to do, but why yield definitely couldn't possibly work.
When a function contains yield keyword it is deeply modified. It is still a callable but not a normal function any more : it becomes a factory that return an iterator.
From the caller's point of view there is no difference between the three implementations below (except that the yield one is so much simpler).
##########################################
print "Function iterator using yield",
def gen():
for x in range(0, 10):
yield x
f = gen()
try:
while True:
print f.next(),
except StopIteration:
pass
for x in gen():
print x,
print
#########################################
print "Class iterator defining iter and next",
class gen2(object):
def __init__(self):
self.index = 0;
self.limit = 10;
def __iter__(self):
return self
def next(self):
if self.index >= self.limit:
raise StopIteration
self.index += 1;
return self.index - 1;
f = gen2()
try:
while True:
print f.next(),
except StopIteration:
pass
for x in gen2():
print x,
print
#########################################
print "Function iterator using iter() and sentinel",
def gen3():
def g3():
if g3.index is None:
g3.index = 0
g3.index += 1;
return g3.index - 1
g3.index = None
return iter(g3, 10)
f = gen3()
try:
while True:
print f.next(),
except StopIteration:
pass
for x in gen3():
print x,
print
Then you should understand that yield is not much about control flow, but about keeping call context inside variables. Once it is understood you have to decide if the API of main_loop really want to provide an iterator to it's caller. Then if so, if f may loop it must should also be an iterator (and there should be a loop around calls to f() like below).
def main_hook(self,f):
while (self.shouldContinue()):
#do some preparations
for v in f(self):
yield v
#do some tear down
But you should not care if f() has to call inner functions g(), etc. That is completely irrelevant. You provide a lib and it is your user problem to call with an appropriate iterable. If you believe your lib user won't be able to, you will have to change the overall design.
Hope it helps.
I don't understand the whole either (what does the main_hook caller look like ?), but i would say, Throw a StopNow exception, when you should stop, just like you should throw StopIteration when your generator is finished.
here is how i understood the thing as well as what i would do.
class StopNow(Exception):
pass
def main_hook(self,f):
got_stop_now_exc = False
while (!got_stop_now_exc and self.shouldContinue()):
#do some preparations
try:
f(self)
except StopNow:
got_stop_now_exc = True
#do some compulsary tear down, exception or not
def stop_and_do_stuff()
raise StopNow()
def my_f():
if needed:
stop_and_do_stuff()
def the_main_hook_caller():
while i_should:
managerthingie.main_hook(my_f)
do_stuff()
The behavior you describe looks exactly like a simple function call. Like below.
def f(manager):
print("Entering f")
manager.stop_and_do_stuff()
print("Exiting f")
class Manager(Object):
def shouldContinue(self):
return True
def stop_and_do_stuff(self):
print("Manager stop and do stuff")
def main_hook(self,f):
while self.shouldContinue()
print("Manager Setup")
f(self)
print("Manager Tear Down")
No problem if f() is provided by another user of if stop_and_do_stuff is called from some inner function. If you also want the manager to be able to unwind stack from stop_and_do_stuff and really exit in some cases, no problem. Just raise some exception from it and you would catch it from main_hook or upper code.
You should be able to do from inside stop_and_and_do_stuff() whatever you want to do from the caller of main hook. If not you should explain why.
What is unclear in the question is what's happening on the caller side of main_hook() and why you would want to be able to exit the main_hook loop, but not really. Either the main_loop caller expect a generator either it does not. You need to explain that part if you want to get a sensible answer (some context informations would also be nice, if you really explain WTF you are trying to do, and your real restrictions - you said f is provided by some other user and main_hook is in a lib, what of main_hook caller ? - there is probably well known usual solutions).
I am not quite sure what exactly you are trying to achieve, so maybe if you can explain the problem more instead of giving solution that would be better.
From my partial understanding why don't you do something like this
def main_hook(self,f):
while (self.shouldContinue()):
#do some preparations
stop_and_do_stuff = f(self)
if stop_and_do_stuff :
yield
#do some tear down
So basically f returns a flag to stop or not, and if it says stop we yield to function which called main_hook and that function can continue after doing some stuff
e.g.
class A(object):
def main_hook(self,f):
while (self.shouldContinue()):
#do some preparations
stop = f(self)
if stop:
yield
#do some tear down
def shouldContinue(self):
return True
def f(a):
return True
a = A()
for x in a.main_hook(f):
print x