For most applications, yield from
just yields everything from the left iterable in order:
def iterable1():
yield 1
yield 2
def iterable2():
yield from iterable1()
yield 3
assert list(iterable2) == [1, 2, 3]
For 90% of users who see this post, I'm guessing that this will be explanation enough for them. yield from
simply delegates to the iterable on the right hand side.
Coroutines
However, there are some more esoteric generator circumstances that also have importance here. A less known fact about Generators is that they can be used as co-routines. This isn't super common, but you can send data to a generator if you want:
def coroutine():
x = yield None
yield 'You sent: %s' % x
c = coroutine()
next(c)
print(c.send('Hello world'))
Aside: You might be wondering what the use-case is for this (and you're not alone). One example is the contextlib.contextmanager
decorator. Co-routines can also be used to parallelize certain tasks. I don't know too many places where this is taken advantage of, but google app-engine's ndb
datastore API uses it for asynchronous operations in a pretty nifty way.
Now, lets assume you send
data to a generator that is yielding data from another generator ... How does the original generator get notified? The answer is that it doesn't in python2.x where you need to wrap the generator yourself:
def python2_generator_wapper():
for item in some_wrapped_generator():
yield item
At least not without a whole lot of pain:
def python2_coroutine_wrapper():
"""This doesn't work. Somebody smarter than me needs to fix it. . .
Pain. Misery. Death lurks here :-("""
# See https://www.python.org/dev/peps/pep-0380/#formal-semantics for actual working implementation :-)
g = some_wrapped_generator()
for item in g:
try:
val = yield item
except Exception as forward_exception: # What exceptions should I not catch again?
g.throw(forward_exception)
else:
if val is not None:
g.send(val) # Oops, we just consumed another cycle of g ... How do we handle that properly ...
This all becomes trivial with yield from
:
def coroutine_wrapper():
yield from coroutine()
Because yield from
truly delegates (everything!) to the underlying generator.
Return semantics
Note that the PEP in question also changes the return semantics. While not directly in OP's question, it's worth a quick digression if you are up for it. In python2.x, you can't do the following:
def iterable():
yield 'foo'
return 'done'
It's a SyntaxError
. With the update to yield
, the above function is not legal. Again, the primary use-case is with coroutines (see above). You can send data to the generator and it can do it's work magically (maybe using threads?) while the rest of the program does other things. When flow control passes back to the generator, StopIteration
will be raised (as is normal for the end of a generator), but now the StopIteration
will have a data payload. It is the same thing as if a programmer instead wrote:
raise StopIteration('done')
Now the caller can catch that exception and do something with the data payload to benefit the rest of humanity.