def powers_of_two():
val = 2
while True:
yield val
val *= 2
def gen_range(start, end, skip=1):
x = start
while x < end:
yield x
x += skip8 Coroutines
Generators Revisited
Review: https://notes.jpt.sh/dir-python/generators/
yield from
Useful for flattening/delegating to other generators:
yield from gen
# same as
for x in gen:
yield xdef yield_from_example():
yield from [1, 2, 3]
yield from "word"
yield from range(9, 12)
for c in yield_from_example():
print(c)1
2
3
w
o
r
d
9
10
11
def flatten(seq: list):
for item in seq:
if isinstance(item, list):
yield from flatten(item)
else:
yield itemfor i in flatten([[1, 2], [3, [4]], [5, 6, 7, [[8]]]]):
print(i)1
2
3
4
5
6
7
8
Coroutines
Generators are a form of coroutine a method of allowing cooperative multitasking where two or more functions execute in an interwoven manner.
With yield the calling function can react to the output of the called function:
def caller():
for n in powers_of_two():
if n > 100:
print(n)
if n > 1000:
break
caller()128
256
512
1024
But our interaction with the inner function is limited to reacting, we can stop iterating or iterate more than once– but that’s about it.
Let’s take a closer look at what is returned, the generator object:
def gen_range(start, end, skip=1):
x = start
while x < end:
yield x
x += skip
g = gen_range(0, 10, 2)
print(g)<generator object gen_range at 0x7fb144ba4ad0>
__next__
The returned generator object is iterable, when __next__ is called:
g.__next__()0
# correctly written as
next(g)2
# but we can also send values back in (here the value is ignored)
g.send(999)4
send
When send is called, function resumes execution (it must have already been started with next or send(None))
def skip_range(start, end):
x = start
while x < end:
# value from send(skip)
skip = yield x
x += skipg = skip_range(0, 20)
print(next(g))
print(g.send(3))
print(g.send(4))
print(g.send(5))0
3
7
12
Note: send(None) is equivalent to __next__()
def running_average():
total, count = 0, 0
while True:
value = yield (total / count if count else None)
total += value
count += 1
ra1 = running_average()
ra2 = running_average()
# same thing
ra1.send(None)
next(ra2)
print(ra1.send(5))
print(ra1.send(4))
print(ra1.send(3))5.0
4.5
4.0
print(ra2.send(100))
print(ra2.send(10))
print(ra2.send(1))
print(ra2.send(1))
print(ra2.send(1))
print(ra2.send(1))100.0
55.0
37.0
28.0
22.6
19.0
throw
If an exception is raised within a generator, it will be raised into the surrounding function and the generator is exhausted.
It is also possible to send an exception into the coroutine:
def resilient(max):
n = 0
while n < max:
try:
yield n
n += 1
except OSError as e:
print(f"exiting from {type(e)}")
return #
except Exception as e:
print(f"ignoring exception {type(e)}")
g = resilient(10)
print(next(g))
print(next(g))
print(next(g))
print(next(g))
g.throw(ValueError)
print(next(g))
try:
g.throw(OSError)
except StopIteration as e:
pass0
1
2
3
ignoring exception <class 'ValueError'>
4
exiting from <class 'OSError'>
close
StopIteration is an exception that typically passes silently when using an iterable. GeneratorExit can be sent into a generator to force it to stop:
def until_stopped():
n = 0
while True:
yield n
n += 1
g = until_stopped()
print(next(g))
print(next(g))
print(next(g))
print(next(g))
g.close()
try:
print(next(g))
except StopIteration as e:
print("was stopped")0
1
2
3
was stopped
Cooperative, Not Parallel
At any given time, a single function is executing, passing control back and forth using yield and send
If we want to have more than one function executing in parallel, we will need to turn to threads, processes, and eventually async coroutines.