How to create a generator function that calls generator functions - python

Consider the following (non-working) example code:
class MyGenerator:
def test_gen(self):
for i in range(1,5):
if i % 2:
self.foo(i)
else:
self.bar(i)
def foo(self, i):
yield i
def bar(self, i):
yield i**2
g = MyGenerator()
for i in g.test_gen():
print i
This will not work, because test_gen has no yield and is no longer a generator function. In this small example I could just return the values from foo and bar and put the yield into test_gen, however I have a case where that's not possible. How can I turn test_gen into a generator function again?

You need to loop over the results of the delegated generators and yield those:
def test_gen(self):
for i in range(1,5):
if i % 2:
for res in self.foo(i):
yield res
else:
for res in self.bar(i):
yield res
If you are using Python 3.3 or up, you'd use the yield from expression to do proper generator delegation:
def test_gen(self):
for i in range(1,5):
if i % 2:
yield from self.foo(i)
else:
yield from self.bar(i)
Both re-introduce yield into the function, once again making it a generator function.

why not just:
class MyGenerator:
def test_gen(self):
for i in range(1,5):
if i % 2:
yield next(self.foo(i))
else:
yield next(self.bar(i))
def foo(self, i):
yield i
def bar(self, i):
yield i**2

Related

Implementing fibonacci series using greedy approach?

I have implemented fibonacci series using recursion:
def fibonacci(n):
if n==0:
return 0
elif n==1:
return 1
else:
return fibonacci(n-1) + fibonacci(n-2)
I have also implemented it using dynamic programming:
def fibonacci(n):
result = [0, 1]
if n > 1:
for i in range(2, n+1):
result.append(result[i-1] + result[i-2])
return result[n]
I want to implement it using greedy approach. I am unable to think of it in greedy terms. Please provide a greedy approach for this problem.
I didn't understand what you wanted to say by saying the word 'greedy'. But these are ways:
Example 1: Using looping technique
def fib(n):
a,b = 1,1
for i in range(n-1):
a,b = b,a+b
return a
print fib(5)
Example 2: Using recursion
def fibR(n):
if n==1 or n==2:
return 1
return fibR(n-1)+fibR(n-2)
print fibR(5)
Example 3: Using generators
a,b = 0,1
def fibI():
global a,b
while True:
a,b = b, a+b
yield a
f=fibI()
f.next()
f.next()
f.next()
f.next()
print f.next()
Example 4: Using memoization
def memoize(fn, arg):
memo = {}
if arg not in memo:
memo[arg] = fn(arg)
return memo[arg]
fib() as written in example 1.
fibm = memoize(fib,5)
print fibm
Example 5: Using memoization as the decorator
class Memoize:
def __init__(self, fn):
self.fn = fn
self.memo = {}
def __call__(self, arg):
if arg not in self.memo:
self.memo[arg] = self.fn(arg)
return self.memo[arg]
#Memoize
def fib(n):
a,b = 1,1
for i in range(n-1):
a,b = b,a+b
return a
print fib(5)

How to create a next method for a class when the next method is supposed to return the values of two generators?

I have the following example in which the next method of a class is supposed to return the values from two generators:
class Test():
def __next__(self):
g1, g2 = self._gen1(), self._gen2()
return next(g1), next(g2)
def _gen1(self):
i = 0
while True:
yield i
i += 2
def _gen2(self):
i = 1
while True:
yield i
i += 2
However, when I call next for this class, the values are not incremented.
>>> t = Test()
>>> next(t)
>>> (0, 1)
>>> next(t)
>>> (0, 1)
What is wrong? Is there a more eloquent way to write this class?
Although I have no idea what you are trying to accomplish, here is a cleaned up version which (I think) does what you want.
class Test():
def __init__(self):
self.g1 = self._gen2()
self.g2 = self._gen1()
def __next__(self):
return next(self.g1), next(self.g2)
def _gen1(self):
i = 0
while True:
yield i
i += 2
def _gen2(self):
i = 1
while True:
yield i
i += 2
t = Test()
print(next(t))
print(next(t))
print(next(t))
Your code doesn't work because it recreates the generator functions every time __next__() is called, which effectively resets them back to their initial state before their next next() values are returned:
def __next__(self):
g1, g2 = self._gen1(), self._gen2() # Don't do this here.
return next(g1), next(g2)
You can fix that by adding an __init__() method and initializing them in it:
class Test:
def __init__(self):
self.g1, self.g2 = self._gen1(), self._gen2() # Initialize here.
def __next__(self):
return next(self.g1), next(self.g2)
...
A more eloquent and slightly more concise way to do it which likewise will avoid the problem would be to use the builtin zip() function to create an "iterator of generators" that will return pairs of next values from each generator every time it's called. Another advantage is it's very easy to extend to handle even more generators simply just changing the __init__() method.
Here's what I mean:
class Test:
def __init__(self):
self.generators = zip(self._gen1(), self._gen2())
def __next__(self):
return next(self.generators)
def _gen1(self):
i = 0
while True:
yield i
i += 2
def _gen2(self):
i = 1
while True:
yield i
i += 2
t = Test()
for _ in range(3):
print(next(t))
Output:
(0, 1)
(2, 3)
(4, 5)

Parallel random distribution

I have two iterators in python and both should follow the same "random" distribution (both should run in parallel). For instance:
class Iter1(object):
def __iter__(self):
for i in random_generator():
yield i
class Iter2(object):
def __iter__(self):
for i in random_generator():
yield i
for el1, el2 in zip(Iter1(), Iter2()):
print '{} {}'.format(el1, el2)
output should be somethig like:
0.53534 0.53534
0.12312 0.12312
0.19238 0.19238
How can I define random_generator() in a way that it creates the same random distributions in parallel for both iterators.
Note:
They should run in parallel
I can't generate the sequence in advance (it is a streaming, so I don't know the size of the sequence)
Thanks.
Specify the same seed to each call of random_generator:
import random
def random_generator(l, seed=None):
r = random.Random(seed)
for i in range(l):
yield r.random()
class Iter1(object):
def __init__(self, seed):
self.seed = seed
def __iter__(self):
for i in random_generator(10, self.seed):
yield i
class Iter2(object):
def __init__(self, seed):
self.seed = seed
def __iter__(self):
for i in random_generator(10, self.seed):
yield i
# The seed can be any hashable object, but don't use None; that
# tells random.seed() to use the current time. But make sure that
# Python itself isn't using hash randomization.
common_seed = object()
for el1, el2 in zip(Iter1(common_seed), Iter2(common_seed)):
print '{} {}'.format(el1, el2)
There is no way to control the random generation number in this way. If you want to do that you should create your own random function. But as another pythonic and simpler way you can just create one object and use itertools.tee in order to copy your iterator object to having the same result for your random sequences:
In [28]: class Iter1(object):
def __init__(self, number):
self.number = number
def __iter__(self):
for _ in range(self.number):
yield random.random()
....:
In [29]:
In [29]: num = Iter1(5)
In [30]: from itertools import tee
In [31]: num, num2 = tee(num)
In [32]: list(zip(num, num2))
Out[32]:
[(0.485400998727448, 0.485400998727448),
(0.8801649381536764, 0.8801649381536764),
(0.9684025615967844, 0.9684025615967844),
(0.9980073706742334, 0.9980073706742334),
(0.1963579685642387, 0.1963579685642387)]

decorating a function that yields

Is it possible, and if so, advisable, and if so, what would be the recommended method for decorating a function that yields a value?
For example, consider this imaginary example I made up
def foobar_creator(func):
def wrapped(**kwargs):
res = func(**kwargs)
flag = True
for k,v in kwargs:
if res % v == 0:
flag = False
yield k
if flag:
yield res
return wrapped
#foobar_creator
def generic_yielder(**kwargs):
for i in xrange(sys.maxint):
yield i
for i in generic_yielder(foo=3, bar=5, foobar=15):
print i
A generator function, when called, returns an iterator object. If your decorator is itself a generator too, you'll need to loop over the wrapped result:
def foobar_creator(func):
def wrapped(**kwargs):
gen = func(**kwargs)
flag = True
for k, v in kwargs:
if res % v == 0:
flag = False
yield k
if flag:
for res in gen:
yield res
return wrapped
If you are using Python 3.3 or up, you can use delegation to hand control the wrapped generator, by using yield from:
if flag:
yield from gen
Instead of yielding every potential return value, why not yield only those that actually exist? Something like
def wrap(f, arg):
for x in f(arg):
yield x
(actual decorator syntax, handling of positional and keyword arguments, etc. is omitted for clarity.)
For the case in comment42684128, the solution is as simple as:
(v for v in f(<args>) if filter_condition(v))
As a decorator:
def yfilter(filter_condition):
def yfilter_p(f):
def wrapped(*args,**kwargs):
return (v for v in f(*args,**kwargs) if filter_condition(v))
return wrapped
return yfilter_p
The existing answers don't handle generators that yield and then return a value. For that, you need to return (yield from f()):
def dec(f):
def g():
return (yield from f())
return g
#dec
def f():
yield 'val'
return 'done'

Return from an iterator and then throw StopIteration

What would be the nice way to return something from an iterator one last time when it's exhausted. I'm using a flag, but this is rather ugly:
class Example():
def __iter__(self):
self.lst = [1,2,3]
self.stop = False # <-- ugly
return self
def next(self):
if self.stop: # <-- ugly
raise StopIteration
if len(self.lst) == 0:
self.stop = True
return "one last time"
return self.lst.pop()
Background: I'm fetching an unknown amount of strings from an external source and send them further down to the caller. When the process is over, I want to emit a string "x records processed". I have no control over calling code, so this must be done inside my iterator.
You could just yield from __iter__ which would turn it into a generator function (alternately you could just write a generator function as suggested by Dan). Just as a warning, this might be misleading to people that abuse the next method.
class Example():
def __iter__(self):
lst = [1,2,3]
for i in reversed(lst):
yield i
yield "one last time"
Maybe you can use a generator function instead:
def example():
lst = [1, 2, 3]
while lst:
yield lst.pop()
yield 'one last time'
Don't return one extra thing. That's a bad idea because it doesn't extend well. What if you want to do sum as well as count? Or hash as well as count? An iterator is a stateful object. Make use of that.
class Example( collections.Iterator ):
def __iter__(self):
self.lst = [1,2,3]
self.count = 0
return self
def next(self):
if self.lst:
self.count += 1
return self.lst.pop()
raise StopIteration()
Use it like this.
my_iter= iter(Example())
for item in my_iterprin:
print( item )
print( my_iter.count )
You could do something like this:
class Example():
def __iter__(self):
self.lst = [1, 2, 3]
return self
def next(self):
try:
return self.lst.pop()
except IndexError:
print "done iterating"
raise StopIteration
>>> for i in Example():
... print i
...
3
2
1
done iterating
In your actual code you will probably need to change the exception type that you are catching, but this format should still be applicable.

Categories

Resources