Nest multiple yield functions without eval - python

I have the following structure (which might need a rework but to me this feels natural):
def get(baseVar):
if type(baseVar) == GeneratorType:
yield from baseVar
else:
yield baseVar
def multiply(baseVar):
if type(baseVar) == GeneratorType:
for item in baseVar:
yield item*2
else:
yield baseVar*2
funcs = {'get' : get, 'multiply' : multiply}
result = 10
for f in funcs:
result = funcs[f](result)
print(list(result))
Another approach would be (but this isn't dynamic at all) that performance wise works like i want it to, where an iterator object is passed to each functions thus gaining more momentum (theoretically) out of the functions:
for result in multiply(get(10)):
...
How can i nest multiple yield functions in a row and pass the generator object without hard-coding the function names, getattr?

I'm not sure, what you want to do. If you have different functions, that work on single elements, use map:
def get(x):
return x
def multiply(x):
return x*2
print(list(map(multiply,map(get,[10]))
How would you like to get the names of you function? From external source, then your dict is the correct way, from internal, then you can use the functions directly:
funcs = (get, multiply)
result = [10]
for f in funcs:
result = map(funcs,result)

Related

iterating through a function's sub functions python

The goal is to try and access any function's sub functions. I've looked around and I'm not too sure there is a way to do it. When I've tried using
functions = [name for name, obj in inspect.getmembers(sys.modules[__name__], inspect.isfunction)]
which returns the functions in some module (in the above __name__==__main__). When I have used that method, it doesn't return any sub functions. However I'd like to access sub functions that look something like
def f(x):
def y(x):
return x += 3
def z(x):
return x**2 - 1
x += y(x)
x += z(x)
return x
So it seems to me like there should be some way to access them with a magic method of f or some attribute of f. I have a hard time believing that those sub functions aren't stored as some attribute of f, but I have no idea.
In the end, what I need to do is to iterate through the sub functions of some function, so I thought the solution would look something like
for subfunc in f.__method_that_returns_subfuncs__():
if 'my_string' == subfunc.__name__:
out = subfunc(args)
I just need to be able to compare a string to a subfunction name then call that subfunction.
Thanks
There's no implicit list of functions to iterate over. You need to define it yourself. Simply functions can be assigned directly to a list by defining them with lambda expressions; more complex functions will need to be defined first, then added. Examples of each:
def f(x):
funcs = []
def y(x):
return x += 3
f.append(y)
f.append(lambda x: x**2 - 1)
for func in funcs:
x = func(x)
return x
If you care about the name, you can access it via the function object's __name__ attribute.
for func in funcs:
if func.__name__ == "some_func":
x = func(x)

Expressive way compose generators in Python

I really like Python generators. In particular, I find that they are just the right tool for connecting to Rest endpoints - my client code only has to iterate on the generator that is connected the the endpoint. However, I am finding one area where Python's generators are not as expressive as I would like. Typically, I need to filter the data I get out of the endpoint. In my current code, I pass a predicate function to the generator and it applies the predicate to the data it is handling and only yields data if the predicate is True.
I would like to move toward composition of generators - like data_filter(datasource( )). Here is some demonstration code that shows what I have tried. It is pretty clear why it does not work, what I am trying to figure out is what is the most expressive way of arriving at the solution:
# Mock of Rest Endpoint: In actual code, generator is
# connected to a Rest endpoint which returns dictionary(from JSON).
def mock_datasource ():
mock_data = ["sanctuary", "movement", "liberty", "seminar",
"formula","short-circuit", "generate", "comedy"]
for d in mock_data:
yield d
# Mock of a filter: simplification, in reality I am filtering on some
# aspect of the data, like data['type'] == "external"
def data_filter (d):
if len(d) < 8:
yield d
# First Try:
# for w in data_filter(mock_datasource()):
# print(w)
# >> TypeError: object of type 'generator' has no len()
# Second Try
# for w in (data_filter(d) for d in mock_datasource()):
# print(w)
# I don't get words out,
# rather <generator object data_filter at 0x101106a40>
# Using a predicate to filter works, but is not the expressive
# composition I am after
for w in (d for d in mock_datasource() if len(d) < 8):
print(w)
data_filter should apply len on the elements of d not on d itself, like this:
def data_filter (d):
for x in d:
if len(x) < 8:
yield x
now your code:
for w in data_filter(mock_datasource()):
print(w)
returns
liberty
seminar
formula
comedy
More concisely, you can do this with a generator expression directly:
def length_filter(d, minlen=0, maxlen=8):
return (x for x in d if minlen <= len(x) < maxlen)
Apply the filter to your generator just like a regular function:
for element in length_filter(endpoint_data()):
...
If your predicate is really simple, the built-in function filter may also meet your needs.
You could pass a filter function that you apply for each item:
def mock_datasource(filter_function):
mock_data = ["sanctuary", "movement", "liberty", "seminar",
"formula","short-circuit", "generate", "comedy"]
for d in mock_data:
yield filter_function(d)
def filter_function(d):
# filter
return filtered_data
What I would do is define filter(data_filter) to receive a generator as input and return a generator with values filtered by data_filter predicate (regular predicate, not aware of generator interface).
The code is:
def filter(pred):
"""Filter, for composition with generators that take coll as an argument."""
def generator(coll):
for x in coll:
if pred(x):
yield x
return generator
def mock_datasource ():
mock_data = ["sanctuary", "movement", "liberty", "seminar",
"formula","short-circuit", "generate", "comedy"]
for d in mock_data:
yield d
def data_filter (d):
if len(d) < 8:
return True
gen1 = mock_datasource()
filtering = filter(data_filter)
gen2 = filtering(gen1) # or filter(data_filter)(mock_datasource())
print(list(gen2))
If you want to further improve, may use compose which was the whole intent I think:
from functools import reduce
def compose(*fns):
"""Compose functions left to right - allows generators to compose with same
order as Clojure style transducers in first argument to transduce."""
return reduce(lambda f,g: lambda *x, **kw: g(f(*x, **kw)), fns)
gen_factory = compose(mock_datasource,
filter(data_filter))
gen = gen_factory()
print(list(gen))
PS: I used some code found here, where the Clojure guys expressed composition of generators inspired by the way they do composition generically with transducers.
PS2: filter may be written in a more pythonic way:
def filter(pred):
"""Filter, for composition with generators that take coll as an argument."""
return lambda coll: (x for x in coll if pred(x))
Here is a function I have been using to compose generators together.
def compose(*funcs):
""" Compose generators together to make a pipeline.
e.g.
pipe = compose(func1, func2, func3)
result = pipe(range(0, 5))
"""
return lambda x: reduce(lambda f, g: g(f), list(funcs), x)
Where funcs is a list of generator functions. So your example would look like
pipe = compose(mock_datasource, data_filter)
print(list(pipe))
This is not original

Optional yield or return in python3. How to?

I would like to have a function that can, optionally, return or yield the result.
Here is an example.
def f(option=True):
...
for...:
if option:
yield result
else:
results.append(result)
if not option:
return results
Of course, this doesn't work, I have tried with python3 and I always get a generator no matter what option value I set.
As far I have understood, python checks the body of the function and if a yield is present, then the result will be a generator.
Is there any way to get around this and make a function that can return or yield at will?
You can't. Any use of yield makes the function a generator.
You could wrap your function with one that uses list() to store all values the generator produces in a list object and returns that:
def f_wrapper(option=True):
gen = f()
if option:
return gen # return the generator unchanged
return list(gen) # return all values of the generator as a list
However, generally speaking, this is bad design. Don't have your functions alter behaviour like this; stick to one return type (a generator or an object) and don't have it switch between the two.
Consider splitting this into two functions instead:
def f():
yield result
def f_as_list():
return list(f())
and use either f() if you need the generator, and f_as_list() if you want to have a list instead.
Since list(), (and next() to access just one value of a generator) are built-in functions, you rarely need to use a wrapper. Just call those functions directly:
# access elements one by one
gen = f()
one_value = next(gen)
# convert the generator to a list
all_values = list(f())
What about this?
def make_f_or_generator(option):
def f():
return "I am a function."
def g():
yield "I am a generator."
if option:
return f
else:
return g
This gives you at least the choice to create a function or a generator.
class based approach
class FunctionAndGenerator:
def __init__(self):
self.counter = 0
def __iter__(self):
return self
# You need a variable to indicate if dunder next should return the string or raise StopIteration.
# Raising StopIteration will stop the loop from iterating more.
# You'll have to teach next to raise StopIteration at some point
def __next__(self):
self.counter += 1
if self.counter > 1 :
raise StopIteration
return f"I'm a generator and I've generated {self.counter} times"
def __call__(self):
return "I'm a function"
x = FunctionAndGenerator()
print(x())
for i in x:
print(i)
I'm a function
I'm a generator and I've generated 1 times
[Program finished]

Inversing a list recursively

I created a function that will inverse a list recursively but it uses an global list in witch it puts the elements.
Can this be rewritten so that it won't use an outside variable/list to achieve the same result.
Here is the code:
invs = []
def inv_list(list_, elem):
global invs
if elem is not None:
invs.append(elem)
if not list_:
return invs
else:
try:
el = list_.pop()
inv_list(list_, el)
except Exception:
pass
What about:
def inv_list(lst):
if not lst:
return []
return inv_list(lst[1:]) + lst[:1]
it looks like you are doing a whole lot more work than you need to
def reverse_recurse(a_list):
if not a_list:
return []
return [a_list.pop(),] + reverse_recurse(a_list)
While your implementation could be improved in various ways, when I find that I want to build something recursive without using globals and without making the interface feel dirty is create a nested helper function:
def inv_list(list_):
invs = []
def helper(elem):
if elem is not None:
invs.append(elem)
if not list_:
return invs
else:
try:
el = list_.pop()
return helper(el)
except Exception:
pass
return helper(None)
That way, you can have values that are at the scope of the outer function.
The problematic way to do it is simple, just use default arguments.
def rec_reverse(input=[], output=[]):
if len(input) == 0:
return
else:
output.append(input.pop())
rec_reverse(input, output)
return output
x = list(range(10))
y = list(range(20))
print(rec_reverse(x, []))
print(rec_reverse(y, []))
Just remember to pass a new list to the output, so that you can call it again without getting old values.
Nevertheless, you can use the safe approach without using default arguments:
def rec_reverse(input):
if not input:
return input
else:
return [input.pop(), ] + rec_reverse(input)
And you can also use its recursive equivalent as a lambda expression:
rec_reverse = lambda input=[]: [] if not input else [input.pop(), ] + rec_reverse(input)
Keep in mind though, that there's an even simpler solution without using recursion at all:
x = list(range(10))
rec_reverse = lambda input: input[::-1]
print(rec_reverse(x))
Since in Python, you can reverse any list using extended slice notation.
Also, you can just use reverse() and spare you the trouble.
def reverse(input):
input.reverse()
return input
Building on Rederick Deathwill, here is a simplified version of your function:
def inv_list(list_):
def inner(list_, invs):
if not list_:
return invs
else:
invs.append(list_.pop())
return inner(list_, invs)
return inner(list_, [])
It uses a default value for invs, getting rid of the need for a global variable to hold the inverted list. With subsequent invocation, invs is passed along so that the next call can build on it.
Once the bottom of the call stack is reached, the function returns the reversed list. A nice addition to the original is the return inner(list_, invs) line, which allows the caller to capture the new list as the return value.
This is not the shortest, but I think it is at least readable.

What is a good way to decorate an iterator to alter the value before next is called in python?

I am working on a problem that involves validating a format from within unified diff patch.
The variables within the inner format can span multiple lines at a time, so I wrote a generator that pulls each line and yields the variable when it is complete.
To avoid having to rewrite this function when reading from a unified diff file, I created a generator to strip the unified diff characters from the line before passing it to the inner format validator. However, I am getting stuck in an infinite loop (both in the code and in my head). I have abstracted to problem to the following code. I'm sure there is a better way to do this. I just don't know what it is.
from collections import Iterable
def inner_format_validator(inner_item):
# Do some validation to inner items
return inner_item[0] != '+'
def inner_gen(iterable):
for inner_item in iterable:
# Operates only on inner_info type data
yield inner_format_validator(inner_item)
def outer_gen(iterable):
class DecoratedGenerator(Iterable):
def __iter__(self):
return self
def next(self):
# Using iterable from closure
for outer_item in iterable:
self.outer_info = outer_item[0]
inner_item = outer_item[1:]
return inner_item
decorated_gen = DecoratedGenerator()
for inner_item in inner_gen(decorated_gen):
yield inner_item, decorated_gen.outer_info
if __name__ == '__main__':
def wrap(string):
# The point here is that I don't know what the first character will be
pseudo_rand = len(string)
if pseudo_rand * pseudo_rand % 2 == 0:
return '+' + string
else:
return '-' + string
inner_items = ["whatever"] * 3
# wrap screws up inner_format_validator
outer_items = [wrap("whatever")] * 3
# I need to be able to
# iterate over inner_items
for inner_info in inner_gen(inner_items):
print(inner_info)
# and iterate over outer_items
for outer_info, inner_info in outer_gen(outer_items):
# This is an infinite loop
print(outer_info)
print(inner_info)
Any ideas as to a better, more pythonic way to do this?
I would do something simpler, like this:
def outer_gen(iterable):
iterable = iter(iterable)
first_item = next(iterable)
info = first_item[0]
yield info, first_item[1:]
for item in iterable:
yield info, item
This will execute the 4 first lines only once, then enter the loop and yield what you want.
You probably want to add some try/except to cacth IndexErrors here and there.
If you want to take values while they start with something or the contrary, remember you can use a lot of stuff from the itertools toolbox, and in particular dropwhile, takewhile and chain:
>>> import itertools
>>> l = ['+foo', '-bar', '+foo']
>>> list(itertools.takewhile(lambda x: x.startswith('+'), l))
['+foo']
>>> list(itertools.dropwhile(lambda x: x.startswith('+'), l))
['-bar', '+foo']
>>> a = itertools.takewhile(lambda x: x.startswith('+'), l)
>>> b = itertools.dropwhile(lambda x: x.startswith('+'), l)
>>> list(itertools.chain(a, b))
['+foo', '-bar', '+foo']
And remember that you can create generators like comprehension lists, store them in variables and chain them, just like you would pipe linux commands:
import random
def create_item():
return random.choice(('+', '-')) + random.choice(('foo', 'bar'))
random_items = (create_item() for s in xrange(10))
added_items = ((i[0], i[1:]) for i in random_items if i.startswith('+'))
valid_items = ((prefix, line) for prefix, line in added_items if 'foo' in line)
print list(valid_items)
With all this, you should be able to find some pythonic way to solve your problem :-)
I still don't like this very much, but at least it's shorter and a tad more pythonic:
from itertools import imap, izip
from functools import partial
def inner_format_validator(inner_item):
return not inner_item.startswith('+')
inner_gen = partial(imap, inner_format_validator)
def split(astr):
return astr[0], astr[1:]
def outer_gen(iterable):
outer_stuff, inner_stuff = izip(*imap(split, iterable))
return izip(inner_gen(inner_stuff), outer_stuff)
[EDIT] inner_gen() and outer_gen() without imap and partial:
def inner_gen(iterable):
for each in iterable:
yield inner_format_validator(each)
def outer_gen(iterable):
outer_stuff, inner_stuff = izip(*(split(each) for each in iterable))
return izip(inner_gen(inner_stuff), outer_stuff)
Maybe this is a better, though different, solution:
def transmogrify(iter_of_iters, *transmogrifiers):
for iters in iter_of_iters:
yield (
trans(each) if trans else each
for trans, each in izip(transmogrifiers, iters)
)
for outer, inner in transmogrify(imap(split, stuff), inner_format_validator, None):
print inner, outer
I think it will do what you intended if you change the definition of DecoratedGenerator to this:
class DecoratedGenerator(Iterable):
def __iter__(self):
# Using iterable from closure
for outer_item in iterable:
self.outer_info = outer_item[0]
inner_item = outer_item[1:]
yield inner_item
Your original version never terminated because its next() method was stateless and would return the same value every time it was called. You didn't need to have a next() method at all, though--you can implement __iter__() yourself (as I did), and then it all works fine.

Categories

Resources