I am implementing this requirement:
As part of a data processing pipeline, complete the implementation of the pipeline method:
The method should accept a variable number of functions, and it
should return a new function that accepts one parameter arg.
The returned function should call the first function in the pipeline
with the parameter arg, and call the second function with the result
of the first function.
The returned function should continue calling each function in the
pipeline in order, following the same pattern, and return the value
from the last function.
For example, pipeline(lambda x: x * 3, lambda x: x + 1, lambda x: x / 2) then calling the returned function with 3 should return 5.0.
My code
def pipeline(*funcs):
def helper(arg):
argCount = len(funcs)
if argCount > 0:
# Iterate over all the arguments and call each lamba's function
res = []
for elem in funcs:
if(len(res) > 0):
helper = elem(res.pop())
else:
helper = elem(arg)
res.append(helper)
helper = res.pop()
else:
return helper
print('before returning, helper value is: ', helper)
return helper
fun = pipeline(lambda x: x * 3, lambda x: x + 1, lambda x: x / 2)
print('final result: ', fun(3)) #should print 5.0
Question
None is returned. Why?
before returning, helper value is: 5.0
final result: None
The problem is that you don't execute a return right after you print. You do have a return in the else branch just before it, but not in the if block. Also, the return helper you have further below does not belong to the def helper function block, so you need one more return helper. I would in fact omit the else block, and just always do the return, like this:
def pipeline(*funcs):
def helper(arg):
argCount = len(funcs)
if argCount > 0:
# Iterate over all the arguments and call each lamba's function
res = []
for elem in funcs:
if(len(res) > 0):
helper = elem(res.pop())
else:
helper = elem(arg)
res.append(helper)
helper = res.pop()
print('before returning, helper value is: ', helper)
return helper # <-------
return helper
It is not really clear why you have a list res, since there is only one value to pass from one function to the next. You could just use arg for this purpose. Furthermore, you use helper in two different senses (function & value) which is quite confusing. The code can be simplified to this:
def pipeline(*funcs):
def helper(arg):
for elem in funcs:
arg = elem(arg)
return arg
return helper
Do not invent what is already available in Python
from functools import reduce
pipeline = [lambda x: x * 3, lambda x: x + 1, lambda x: x / 2]
val = reduce(lambda x, f: f(x), pipeline, 3)
print(val) # 5.0
You do: print('before returning, helper value is: ', helper)... and then do not actually return anything from helper, so it implicitly returns None.
def pipeline(*args):
def helper(num):
for i in args:
total=i(num)
num=total
return total
return helper
fun = pipeline(lambda x: x * 3, lambda x: x + 1, lambda x: x / 2)
print(fun(3)) #should print 5.0
Related
I have a pipeline function that takes an arbitrary number of functions as arguments, it returns a single function helper which contain one argument and this function in turn calls the pipeline function with a single argument iteratively. Here is the code:
def pipeline(*funcs):
def helper(arg):
for func in funcs:
result = func(arg)
arg = result
return result
return helper
And here a test case:
fun = pipeline(lambda x: x * 3, lambda x: x + 1, lambda x: x / 2)
print(fun(3)) #should print 5.0
I was reading on a different question about generators and how they can be used to remember previous state here, and was wondering if I could reframe my current pipeline function to use a generator instead. I need to remember the arg for every func call and currently I'm doing that by storing into a variable, however since generators can remember the last state I was wondering if I could use it instead.
You could (but absolutely shouldn't) create a side-effect based list-comprehension that uses the walrus-operator:
funcs = [lambda x: x * 3, lambda x: x + 1, lambda x: x / 2]
x = 3
[x := f(x) for f in funcs]
print(x)
What makes this even worse than it already is is the fact that if you use a simple generator instead of the list-comprehension the result is different because it doesn't all get executed before the print.
Or you even force it all in one line like this:
print([x := f(x if i>0 else 3) for i, f in enumerate(funcs)][-1])
I think generally a prettier approach would be to just wrap it into a normal for loop:
x = 3
for f in funcs:
x = f(x)
print(x)
If I have a function that takes in a lambda reference as a parameter, and returns that reference so when it's called, it returns the value of that lambda function (boolean).
Is there a way to have it return the opposite boolean value?
def returns_diff(lambda_function):
return lambda_function
f = returns_diff(lambda x : x > 2)
f(0)
# I know I can do it this way but I want to do it inside the function.
# print(not(f(0)))
---> Should return True because it's supposed to return False since 0 is not bigger than two (return the opposite value of the lambda function)
I know I can just do: not(f(0)) when calling it, but I want to do it inside the function, not when I call it.
If you want to generate a function that returns the boolean opposite of a given function, you can do it like this:
def returns_diff(func):
return lambda x: not func(x)
f = returns_diff(lambda x: x>2)
f(0) # returns True
That's assuming the functions take one argument, as in your question. You can also make a version that works for functions with any number of positional or keyword arguments:
def returns_diff(func):
return lambda *args, **kwargs: not func(*args, **kwargs)
Can i use classes? Or it need to be just plain functions? With classes i would do
class diff:
def __init__(self,lambda_func):
self.lambda_func = lambda_func
def __call__(self,x):
return not(self.lambda_func(x))
f = diff(lambda x: x > 2)
f(0) #True
Hello i have 3 functions f1(), f2() and f3(). The output of the previous is the input of the next. meaning output = f3(f2(f1(data))).
instead of writing
def outp(data):
o1 = f1(data)
o2= f2(o1)
o3 = f3(02)
return o3
output=outp(data)
is there a way to this by simply providing a list of functions to some other general function and let it handle the chaining together?
You could simply run a for loop with an assignment:
>>> f1 = int
>>> f2 = float
>>> f3 = lambda x: x * 2
>>> i = '3'
>>> for func in (f1, f2, f3):
... i = func(i)
... print(i)
...
3
3.0
6.0
I stumbled across your question and I found it really interesting.
Here is my approach on a pipeline of functions:
def call_pipeline(transition_list, *args):
"""
transition_list = [func1, func2, func3...
]
- func1 output is the input for func2 etc
"""
result = args
for f in transition_list:
result = f(*result)
print(*result)
def f1(x, y):
return [x + y]
def f2(z):
return [z**2, z]
def f3(w, r):
return [w * r]
def f4(t):
return ["Final", t]
def test_pipeline():
transition_list = [f1, f2, f3, f4]
call_pipeline(transition_list, *[1, 2])
As long as you make sure that every function inside the pipeline returns a list and the next function can properly process the output of the previous one, this should work fine.
It is fairly easy to define a compose function that handles simple single-argument functions:
def compose(*args):
if not args:
return lambda x: x # Identity function
else:
return compose(args[0], compose(args[1:]))
outp = compose(f3, f2, f1) # Note the order of the arguments
You can also use the reduce function (functools.reduce in Python 3):
outp = reduce(lambda f, g: lambda x: f(g(x)), [f3, f2, f1], lambda x:x)
You can omit the third argument if you are certain the list of functions won't be empty.
You can define a composition function that operates on a sequence of functions:
def compose(a, b):
def fcn(*args, **kwargs):
return a(b(*args, **kwargs))
return fcn
def compose_all(*fcns):
return reduce(compose, fcns)
def compose_all_1(*fcns):
return reduce(compose, fcns, lambda x: x)
The function compose is the basic building block that takes two functions and returns their composition. With this elementary idiom you can extend this to an arbitrary sequence of functions in compose_all. The variant compose_all_1 works even on a 1- or 0-element input sequence.
You can define a pipeline builder, that you can reuse to build pipelines.
Each pipeline is a function that can be invoked with one argument or none (rarely used).
When a pipeline is called it sequentially run every functions with previous function return value as argument.
from functools import reduce
import warnings
# pipeline function builder
def pipe(*funcs):
"""
Usage:
pipe(
lambda x: x+1,
lambda x: x+20
)(50)
Order of execution of functions is FIFO.
FIFO = First In First Out
"""
# if no arguments return an identity function and warn the developer
if not funcs:
warnings.warn("""pipe() is useless when called without arguments.
Please provide functions as arguments or remove call to pipe().
""")
return lambda x=None: x # Identity function
def runner(value=None):
return reduce(lambda acc, curr: curr(acc), funcs, value)
return runner
# test 1
pipeline = pipe()
print(pipeline())
# None
# test 2
pipeline = pipe()
print(pipeline(8))
# 8
# test 3
pipeline = pipe(
lambda unused: 50
)
print(pipeline())
# 50
# test 4 -> Real Usage
pipeline = pipe(
lambda age: age >= 18,
lambda is_adult: 'Adult' if is_adult else 'Minor'
)
print(pipeline(12))
# Minor
print(pipeline(20))
# Adult
Is there any possibility to specify how many arguments a lambda as a function argument can take?
For example:
def func(k=lambda x:return x**2):
return k
Could i specify, if k is not the standard lambda, that k is supposed to take exactly one argument?
Max
You could do something like this using inspect.getargspec:
import inspect
def func (k = lambda x: x ** 2):
if not callable(k) or len(inspect.getargspec(k).args) != 1:
raise TypeError('k must take exactly one argument.')
# Do whatever you want
Note that the above will miserably fail with something like (while it shouldn't):
func (lambda x, y = 8: x + y)
...so you will need something a bit more complicated if you want to handle this case:
import inspect
def func (k = lambda x: x ** 2):
if not callable(k):
raise TypeError('k must be callable.')
argspec = inspect.getfullargspec(k)
nargs = len(argspec.args)
ndeft = 0 if argspec.defaults is None else len(argspec.defaults)
if nargs != ndeft + 1:
raise TypeError('k must be callable with one argument.')
# Do whatever you want
I'm trying to set up a "processing pipeline" for data that I'm reading in from a data source, and applying a sequence of operators (using generators) to each item as it is read.
Some sample code that demonstrates the same issue.
def reader():
yield 1
yield 2
yield 3
def add_1(val):
return val + 1
def add_5(val):
return val + 5
def add_10(val):
return val + 10
operators = [add_1, add_5, add_10]
def main():
vals = reader()
for op in operators:
vals = (op(val) for val in vals)
return vals
print(list(main()))
Desired : [17, 18, 19]
Actual: [31, 32, 33]
Python seems to not be saving the value of op each time through the for loop, so it instead applies the third function each time. Is there a way to "bind" the actual operator function to the generator expression each time through the for loop?
I could get around this trivially by changing the generator expression in the for loop to a list comprehension, but since the actual data is much larger, I don't want to be storing it all in memory at any one point.
You can force a variable to be bound by creating the generator in a new function. eg.
def map_operator(operator, iterable):
# closure value of operator is now separate for each generator created
return (operator(item) for item in iterable)
def main():
vals = reader()
for op in operators:
vals = map_operator(op, vals)
return vals
However, map_operator is pretty much identical to the map builtin (in python 3.x). So just use that instead.
You can define a little helper which composes the functions but in reverse order:
import functools
def compose(*fns):
return functools.reduce(lambda f, g: lambda x: g(f(x)), fns)
I.e. you can use compose(f,g,h) to generate a lambda expression equivalent to lambda x: h(g(f(x))). This order is uncommon, but ensures that your functions are applied left-to-right (which is probably what you expect):
Using this, your main becomes just
def main():
vals = reader()
f = compose(add_1, add_5, add_10)
return (f(v) for v in vals)
This may be what you want - create a composite function:
import functools
def compose(functions):
return functools.reduce(lambda f, g: lambda x: g(f(x)), functions, lambda x: x)
def reader():
yield 1
yield 2
yield 3
def add_1(val):
return val + 1
def add_5(val):
return val + 5
def add_10(val):
return val + 10
operators = [add_1, add_5, add_10]
def main():
vals = map(compose(operators), reader())
return vals
print(list(main()))
The reason for this problem is that you are creating a deeply nested generator of generators and evaluate the whole thing after the loop, when op has been bound to the last element in the list -- similar to the quite common "lambda in a loop" problem.
In a sense, your code is roughly equivalent to this:
for op in operators:
pass
print(list((op(val) for val in (op(val) for val in (op(val) for val in (x for x in [1, 2, 3])))))
One (not very pretty) way to fix this would be to zip the values with another generator, repeating the same operation:
def add(n):
def add_n(val):
return val + n
return add_n
operators = [add(n) for n in [1, 5, 10]]
import itertools
def main():
vals = (x for x in [1, 2, 3])
for op in operators:
vals = (op(val) for (val, op) in zip(vals, itertools.repeat(op)))
return vals
print(list(main()))