For partial function application, I know there are several ways to do that in Python. However, they seems not to preserve the original function's docstring.
Take functools.partial as example:
from functools import partial
def foo(a, b, c=1):
"""Return (a+b)*c."""
return (a+b)*c
bar10_p = partial(foo, b=10)
print bar10_p.__doc__
partial(func, *args, **keywords) - new function with partial application
of the given arguments and keywords.
Let's try fn.py:
from fn import F
def foo(a, b, c=1):
"""Return (a+b)*c."""
return (a+b)*c
bar10_F = F(foo, b=10)
print bar10_F.__doc__
Provide simple syntax for functions composition
(through << and >> operators) and partial function
application (through simple tuple syntax).
Usage example:
>>> func = F() << (_ + 10) << (_ + 5)
>>> print(func(10))
25
>>> func = F() >> (filter, _ < 6) >> sum
>>> print(func(range(10)))
15
Is there any Python package/module providing partial application with preserved docstring?
UPDATE
As #Kevin and #Martijn Pieters mentioned, the function signature has changed such that it is not suggested to stick to the original function's docstring. I realized that I'm looking for an updated docstring with something like foo() with a default b value of 10 (Thanks for Kevin's simple but direct example.).
__doc__ is writable, on partial objects as well as on functions; simply copy it over:
bar10_p = partial(foo, b=10)
bar10_p.__doc__ = func.__doc__
or use the functools.update_wrapper() function to do the copying for you; it'll copy a few other pieces of metadata for you too:
from functools import update_wrapper
bar10_p = partial(foo, b=10)
update_wrapper(bar10_p, foo)
Just write a new __doc__.
bar10_p = partial(foo, b=10)
bar10_p.__doc__ = """foo() with a default b value of 10.
See foo().
"""
Your function has a different interface from the original, so it should not copy the docstring exactly.
Partial has access to func method which is the original function. So through original function, you have access to original function docstring.
Try this:
from math import cos
from functools import partial
cos_partial = partial(cos, 0.5)
print(cos_partial.func.__doc__)
With makefun you can do it:
from makefun import partial
def foo(a, b, c=1):
"""Return (a+b)*c."""
return (a + b) * c
bar10_p = partial(foo, b=10)
assert bar10_p(0) == 10
assert bar10_p(0, c=2) == 20
help(bar10_p)
It yields:
Help on function foo in module makefun.tests.test_so:
foo(a, c=1)
<This function is equivalent to 'foo(a, c=1, b=10)', see original 'foo' doc below.>
Return (a+b)*c.
Note that if you have any comment on how the docstring should be updated, do not hesitate to propose an issue on the git repo !
(I'm the author by the way)
Related
Is there an equivalent to R's do.call in python?
do.call(what = 'sum', args = list(1:10)) #[1] 55
do.call(what = 'mean', args = list(1:10)) #[1] 5.5
?do.call
# Description
# do.call constructs and executes a function call from a name or a function and a list of arguments to be passed to it.
There is no built-in for this, but it is easy enough to construct an equivalent.
You can look up any object from the built-ins namespace using the __builtin__ (Python 2) or builtins (Python 3) modules then apply arbitrary arguments to that with *args and **kwargs syntax:
try:
# Python 2
import __builtin__ as builtins
except ImportError:
# Python 3
import builtins
def do_call(what, *args, **kwargs):
return getattr(builtins, what)(*args, **kwargs)
do_call('sum', range(1, 11))
Generally speaking, we don't do this in Python. If you must translate strings into function objects, it is generally preferred to build a custom dictionary:
functions = {
'sum': sum,
'mean': lambda v: sum(v) / len(v),
}
then look up functions from that dictionary instead:
functions['sum'](range(1, 11))
This lets you strictly control what names are available to dynamic code, preventing a user from making a nuisance of themselves by calling built-ins for their destructive or disruptive effects.
do.call is pretty much the equivalent of the splat operator in Python:
def mysum(a, b, c):
return sum([a, b, c])
# normal call:
mysum(1, 2, 3)
# with a list of arguments:
mysum(*[1, 2, 3])
Note that I’ve had to define my own sum function since Python’s sum already expects a list as an argument, so your original code would just be
sum(range(1, 11))
R has another peculiarity: do.call internally performs a function lookup of its first argument. This means that it finds the function even if it’s a character string rather than an actual function. The Python equivalent above doesn’t do this — see Martijn’s answer for a solution to this.
Goes similar to previous answer, but why so complicated?
def do_call(what, args=[], kwargs = {}):
return what(*args, **kwargs)
(Which is more elegant than my previously posted definition:)
def do_call(which, args=None, kwargs = None):
if args is None and kwargs is not None:
return which(**kwargs)
elif args is not None and kwargs is None:
return which(*args)
else:
return which(*args, **kwargs)
Python's sum is different than R's sum (1 argument a list expected vs.
arbitraily many arguments expected in R). So we define our own sum (mysum)
which behaves similarly to R's sum. In a similar way we define mymean.
def mysum(*args):
return sum(args)
def mymean(*args):
return sum(args)/len(args)
Now we can recreate your example in Python - as a reasonable 1:1 translation of the R function call.
do_call(what = mymean, args=[1, 2, 3])
## 2.0
do_call(what = mysum, args=[1, 2, 3])
## 6
For functions with argument names, we use a dict for kwargs, where the parameter
names are keys of the dictionary (as strings) and their values the values.
def myfunc(a, b, c):
return a + b + c
do_call(what = myfunc, kwargs={"a": 1, "b": 2, "c": 3})
## 6
# we can even mix named and unnamed parts
do_call(what = myfunc, args = [1, 2], kwargs={"c": 3})
## 6
I recently started coding in Python and I was wondering if it's possible to return a function that specializes another function.
For example, in Haskell you can create a function that adds 5 to any given number like this:
sumFive = (+5)
Is it somehow possible in Python?
I think the other answers are misunderstanding the question. I believe the OP is asking about partial application of a function, in his example the function is (+).
If the goal isn't partial application, the solution is as simple as:
def sumFive(x): return x + 5
For partial application in Python, we can use this function: https://docs.python.org/2/library/functools.html#functools.partial
def partial(func, *args, **keywords):
def newfunc(*fargs, **fkeywords):
newkeywords = keywords.copy()
newkeywords.update(fkeywords)
return func(*(args + fargs), **newkeywords)
newfunc.func = func
newfunc.args = args
newfunc.keywords = keywords
return newfunc
Then, we must turn the + operator into a function (I don't believe there's a lightweight syntax to do so like in Haskell):
def plus(x, y): return x + y
Finally:
sumFive = partial(plus, 5)
Not nearly as nice as in Haskell, but it works:
>>> sumFive(7)
12
Python's design does not naturally support the evaluation of a multi-variable function into a sequence of single-variable functions (currying). As other answers point out, the related (but distinct) concept of partial application is more straightforward to do using partial from the functools module.
However, the PyMonad library supplies you with the tools to make currying possible in Python, providing a "collection of classes for programming with functors, applicative functors and monads."
Use the curry decorator to decorate a function that accepts any number of arguments:
from pymonad import curry
#curry
def add(x, y):
return x + y
It is then very easy to curry add. The syntax is not too dissimilar to Haskell's:
>>> add5 = add(5)
>>> add5(12)
17
Note that here the add and add5 functions are instances of PyMonad's Reader monad class, not a normal Python function object:
>>> add
<pymonad.Reader.Reader at 0x7f7024ccf908>
This allows, for example, the possibility of using simpler syntax to compose functions (easy to do in Haskell, normally much less so in Python).
Finally, it's worth noting that the infix operator + is not a Python function: + calls into the left-hand operand's __add__ method, or the right-hand operand's __radd__ method and returns the result. You'll need to decorate these class methods for the objects you're working with if you want to curry using + (disclaimer: I've not tried to do this yet).
Yup. Python supports lambda expressions:
sumFive = lambda x: x + 5
for i in range(5):
print sumFive(i),
#OUTPUT 5,6,7,8,9
Python functions can return functions, allowing you to create higher-order functions. For example, here is a higher-order function which can specialize a function of two variables:
def specialize(f,a,i):
def g(x):
if i == 0:
return f(a,x)
else:
return f(x,a)
return g
Used like this:
>>> def subtract(x,y): return x - y
>>> f = specialize(subtract,5,0)
>>> g = specialize(subtract,5,1)
>>> f(7)
-2
>>> g(7)
2
But -- there is really no need to reinvent the wheel, the module functools has a number of useful higher-order functions that any Haskell programmer would find useful, including partial for partial function application, which is what you are asking about.
As it was pointed out, python does have lambda functions, so the following does solve the problem:
# Haskell: sumFive = (+5)
sumFive = lambda x : x + 5
I think this is more useful with the fact that python has first class functions (1,2)
def summation(n, term):
total, k = 0, 1
while k <= n:
total, k = total + term(k), k + 1
return total
def identity(x):
return x
def sum_naturals(n):
return summation(n, identity)
sum_naturals(10) # Returns 55
# Now for something a bit more complex
def pi_term(x):
return 8 / ((4*x-3) * (4*x-1))
def pi_sum(n):
return summation(n, pi_term)
pi_sum(1e6) # returns: 3.141592153589902
You can find more on functional programming and python here
For the most generic Haskell style currying, look at partial from the functools module.
Is it possible to assign a function to a variable with modified default arguments?
To make it more concrete, I'll give an example.
The following obviously doesn't work in the current form and is only meant to show what I need:
def power(a, pow=2):
ret = 1
for _ in range(pow):
ret *= a
return ret
cube = power(pow=3)
And the result of cube(5) should be 125.
functools.partial to the rescue:
Return a new partial object which when called will behave like func called with the positional arguments args and keyword arguments keywords. If more arguments are supplied to the call, they are appended to args. If additional keyword arguments are supplied, they extend and override keywords.
from functools import partial
cube = partial(power, pow=3)
Demo:
>>> from functools import partial
>>>
>>> def power(a, pow=2):
... ret = 1
... for _ in range(pow):
... ret *= a
... return ret
...
>>> cube = partial(power, pow=3)
>>>
>>> cube(5)
125
The answer using partial is good, using the standard library, but I think it's worth mentioning that the following approach is equivalent:
def cube(a):
return power(a, pow=3)
Even though this doesn't seem like assignment because there isn't a =, it is doing much the same thing (binding a name to a function object). I think this is often more legible.
In specific there's a special function for exponents:
>>> 2**3
8
But I also solved it with a lambda function, which is a nicer version of a function pointer.
# cube = power(pow=3) # original
cube = lambda x: power(x,3)
Let me first acknowledge that what I want to do may be considered anything from silly to evil, but I want to find out if I can do it in Python anyway.
Let's say I have a function decorator that takes keyword arguments defining variables, and I want to access those variables in the wrapped function. I might do something like this:
def more_vars(**extras):
def wrapper(f):
#wraps(f)
def wrapped(*args, **kwargs):
return f(extras, *args, **kwargs)
return wrapped
return wrapper
Now I can do something like:
#more_vars(a='hello', b='world')
def test(deco_vars, x, y):
print(deco_vars['a'], deco_vars['b'])
print(x, y)
test(1, 2)
# Output:
# hello world
# 1 2
The thing I don't like about this is that when you use this decorator, you have to change the call signature of the function, adding the extra variable in addition to slapping on the decorator. Also, if you look at the help for the function, you see an extra variable that you're not expected to use when calling the function:
help(test)
# Output:
# Help on function test in module __main__:
#
# test(deco_vars, x, y)
This makes it look like the user is expected to call the function with 3 parameters, but obviously that won't work. So you'd have to also add a message to the docstring indicating that the first parameter isn't part of the interface, it's just an implementation detail and should be ignored. That's kind of crappy, though. Is there any way to do this without hanging these variables on something in the global scope? Ideally, I'd like it to look like the following:
#more_vars(a='hello', b='world')
def test(x, y):
print(a, b)
print(x, y)
test(1, 2)
# Output:
# hello world
# 1 2
help(test)
# Output:
# Help on function test in module __main__:
#
# test(x, y)
I am content with a Python 3 only solution if one exists.
You could do this with some trickery that inserts the variables passed to the decorator into the function's local variables:
import sys
from functools import wraps
from types import FunctionType
def is_python3():
return sys.version_info >= (3, 0)
def more_vars(**extras):
def wrapper(f):
#wraps(f)
def wrapped(*args, **kwargs):
fn_globals = {}
fn_globals.update(globals())
fn_globals.update(extras)
if is_python3():
func_code = '__code__'
else:
func_code = 'func_code'
call_fn = FunctionType(getattr(f, func_code), fn_globals)
return call_fn(*args, **kwargs)
return wrapped
return wrapper
#more_vars(a="hello", b="world")
def test(x, y):
print("locals: {}".format(locals()))
print("x: {}".format(x))
print("y: {}".format(y))
print("a: {}".format(a))
print("b: {}".format(b))
if __name__ == "__main__":
test(1, 2)
Can you do this? Sure! Should you do this? Probably not!
(Code available here.)
EDIT: answer edited for readability. Latest answer is on top, original follows.
If I understand well
you want the new arguments to be defined as keywords in the #more_vars decorator
you want to use them in the decorated function
and you want them to be hidden to the normal users (the exposed signature should still be the normal signature)
Have a look at the #with_partial decorator in my library makefun. It provides this functionality out of the box:
from makefun import with_partial
#with_partial(a='hello', b='world')
def test(a, b, x, y):
"""Here is a doc"""
print(a, b)
print(x, y)
It yields the expected output and the docstring is modified accordingly:
test(1, 2)
help(test)
yields
hello world
1 2
Help on function test in module <...>:
test(x, y)
<This function is equivalent to 'test(x, y, a=hello, b=world)', see original 'test' doc below.>
Here is a doc
To answer the question in your comment, the function creation strategy in makefun is exactly the same than the one in the famous decorator library: compile + exec. No magic here, but decorator has been using this trick for years in real-world applications so it is quite solid. See def _make in the source code.
Note that the makefun library also provides a partial(f, *args, **kwargs) function if you want to create the decorator yourself for some reason (see below for inspiration).
If you wish to do this manually, this is a solution that should work as you expect, it relies on the wraps function provided by makefun, to modify the exposed signature.
from makefun import wraps, remove_signature_parameters
def more_vars(**extras):
def wrapper(f):
# (1) capture the signature of the function to wrap and remove the invisible
func_sig = signature(f)
new_sig = remove_signature_parameters(func_sig, 'invisible_args')
# (2) create a wrapper with the new signature
#wraps(f, new_sig=new_sig)
def wrapped(*args, **kwargs):
# inject the invisible args again
kwargs['invisible_args'] = extras
return f(*args, **kwargs)
return wrapped
return wrapper
You can test that it works:
#more_vars(a='hello', b='world')
def test(x, y, invisible_args):
a = invisible_args['a']
b = invisible_args['b']
print(a, b)
print(x, y)
test(1, 2)
help(test)
You can even make the decorator definition more compact if you use decopatch to remove the useless level of nesting:
from decopatch import DECORATED
from makefun import wraps, remove_signature_parameters
#function_decorator
def more_vars(f=DECORATED, **extras):
# (1) capture the signature of the function to wrap and remove the invisible
func_sig = signature(f)
new_sig = remove_signature_parameters(func_sig, 'invisible_args')
# (2) create a wrapper with the new signature
#wraps(f, new_sig=new_sig)
def wrapped(*args, **kwargs):
kwargs['invisible_args'] = extras
return f(*args, **kwargs)
return wrapped
Finally, if you rather do not want to depend on any external library, the most pythonic way to do it is to create a function factory (but then you cannot have this as a decorator):
def make_test(a, b, name=None):
def test(x, y):
print(a, b)
print(x, y)
if name is not None:
test.__name__ = name
return test
test = make_test(a='hello', b='world')
test2 = make_test(a='hello', b='there', name='test2')
I'm the author of makefun and decopatch by the way ;)
It sounds like your only problem is that help is showing the signature of the raw test as the signature of the wrapped function, and you don't want it to.
The only reason that's happening is that wraps (or, rather, update_wrapper, which wraps calls) explicitly copies this from the wrappee to the wrapper.
You can decide exactly what you do and don't want to copy. If what you want to do differently is simple enough, it's just a matter of filtering stuff out of the default WRAPPER_ASSIGNMENTS and WRAPPER_UPDATES. If you want to change other stuff, you may need to fork update_wrapper and use your own version—but functools is one of those modules that has a link to the source right at the top of the docs, because it's meant to be used as readable sample code.
In your case, it may just be a matter of wraps(f, updated=[]), or you may want to do something fancy, like use inspect.signature to get the signature of f, and modify it to remove the first parameter, and build a wrapper explicitly around that to fool even the inspect module.
I've found a solution to this problem, although the solution is by most standards almost certainly worse than the problem itself. With some clever rewriting of the decorated function's bytecode, you can redirect all references to variables of a given name to a new closure you can dynamically create for the function. This solution only works for the standard CPython, and I have only tested it with 3.7.
import inspect
from dis import opmap, Bytecode
from types import FunctionType, CodeType
def more_vars(**vars):
'''Decorator to inject more variables into a function.'''
def wrapper(f):
code = f.__code__
new_freevars = code.co_freevars + tuple(vars.keys())
new_globals = [var for var in code.co_names if var not in vars.keys()]
new_locals = [var for var in code.co_varnames if var not in vars.keys()]
payload = b''.join(
filtered_bytecode(f, new_freevars, new_globals, new_locals))
new_code = CodeType(code.co_argcount,
code.co_kwonlyargcount,
len(new_locals),
code.co_stacksize,
code.co_flags & ~inspect.CO_NOFREE,
payload,
code.co_consts,
tuple(new_globals),
tuple(new_locals),
code.co_filename,
code.co_name,
code.co_firstlineno,
code.co_lnotab,
code.co_freevars + tuple(vars.keys()),
code.co_cellvars)
closure = tuple(get_cell(v) for (k, v) in vars.items())
return FunctionType(new_code, f.__globals__, f.__name__, f.__defaults__,
(f.__closure__ or ()) + closure)
return wrapper
def get_cell(val=None):
'''Create a closure cell object with initial value.'''
# If you know a better way to do this, I'd like to hear it.
x = val
def closure():
return x # pragma: no cover
return closure.__closure__[0]
def filtered_bytecode(func, freevars, globals, locals):
'''Get the bytecode for a function with adjusted closed variables
Any references to globlas or locals in the bytecode which exist in the
freevars are modified to reference the freevars instead.
'''
opcode_map = {
opmap['LOAD_FAST']: opmap['LOAD_DEREF'],
opmap['STORE_FAST']: opmap['STORE_DEREF'],
opmap['LOAD_GLOBAL']: opmap['LOAD_DEREF'],
opmap['STORE_GLOBAL']: opmap['STORE_DEREF']
}
freevars_map = {var: idx for (idx, var) in enumerate(freevars)}
globals_map = {var: idx for (idx, var) in enumerate(globals)}
locals_map = {var: idx for (idx, var) in enumerate(locals)}
for instruction in Bytecode(func):
if instruction.opcode not in opcode_map:
yield bytes([instruction.opcode, instruction.arg or 0])
elif instruction.argval in freevars_map:
yield bytes([opcode_map[instruction.opcode],
freevars_map[instruction.argval]])
elif 'GLOBAL' in instruction.opname:
yield bytes([instruction.opcode,
globals_map[instruction.argval]])
elif 'FAST' in instruction.opname:
yield bytes([instruction.opcode,
locals_map[instruction.argval]])
This behaves exactly as I wanted:
In [1]: #more_vars(a='hello', b='world')
...: def test(x, y):
...: print(a, b)
...: print(x, y)
...:
In [2]: test(1, 2)
hello world
1 2
In [3]: help(test)
Help on function test in module __main__:
test(x, y)
This is almost certainly not ready for production use. I would be surprised if there weren't edge cases that behave unexpectedly, and possibly even segfault. I'd probably file this under the "educational curiosity" heading.
I have been working with Python and I set up the following code situation:
import timeit
setting = """
import functools
def f(a,b,c):
pass
g = functools.partial(f,c=3)
h = functools.partial(f,b=5,c=3)
i = functools.partial(f,a=4,b=5,c=3)
"""
print timeit.timeit('f(4,5,3)', setup = setting, number=100000)
print timeit.timeit('g(4,5)', setup = setting, number=100000)
print timeit.timeit('h(4)', setup = setting, number=100000)
print timeit.timeit('i()', setup = setting, number=100000)
I get the following as a result:
f: 0.181384086609
g: 0.39066195488
h: 0.425783157349
i: 0.391901016235
Why do the calls to the partial functions take longer? Is the partial function just forwarding the parameters to the original function or is it mapping the static arguments throughout? And also, is there a function in Python to return the body of a function filled in given that all the parameters are predefined, like with function i?
Why do the calls to the partial functions take longer?
The code with partial takes about two times longer because of the additional function call. Function calls are expensive:
Function call overhead in Python is relatively high, especially compared with the execution speed of a builtin function.
-
Is the partial function just forwarding the parameters to the original function or is it mapping the static arguments throughout?
As far as i know - yes, it just forwards the arguments to the original function.
-
And also, is there a function in Python to return the body of a function filled in given that all the parameters are predefined, like with function i?
No, i am not aware of such built-in function in Python. But i think it's possible to do what you want, as functions are objects which can be copied and modified.
Here is a prototype:
import timeit
import types
# http://stackoverflow.com/questions/6527633/how-can-i-make-a-deepcopy-of-a-function-in-python
def copy_func(f, name=None):
return types.FunctionType(f.func_code, f.func_globals, name or f.func_name,
f.func_defaults, f.func_closure)
def f(a, b, c):
return a + b + c
i = copy_func(f, 'i')
i.func_defaults = (4, 5, 3)
print timeit.timeit('f(4,5,3)', setup = 'from __main__ import f', number=100000)
print timeit.timeit('i()', setup = 'from __main__ import i', number=100000)
which gives:
0.0257439613342
0.0221881866455
Calls to a function with partially applied arguments are more expensive because you double the number of function calls. The effect of functools.partial() is similar to this example:
def apply_one_of_two(f, a):
def g(b):
return f(a, b)
return g
That means that apply_one_of_two() returns a function and when it's called than this results in the additional call of the original funciton f.
Since Python usually doesn't optimize this away it directly translates into additional runtime efforts.
But this isn't the only factor to consider in your microbenchmark. You also switch from positional to keyword arguments in your partial invocations, which introduces additional overhead.
When you reverse the argument ordering in your original function you don't need keyword arguments in the partial calls and then the runtime difference somewhat decreases, e.g.:
import timeit
setting = """
import functools
def f(a,b,c):
pass
g = functools.partial(f, 4)
h = functools.partial(f, 4, 5)
i = functools.partial(f, 4, 5, 3)
"""
print(timeit.timeit('f(4, 5, 3)', setup = setting, number=100000))
print(timeit.timeit('g(5, 3)', setup = setting, number=100000))
print(timeit.timeit('h(3)', setup = setting, number=100000))
print(timeit.timeit('i()', setup = setting, number=100000))
Output (on an Intel Skylake i7 under Fedora 27/Python 3.6):
0.010069019044749439
0.01681053702486679
0.018060395028442144
0.011366961000021547