With a function f calling another well-known function in Python (e.g. a matplotlib function), what is the most pythonic/efficient/elegant way to define some default values while still giving the possibility to the user of f to fully customize the called function (typically with **kwargs), including to overwrite the default keyword arguments defined in f?
import numpy as np
import matplotlib.pyplot as plt
v = np.linspace(-10.,10.,100)
x,y = np.meshgrid(v, v)
z = -np.hypot(x, y)
def f(ax, n=12, **kwargs):
ax.contourf(x, y, z, n, cmap=plt.cm.autumn, **kwargs)
fig, ((ax0, ax1), (ax2, ax3)) = plt.subplots(2, 2)
f(ax0) # OK
f(ax1, n=100) # OK
f(ax2, n=100, **{'vmax': -2, 'alpha': 0.2}) # OK
# f(ax3, n=100, **{'cmap': plt.cm.cool}) # ERROR
plt.show()
Here, the last call to f throws:
TypeError: contourf() got multiple values for keyword argument 'cmap'
In your wrapper, you could simply adjust kwargs before passing it to wrapped function:
def f(ax, n=12, **kwargs):
kwargs.setdefault('cmap', plt.cm.autumn)
ax.contourf(x, y, z, n, **kwargs)
setdefault will avoid changing the argument if it was passed to your wrapper, but you could just as easily always clobber it if you wanted.
The following minimal example illustrates the options you have when coding this from scratch. Here you can also define a default argument when defining the inner function but NOT specify it when calling it in the wrapper. Note that default variable a is removed from kwargs within the function, if it was already defined as a default argument. Only in run_test_working3 (setting the default value merely with setdefault in the wrapper) default variable a is not removed from kwargs. This might be important if you want to pass kwargs to inner functions within the test function.
kwargs = {"a": 1,
"b": 2,
"c": 3}
kwargs2 = {"b": 2,
"c": 3}
def test(required_arg, a="default", **kwargs):
print(required_arg)
print(a)
print(kwargs)
def test2(required_arg, **kwargs):
print(required_arg)
print(kwargs)
#Set default for a in definition of test but not when calling it in wrapper
# a is removed from kwargs
def run_test_working1(required_arg, **kwargs):
test(required_arg, **kwargs)
#Set default for a different from definition of test in wrapper
# a is removed from kwargs
def run_test_working2(required_arg, **kwargs):
kwargs.setdefault("a", "default2")
test(required_arg, **kwargs)
#Set default value only via setdefault in wrapper
#a is not removed from kwargs
def run_test_working3(required_arg, **kwargs):
kwargs.setdefault("a", "default2")
test2(required_arg, **kwargs)
#Provoke TypeError: test() got multiple values for keyword argument 'a'
def run_test_not_working(required_arg, **kwargs):
test(required_arg, a="default", **kwargs)
print("Demo run_test_working1\n")
run_test_working1("required_arg",**kwargs)
print("\n")
run_test_working1("required_arg",**kwargs2)
print("\n")
print("Demo run_test_working2\n")
run_test_working2("required_arg",**kwargs)
print("\n")
run_test_working2("required_arg",**kwargs2)
print("\n")
print("Demo run_test_working3\n")
run_test_working3("required_arg",**kwargs)
print("\n")
run_test_working3("required_arg",**kwargs2)
print("\n")
print("Demo run_test_not_working\n")
run_test_not_working("required_arg",**kwargs)
print("\n")
test("required_arg")
Related
As per manual, functools partial() is 'used for partial function application which “freezes” some portion of a function’s arguments and/or keywords resulting in a new object with a simplified signature.'
What's the best way to specify the positions of the arguments that one wishes to evaluate?
EDIT
Note as per comments, the function to be partially evaluated may contain named and unnamed arguments (these functions should be completely arbitrary and may be preexisting)
END EDIT
For example, consider:
def f(x,y,z):
return x + 2*y + 3*z
Then, using
from functools import partial
both
partial(f,4)(5,6)
and
partial(f,4,5)(6)
give 32.
But what if one wants to evaluate, say the third argument z or the first and third arguments x, and z?
Is there a convenient way to pass the position information to partial, using a decorator or a dict whose keys are the desired arg positions and the respective values are the arg values? eg to pass the x and z positions something like like this:
partial_dict(f,{0:4,2:6})(5)
No, partial is not designed to freeze positional arguments at non-sequential positions.
To achieve the desired behavior outlined in your question, you would have to come up with a wrapper function of your own like this:
def partial_positionals(func, positionals, **keywords):
def wrapper(*args, **kwargs):
arg = iter(args)
return func(*(positionals[i] if i in positionals else next(arg)
for i in range(len(args) + len(positionals))), **{**keywords, **kwargs})
return wrapper
so that:
def f(x, y, z):
return x + 2 * y + 3 * z
print(partial_positionals(f, {0: 4, 2: 6})(5))
outputs:
32
Simply use keyword arguments. Using your definition of f above,
>>> g = partial(f, z=10)
>>> g(2, 4)
40
>>> h = partial(f, y=4, z=10)
>>> h(2)
40
Note that once you use a keyword argument for a given parameter, you must use keyword arguments for all remaining arguments. For example, the following would not be valid:
>>> j = partial(f, x=2, z=10)
>>> j(4)
TypeError: f() got multiple values for argument 'x'
But continuing to use keyword arguments is:
>>> j = partial(f, x=2, z=10)
>>> j(y=4)
40
When you use functools.partial, you store the values of *args and **kwargs for later interpolation. When you later call the "partially applied" function, the implementation of functools.partial effectively adds the previously provided *args and **kwargs to the argument list at the front and end, respectively, as though you had inserted these argument-unpackings yourself. I.e., calling
h = partial(1, z=10)
f(4)
is roughly equivalent to writing
args = [1]
kwargs = {'z': 10}
f(*args, 4, **kwargs)
As such, the semantics of how you provide arguments to functools.partial is the same as how you would need to store arguments in the args and kwargs variables above such that the final call to f is sensible. For more information, take a look at the pseduo-implementation of functools.partial given in the functools module documentation
For easier usage, you can create a new object specifically to specify a positional argument that is to be skipped when sequentially listing values for positional arguments to be frozen with partial:
SKIP = object()
def partial_positionals(func, *positionals, **keywords):
def wrapper(*args, **kwargs):
arg = iter(args)
return func(*(*(next(arg) if i is SKIP else i for i in positionals), *arg),
**{**keywords, **kwargs})
return wrapper
so that:
def f(x, y, z):
return x + 2 * y + 3 * z
print(partial_positionals(f, 4, SKIP, 6)(5))
outputs:
32
I run a decorator demo below.
def logger(func):
def inner(*args, **kwargs):
print(args)
print(kwargs)
return func(*args, **kwargs)
return inner
#logger
def foo1(a, b, c, x=2, y=1):
print(x * y)
foo1(6,7,8)
output is:
(6, 7, 8)
{}
2
Why is the dict empty? I think it should be {'x':2, 'y':1}
That's because of no kwargs provided in a function call. And decorator logger know nothing about that and what function will use. It is kind a "proxy" between kwargs provided there and real call.
See examples below:
# kwargs are not provided (not redefined), function `foo1` will use default.
>>> foo1(6, 7, 8)
(6, 7, 8)
{}
2
# new kwargs are provided and passed to decorator too
>>> foo1(6, 7, 8, x=9, y=10)
(6, 7, 8)
{'x': 9, 'y': 10}
90
This is something similar to:
def foo1(a, b, c, x=2, y=1):
print(x * y)
def logger(func):
def inner(*args, **kwargs):
print(args)
print(kwargs)
return func(*args, **kwargs)
return inner
wrapped_foo1 = logger(foo1)
wrapped_foo1(6,7,8)
Or even simplified to the following, when you can clearly see the problem:
def foo1_decorated(*args, **kwargs):
print(args) # <-- here it has no chance to know that `x=2, y=1`
print(kwargs)
return foo1(*args, **kwargs)
foo1_decorated(6, 7, 8)
The problem is that the default values for arguments are filled in by the wrapped function object when you call it, because only the wrapped function knows them (they are stored in __defaults__ and __kwdefaults__).
If you want your decorator to know about them too, you have to mimic what the wrapped function object would do.
For this task you can use the inspect module:
from inspect import signature
def logger(func):
sig = signature(func)
def inner(*args, **kwargs):
arguments = sig.bind(*args, **kwargs) # these 2 steps are normally handled by func
arguments.apply_defaults()
print(func, "was called with", arguments)
return func(*args, **kwargs)
return inner
#logger
def foo1(a, b, c, x=2, y=1):
print(x * y)
foo1(6,7,8)
Output:
<function foo1 at 0x7f5811a18048> was called with <BoundArguments (a=6, b=7, c=8, x=2, y=1)>
2
If you want to access the arguments, read more about it in the docs.
That dictionary is empty because you have not passed any kwargs in foo1.
To get x and y instead of empty dictionary you can use
foo1(6,7,8, x=2, y=3) # x and y are printed while printing kwargs
instead of
foo1(6,7,8) # no x and y values are passed so empty dict is print while printing kwargs
Note that you should only use variable x and y. Any other variables will cause error.
The process that is exactly happening is this:
1. foo1 function is tried to called
2. Due to presence of #logger, logger function is called first
3. foo1 function is passed to logger function.
4. inner function takes both type of arguments of foo1 function.
4. *args accepts arguments that are comma separated and should not
contain key = value type of argument
5. **kwargs accepts arguments that are only key = value type
6. Since you have passed 6,7,8, they are all treated as *args
7. To pass as **kwargs, you have to pass key = value in foo1 parameters.
8. *args and ** kwargs values are printed
9. foo1 function is called
10. It executes code inside foo1
So I have a bunch of processing functions and all of them use a (for lack of a better word) 'master' function. This master function basically is a big AND operation that returns the relevant lines from a pandas data frame according to the value of a bunch of boolean or string columns (btw, the data are about rodent behavior).
def trial_selector(session,
init='NSWE', odor='ABCD', action='LRFB',
action_choice='LRFB', goal='NSWE', qNa='both',
cl=False, invalid=False):
trials = load_trials(session) # Wrapper load func that is somwhere else
# input checks transform str in lists... ugly but needed for now
if type(init) == str:
init = [init] if len(init) == 1 else [x for x in init]
if type(odor) == str:
odor = [odor] if len(odor) == 1 else [x for x in odor]
mapping = {'A': 1, 'B': 2, 'C': 3, 'D': 4}
odor = [mapping[x] for x in odor]
if type(action) == str:
action = [action] if len(action) == 1 else [x for x in action]
if type(action_choice) == str:
action_choice = [action_choice] if len(action_choice) == 1 else [x for x in action_choice]
if type(goal) == str:
goal = [goal] if len(goal) == 1 else [x for x in goal]
# init odor action action_choice goal selection
tr = trials[trials.init.isin(init) & trials.valve_number.isin(odor)
& trials.action.isin(action)
& trials.action_choice.isin(action_choice)
& trials.goal_choice.isin(goal)]
# TODO: Invalid, correction loop and trial type (not complete)
if not invalid:
tr = tr[tr.valid]
if not cl:
tr = tr[~tr.correction_loop]
if qNa == 'both':
tr = tr
elif qNa == 'q':
tr = tr[~tr.solution]
elif qNa == 'a':
tr = tr[tr.solution]
return tr
The processor functions prepare the data to be plotted by a corresponding plotting function, i.e., tperformance returns (x, y, yerr) and its used by tplot_performance.
# tsargs are the arguments of the trial_selector function
def tperformance_uni(sessionName, **tsargs):
trials = trial_selector(sessionName, **tsargs)
x = trials.correct.dropna().index
y = trials.correct.dropna().values
return (x, y)
#check_session
def tperformance(sessionList, smooth=False, **tsargs):
temp = []
out = pd.DataFrame()
for session in sessionList:
x, y = tperformance_uni(session, **tsargs)
temp.append(pd.Series(y, index=x, name=session))
out = pd.concat(temp, axis=1, )
x = out.mean(axis=1).index
y = out.mean(axis=1).values
yerr = out.std(axis=1)/np.sqrt(len(out.columns))
yerr[yerr.isnull()] = 0
if not smooth or type(smooth) is bool:
win = win_size(x)
else:
win = win_size(x, default=smooth)
ysmooth = sm(y, win)
yerrsmooth = sm(yerr, win)
if len(x) != len(ysmooth):
ysmooth = ysmooth[1:]
if len(x) != len(yerrsmooth):
yerrsmooth = yerrsmooth[1:]
return (x, y, yerr) if not smooth else (x, ysmooth, yerrsmooth)
And the plotting function eg is:
def tplot_performance(sessionName, ax=False, decor=False, err=False,
c='b', ls='-', m='',
smooth=False,
**tsargs):
"""
Plots correct across trials
"""
if not ax:
ax = plt.subplot2grid((1, 1), (0, 0), rowspan=1, colspan=1)
# ---
x, y, yerr = tperformance(sessionName, smooth=smooth, **tsargs)
# ---
ax.plot(x, y, ls=ls, marker=m, color=c, linewidth=2)
if err:
ax.fill_between(x, y-yerr, y+yerr, color='gray', alpha=0.25)
if decor:
tplot_performance_template(sessionName, ax=ax)
return (x, y, yerr)
I managed to successfully implement an argument check #check_session using decorators that basically ensures the session is a list of strings.
def check_session(func):
"""Ensures session is of type list, if string will make list of one value
"""
def wrapper(session, **kwargs):
session = [session] if type(session) is str else session
return func(session, **kwargs)
return wrapper
So far so good.
Now I wanted to add the default values for the trial_selector function without being completely explicit, i.e., exposing init, odor, action,... in all functions nor completely generic, i.e. the way its implemented now that uses **tsargs.
Basically I would like to use a decorator like #tsargs_defaults so that I can use the default values in the processing function to do stuff. I could have input argument modules that would allow me to declare something like this:
#defalut_bla
#tsargs_defaults
def example_func(*args, **kwargs):
if init == 'N':
do something
if var_in_defalut_bla == someVal:
do something else
The decorators should add groups of variables that are declared in the inner scope locals() of the func.
What I tried so far:
def tsargs_defaults(func):
"""Adds trial_selector arguments and their defaults to function
tsargs = init='NSWE', odor='ABCD', action='LRFB', action_choice='LRFB',
goal='NSWE', qNa='both', cl=False, invalid=False,
"""
def wrapper(*args, **kwargs):
defaults = {'init': 'NSWE',
'odor': 'ABCD',
'action': 'LRFB',
'action_choice': 'LRFB',
'goal': 'NSWE',
'qNa': 'both',
'cl': False,
'invalid': False}
for k in kwargs:
if k in defaults:
defaults[k] = kwargs[k]
elif k not in defaults:
defaults.update({k: kwargs[k]})
return func(*args, **defaults)
return wrapper
However this will add what I want not to the local scope but to a kwargs dict (**defaults in the example). This means that I have to use kwargs['init'] == 'N' in stead of init == 'N' in the inner scope of the function.
I understand that this is a huge explanation for a non problem as the code kind of works like this, however I have a bunch of processing and plotting functions that use exposed default arguments to do different things and would like to avoid refactoring all of it.
Maybe there is no way or my question is ill posed keep in mind its my first attempt of using python decorators. In any case I would like to understand a bit more.
Any help is appreciated!
Thanks
btw: I'm using python 3.4
TL;DR
# some_kwargs {'one': 1, 'two': 2}
# some_other_kwargs {'three': 3, 'four': 4}
#some_other_kwargs
#some_kwargs
def example_func(*args, **kwargs):
print(one, two, three, four) # does not work
print(kwargs['one'], kwargs['two'], kwargs['three'], kwargs['four']) # works
update
The original answer, suggesting the use of functools.partial is bellow. In th meantime, years after answering this question, I needed the functionality being described:
A decorator to add new specific keyword-args to the decorated function, and have these keywords show up in the function signature itself, without the wrapped function ever need to know about them.
I created a "metadecorator" called conbine_sginatures which can imbue any ordinary decorator with this "power" - I had not, yet, refactored it to a more specific Python package from whre it can more easily be reused - it remains in the context where it was created, in the "utils" package of my unicode-art Terminedia project:
Example usage:
In [13]: from terminedia.utils import combine_signatures
In [14]: def add_parameter_b(func):
...: #combine_signatures(func)
...: def wrapper(*args, b, **kwargs):
...: print("parameter b", b)
...: return func(*args, **kwargs)
...: return wrapper
...:
In [15]: #add_parameter_b
...: def a(a):
...: print("parameter a", a)
...:
In [16]: a(42, b=1138)
parameter b 1138
parameter a 42
The resulting code was subject of a talk on PyCon Sweden in 2020: https://www.youtube.com/watch?v=eva0s7up5Oc&t=1262s&ab_channel=PyConSweden
(This answer got an upvote today (2021-04-25), that is why I revisited here. By coincidence, I had just today added an extra feature to the combine_signatures decorator: it now works with co-routine functions as well)
original answer
If all you want are Python functions that are different versions of the same function with different default parameters, you can use functools.partial to create them easily, with no need for decorators.
so, if you have def trial_selector(par1=..., par2=..., ...):
and want a callables with different default sets for the various parameters, you can declare them like this:
from functools import partial
search1 = partial(trial_decorator, par1="ABCD", par2="EFGH")
search2 = partial(trial_decorator, par1=None, par3="XZY", ...0
And just call the searchN functions only having to care about new
parameters, or parameters you want to override again.
Now if you need decorators for other functionalities than that - there are some extra remarks about your code:
WHat you are likely not aware of is that if you make use of **kwargs to call a function, there is no need for the function signature itself to use kwargs.
So, for your inner function, instead of havign just def example_func(*args, **kwargs): as a signature, you can have a full list of explict parameters (like on your first listng -
def trial_selector(session,
init='NSWE', odor='ABCD', action='LRFB',
action_choice='LRFB', goal='NSWE', qNa='both',
cl=False, invalid=False):
and still wrap it with a decorator that pass kwargs to it, just as you made in your "what I have tried" code. Moreover, on your example decorator, you have recreated made a dictionary "update" method in a rather complicated way - it can be written just like:
def tsargs_defaults(func):
"""Adds trial_selector arguments and their defaults to function
tsargs = init='NSWE', odor='ABCD', action='LRFB', action_choice='LRFB',
goal='NSWE', qNa='both', cl=False, invalid=False,
"""
def wrapper(*args, **kwargs):
defaults = {'init': 'NSWE',
'odor': 'ABCD',
...
'invalid': False}
defaults.update(kwargs)
return func(*args, **defaults)
return wrapper
If that is all you want, that is all you will need. Moreover, the decorator syntax is meant to help - in this case, it looks like you can make use of several of these "default args" decorators without using the decorator syntax - you can write it just like:
def full_search_function(all, default, parameters, ...):
...
def decorator_for_type1_search(...):
...
type1_search = decorator_for_type1_search(full_Search_function)
And at this point you have type1_search as a function with the parameters added in decorator_for_type1_search - and you can just create as many of those as you want.
Consider example:
def decorator(func):
def wrapper(*args, **kwargs):
print(args, kwargs)
func(*args, **kwargs)
return wrapper
#decorator
def foo(x, y, z=0):
pass
foo(5, 5)
Output:
(5, 5) {}
Why not (5, 5) {'z': 0}? How to pass all default values of the function foo to *args or **kwargs using only decorator (for functions) or metaclass (for class methods, e.g. __init__)?
The wrapper is just a normal function. It does not have "access" to the internals of the wrapped function.
You would have to use introspection to get them. See a related question:
How to find out the default values of a particular function's argument in another function in Python?
If I have to wrap an existing method, let us say wrapee() from a new method, say wrapper(), and the wrapee() provides default values for some arguments, how do I preserve its semantics without introducing unnecessary dependencies and maintenance? Let us say, the goal is to be able to use wrapper() in place of wrapee() without having to change the client code. E.g., if wrapee() is defined as:
def wrapee(param1, param2="Some Value"):
# Do something
Then, one way to define wrapper() is:
def wrapper(param1, param2="Some Value"):
# Do something
wrapee(param1, param2)
# Do something else.
However, wrapper() has to make assumptions on the default value for param2 which I don't like. If I have the control on wrapee(), I would define it like this:
def wrapee(param1, param2=None):
param2 = param2 or "Some Value"
# Do something
Then, wrapper() would change to:
def wrapper(param1, param2=None):
# Do something
wrapee(param1, param2)
# Do something else.
If I don't have control on how wrapee() is defined, how best to define wrapper()? One option that comes into mind is to use to create a dict with non-None arguments and pass it as dictionary arguments, but it seems unnecessarily tedious.
Update:
The solution is to use both the list and dictionary arguments like this:
def wrapper(param1, *args, **argv):
# Do something
wrapee(param1, *args, **argv)
# Do something else.
All the following calls are then valid:
wrapper('test1')
wrapper('test1', 'test2')
wrapper('test1', param2='test2')
wrapper(param2='test2', param1='test1')
Check out argument lists in the Python docs.
>>> def wrapper(param1, *stuff, **kargs):
... print(param1)
... print(stuff)
... print(args)
...
>>> wrapper(3, 4, 5, foo=2)
3
(4, 5)
{'foo': 2}
Then to pass the args along:
wrapee(param1, *stuff, **kargs)
The *stuff is a variable number of non-named arguments, and the **kargs is a variable number of named arguments.
I'd hardly say that it isn't tedious, but the only approach that I can think of is to introspect the function that you are wrapping to determine if any of its parameters have default values. You can get the list of parameters and then determine which one is the first that has default values:
from inspect import getargspec
method_signature = getargspec(method)
param_names = method_signature[0]
default_values = method_signature[3]
params = []
# If any of method's parameters has default values, we need
# to know the index of the first one that does.
param_with_default_loc = -1
if default_values is not None and len(default_values) > 0:
param_slice_index = len(default_values) * -1
param_with_default = param_names[param_slice_index:][0]
param_with_default_loc = param_names.index(param_with_default)
At that point, you can iterate over param_names, copying into the dict that is passed to wrappee. Once your index >= param_with_default_loc, you can obtain the default values by looking in the default_values list with an index of your index - param_with_default_loc.
Does that make any sesne?
Of course, to make this generic, you would to define it as a wrapper function, adding yet another layer of wrapping.
def wrapper(param1, param2=None):
if param2:
wrapee(param1, param2)
else:
wrapee(param1)
is this what you want?
#!/usr/bin/python
from functools import wraps
def my_decorator(f):
#wraps(f)
def wrapper(*args, **kwds):
print 'Calling decorated function'
return f(*args, **kwds)
return wrapper
def f1(x, y):
print x, y
def f2(x, y="ok"):
print x, y
my_decorator(f1)(1,2)
my_decorator(f2)(1,2)
my_decorator(f2)(1)
adapted from http://koala/doc/python2.6-doc/html/library/functools.html#module-functools