How can I write a decorator so that decorated functions can accept (and ignore) arbitrary arguments?
I have some functions like this:
def foo(x):
return x
def foo2(x, y):
if bar(y):
return x
else:
return x + 1
def foo3(x, y, z):
...
foo() can calculate the return value for a given x just based on x, but foo2() needs another parameter, and foo3() needs a third parameter. I have a method elsewhere that, among other things, calls either foo(), foo2(), etc depending on a user-specified argument.
Right now, the method just grabs the appropriate function with getattr(user_arg) and calls it with all of x, y, z. To avoid a TypeError for the wrong number of arguments, the foo functions are all defined with *args like this:
def foo(x, *args):
return x
But I'd like to just have a decorator to avoid including *args in every single foo function definition. Is there a way to do that? Or can you suggest a better way to organize this code?
One way would be to write the method like this:
if user_arg == 'foo':
foo(x)
elif user_arg == 'foo2':
foo2(x, y)
elif user_arg == 'foo3':
foo3(x, y, z)
But I'd really like to avoid this, because there are a lot of foo functions, and because I'd like to make it possible to add new foo functions without also having to add another branch to the method.
Edit: To clarify, it's not the case that I need to call a different foo function based on the number of arguments. It's arbitrary (user-specified) which of the foo functions is called.
def foo3(x, y, z):
return x + y + z
def foo4(x, y, z):
return x + y - z
You can use inspect.getargspec to determine what arguments the decorated function takes:
import functools
import inspect
def ignore_extra_arguments(func):
args, varargs, kwvarargs, defaults = inspect.getargspec(func)
#functools.wraps(func)
def wrapper_func(*wrapper_args, **wrapper_kwargs):
if varargs is None:
# remove extra positional arguments
wrapper_args = wrapper_args[:len(args)]
if kwvarargs is None:
# remove extra keyword arguments
wrapper_kwargs = {k: v for k, v in wrapper_kwargs.iteritems() if k in args}
return func(*wrapper_args, **wrapper_kwargs)
return wrapper_func
This can be optimized a little by lifting the if checks out of wrapper_func, at the expense of writing more code.
You can use named arguments and always call foo() (the first function) in order to dispatch to the "right" function, as follows:
def foo(a, b=None, c=None, d=None):
if b:
return foo2(a, b, c, d)
else:
# do your thing
def foo2(a, b, c=None, d=None):
if c:
return foo3(a, b, c, d)
else:
# do your thing
def foo3(a, b, c, d=None):
...
Another option:
def foo(a, *b):
if b:
return foo2(a, *b)
else:
print "in foo"
# do your thing
def foo2(a, b, *c):
if c:
return foo3(a, b, *c)
else:
print "in foo2"
# do your thing
def foo3(a, b, c, *d):
print "in foo3"
foo(1, 2, 3)
Related
Mock version of the problem
For a function
def f(a,b,c):
return a+b+c
The function
def fix(func, **kwargs):
fa = kwargs.get('a')
fb = kwargs.get('b')
if fa is not None and fb is not None:
def f(*args):
return func(a=fa, b=fb, c=args[0])
elif fa is not None:
def f(*args):
return func(a=fa, b=args[0], c=args[1])
elif fb is not None:
def f(*args):
return func(a=args[0],b=fb, c=args[1])
else:
def f(*args):
return func(args)
return f
allows to obtain a new function by fixing some of the parameters of func.
For example: fix(g, b=3) would give us a function like
def fixed_b_in_g(a,c):
return g(a,3,c)
Question: I would like to see if there is some trick to use fix in such a way that produces a function like
def fix_a_equal_b_in_g(a,c):
return g(a,a,c)
Concrete problem
The function scipy.stats.rv_continuous.fit allows to fit parameters of a distribution to an input sample. It allows to input some keyword arguments (like fix above does) to tell it to keep some of the parameters fixed to values that the user inputs. Internally scipy.stats.rv_continuous.fit has a function, scipy.stats.rv_continuous._reduce_func, that does more or less what dix does (better implemented than my fix for example).
In my case, rather than fixing some parameters to values, I would like to fit to keep two parameters (say a and b) equal to each other, but still free during the fitting.
We can use this function to copy a keyword argument whose name is base_kwarg_name to added_kwarg_name:
def with_copied_kwargs(func, added_kwarg_names_by_base):
def fixed_func(*args, **base_kwargs):
added_kwargs = {
added_kwarg_name: base_kwargs[base_kwarg_name]
for base_kwarg_name, added_kwarg_name in added_kwarg_names_by_base.items()
}
return func(*args, **base_kwargs, **added_kwargs)
return fixed_func
Given:
def add(*, a, b, c):
return a + b + c
then modified_add = with_copied_kwargs(add, {"b": "c"}) is equivalent to:
def modified_add(*, a, b):
return add(a=a, b=b, c=b)
with_copied_kwargs can then be used along with functools.partial to both both copy keyword arguments and provide values incrementally. modified_add = functools.partial(with_copied_kwargs(add, {"b": "c"}), a=1) is equivalent to:
def modified_add(*, b):
return add(a=1, b=b, c=b)
Note that I add * (see PEP 3102) before all parameters in functions I then apply with_copied_kwargs to because the minute people start using positional arguments, things would get messy. So better to restrict it to keyword-only arguments.
I am trying to implement a callback system in Python that is similar to how JavaScript can have different numbers of parameters in its callbacks. Ideally, I want to achieve this without using *args or **kwargs in the parameters of my callbacks.
My Goal
What I want is something that looks roughly like this:
def callback1(val):
print(val)
def callback2(x, y):
print(x, y)
def callback3(a, b, c):
print(a, b, c)
def foo(callback):
callback(1, 2, 3) # Always has 3 arguments to pass
foo(callback1) # Fails. Should print "1"
foo(callback2) # Fails. Should print "1 2"
foo(callback3) # Ok. Prints "1 2 3"
Perhaps a more verbose way of putting it would be:
# num_params() method isn't real (that I know of), but this is an
# inelegant example of how the callbacks might work if it were
def foo2(callback):
if num_params(callback) == 1:
callback(1)
elif num_params(callback) == 2:
callback(1, 2)
elif num_params(callback) == 3:
callback(1, 2, 3)
What I Don't Want
I don't want to use *args or **kwargs in each callback (unless this isn't possible any other way) like the following:
# This is just SO ugly
def callback1(*args):
print(args[0])
def callback2(*args):
print(args[0], args[1])
def callback3(*args):
print(args[0], args[1], args[2])
JavaScript Equivalent
This is relatively common in JavaScript. For example, one can supply the callback of a .forEach() function with 1, 2, or 3 arguments:
let myArray = [1, 2, 3, 4]
// Valid
myArray.forEach((element) => {
// Do stuff with the element only
});
// Valid
myArray.forEach((element, index) => {
// Do stuff with the element AND the index
});
// Valid
myArray.forEach((element, index, array) => {
// Do stuff with the element, index and the whole array
});
However, despite my best efforts in Google searching, I have no idea how to implement this in Python (or even in JavaScript for that matter, but that's beside the point; I hope this doesn't come back to bite me).
I would very much like to know if this is possible in Python and/or what the proper term is for this coding technique.
What's wrong with args and kwargs? It is the pythonic way to do that. Python is not JavaScript. If you do not like accessing args by indexes like args[0], args[1], etc, you could just define some args like usual, and rest (unused args) - in *args:
def callback1(a, *args):
print(a)
def callback2(a, b, *args):
print(a, b)
def callback3(a, b, c):
print(a, b, c)
Also you can unpack them in the function:
def callback1(*args):
a, *rest = args
print(a)
It makes it more verbose inside, but same definition for all callbacks.
Also it's common to name variables, you are not going to use with _ (underscore) instead of args, rest, etc.:
def callback1(a, *_):
print(a)
def callback1(*args):
a, *_ = args
print(a)
You can define all your callback functions using the same number of arguments, i.e.:
def callback1(val, b=None, c=None):
print(val)
def callback2(x, y, c=None):
print(x, y)
def callback3(a, b, c):
print(a, b, c)
Alternatively you can unpack *args within functions:
def callback1(*args):
val, _, _ = args
print(val)
def callback2(*args):
x, y, _ = args
print(x, y)
def callback3(*args):
a, b, c = args
print(a, b, c)
Finally, you can get creative using functools.partial.
Python and Excel have a different behaviour for defaults. Python is passing optional arguments by keyword, while Excel is only positional, even on defaults. As a result, an unused argument is in reality passed, as None. Assuming as example the scipy function brentq:
brentq(f, a, b, xtol=1e-12, rtol=4.4408920985006262e-16, maxiter=100, full_output=False, disp=True, *args)
calling it from Excel with xtol and rtol unset:
brentq(f,a,f,,,50)
will in reality be seen as
brentq(f,a,f,None,None,50)
and of course Python will not like None for xtol,rtol.
As an escamotage, till now, I have a function checking for default values:
def checkvals(f, args):
a = inspect.getargspec(f)
defs = zip(a.args[-len(a.defaults):], a.defaults)
for x in defs:
key = x[0]
if args[key] == None:
args[key] = x[1]
return args
and I wrap brentq as follows:
def brentq(f, a, b, xtol=1e-12, rtol=4.4408920985006262e-16, maxiter=100, full_output=False, disp=True, *args):
x = checkvals(brentq, locals())
return scipy.optimize.brentq (f, a, b, *args, x['xtol'], x['rtol'], x['maxiter'], x['full_output'], x['disp'])
It works, meaning that x['xtol'] and x['rtol'] are restored to their defaults. However, I was wondering if there is a better way to do it.
In other words: is it possible to modify locals() inside a function, and force the function to use the modified values?
If Excel passes None when you skip an argument, then put that in your Python function signature as the default argument. In fact, this is the Pythonic way to do it (for various reasons including default-argument mutability that aren't worth getting into here).
brentq(f, a, b, xtol=None, rtol=None, maxiter=100, full_output=None, disp=None, *args):
if xtol is None: xtol = 1e-12
if rtol is None: rtol = 4.4408920985006262e-16
if full_output is None: full_output = False
if disp is None: disp = true
# ... rest of function
I'm satisfied with final Kyle's suggestion. Following it the excel-python interface would be like (example tried in spyder):
import scipy.optimize
def f(x):
return -2*x**4 + 2*x**3 + -16*x**2 + -60*x + 100
#this is the wrapper for Excel
def brentqWrapper (f, a, b, xtol=None, rtol=None, maxiter=None, full_output=None, disp=None,*args):
return scipy.optimize.brentq (**{k: v for k, v in locals().items() if v!=None})
#this simulates the excel call with unset inner optional parameters:
print brentqWrapper(f,0,2,None,None,50)
>> 1.24078711375
I'm trying to execute a list of functions with the output of some functions being used as inputs to other ones.
A simplified example of what I'm trying to do:
def a():
return 25
def b():
return 50
def c(x, y):
return x+y
Normally, I'd do: print c(a(), b())
But if I have a list of functions: l = [a,b,c]
How can I iterate over this list and get the right arguments passed into method c?
Thanks.
My previous answer was not generic enough, so here's another bash at it. This uses inspect.getargspec() to determine the arity of each function prior to calling it. With that knowledge the required number of arguments can be sliced out of a "results list" which contains the results of all previous function calls.
There are some limitations:
inspect.getargspec() doesn't work for built-in functions, only user defined ones.
varargs and keyword args are ignored because this results in possibly infinite arguments which just complicates things.
Code:
import inspect
def a():
return 25
def b():
return 50
def c(x, y):
return x + y
def d(x, y, z):
return x + y + z
def e(x=100):
return x * x
def f():
pass
def execute_function_list(funcs):
results = []
for i, fn in enumerate(funcs):
args_required = len(inspect.getargspec(fn).args)
if args_required > 0:
args = results[i-args_required:i]
else:
args = ()
print("calling function %r with args %r" % (fn, args))
results.append(fn(*args))
print("after function returned results = %r" % results)
return results[-1] if len(results) else None
if __name__ == '__main__':
funcs = [e, a, b, c, d, e]
print("final result is %r" % execute_function_list(funcs))
Assuming that the last function in the list is to be called with the results of all of the preceding functions in the list:
def c(*args):
"Return the sum of all function arguments"
return sum(args)
>>> l = [a, b, a, b, b, b, c]
>>> args=(f() for f in l[:-1])
>>> args
<generator object <genexpr> at 0x7fd441aabe60>
>>> l[-1](*args)
250
or in one line,
>>> l[-1](*(f() for f in l[:-1]))
250
You can try the built-in function apply:
apply( l[-1], [ func() for func in l[:-1] ] )
I have two definitions say,
file x.py:
class x:
def p(self, a, b) :
# ...
file y.py:
class y:
def p(self, a, b, c) :
# [...]
Now I am calling these functions from another file. Command line inputs having the file names (in this case x and y) are passed in cust variable.
file z.py:
from x import *
from y import *
class z:
cust.p(a, b, c)
x (or y) is being passed as the command line inputs to file z.py.
cust is the class variable of class z.
Now when cust = x object I get an error: takes exactly 2 arguments (3 given).
But when cust = y object it works fine.
How do i eliminate this error?
I do not want to modify the function definition in x.py and y.py files as there are many files.
How do i modify the code so that both the functions are called from the same function call without modifying the function definition?
Change your methods to make the third parameter optional in both...
class x:
def p(a, b, c=None):
""" c will be ignored. """
print a, b
class y:
def p(a, b, c=1234):
""" c will be used, with a default of 1234. """
print a, b, c
x().p(1, 2)
x().p(1, 2, 123)
y().p(1, 2)
y().p(1, 2, 123)
Monkeypatch.
x.p = lambda self, a, b, c=None: x.p(self, a, b)
Based on my understanding of the code and your specifications (which, honestly, are not very clear), you want to call the functions from both files at the same time, and you have this variable called cust through which you call them as methods.
Until you specify what cust is, there is not much I can do to help. I also am unsure of how it is both of classes x and y simultaneously. This might be what you want:
#file x.py
class x:
def p(self, a, b):
...
#file y.py
class y:
def p(self, a, b, c):
...
#file z.py
from x import *
from y import *
my_x = x()
my_y = y()
a = ...
b = ...
my_x.p(a, b)
my_y.p(a, b, c)
I would like you to be more specific as to what you want so I can help.
Quick and dirty answer is:
try:
cust.p(a,b,c)
except TypeError:
cust.p(a,b)
Disclaimer: there are numerous problems with how the original problem has been defined, so the approach doesn't seem to make anything worse.
If you can't change the function being called to use *args or **kwargs, then you have to add special cases to see what class the object is:
if isinstance(cust, x):
cust.p(a, b)
elif isinstance(cust, y):
cust.p(a, b, c)
See this link for more information about the isinstance function.