How do you pass arguments from one function to another? - python

Sorry for the newbie question guys, but I'm relatively new to python. I want to write a function that passes keyword and value arguments into another function:
e.g.
def function_that_passes_arguments(arguments):
some_other_function(arguments)
so when I call the first function they are passed into the second... e.g.
function_that_passes_arguments(arg1=1, arg2=2)
is effectively
some_other_function(arg1=1, arg2=2)
The argument names will change so it is important that I pass both keyword and value from one function to another.

Accept *args, **kwargs and pass those to the called function:
def function_that_passes_arguments(*args, **kwargs):
some_other_function(*args, **kwargs)
In both places you can also use regular arguments - the only requirement is that the * and ** arguments are the last ones.

Related

How can you override the argspec of a python function, e.g. to make the result of the help() function more useful?

I have a python function that only takes in keyword arguments:
def my_func(**kwargs):
I am splitting the keyword arguments among two separate functions, which have their keyword arguments defined explicitly:
def my_subfunc_1(a=None,b=None):
def my_subfunc_2(c=None,d=None):
When I issue help(my_func) I only get the description for my_func(**kwargs). However, ideally I would like the result of this to be my_func(a=None,b=None,c=None,d=None).
I can fetch the arguments of my_subfunc_1 and my_subfunc_2 with inspect.getfullargspec(). However, I am not sure how to use this information to override the part of my_func that the help() function reads from to fetch the displayed **kwargs.

Add a function as a parameter in Python [duplicate]

This question already has answers here:
Passing functions with arguments to another function in Python? [duplicate]
(9 answers)
Closed 1 year ago.
Imagine that I want to create a function called "execute()". This function takes the name of another function and its input as parameters and outputs whatever it returns.
Here is an example:
execute(print, "Hello, World!") # "Hello, World!"
execute(str, 68) # "68"
Of course, this function wouldn't be of any use, but I want to grasp the main idea of putting another function in as a parameter.
How could I do this?
Functions can easily be passed into functions. To pass a variable length argument list, capture it with *args in the function definition and when calling the func use the same syntax to expand the arguments again into multiple parameters.
def execute(fn, *args):
return fn(*args)
Note: we are not passing the name of a function to execute(), we are passing the function itself.
You can just do this,
def execute (func, argv):
return func(argv)
execute(print, 'test')
returns test
execute(str, 65)
returns '65'
I believe this should work:
def execute(fn, *args, **kwargs):
return fn(*args, **kwargs)
Here:
args = Arguments (list)
kwargs = Keyworded Arguments (dictionary)
If you want to do more, then you can look for Decorators in Python.
Yes, you can pass functions as parameters into another function.
Functions that can accept other functions as arguments are also called higher-order functions.
I hope the following example helps:
def shout(text):
return text.upper()
def greet(func):
greeting = func("Hi, I am created by a function passed as an argument.")
print(greeting)
greet(shout)
The output of the code will be :
HI, I AM CREATED BY A FUNCTION PASSED AS AN ARGUMENT.

In Python 3.x how do I create a simple proxy function for another function that is a pure pass through?

In Python 3.x, I want to create a proxy function _proxy for a specific known function proxiedFunc and guarantee that all arguments passed are "forwarded" exactly as if they were passed directly to the proxiedFunc.
# Pseudo-Python code
def _proxy(???generic_parameters???):
return proxiedFunc(???arguments???)
What I mean by "pure pass through" -> The implementation of the _proxy method should not be affected by (non-)compatible changes to the proxiedMethod, assuming the name of the function doesn't change (SOLID principles). Obviously, callers of _proxy would need to be modified if proxiedMethod is changed incompatibly (i.e. I'm not intending for _proxy to be an adapter, but that would be a possibility).
The generic way of taking "anything" in a function definition is using *args, **kwargs.
The same syntax is used for passing those args when calling another function.
def _proxy(*args, **kwargs):
return proxiedFunc(*args, **kwargs)
The single * (e.g. *args) captures the positional arguments, and the double (e.g. **kwargs) captures the keyword arguments.
args and kwargs are the names you give to those argument-containers. By convention, the name of the "any-positional-args" argument is args or a (its type is tuple), and the name of the "any-keyword-args" argument is kwargs or kw (its type is dict).
I, too, wanted to find a way to do that, so I wrote a function for that. I posted it to github: https://github.com/make-itrain/function-proxy
Basically, what it does is:
def a(p1, p2, p3=3, *args, p4=228, p18=11, **kwargs):
# any function, doesn't matter
pass
def b(p4, p2, *args):
# some overlapping arguments
# it too doesn't matter, what's inside here
print(p4, p2, args)
args, kwargs = proxy_function(b, a, {"args": ("replaced args",)}, 322, "some_random_arg", 1337,
"arguments", ('anything',), 1, abc=338, cbd="here too?")
b(*args, **kwargs)
Prints 228 some_random_arg ('replaced args',). Cool, isn't it?

forcing value of lambda inner scope variable to outer variable - python

I have already found various answers to this question (eg. lambda function acessing outside variable) and all point to the same hack, namely (eg.) lambda n=i : n*2 with i a variable in the external scope of lambda (hoping I'm not misusing the term scope). However, this is not working and given that all answers I found are generally from couple of years ago, I thought that maybe this has been deprecated and only worked with older versions of python. Does anybody have an idea or suggestion on how to solve this?
SORRY, forgot the MWE
from inspect import getargspec
params = ['a','b']
def test(*args):
return args[0]*args[1]
func = lambda p=params : test(p)
I expected the signature of func to be ['a','b'] but if I try
func(3,2)
I get a Type error (TypeError: <lambda>() takes at most 1 argument (2 given) )
and it's true signature (from getargspec(func)[0] ) is ['p']
In my real code the thing is more complicated. Shortly:
def fit(self, **kwargs):
settings = self.synch()
freepars = self.loglike.get_args()
func = lambda p=freeparams : self.loglike(p)
minuit = Minuit(func,**settings)
I need lambda because it's the only way I could think to create inplace a function object depending on a non-hardcoded list of variables (extracted via a method get_params() of the instance self.loglike). So func has to have the correct signature, to match the info inside the dict settings
The inspector gives ['p'] as argument of func, not the list of parameters which should go in loglike. Hope you can easily spot my mistake. Thank you
There's no way to do exactly what you want. The syntax you're trying to use to set the signature of the function you're creating doesn't do what you want. It instead sets a default value for the argument you've defined. Python's function syntax allows you to define a function that accepts an arbitrary number of arguments, but it doesn't let you define a function with argument names in a variable.
What you can do is accept *args (or **kwargs) and then do some processing on the value to match it up with a list of argument names. Here's an example where I turn positional arguments in a specific order into keyword arguments to be passed on to another function:
arg_names = ['a', 'b']
def foo(*args):
if len(args) != len(arg_names):
raise ValueError("wrong number of arguments passed to foo")
args_by_name = dict(zip(arg_names, args))
some_other_function(**args_by_name)
This example isn't terribly useful, but you could do more sophisticated processing on the args_by_name dict (e.g. combining it with another dict), which might be relevant to your actual use case.

why doesn't ** unpack kwargs in function calls?

This is something that's bugged me for awhile now:
def test (*args, **kwargs):
print target
test(foo='bar', target='baz')
I would presume that target='test' in the aFunc call at the bottom would end up in kwargs (and it does), and I would also presume that **would unpack kwargs in the function call, so target would exist as a keyword argument inside of aFunc. It doesn't. I know that it comes in as a dict, but I need to have that dict unpack in the argument list. Is this possible? In short, is there any way to have *args and **kwargs disappear and have the actual args and kwargs go into the call?
Edit: I threw together a case where unpacking of *args and **kwargs might help:
Let's say I have a function that prints a list:
def printList (inputList=None):
print inputList
I want to be able to pass no list and have a default list supplied:
def ensureList (listFunc):
def wrapper (inputList=None):
listFunc(inputList=inputList or ['a','default','list'])
return wrapper
#ensureList
def printList (inputList=None):
print inputList
Now I want to get a bit more complicated with a list repeater:
#ensureList
def repeatList (inputList=None):
print inputList*2
That works fine. But now I want variable repeating:
#ensureList
def repeatList (times, inputList=None):
print inputList*times
Now you would be able to say:
repeatList(5)
It would generate the default list and repeat it 5 times.
This fails, of course, because wrapper can't handle the times argument. I could of course do this:
#ensureList
def repeatList (inputList=None, times=1)
But then I always have to do this:
repeatList(times=5)
And maybe in some cases I want to enforce supplying a value, so a non-keyword arg makes sense.
When I first encountered problems like this last year, I thought a simple solution would be to remove the requirements on the wrapper:
def ensureList (listFunc):
"info here re: operating on/requiring an inputList keyword arg"
def wrapper (*args, **kwargs):
listFunc(inputList=inputList or ['a','default','list'])
return wrapper
That doesn't work, though. This is why I'd like to have args and kwargs actually expand, or I'd like to have a way to do the expansion. Then whatever args and kwargs I supply, they actually fill in the arguments, and not a list and a dict. The documentation in the wrapper would explain requirements. If you pass in inputList, it would actually go in, and inputList in the call back to repeatList from the wrapper would be valid. If you didn't pass in inputList, it would create it in the call back to repeatList with a default list. If your function didn't care, but used *kwargs, it would just gracefully accept it without issue.
Apologies if any of the above is wrong (beyond the general concept). I typed it out in here, untested, and it's very late.
The answer to "why doesn't ** unpack kwargs in function calls?" is: Because it's a bad idea, the person who develop a function does not want local variable to just appear depending on the call arguments.
So, this is not how it's working and you surely do not want python to behave like that.
To access the target variable in the function, you can either use:
def test(target='<default-value>', *args, **kwargs):
print target
or
def test(*args, **kwargs):
target = kwargs.get('target', '<default-value>')
print target
However, if you want a hack (educational usage only) to unpack **kwargs, you can try that:
def test(*args, **kwargs):
for i in kwargs:
exec('%s = %s' % (i, repr(kwargs[i])))
print target
The obvious way for this particular case is
def test(foo=None, target=None):
print target
test(foo='bar', target='baz')
If you want to access a parameter inside a function by name, name it explicitly in the argument list.

Categories

Resources