I am trying to call a python function defined/created by a different person. That function explicitly requries 3 arguments inputs as
def function_name (argument1 argument2 argument3):
However, inside the function, only the argument1 and argument3 are used with the argument2 completely ignored. If I can not modify this function definition and need to call this function, how should I skip providing the argument2?
like
function_name (value1, *, value3)
or
function_name(value1, whatever_fake_value, value3)
I know the latter option is definitely going to work, but can I explicitly show (to minimize future confusion) that a argument has been skipped in this function call.
Create a wrapper function that calls the old function you can't change with the arguments you care about being passed through and a default in place of the "dead" argument and a clear concise comment explaining the exact situation for future posterity and so your future self is happy with you.
def new_wrapper_function(arg1, arg2):
# this function is a wrapper that calls old_function with a default argument in position 2 because it is unused
old_function(arg1, default_dead_arg, arg2)
You would probably have the wrapper function pass None as the "dead" argument, for example:
def new_wrapper_function(arg1, arg2):
# this function is a wrapper that calls old_function with a default argument in position 2 because it is unused
old_function(arg1, None, arg2)
You can make argument2 an optional argument and give it a default value. For instance:
def function_name(arg1, arg3, arg2=None):
pass
Then you can check if arg2 is valid, otherwise ignore it.
Why can't you change the definition? If you absolutely have to use it as is, you can pass None or an empty list.
function_name(arg1,None,arg2)
Pass None as the second argument. No harm no foul.
Since the arguments are all required and no defaults, skipping the second one in any way even by passing named arguments would raise a TypeError exception.
Related
I have a python function that only takes in keyword arguments:
def my_func(**kwargs):
I am splitting the keyword arguments among two separate functions, which have their keyword arguments defined explicitly:
def my_subfunc_1(a=None,b=None):
def my_subfunc_2(c=None,d=None):
When I issue help(my_func) I only get the description for my_func(**kwargs). However, ideally I would like the result of this to be my_func(a=None,b=None,c=None,d=None).
I can fetch the arguments of my_subfunc_1 and my_subfunc_2 with inspect.getfullargspec(). However, I am not sure how to use this information to override the part of my_func that the help() function reads from to fetch the displayed **kwargs.
I'm currently learning curses in python, and I found this piece of code online that is confusing me.
import curses
def draw_menu(stdscr):
# do stuff
# if you want more code just let me know
def main():
curses.wrapper(draw_menu)
if __name__ == "__main__":
main()
When I run this I don't get the expected missing 1 required positional argument error, since there is no parameter being passed in the curses.wrapper(draw_menu) line. Is this a curses thing? Any help is greatly appreciated.
A function is a datatype, just as much as strings, integers, and so on.
def my_function(txt):
print(txt)
here type(my_function) # => <class 'function'>
You invoke the code inside the function when you call it with parenthesis : my_function('hello') # => prints hello
Until then you can perfectly pass a function as an argument to another function.
And that last one can call the one you passed giving it some parameters.
Like in your case, I'd guess that curses.wrapper() creates a screen interface that it passes as argument your draw_menu() function.
And you can probably use that screen object to build your curse app.
See this : Python function as a function argument?
There's a big difference between curses.wrapper(draw_menu) and curses.wrapper(draw_menu()). curses.wrapper(draw_menu) calls curses.wrapper and passes the function draw_menu into it as an argument. In contrast, curses.wrapper(draw_menu()) would call draw_menu and pass its return value into curses.wrapper.
curses.wrapper will call the function you pass it. From that link:
Initialize curses and call another callable object, func, which should be the rest of your curses-using application.
E.g., it will call draw_menu when curses is completely initialized.
Here is the signature for curses.wrapper from here.
curses.wrapper(func, /, *args, **kwargs)
It says that you need to give curses.wrapper a function reference argument followed by zero or more arguments and keyword arguments. Your code satisfies those requirements.
Python allows function signatures like this to enable developers a lot of flexibility regarding what can be passed in by the caller.
I have already found various answers to this question (eg. lambda function acessing outside variable) and all point to the same hack, namely (eg.) lambda n=i : n*2 with i a variable in the external scope of lambda (hoping I'm not misusing the term scope). However, this is not working and given that all answers I found are generally from couple of years ago, I thought that maybe this has been deprecated and only worked with older versions of python. Does anybody have an idea or suggestion on how to solve this?
SORRY, forgot the MWE
from inspect import getargspec
params = ['a','b']
def test(*args):
return args[0]*args[1]
func = lambda p=params : test(p)
I expected the signature of func to be ['a','b'] but if I try
func(3,2)
I get a Type error (TypeError: <lambda>() takes at most 1 argument (2 given) )
and it's true signature (from getargspec(func)[0] ) is ['p']
In my real code the thing is more complicated. Shortly:
def fit(self, **kwargs):
settings = self.synch()
freepars = self.loglike.get_args()
func = lambda p=freeparams : self.loglike(p)
minuit = Minuit(func,**settings)
I need lambda because it's the only way I could think to create inplace a function object depending on a non-hardcoded list of variables (extracted via a method get_params() of the instance self.loglike). So func has to have the correct signature, to match the info inside the dict settings
The inspector gives ['p'] as argument of func, not the list of parameters which should go in loglike. Hope you can easily spot my mistake. Thank you
There's no way to do exactly what you want. The syntax you're trying to use to set the signature of the function you're creating doesn't do what you want. It instead sets a default value for the argument you've defined. Python's function syntax allows you to define a function that accepts an arbitrary number of arguments, but it doesn't let you define a function with argument names in a variable.
What you can do is accept *args (or **kwargs) and then do some processing on the value to match it up with a list of argument names. Here's an example where I turn positional arguments in a specific order into keyword arguments to be passed on to another function:
arg_names = ['a', 'b']
def foo(*args):
if len(args) != len(arg_names):
raise ValueError("wrong number of arguments passed to foo")
args_by_name = dict(zip(arg_names, args))
some_other_function(**args_by_name)
This example isn't terribly useful, but you could do more sophisticated processing on the args_by_name dict (e.g. combining it with another dict), which might be relevant to your actual use case.
(Python 3)
First of all, I feel my title isn't quite what it should be, so if you stick through the question and come up with a better title, please feel free to edit it.
I have recently learned about Python Decorators and Python Annotations, and so I wrote two little functions to test what I have recently learned.
One of them, called wraps is supposed to mimic the behaviour of the functools wraps, while the other, called ensure_types is supposed to check, for a given function and through its annotations, if the arguments passed to some function are the correct ones.
This is the code I have for those functions:
def wraps(original_func):
"""Update the decorated function with some important attributes from the
one that was decorated so as not to lose good information"""
def update_attrs(new_func):
# Update the __annotations__
for key, value in original_func.__annotations__.items():
new_func.__annotations__[key] = value
# Update the __dict__
for key, value in original_func.__dict__.items():
new_func.__dict__[key] = value
# Copy the __name__
new_func.__name__ = original_func.__name__
# Copy the docstring (__doc__)
new_func.__doc__ = original_func.__doc__
return new_func
return update_attrs # return the decorator
def ensure_types(f):
"""Uses f.__annotations__ to check the expected types for the function's
arguments. Raises a TypeError if there is no match.
If an argument has no annotation, object is returned and so, regardless of
the argument passed, isinstance(arg, object) evaluates to True"""
#wraps(f) # say that test_types is wrapping f
def test_types(*args, **kwargs):
# Loop through the positional args, get their name and check the type
for i in range(len(args)):
# function.__code__.co_varnames is a tuple with the names of the
##arguments in the order they are in the function def statement
var_name = f.__code__.co_varnames[i]
if not(isinstance(args[i], f.__annotations__.get(var_name, object))):
raise TypeError("Bad type for function argument named '{}'".format(var_name))
# Loop through the named args, get their value and check the type
for key in kwargs.keys():
if not(isinstance(kwargs[key], f.__annotations__.get(key, object))):
raise TypeError("Bad type for function argument named '{}'".format(key))
return f(*args, **kwargs)
return test_types
Supposedly, everything is alright until now. Both the wraps and the ensure_types are supposed to be used as decorators. The problem comes when I defined a third decorator, debug_dec that is supposed to print to the console when a function is called and its arguments. The function:
def debug_dec(f):
"""Does some annoying printing for debugging purposes"""
#wraps(f)
def profiler(*args, **kwargs):
print("{} function called:".format(f.__name__))
print("\tArgs: {}".format(args))
print("\tKwargs: {}".format(kwargs))
return f(*args, **kwargs)
return profiler
That also works cooly. The problem comes when I try to use debug_dec and ensure_types at the same time.
#ensure_types
#debug_dec
def testing(x: str, y: str = "lol"):
print(x)
print(y)
testing("hahaha", 3) # raises no TypeError as expected
But if I change the order with which the decorators are called, it works just fine.
Can someone please help me understand what is going wrong, and if is there any way of solving the problem besides swapping those two lines?
EDIT
If I add the lines:
print(testing.__annotations__)
print(testing.__code__.co_varnames)
The output is as follows:
#{'y': <class 'str'>, 'x': <class 'str'>}
#('args', 'kwargs', 'i', 'var_name', 'key')
Although wraps maintains the annotations, it doesn't maintain the function signature. You see this when you print out the co_varnames. Since ensure_types does its checking by comparing the names of the arguments with the names in the annotation dict, it fails to match them up, because the wrapped function has no arguments named x and y (it just accepts generic *args and **kwargs).
You could try using the decorator module, which lets you write decorators that act like functools.wrap but also preserve the function signature (including annotations).
There is probably also a way to make it work "manually", but it would be a bit of a pain. Basically what you would have to do is have wraps store the original functions argspec (the names of its arguments), then have ensure_dict use this stored argspec instead of the wrapper's argspec in checking the types. Essentially your decorators would pass the argspec in parallel with the wrapped functions. However, using decorator is probably easier.
def _procedural_reloading(self,gen=[],*args):
if len(gen):
gen.pop().reload()
Clock.schedule_interval(functools.partial(
self._procedural_reloading,gen=gen),.5)
In above code, _procedural_reloading() is a method of a class and it gets a list which contains some images and tries to reload() them one by one.
Guess what, it doesn't work because it says that _procedural_reloading got multiple values for keyword gen!
The odd thing is if I pass gen as an argument (not as a keyword argument) it works just fine, here:
def _procedural_reloading(self,gen=[],*args):
if len(gen):
gen.pop().reload()
Clock.schedule_interval(functools.partial(
self._procedural_reloading,gen),.5)
why gen=gen doesn't work?
To elaborate it more, until now I couldn't pass any keyword argument with Clock even once! I always have to arrange the arguments one by one by order and pass them... is it a known issue? or have I done something wrong there? I feel stupid!
Edit:
gen without default value also doesn't work in my case:
def _procedural_reloading(self,gen,*args):
if len(gen):
gen.pop().reload()
Clock.schedule_interval(functools.partial(
self._procedural_reloading,gen=gen),.5)
When you create
functools.partial(self._procedural_reloading,gen=gen)
partial saves gen=gen into kwargs.
kwargs would probably look like {'gen': gen}, nothing to do with positional arguments, none have been given yet. When this partial function is called, you can see by your function definition:
def _procedural_reloading(self,gen=[],*args):
gen is the first positional argument, so now the caller of the partial function, calls it with a certain number of arguments, the first of which, is set to gen! because it is the first positional argument, so you are setting it twice! (Which isn't allowed). It's weird and problematic (as can be seen) to define positional args like that (with a default arg before it), the solution is:
def _procedural_reloading(self,*args, **kwargs):
Now you are handling args differently from kwargs, you can get gen like kwargs.get(gen, default)
I'm trying to put #jamylak method to work, meantime here my own solution to this problem:
As I have expected, the problem was due to Kivy's Clock, it seems it passes it's parameter to the function first! got it? no?
When Clock calls a function, it passes a parameter to it called dt.
So if you have a function and want to call it with Clock it should have at least one argument:
def clock_callback_function(dt):
...
In my case, I always give my functions *args, so Kivy can do what ever it wants with them! but it seems Clock always overwrite the first argument of the callback functions.
to wrap it nicely, I should write my code like this:
def _procedural_reloading(self,dt=0,gen=[]):
if len(gen):
gen.pop().reload()
Clock.schedule_interval(functools.partial(
self._procedural_reloading,gen=gen),.5)
the above code works without exceptions but bellow code doesn't work as we already know:
def _procedural_reloading(self,gen=[],dt=0):
if len(gen):
gen.pop().reload()
Clock.schedule_interval(functools.partial(
self._procedural_reloading,gen=gen),.5)