So, I have some function that takes a list of arguments with minimum one argument, but potentially an entire list of arguments. I know a couple ways of defining this method, but I'm trying to determine what the various alternatives are at my disposal and what the various advantages of each approach are.
I'm trying to figure out what possible prototypes I can use to define this argument. I've considered the following:
def func(arg_1, *args):
arg_list = (arg_1,) + args
# do stuff with arg_list
as well as
def func(*args):
if len(args) == 0: raise Exception("Not enough arguments specified")
# do stuff with args
Are there alternative ways to specify a function that takes an argument list with minimum one argument?
In Python 3.5 you can (it has been present since 3.0 I believe, when PEP 3102 came out) but with the restrictions that the extra arguments are defined without a default value and supplied in a keyword form when calling the defined function.
This is done by adding * as a separator and the required argument(s) after it:
def foo(*, must_specify, **kwargs): pass
Calling foo() without supplying the a keyword argument of the form must_specify = 'specified' will result in a TypeError hinting the requirement for that specific argument:
foo()
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-45-624891b0d01a> in <module>()
----> 1 foo()
TypeError: foo() missing 1 required keyword-only argument: 'must_specify'
Positional arguments on their own can be supplied, but, the positional grouping parameter of the form *args cannot i.e:
def foo(pos_arg, *, must_specify, **kwargs): pass
is fine, but:
def foo(pos_arg, *args, *, must_specify, **kwargs): pass
is a SyntaxError. This is done because * is essentially the same as a 'throw-away' form of *args.
Either way, there is no other syntactic way (that I'm aware of) which can satisfy the restriction you wish. If working with kwargs is too much of a hassle for you or if you're working with Py2 and are adamant to switch, solutions as those suggested in the comments are your best bet.
Related
If I have the function
def my_function(a,b,c):
and when the user calls the function, they omit the last argument
print(my_function(a,b))
what exception should I raise?
As others have mentioned, Python will raise a TypeError if a function is called with an incorrect number of statically declared arguments. It seems there is no practical reason to override this behavior to raise your own custom error message since Python's:
TypeError: f() takes 2 positional arguments but 3 were given
is quite telling.
However, if you want to do this, and perhaps optionally allow a second argument, you can use *args.
def my_function(a, *args):
b = None
if len(args) > 1:
raise TypeError("More than 2 arguments not allowed.")
elif args:
b = args[0]
# do something with a and possibly b.
Edit: The other answer suggesting a default keyword argument is more appropriate given new additional details in OP’s comment.
After discussion in the comment, it seems that what you want to do is catch an exception to pass a default argument if one was missing.
First of all, Python will already raise a TypeError if an argument is missing.
But you do not need to catch it to have default arguments since Python already provides a way to do this.
def my_function(a, b, c=0):
pass
my_function(1, 2, 3) # This works fine
my_function(1, 2) # This works as well an used 0 as default argument for c
I have already found various answers to this question (eg. lambda function acessing outside variable) and all point to the same hack, namely (eg.) lambda n=i : n*2 with i a variable in the external scope of lambda (hoping I'm not misusing the term scope). However, this is not working and given that all answers I found are generally from couple of years ago, I thought that maybe this has been deprecated and only worked with older versions of python. Does anybody have an idea or suggestion on how to solve this?
SORRY, forgot the MWE
from inspect import getargspec
params = ['a','b']
def test(*args):
return args[0]*args[1]
func = lambda p=params : test(p)
I expected the signature of func to be ['a','b'] but if I try
func(3,2)
I get a Type error (TypeError: <lambda>() takes at most 1 argument (2 given) )
and it's true signature (from getargspec(func)[0] ) is ['p']
In my real code the thing is more complicated. Shortly:
def fit(self, **kwargs):
settings = self.synch()
freepars = self.loglike.get_args()
func = lambda p=freeparams : self.loglike(p)
minuit = Minuit(func,**settings)
I need lambda because it's the only way I could think to create inplace a function object depending on a non-hardcoded list of variables (extracted via a method get_params() of the instance self.loglike). So func has to have the correct signature, to match the info inside the dict settings
The inspector gives ['p'] as argument of func, not the list of parameters which should go in loglike. Hope you can easily spot my mistake. Thank you
There's no way to do exactly what you want. The syntax you're trying to use to set the signature of the function you're creating doesn't do what you want. It instead sets a default value for the argument you've defined. Python's function syntax allows you to define a function that accepts an arbitrary number of arguments, but it doesn't let you define a function with argument names in a variable.
What you can do is accept *args (or **kwargs) and then do some processing on the value to match it up with a list of argument names. Here's an example where I turn positional arguments in a specific order into keyword arguments to be passed on to another function:
arg_names = ['a', 'b']
def foo(*args):
if len(args) != len(arg_names):
raise ValueError("wrong number of arguments passed to foo")
args_by_name = dict(zip(arg_names, args))
some_other_function(**args_by_name)
This example isn't terribly useful, but you could do more sophisticated processing on the args_by_name dict (e.g. combining it with another dict), which might be relevant to your actual use case.
Sorry for the newbie question guys, but I'm relatively new to python. I want to write a function that passes keyword and value arguments into another function:
e.g.
def function_that_passes_arguments(arguments):
some_other_function(arguments)
so when I call the first function they are passed into the second... e.g.
function_that_passes_arguments(arg1=1, arg2=2)
is effectively
some_other_function(arg1=1, arg2=2)
The argument names will change so it is important that I pass both keyword and value from one function to another.
Accept *args, **kwargs and pass those to the called function:
def function_that_passes_arguments(*args, **kwargs):
some_other_function(*args, **kwargs)
In both places you can also use regular arguments - the only requirement is that the * and ** arguments are the last ones.
is there any way to define a mandatory *args (arbitrary arguments) in a method of a class?
class foo():
def __init__(self,*args):
....
Given this, you can create an object from the foo-class without any arguments x=foo(), i.e. its optional. Any ideas how to change it to a non-optional or "give me at least one arguments"-thing?
Another questions concerns the list unpacking with x=foo(*list) --> Is there a way to recognize the list as a list and unpack the list automatically, so that you don´t have to use the * in a function/method call?
Thx
*args is meant to receive all arguments that are not already taken by normal arguments.
Usually if you need an argument to be mandatory, you just use a normal argument:
>>> def foo(arg, *rest):
... print arg, rest
...
>>> foo()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: foo() takes at least 1 argument (0 given)
If you think it is more elegant to gather all arguments in a tuple, you have to handle the error case yourself:
>>> def foo(*args):
... if len(args) < 1:
... raise TypeError('foo() takes at least 1 argument (%i given)' % len(args))
...
>>> foo()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in foo
TypeError: foo() takes at least 1 argument (0 given)
But as you can (or should) see, from the signature of that function it is not clear how many arguments to that function are mandatory to anyone who uses that function. You should either avoid this altogether or at least document it very well.
There are other problems as well: if you give on argument to foo() that is iterable (like a string), you will not get the intended result.
Responding to your comment below, your first approach was the right one: take a list.
def scrape(urls):
for url in urls:
do_something(url)
The caller simply has to pass a list with only one element: scrape(['localhost']).
Even better would probably be to take only one URL and let the caller iterate over a list. In that case the caller could parallelize the operations if she ever wants to.
As to your second question1: either you function takes a list as an argument or it doesn't. Either it makes sense in your program to pass around lists or it doesn't.
I guess, I'm not entirely sure what you are asking there, but then again your whole question sounds like you found a shiny new tool and now you want to use it everywhere, regardless of whether it makes sense or not.
1 please don't ask more than one question at once!
Either test the length of the resultant tuple, or put one or more normal arguments before it.
No.
For "give me at least one argument," just check the len() of the tuple you receive and throw an exception if you don't get at least one. Here I am using the fact that empty tuples are "falsy" to do that implicitly:
def __init__(self, *args):
if not args:
raise TypeError("__init__ takes at least 2 arguments (1 given)")
For "auto-unpacking," you will also need to test for this and perform it yourself. One method might be:
if len(args) == 1 and not isinstance(args[0], basestring) and iter(args[0]):
args = args[0]
The iter() will always be true, but if what you pass it is not iterable, it will raise an exception. If you want to provide a friendlier error message, you could catch it and raise something else.
Another alternative would be to write your method so that it recursively iterates over all elemets of args and all subcontainers within it; then it doesn't matter.
Or, you know, just have the caller pass in an iterable to begin with, and don't mess with *args. If the caller wants to pass in a single item, there is simple syntax to turn it into a list: foo([item])
What is the best style for a Python method that requires the keyword argument 'required_arg':
def test_method(required_arg, *args, **kwargs):
def test_method(*args, **kwargs):
required_arg = kwargs.pop('required_arg')
if kwargs:
raise ValueError('Unexpected keyword arguments: %s' % kwargs)
Or something else? I want to use this for all my methods in the future so I'm kind of looking for the best practices way to deal with required keyword arguments in Python methods.
The first method by far. Why duplicate something the language already provides for you?
Optional arguments in most cases should be known (only use *args and **kwargs when there is no possible way of knowing the arguments). Denote optional arguments by giving them their default value (def bar(foo = 0) or def bar(foo = None)). Watch out for the classic gotcha of def bar(foo = []) which doesn't do what you expect.
The first method offers you the opportunity to give your required argument a meaningful name; using *args doesn't. Using *args is great when you need it, but why give up the opportunity for clearer expression of your intent?
If you don't want arbitrary keyword arguments, leave out the ** parameter. For the love of all that is holy, if you have something that is required, just make it a normal argument.
Instead of this:
def test_method(*args, **kwargs):
required_arg = kwargs.pop('required_arg')
if kwargs:
raise ValueError('Unexpected keyword arguments: %s' % kwargs)
Do this:
def test_method(required_arg, *args):
pass