Python mandatoryl arbitrary argument list *args - python

is there any way to define a mandatory *args (arbitrary arguments) in a method of a class?
class foo():
def __init__(self,*args):
....
Given this, you can create an object from the foo-class without any arguments x=foo(), i.e. its optional. Any ideas how to change it to a non-optional or "give me at least one arguments"-thing?
Another questions concerns the list unpacking with x=foo(*list) --> Is there a way to recognize the list as a list and unpack the list automatically, so that you don´t have to use the * in a function/method call?
Thx

*args is meant to receive all arguments that are not already taken by normal arguments.
Usually if you need an argument to be mandatory, you just use a normal argument:
>>> def foo(arg, *rest):
... print arg, rest
...
>>> foo()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: foo() takes at least 1 argument (0 given)
If you think it is more elegant to gather all arguments in a tuple, you have to handle the error case yourself:
>>> def foo(*args):
... if len(args) < 1:
... raise TypeError('foo() takes at least 1 argument (%i given)' % len(args))
...
>>> foo()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in foo
TypeError: foo() takes at least 1 argument (0 given)
But as you can (or should) see, from the signature of that function it is not clear how many arguments to that function are mandatory to anyone who uses that function. You should either avoid this altogether or at least document it very well.
There are other problems as well: if you give on argument to foo() that is iterable (like a string), you will not get the intended result.
Responding to your comment below, your first approach was the right one: take a list.
def scrape(urls):
for url in urls:
do_something(url)
The caller simply has to pass a list with only one element: scrape(['localhost']).
Even better would probably be to take only one URL and let the caller iterate over a list. In that case the caller could parallelize the operations if she ever wants to.
As to your second question1: either you function takes a list as an argument or it doesn't. Either it makes sense in your program to pass around lists or it doesn't.
I guess, I'm not entirely sure what you are asking there, but then again your whole question sounds like you found a shiny new tool and now you want to use it everywhere, regardless of whether it makes sense or not.
1 please don't ask more than one question at once!

Either test the length of the resultant tuple, or put one or more normal arguments before it.
No.

For "give me at least one argument," just check the len() of the tuple you receive and throw an exception if you don't get at least one. Here I am using the fact that empty tuples are "falsy" to do that implicitly:
def __init__(self, *args):
if not args:
raise TypeError("__init__ takes at least 2 arguments (1 given)")
For "auto-unpacking," you will also need to test for this and perform it yourself. One method might be:
if len(args) == 1 and not isinstance(args[0], basestring) and iter(args[0]):
args = args[0]
The iter() will always be true, but if what you pass it is not iterable, it will raise an exception. If you want to provide a friendlier error message, you could catch it and raise something else.
Another alternative would be to write your method so that it recursively iterates over all elemets of args and all subcontainers within it; then it doesn't matter.
Or, you know, just have the caller pass in an iterable to begin with, and don't mess with *args. If the caller wants to pass in a single item, there is simple syntax to turn it into a list: foo([item])

Related

How to pass argument to a python decorator [duplicate]

This question already has answers here:
Decorators with parameters?
(23 answers)
Closed 3 years ago.
I am going to write a function decorator that takes an argument and returns a decorator that can be used to control the type of the argument to a one-argument function. I expect a raising error happens in case the passed argument be in a wrong type.
def typecontrol(num_type):
def wrapper_function(num):
if isinstance(num, num_type):
return num
else:
raise TypeError
return wrapper_function
#typecontrol(float)
def myfunc(num):
print(num)
I expect for example, myfunc(9.123) should print 9.123 and myfunc(9) should raise an error. But it always raises type error.
typecontrol will be a function that returns the decorator, not the decorator itself. You need an extra nested function:
def typecontrol(num_type):
def decorator(f):
def wrapper_function(num):
if isinstance(num, num_type):
f(num)
else:
raise TypeError
return wrapper_function
return decorator
#typecontrol(float)
def myfunc(num):
print(num)
The wrapper function will take care of calling the wrapped function if the typecheck passes, rather than returning the typechecked argument.
In your code as written, num is actually the function you are decorating, so you are checking if a function is an instance of (in this case), float.
Some examples:
>>> myfunc(3.0)
3.0
>>> myfunc("foo")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "tmp.py", line 7, in wrapper_function
raise TypeError
TypeError
As others have commented, a good explanation on writing decorators with arguments is found here.
However, it seems like you want to enforce types (I've gone down a similar rabbit hole before in Python) so depending on your use case would I possibly recommend two options:
If you want to make sure that your program is typed correctly prior to runtime, use the mypy static type checker.
If you need to parse/validate input values at runtime, I would highly suggest the pydantic package. It allows you to create "struct-like" objects (similar to a NamedTuple, or a dataclass) that will enforce runtime type checks and will appropriately coerce the input at runtime using Python 3.6+ type hints. Documentation and some helpful examples can be found here.
Depending on your use case, I hope either of these two help!

How to raise exception when function call is missing an argument in Python

If I have the function
def my_function(a,b,c):
and when the user calls the function, they omit the last argument
print(my_function(a,b))
what exception should I raise?
As others have mentioned, Python will raise a TypeError if a function is called with an incorrect number of statically declared arguments. It seems there is no practical reason to override this behavior to raise your own custom error message since Python's:
TypeError: f() takes 2 positional arguments but 3 were given
is quite telling.
However, if you want to do this, and perhaps optionally allow a second argument, you can use *args.
def my_function(a, *args):
b = None
if len(args) > 1:
raise TypeError("More than 2 arguments not allowed.")
elif args:
b = args[0]
# do something with a and possibly b.
Edit: The other answer suggesting a default keyword argument is more appropriate given new additional details in OP’s comment.
After discussion in the comment, it seems that what you want to do is catch an exception to pass a default argument if one was missing.
First of all, Python will already raise a TypeError if an argument is missing.
But you do not need to catch it to have default arguments since Python already provides a way to do this.
def my_function(a, b, c=0):
pass
my_function(1, 2, 3) # This works fine
my_function(1, 2) # This works as well an used 0 as default argument for c

Specifying list of arguments with minimum one argument in list

So, I have some function that takes a list of arguments with minimum one argument, but potentially an entire list of arguments. I know a couple ways of defining this method, but I'm trying to determine what the various alternatives are at my disposal and what the various advantages of each approach are.
I'm trying to figure out what possible prototypes I can use to define this argument. I've considered the following:
def func(arg_1, *args):
arg_list = (arg_1,) + args
# do stuff with arg_list
as well as
def func(*args):
if len(args) == 0: raise Exception("Not enough arguments specified")
# do stuff with args
Are there alternative ways to specify a function that takes an argument list with minimum one argument?
In Python 3.5 you can (it has been present since 3.0 I believe, when PEP 3102 came out) but with the restrictions that the extra arguments are defined without a default value and supplied in a keyword form when calling the defined function.
This is done by adding * as a separator and the required argument(s) after it:
def foo(*, must_specify, **kwargs): pass
Calling foo() without supplying the a keyword argument of the form must_specify = 'specified' will result in a TypeError hinting the requirement for that specific argument:
foo()
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-45-624891b0d01a> in <module>()
----> 1 foo()
TypeError: foo() missing 1 required keyword-only argument: 'must_specify'
Positional arguments on their own can be supplied, but, the positional grouping parameter of the form *args cannot i.e:
def foo(pos_arg, *, must_specify, **kwargs): pass
is fine, but:
def foo(pos_arg, *args, *, must_specify, **kwargs): pass
is a SyntaxError. This is done because * is essentially the same as a 'throw-away' form of *args.
Either way, there is no other syntactic way (that I'm aware of) which can satisfy the restriction you wish. If working with kwargs is too much of a hassle for you or if you're working with Py2 and are adamant to switch, solutions as those suggested in the comments are your best bet.

Should all dict params in python decorated by asterisks

def accept(**kwargs):
pass
If I defined accept and I expect it be called by passing a param which is dict. Are the asterisks necessary for all dict params?
What if I do things like:
def accept(dict):
pass
dict = {...}
accept(dict)
Specifically speaking, I would like to implement a update method for a class, which keeps a dict as a container. Just like the dict.update method, it takes a dict as a param and modify the content of the container. In this specific case should I use the kwargs or not?
** in function parameter collects all keyword arguments as a dictionary.
>>> def accept(**kwargs): # isinstance(kwargs, dict) == True
... pass
...
Call using keyword arguments:
>>> accept(a=1, b=2)
Call using ** operator:
>>> d = {'a': 1, 'b': 2}
>>> accept(**d)
>>> accept(d)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: accept() takes exactly 0 arguments (1 given)
See Python tutorial - Keyword argument and Unpacking Argument Lists.
BTW, don't use dict as variable name. It shadows builtin function/type dict.
See f below. The function f has two parameters, a positional one called name and a keyword argument message. They are local variable in the frame of the function call.
When you do f("John", **{"foo": "123", "message": "Hello World"}), the function f will unpack the dictionary into local variable by its key/value pair. In the end you have three local varaibles: name, foo=123, and message=Hello World.
The purpose of **kwargs, double asterisks is for uknown keyword arguments.
Contrast this:
def f(name, message=None):
if message:
return name + message
return name
Here I am telling user if you ever want to call f, you can pass a keyword argument message. This is the only kwarg I will ever accept and expect to receive if there is such one. If you try f("John", foo="Hello world") you will get unexpected keyword argument.
**kwargs is useful if you don't know ahead of time what keyword arguments you want to receive (very common for dispatching to lower-level functions/methods), then you use it.
def f(name, message=None, **kwargs):
if message:
return name + message
return name
In the second example, you can do f("John", **{"foo": "Hello Foo"}) while omitting message. You can also do f("John", **{"foo": "Hello Foo", "message": "Hello Message"}).
Can I ignore it?
As you see yes you can ignore it. Here f("John", **{"foo": "Hello Foo", "message": "Hello Message"}) I still only use message and ignore everything else.
Don't use **kwargs unless you need to be careless about the inputs.
What if my input is a dictionary?
If your function simply takes the dictionary and modifies the dictionary, NOT using individaul key, then just pass the dictionary. There is no reason to make a dictionary item into variables.
But here are two main usages of **kwargs.
Supposed I have a class and I want to create attributes on the fly. I can use setattr to set class attributes from input.
class Foo(object):
def __init__(**kwargs):
for key, value in kwargs.items():
setattr(self, key, value)
If I do Foo(**{"a": 1, "b": 2}) I will get Foo.a and Foo.b at the end.
This is particularly useful when you have to deal with legacy code. However, there is a big security concern. Imagine you own a MongoDB instance and this is a container for writing into a database. Imagine this dict is a request form object from user. The user can shovel ANYTHING and you simply save it in the database like that? That's a security hole. Make sure you validate (use a loop).
The second common usage of kwargs is that you don't know things ahead of times which I have covered (it's actually sort of the first common usage anyway).
If you want to pass a dictionary as input to a function, you can simply do it like this
def my_function1(input_dict):
print input_dict
d = {"Month": 1, "Year": 2}
my_function1(d) # {'Month': 1, 'Year': 2}
This is straight forward. Lets see the **kwargs method. kwargs stands for keyword arguments. So, you need to actually pass the parameters as key-value pairs, like this
def my_function2(**kwargs):
print kwargs
my_function2(Month = 1, Year = 2)
But if you have a dictionary and if you want to pass that as a parameter to my_function2, it can be done with unpacking, like this
my_function2(**d)

What is the proper python way to write methods that only take a particular type?

I have a function that is supposed to take a string, append things to it where necessary, and return the result.
My natural inclination is to just return the result, which involved string concatenation, and if it failed, let the exception float up to the caller. However, this function has a default value, which I just return unmodified.
My question is: What if someone passed something unexpected to the method, and it returns something the user doesn't expect? The method should fail, but how to enforce that?
It's not necessary to do so, but if you want you can have your method raise a TypeError if you know that the object is of a type that you cannot handle. One reason to do this is to help people to understand why the method call is failing and to give them some help fixing it, rather than giving them obscure error from the internals of your function.
Some methods in the standard library do this:
>>> [] + 1
Traceback (most recent call last):
File "", line 1, in
TypeError: can only concatenate list (not "int") to list
You can use decorators for this kind of thing, you can see an example here.
But forcing parameters to be of a specific type isn't very pythonic.
Python works under the assumption that we are all intelligent adults that read the documentation. If you still want to do it, you should not assert the actual type, but rather just catch the exception when the argument does not support the operations you need, like that:
def foo(arg):
try:
return arg + "asdf"
except TypeError:
return arg
What does the default value have to do with it? Are you saying you want to return the default value in the case where the caller doesn't pass a str? In that case:
def yourFunc( foo ):
try:
return foo + " some stuff"
except TypeError:
return "default stuff"
Space_C0wb0y has the right answer if you want to return the arg unmodified if it's not a string, and there's also the option of making an attempt to convert something to a string:
def yourFunc2( bar ):
return str(bar) + " some stuff"
Which will work with a lot of different types.

Categories

Resources