How to use *args and self in Python constructor - python

I need a Python method to have access to self for instance variables and also be able to take any number of arguments. I basically want a method foo that can be called via
foo(a, b, c)
or
foo()
In the class, I think the constructor would be
def foo(self, *args):
Is this correct? Also, fyi, I am new to Python (if you can't tell).

You just have to add it after the self parameter:
class YourClass:
def foo(self, *args):
print(args)
def bar(self, *args, **kwargs):
print(args)
print(kwargs)
def baz(self, **kwargs):
print(kwargs)
I have also added a method in which you also add **kwargs, and the case in which you add both *args and **kwargs.
Examples
>>> o = YourClass()
>>> o.foo()
()
>>> o.foo(1)
(1,)
>>> o.foo(1, 2)
(1, 2)

def foo(self, *args):
Yes, that is correct.

You declared the method correctly. You can also use double asterisks to accept keyword arguments.
Reference: Expressions
A double asterisk ** denotes dictionary unpacking. Its operand must be a mapping. Each mapping item is added to the new dictionary. Later values replace values already set by earlier key/datum pairs and earlier dictionary unpackings.
....
An asterisk * denotes iterable unpacking. Its operand must be an iterable. The iterable is expanded into a sequence of items, which are included in the new tuple, list, or set, at the site of the unpacking.
Args will be a tuple. To access the values you will have to iterate or use positional arguments, ie: args[0]

Related

Passing a dictionary to a function with unpacking argument

This below snippet of code gives me this error TypeError: pop() argument after ** must be a mapping, not tuple.
class a():
data={'a':'aaa','b':'bbb','c':'ccc'}
def pop(self, key, **args):
return self.data.pop(key, **args)
b=a()
print(b.pop('a',{'b':'bbb'}))
But when I replace double ** with single *, this works fine. As per my understanding , if we are passing a dictionary , we should have double **. In this case the second argument what's being passed is dictionary {'b':'bbb'}. Then how is it throwing error in first case but not in second case?
class a():
data={'a':'aaa','b':'bbb','c':'ccc'}
def pop(self, key, *args):
return self.data.pop(key, *args)
b=a()
print(b.pop('a',{'b':'bbb'})
If you want a dictionary to be used as keyword arguments, you have to use the ** in the call as well:
print(b.pop('a',**{'b':'bbb'}))
But I don't think that's really what you wanted anyway.

Why Does Python Allow *args After Keyword Arguments?

Example:
def foo(a, b=2, *args, **kwargs): pass
Why does this not result in a SyntaxError? *args will not catch additional non-keyword arguments because it is illegal to pass them after keyword arguments.
For python3.x the correct use of *args, **kwargs in this case looks like:
def foo(a, *args, b=2, **kwargs): pass
Thanks for any insights into this curious behavior.
Edit:
Thanks to Jab for pointing me to PEP 3102, which explains this behavior concisely. Check it out!
And also thanks to jsbueno for the additional excellent explanation, which I am updating as the best answer due to its thoroughness.
Given:
def foo(a, b=2, *args, **kwargs): pass
b is not a keyword-only parameter - it is just a parameter for which arguments can be positional or named, but have a default value. It is not possible to pass any value into args and omit passing b or passing b out of order in the signature you suggest.
This signature makes sense and is quite unambiguous - you can pass from 0 to n positional arguments, but if you pass 2 or more, the second argument is assigned to "b", and end of story.
If you pass 0 positional arguments, you can still assign values to "a" or "b" as named arguments, but trying anything like: foo(0, 1, 2, a=3, b=4) will fail as more than one value is attempted to be passed to both parameters.
Where as in:
def foo(a, *args, b=2, **kwargs): pass
it is also an unambiguous situation: the first positional argument goes to "a", the others go to "args", and you can only pass a value to "b" as a named argument.
The new / syntax in signature definition coming with Python 3.8 gives more flexibility to this, allowing one to require that "a" and "b" are passed as positional-only arguments. Again, there is no ambiguity:
def foo(a, b=2, /, *args, **kwargs): pass
A curious thing on this new syntax: one is allowed to pass named arguments to "a" and "b", but the named arguments will come up as key/value pairs inside "kwargs" - while the local variables "a" and "b" will be assigned the positional only arguments:
def foo(a, b=2, /, *args, **kwargs):
print(a, b, args, kwargs)
...
In [9]: foo(1, 2, a=3, b=4)
1 2 () {'a': 3, 'b': 4}
Whereas with the traditional syntax you ask about - def foo(a, b=2, *args, **kwargs): - one gets a TypeError if that is tried:
In [11]: foo(1,2, a=3, b=4)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-11-d002c7717dba> in <module>
----> 1 foo(1,2, a=3, b=4)
TypeError: foo() got multiple values for argument 'a'
This was implemented into 3.X for multiple reasons. Best way I can answer this is refer to
PEP 3102
Also take a look at the New Syntax section in the Python 3.0.1 docs.
TLDR:
Named parameters occurring after
*args in the parameter list must be specified using keyword syntax in the call. You can also use a bare * in the parameter list to indicate
that you don’t accept a variable-length argument list, but you do have
keyword-only arguments.

*args treated as single argument

I'm trying to make a function designed to call another function multiple times:
def iterator(iterations, function, *args):
#called as:
iterator(5, my_function, arg1, arg2, arg3)
Note that the number of arguments here is variable: could 1, could be 2, could be 10.
fill them in based on the function that is being called.
def iterator(iterations, function, *args):
for i in range(iteration):
temp = function(args)
return temp
The problem here is:
TypeError: my_function() takes exactly 4 arguments (1 given)
And this is because (arg1, arg2, arg3, arg4) are being treated as a single argument.
How do I get around this?
By using the same syntax when applying the args sequence:
temp = function(*args)
The *args syntax here is closely related to the *args function parameter syntax; instead of capturing an arbitrary number of arguments, using *args in a call expands the sequence to separate arguments.
You may be interested to know that there is a **kwargs syntax too, to capture and apply keyword arguments:
def iterator(iterations, function, *args, **kwargs):
for i in range(iteration):
temp = function(*args, **kwargs)
return temp
Try this, unpacking the argument list (a.k.a. splatting it):
function(*args)
From the example in the documentation, you'll see that this is what you need:
range(3, 6) # ok
range([3, 6]) # won't work
range(*[3, 6]) # it works!

How to unpack a tuple while calling an external method in Python?

I call a method of an external library multiple times in my class like this:
class MyClass:
const_a = "a"
const_b = True
const_c = 1
def push(self, pushee):
with ExternalLibrary.open(self.const_a, self.const_b, self.const_c) as el:
el.push(pushee)
def pop(self):
with ExternalLibrary.open(self.const_a, self.const_b, self.const_c) as el:
return el.pop()
The lines containing the with statement are bugging me, because they require passing the the constants as arguments every time. I would like to store the arguments in a predefined data structure like a tuple and pass that to the external library.
You can do this:
args = (const_a, const_b, const_c)
ExternalLibrary.open(*args)
The * syntax unpacks an iterable (tuple, list, etc.) into individual arguments in a function call. There is also a ** syntax for unpacking a dictionary into keyword arguments:
kwargs = {'foo': 1, 'bar': 2}
func(**kwargs) # same as func(foo=1, bar=2)
You can also use both in the same call, like func(*args, **kwargs).

Preserving argument default values while method chaining

If I have to wrap an existing method, let us say wrapee() from a new method, say wrapper(), and the wrapee() provides default values for some arguments, how do I preserve its semantics without introducing unnecessary dependencies and maintenance? Let us say, the goal is to be able to use wrapper() in place of wrapee() without having to change the client code. E.g., if wrapee() is defined as:
def wrapee(param1, param2="Some Value"):
# Do something
Then, one way to define wrapper() is:
def wrapper(param1, param2="Some Value"):
# Do something
wrapee(param1, param2)
# Do something else.
However, wrapper() has to make assumptions on the default value for param2 which I don't like. If I have the control on wrapee(), I would define it like this:
def wrapee(param1, param2=None):
param2 = param2 or "Some Value"
# Do something
Then, wrapper() would change to:
def wrapper(param1, param2=None):
# Do something
wrapee(param1, param2)
# Do something else.
If I don't have control on how wrapee() is defined, how best to define wrapper()? One option that comes into mind is to use to create a dict with non-None arguments and pass it as dictionary arguments, but it seems unnecessarily tedious.
Update:
The solution is to use both the list and dictionary arguments like this:
def wrapper(param1, *args, **argv):
# Do something
wrapee(param1, *args, **argv)
# Do something else.
All the following calls are then valid:
wrapper('test1')
wrapper('test1', 'test2')
wrapper('test1', param2='test2')
wrapper(param2='test2', param1='test1')
Check out argument lists in the Python docs.
>>> def wrapper(param1, *stuff, **kargs):
... print(param1)
... print(stuff)
... print(args)
...
>>> wrapper(3, 4, 5, foo=2)
3
(4, 5)
{'foo': 2}
Then to pass the args along:
wrapee(param1, *stuff, **kargs)
The *stuff is a variable number of non-named arguments, and the **kargs is a variable number of named arguments.
I'd hardly say that it isn't tedious, but the only approach that I can think of is to introspect the function that you are wrapping to determine if any of its parameters have default values. You can get the list of parameters and then determine which one is the first that has default values:
from inspect import getargspec
method_signature = getargspec(method)
param_names = method_signature[0]
default_values = method_signature[3]
params = []
# If any of method's parameters has default values, we need
# to know the index of the first one that does.
param_with_default_loc = -1
if default_values is not None and len(default_values) > 0:
param_slice_index = len(default_values) * -1
param_with_default = param_names[param_slice_index:][0]
param_with_default_loc = param_names.index(param_with_default)
At that point, you can iterate over param_names, copying into the dict that is passed to wrappee. Once your index >= param_with_default_loc, you can obtain the default values by looking in the default_values list with an index of your index - param_with_default_loc.
Does that make any sesne?
Of course, to make this generic, you would to define it as a wrapper function, adding yet another layer of wrapping.
def wrapper(param1, param2=None):
if param2:
wrapee(param1, param2)
else:
wrapee(param1)
is this what you want?
#!/usr/bin/python
from functools import wraps
def my_decorator(f):
#wraps(f)
def wrapper(*args, **kwds):
print 'Calling decorated function'
return f(*args, **kwds)
return wrapper
def f1(x, y):
print x, y
def f2(x, y="ok"):
print x, y
my_decorator(f1)(1,2)
my_decorator(f2)(1,2)
my_decorator(f2)(1)
adapted from http://koala/doc/python2.6-doc/html/library/functools.html#module-functools

Categories

Resources