My question is regarding the following code:
def foo(*args):
return *args # syntax error
def bar(*args):
return 0, *args # ok
def foobar(*args):
if len(args) == 1:
return args[0]
return args
print(bar(1,2))
print(foobar(1,2))
print(foobar(1))
>>> (0,1,2)
>>> (1,2)
>>> 1
is there why reason why foo, rather than being invalid Python code, does not have the same behaviour as foobar? I guess I would also be willing to accept foo producing a singleton tuple i.e. (1,) = foo(1). Any insight into this would be appreciated!
As mentioned in the comments, usefulness of such construct is rather dubious. Esp. since you end up returning tuple of unpacked values... so why not return the tuple itself?
If you really wanted to make this work, you could, starting with Python 3.5 say:
return (*args,)
In line with PEP-448. This unpacks args items into tuple that is to be returned.
And starting with Python 3.8, you could drop the enclosing parenthesis:
Generalized iterable unpacking in yield and return statements no longer requires enclosing parentheses...
return *args,
Your bar() does essentially the same using that generalized unpacking behavior as described in the linked PEP, just having leading item in the tuple.
With regards to the explanation, I believe the comments made above have addressed it.
I wrote the following code and got it to return (1,). You can also follow the suggestion and simply remove the * in front of args in your original code as the comments suggest.
def foo(*args):
return *args,
def bar(*args):
return 0, *args # ok
def foobar(*args):
if len(args) == 1:
return args[0]
return args
print(bar(1,2))
print(foobar(1,2))
print(foobar(1))
print(foo(1))
Here is a picture of the output. I am using Python 3.8
It returns the singleton tuple, as you desire. Is this what you needed?
Related
I need a Python method to have access to self for instance variables and also be able to take any number of arguments. I basically want a method foo that can be called via
foo(a, b, c)
or
foo()
In the class, I think the constructor would be
def foo(self, *args):
Is this correct? Also, fyi, I am new to Python (if you can't tell).
You just have to add it after the self parameter:
class YourClass:
def foo(self, *args):
print(args)
def bar(self, *args, **kwargs):
print(args)
print(kwargs)
def baz(self, **kwargs):
print(kwargs)
I have also added a method in which you also add **kwargs, and the case in which you add both *args and **kwargs.
Examples
>>> o = YourClass()
>>> o.foo()
()
>>> o.foo(1)
(1,)
>>> o.foo(1, 2)
(1, 2)
def foo(self, *args):
Yes, that is correct.
You declared the method correctly. You can also use double asterisks to accept keyword arguments.
Reference: Expressions
A double asterisk ** denotes dictionary unpacking. Its operand must be a mapping. Each mapping item is added to the new dictionary. Later values replace values already set by earlier key/datum pairs and earlier dictionary unpackings.
....
An asterisk * denotes iterable unpacking. Its operand must be an iterable. The iterable is expanded into a sequence of items, which are included in the new tuple, list, or set, at the site of the unpacking.
Args will be a tuple. To access the values you will have to iterate or use positional arguments, ie: args[0]
Is it possible to assign a function to a variable with modified default arguments?
To make it more concrete, I'll give an example.
The following obviously doesn't work in the current form and is only meant to show what I need:
def power(a, pow=2):
ret = 1
for _ in range(pow):
ret *= a
return ret
cube = power(pow=3)
And the result of cube(5) should be 125.
functools.partial to the rescue:
Return a new partial object which when called will behave like func called with the positional arguments args and keyword arguments keywords. If more arguments are supplied to the call, they are appended to args. If additional keyword arguments are supplied, they extend and override keywords.
from functools import partial
cube = partial(power, pow=3)
Demo:
>>> from functools import partial
>>>
>>> def power(a, pow=2):
... ret = 1
... for _ in range(pow):
... ret *= a
... return ret
...
>>> cube = partial(power, pow=3)
>>>
>>> cube(5)
125
The answer using partial is good, using the standard library, but I think it's worth mentioning that the following approach is equivalent:
def cube(a):
return power(a, pow=3)
Even though this doesn't seem like assignment because there isn't a =, it is doing much the same thing (binding a name to a function object). I think this is often more legible.
In specific there's a special function for exponents:
>>> 2**3
8
But I also solved it with a lambda function, which is a nicer version of a function pointer.
# cube = power(pow=3) # original
cube = lambda x: power(x,3)
I am trying to create a set of functions in python that will all do a similar operation on a set of inputs. All of the functions have one input parameter fixed and half of them also need a second parameter. For the sake of simplicity, below is a toy example with only two functions.
Now, I want, in my script, to run the appropriate function, depending on what the user input as a number. Here, the user is the random function (so the minimum example works). What I want to do is something like this:
def function_1(*args):
return args[0]
def function_2(*args):
return args[0] * args[1]
x = 10
y = 20
i = random.randint(1,2)
f = function_1 if i==1 else function_2
return_value = f(x,y)
And it works, but it seems messy to me. I would rather have function_1 defined as
def function_1(x):
return x
Another way would be to define
def function_1(x,y):
return x
But that leaves me with a dangling y parameter.
but that will not work as easily. Is my way the "proper" way of solving my problem or does there exist a better way?
There are couple of approaches here, all of them adding more boiler-plate code.
There is also this PEP which may be interesting to you.
But 'pythonic' way of doing it is not as elegant as usual function overloading due to the fact that functions are just class attributes.
So you can either go with function like that:
def foo(*args):
and then count how many args you've got which will be very broad but very flexible as well.
another approach is the default arguments:
def foo(first, second=None, third=None)
less flexible but easier to predict, and then lastly you can also use:
def foo(anything)
and detect the type of anything in your function acting accordingly.
Your monkey-patching example can work too, but it becomes more complex if you use it with class methods, and does make introspection tricky.
EDIT: Also, for your case you may want to keep the functions separate and write single 'dispatcher' function that will call appropriate function for you depending on the arguments, which is probably best solution considering above.
EDIT2: base on your comments I believe that following approach may work for you
def weigh_dispatcher(*args, **kwargs):
#decide which function to call base on args
if 'somethingspecial' in kwargs:
return weight2(*args, **kwargs)
def weight_prep(arg):
#common part here
def weight1(arg1, arg2):
weitht_prep(arg1)
#rest of the func
def weight2(arg1, arg2, arg3):
weitht_prep(arg1)
#rest of the func
alternatively you can move the common part into the dispatcher
You may also have a function with optional second argument:
def function_1(x, y = None):
if y != None:
return x + y
else:
return x
Here's the sample run:
>>> function_1(3)
3
>>> function_1(3, 4)
7
Or even optional multiple arguments! Check this out:
def function_2(x, *args):
return x + sum(args)
And the sample run:
>>> function_2(3)
3
>>> function_2(3, 4)
7
>>> function_2(3, 4, 5, 6, 7)
25
You may here refer to args as to list:
def function_3(x, *args):
if len(args) < 1:
return x
else:
return x + sum(args)
And the sample run:
>>> function_3(1,2,3,4,5)
15
I'm trying to make a function designed to call another function multiple times:
def iterator(iterations, function, *args):
#called as:
iterator(5, my_function, arg1, arg2, arg3)
Note that the number of arguments here is variable: could 1, could be 2, could be 10.
fill them in based on the function that is being called.
def iterator(iterations, function, *args):
for i in range(iteration):
temp = function(args)
return temp
The problem here is:
TypeError: my_function() takes exactly 4 arguments (1 given)
And this is because (arg1, arg2, arg3, arg4) are being treated as a single argument.
How do I get around this?
By using the same syntax when applying the args sequence:
temp = function(*args)
The *args syntax here is closely related to the *args function parameter syntax; instead of capturing an arbitrary number of arguments, using *args in a call expands the sequence to separate arguments.
You may be interested to know that there is a **kwargs syntax too, to capture and apply keyword arguments:
def iterator(iterations, function, *args, **kwargs):
for i in range(iteration):
temp = function(*args, **kwargs)
return temp
Try this, unpacking the argument list (a.k.a. splatting it):
function(*args)
From the example in the documentation, you'll see that this is what you need:
range(3, 6) # ok
range([3, 6]) # won't work
range(*[3, 6]) # it works!
I call a method of an external library multiple times in my class like this:
class MyClass:
const_a = "a"
const_b = True
const_c = 1
def push(self, pushee):
with ExternalLibrary.open(self.const_a, self.const_b, self.const_c) as el:
el.push(pushee)
def pop(self):
with ExternalLibrary.open(self.const_a, self.const_b, self.const_c) as el:
return el.pop()
The lines containing the with statement are bugging me, because they require passing the the constants as arguments every time. I would like to store the arguments in a predefined data structure like a tuple and pass that to the external library.
You can do this:
args = (const_a, const_b, const_c)
ExternalLibrary.open(*args)
The * syntax unpacks an iterable (tuple, list, etc.) into individual arguments in a function call. There is also a ** syntax for unpacking a dictionary into keyword arguments:
kwargs = {'foo': 1, 'bar': 2}
func(**kwargs) # same as func(foo=1, bar=2)
You can also use both in the same call, like func(*args, **kwargs).