Related
Is there a way to define a function which knows how many variables to return based on how many outputs the user expects?
Let me illustrate the idea. Assume the following function:
def function():
a = 1
b = 2
c = 3
return a, b, c
Then, I would like my function to behave like:
>>> x = function()
>>> x
1
>>> x, y = function()
>>> x
1
>>> y
2
>>> x, y, z = function()
>>> x
1
>>> y
2
>>> z
3
Is there a function or concept in python that can help me to achieve that? Maybe decorators?
Any idea is welcome!
PD: My level of Python is still very basic.
EDIT:
I am currently moving from IDL to Python. So I am missing the nice feature in IDL that you can actually choose to return as many variables as desired by doing like:
FUNCTION function, a=a, b=b, c=c
a=1
b=2
c=3
RETURN, a
And then you can simply ask what you want to get back
IDL> x=function()
IDL> print, x
1
IDL> x=function(y=b)
IDL> print, x
1
IDL> print, y
2
IDL> x=function(y=b, z=c)
IDL> print, x
1
IDL> print, y
2
IDL> print, c
3
Instead of returning different values, you could call it differently:
x, *_ = function()
x, y, *_ = function()
x, y, z, *_ = function() # *_ is optional here if it only returns 3 things
Doing this assigns all unused returned values to the _ variable, so if you want them to get gc'd early, you have to del _.
You can only ever return a single object. Note, your function is returning a tuple. However, Python supports syntax that lets you unpack variables flexibly
x,_,_ = function()
Or
x,y,_ = function()
Or even with extended unpacking:
x, *_ = function()
Note, using _ as a throwaway variable is merely a convention.
Short of insane tricks like examining your caller’s bytecode to see how many values it’s expecting, this is impossible: the interpreter will not accept any mismatched number of values (it checks for and traps extras).
While not as Pythonic as unpacking, you can create a decorator and specify the number of returned values that are allowed:
def filter_results(val):
def outer(f):
def wrapper(*args, **kwargs):
v = f(*args, **kwargs)[:val]
return v[0] if len(v) == 1 else v
return wrapper
return outer
#filter_results(1)
def function():
a = 1
b = 2
c = 3
return a, b, c
r = function()
Output:
1
Full results:
#filter_results(2)
def function():
...
x, y = function()
Lastly:
#filter_results(3)
def function():
...
x, y, z = function()
If I write this:
c = []
def cf(n):
c = range (5)
print c
if any((i>3) for i in c) is True:
print 'hello'
cf(1)
print c
Then I get:
[1, 2, 3, 4]
hello
[]
I'm really new to programming, so please explain it really simply, but how do I stop Python from forgetting what c is after the function has ended? I thought I could fix it by defining c before the function, but obviously that c is different to the one created just for the function loop.
In my example, I could obviously just write:
c = range (5)
def cf(n)
But the program I'm trying to write is more like this:
b = [blah]
c = []
def cf(n):
c = [transformation of b]
if (blah) is True:
'loop' cf
else:
cf(1)
g = [transformation of c that produces errors if c is empty or if c = b]
So I can't define c outside the function.
In python you can read global variables in functions, but you cant assigned to them by default. the reason is that whenever python finds c = it will create a local variable. Thus to assign to global one, you need explicitly specify that you are assigning to global variable.
So this will work, e.g.:
c = [1,2,3]
def cf():
print(c) # it prints [1,2,3], it reads global c
However, this does not as you would expect:
c = [1,2,3]
def cf():
c = 1 # c is local here.
print(c) # it prints 1
cf()
print(c) # it prints [1,2,3], as its value not changed inside cf()
So to make c be same, you need:
c = [1,2,3]
def cf():
global c
c = 1 # c is global here. it overwrites [1,2,3]
print(c) # prints 1
cf()
print(c) # prints 1. c value was changed inside cf()
To summarise a few of these answers, you have 3 basic options:
Declare the variable as global at the top of your function
Return the local instance of the variable at the end of your function
Pass the variable as an argument to your function
You can also pass the array c into the function after declaring it. As the array is a function argument the c passed in will be modified as long as we don't use an = statement. This can be achieved like this:
def cf(n, c):
c.extend(range(5))
print c
if any((i>3) for i in c) is True:
print 'hello'
if __name__ == '__main__':
c = []
cf(1, c)
print c
For an explanation of this see this
This is preferable to introducing global variables into your code (which is generally considered bad practice). ref
Try this
c = []
def cf(n):
global c
c = range (5)
print c
if any((i>3) for i in c) is True:
print 'hello'
cf(1)
print c
If you want your function to modify c then make it explicit, i.e. your function should return the new value of c. This way you avoid unwanted side effects:
def cf(n, b):
"""Given b loops n times ...
Returns
------
c: The modified value
"""
c = [transformation of b]
...
return c # <<<<------- This
c = cf(1)
I want to make a function that does the following:
def func(*args):
for arg in args:
arg+=1
a = 5
b = 6
c = 7
func(a,b,c)
print("%i,%i,%i"%(a,b,c))
I want it to return:
6,7,8
How would I do this?
You can’t! Ta-da.
Python does not support pass-by-reference in any form. Return values instead:
def func(*args):
return [arg + 1 for arg in args]
a = 5
b = 6
c = 7
a, b, c = func(a, b, c)
print("%i,%i,%i" % (a, b, c))
You can't do this, because when you pass in a variable, it takes it in as its value, not as its variable.
Instead, return the value:
def func(*args):
args = list(args)
for i in range(len(args)):
args[i]+=1
return args
a = 5
b = 6
c = 7
a, b, c = func(a,b,c)
print("%i,%i,%i"%(a,b,c))
Which outputs:
>>> print("%i,%i,%i"%(a,b,c))
6,7,8
>>>
You can't, at least not with integer values. Integers are immutable, so you can't change their values, and a function doesn't have access to the namespace of its caller, so you can't rebind the variables (i.e., assign a new value to the variable a outside the function). See this question and various others about what you can and cannot do to affect variables in functions.
If your variables are mutable types like lists, you can achieve a similar effect by mutating the list's value:
def func(*args):
for arg in args:
arg[0] += 1
a = [5]
b = [6]
c = [7]
func(a,b,c)
print("%i,%i,%i"%(a,b,c))
However, you should think about why you want to do this. It may be better to simply return the values and assign them outside the function.
3 posts to tell "You can't" But "Impossible n'est pas français".
Python is the lingua franca of programming languages.
So it's possible:
#!/usr/bin/env python
def func(args):
for i in range(len(args)):
args[i] += 1
abc = [5, 6, 7]
func(abc)
print("%i,%i,%i" % tuple(abc))
actually prints
6,7,8
You can't do it easily because Python doesn't pass immutable objects such as integers by reference. However if you pass the function the names of objects in the current scope, you can achieve your goal like this:
import sys
def func(*args):
namespace = sys._getframe(1).f_globals # caller's globals
for arg in args:
namespace[arg] += 1
a = 5
b = 6
c = 7
func('a','b','c') # note variable *names* passed to function
print("%i,%i,%i" % (a,b,c)) # -> 6,7,8
I have a function, and when it is called, I'd like to know what the return value is going to be assigned to - specifically when it is unpacked as a tuple. So:
a = func() # n = 1
vs.
a, b, c = func() # n = 3
I want to use the value of n in func. There must be some magic with inspect or _getframe that lets me do this. Any ideas?
Disclaimer (because this seems to be neccessary nowadays): I know this is funky, and bad practice, and shouldn't be used in production code. It actually looks like something I'd expect in Perl. I'm not looking for a different way to solve my supposed "actual" problem, but I'm curious how to achive what I asked for above. One cool usage of this trick would be:
ONE, TWO, THREE = count()
ONE, TWO, THREE, FOUR = count()
with
def count():
n = get_return_count()
if not n:
return
return range(n)
Adapted from http://code.activestate.com/recipes/284742-finding-out-the-number-of-values-the-caller-is-exp/:
import inspect
import dis
def expecting(offset=0):
"""Return how many values the caller is expecting"""
f = inspect.currentframe().f_back.f_back
i = f.f_lasti + offset
bytecode = f.f_code.co_code
instruction = ord(bytecode[i])
if instruction == dis.opmap['UNPACK_SEQUENCE']:
return ord(bytecode[i + 1])
elif instruction == dis.opmap['POP_TOP']:
return 0
else:
return 1
def count():
# offset = 3 bytecodes from the call op to the unpack op
return range(expecting(offset=3))
Or as an object that can detect when it is unpacked:
class count(object):
def __iter__(self):
# offset = 0 because we are at the unpack op
return iter(range(expecting(offset=0)))
There is little magic about how Python does this.
Simply put, if you use more than one target name on the left-hand side, the right-hand expression must return a sequence of matching length.
Functions that return more than one value really just return one tuple. That is a standard Python structure, a sequence of a certain length. You can measure that length:
retval = func()
print len(retval)
Assignment unpacking is determined at compile time, you cannot dynamically add more arguments on the left-hand side to suit the function you are calling.
Python 3 lets you use a splat syntax, a wildcard, for capturing the remainder of a unpacked assignment:
a, b, *c = func()
c will now be a list with any remaining values beyond the first 2:
>>> def func(*a): return a
...
>>> a, b, *c = func(1, 2)
>>> a, b, c
(1, 2, [])
>>> a, b, *c = func(1, 2, 3)
>>> a, b, c
(1, 2, [3])
>>> a, b, *c = func(1)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: need more than 1 value to unpack
So if I have a function which takes loads of named arguments:
def foo(a = 1, b = 2, c = 3, d = 4, e = 5) # etc...
pass
and I'm calling it with all the arguments having exactly the same names as in the definition:
a = 0
b = 0
c = 0
d = 0
e = 0
is there a way to avoid doing this?
foo(e = e, b = b, d = d, a = a, c = c)
and just do this:
foo(e, b, d, a, c)
?
I guess I can do this:
foo(a, b, c, d, e)
but what if the arguments have complicated names and I can't remember the order of them by heart?
Well, you could do something like:
def foo(a, b, c, d):
print a, b, c, d
d = 4
b = 2
c = 3
a = 1
import inspect
foo(*[locals().get(arg, None) for arg in inspect.getargspec(foo).args])
but I'm not sure I can recommend this... In practice I'd use a dictionary of arguments:
foo_params = {
'd' : 4,
'b' : 2,
'c' : 3,
'a' : 1
}
foo(**foo_params)
of write a wrapper for foo which uses less arguments.
Python's argument passing mechanisms are extremely flexible. If they're not flexible enough, this seems like a design smell to me ...
possible smell: too many arguments to a function. Solutions: split into multiple functions, pass some args together in a dictionary or object.
possible smell: bad variable names. Solution: give variables more descriptive names.
Or just bite the bullet, figure out the correct order, and keep it simple.
If changing the function is not an option for you but you have the liberty to change the methodology in which you are assigning value to the parameters passed, here is a example code that might be helpful to you. This used orderdict to preserv the
Given
>>> def foo(a = 1, b = 2, c = 3, d = 4, e = 5):
print "a={0},b={1},c={2},d={3},e={4}".format(a,b,c,d,e)
the you can do
>>> var=dict()
>>> var['c']=12
>>> var['a']=10
>>> var['b']=11
>>> var['e']=14
>>> foo(**var)
a=10,b=11,c=12,d=4,e=14
Note, this answer is similar to what was proposed by #thg435 but you are
Not using inspect to hack the arguments a function expects.
Not looking through the local/global dictionary.
Supports missing arguments which defaults to what is the default argument.
And off-course you do not have to remember the order.
And you don;t even have to pass the variables as parameters. Just pass the dictionary.
You can do the following:
def func(a=1, b=2, c=3, **kw):
print a,b,c
a = 11
b = 22
c = 33
func(**locals())
Calling a 5-argument function with a completely different set of arguments each time is pretty rare. If, in practice, you're using the same a, c, and e args most of the time and calling with different b and d args (for example), you can create a class to help you with this:
class FooWrapper(object):
def __init__( self, commonA, commonC, commonE ):
self.a = commonA
self.c = commonC
self.e = commonE
def invokeFoo( self, _b, _d ):
foo( a=self.a, b = _b, c = self.c, d = _d, e = self.e )
w = FooWrapper( 1, 2, 3 )
w.invokeFoo( 4, 5 ) # calls foo( 1, 4, 2, 5, 3 )
w.invokeFoo( 6, 7 ) # calls foo( 1, 6, 2, 7, 3 )