I'm newbie in Python, but the second time I encouter this problem.
Problem:
In some libraries there are functions with arguments. Sometimes there is argument as function, like this:
def somefun(fun):
x = [1,2,3]
z = fun(x)
return z
And I want to pass there some other function like this:
def func(x,y):
return x*y
which have more than one argument. I want to make one argument static, so somefun except func as argument.
Finally I want to make some kind of cycle where I can change static arg.
Something like this:
for i in xrange(1,9):
somefun(func(i,*))
Please do not offer me to change any functions. They are from library and it's not very comfortable to change them.
Thanks a lot!
You can use lambda statement:
somefun(lambda x: func(i, x))
It sure sounds like you are looking for functools.partial. From the docs:
functools.partial(func, *args, **keywords)
Return a new partial object which when called will behave like func called with the positional arguments args and keyword arguments keywords.
In your example, you could pass partial(func, 10) as the argument to somefun. Or you could create the partial objects and use them in a loop:
for i in xrange(1,9):
somefun(partial(func, i))
My solution with decorator
from functools import wraps
import numpy as np
def p_decorate(f):
#wraps(f)
def wrapped(*args):
z = f(*args)
return z
return wrapped
#p_decorate
def myfunc(a,b):
"""My new function"""
z = np.dot(a,b)
return z
x = [1,2,3]
y = [4,2,0]
r = myfunc(x,y)
print (r)
print (myfunc.__name__)
print (myfunc.__doc__)
You can change myfunc as you wish.You can also insert more function layers.Without the use of this decorator factory,you would lose the name of myfunc and the docstring.
Related
The goal is to try and access any function's sub functions. I've looked around and I'm not too sure there is a way to do it. When I've tried using
functions = [name for name, obj in inspect.getmembers(sys.modules[__name__], inspect.isfunction)]
which returns the functions in some module (in the above __name__==__main__). When I have used that method, it doesn't return any sub functions. However I'd like to access sub functions that look something like
def f(x):
def y(x):
return x += 3
def z(x):
return x**2 - 1
x += y(x)
x += z(x)
return x
So it seems to me like there should be some way to access them with a magic method of f or some attribute of f. I have a hard time believing that those sub functions aren't stored as some attribute of f, but I have no idea.
In the end, what I need to do is to iterate through the sub functions of some function, so I thought the solution would look something like
for subfunc in f.__method_that_returns_subfuncs__():
if 'my_string' == subfunc.__name__:
out = subfunc(args)
I just need to be able to compare a string to a subfunction name then call that subfunction.
Thanks
There's no implicit list of functions to iterate over. You need to define it yourself. Simply functions can be assigned directly to a list by defining them with lambda expressions; more complex functions will need to be defined first, then added. Examples of each:
def f(x):
funcs = []
def y(x):
return x += 3
f.append(y)
f.append(lambda x: x**2 - 1)
for func in funcs:
x = func(x)
return x
If you care about the name, you can access it via the function object's __name__ attribute.
for func in funcs:
if func.__name__ == "some_func":
x = func(x)
The simpler version of my problem looks like the following:
def function_1(x, function):
print(x)
function()
def function_2(y):
print(2*y)
def function_3(z, n):
print(3*z)
print(5 * n)
function_1(5, function_2)
function_1(3, function_3)
My question is when calling function_1, how can i give it an argument which will be added to the called function (function_2 or function_3) also as an argument?
Beginner here. Thanks in advance.
You can use functools.partial:
from functools import partial
...
function_1(5, partial(function_2, 5))
function_1(3, partial(function_3, 2, 3))
partial lets you pass a object to function_1 that can be called with no passed parameters.
As Kacper suggested, Python offers the possibility of using arbitrary positional arguments lists (args) and keyword arguments dicts (kwargs):
def fun_x(x, a_callable, *args, **kwargs):
print(x, a_callable, args, kwargs)
a_callable(*args, **kwargs)
def fun_y(a,b,kwarg1):
print(a,b,kwarg1)
fun_x(3,fun_y, "first", "second", kwarg1="third")
3 <function fun_y at 0x000001D1E0E02378> ('first', 'second') {'kwarg1': 'third'}
first second third
Do you mean by adding x to the parameters?
def function_1(x, function):
print(x)
function(x)
...
If you want to use the same argument x to function_1 as an argument to called function inside then you can use:
def function_1(x, function):
print(x)
function(x)
or you want to use another argument y for function which is called inside then you can use like below:
def function_1(x,y,function):
print(x)
function(y)
Is it possible to assign a function to a variable with modified default arguments?
To make it more concrete, I'll give an example.
The following obviously doesn't work in the current form and is only meant to show what I need:
def power(a, pow=2):
ret = 1
for _ in range(pow):
ret *= a
return ret
cube = power(pow=3)
And the result of cube(5) should be 125.
functools.partial to the rescue:
Return a new partial object which when called will behave like func called with the positional arguments args and keyword arguments keywords. If more arguments are supplied to the call, they are appended to args. If additional keyword arguments are supplied, they extend and override keywords.
from functools import partial
cube = partial(power, pow=3)
Demo:
>>> from functools import partial
>>>
>>> def power(a, pow=2):
... ret = 1
... for _ in range(pow):
... ret *= a
... return ret
...
>>> cube = partial(power, pow=3)
>>>
>>> cube(5)
125
The answer using partial is good, using the standard library, but I think it's worth mentioning that the following approach is equivalent:
def cube(a):
return power(a, pow=3)
Even though this doesn't seem like assignment because there isn't a =, it is doing much the same thing (binding a name to a function object). I think this is often more legible.
In specific there's a special function for exponents:
>>> 2**3
8
But I also solved it with a lambda function, which is a nicer version of a function pointer.
# cube = power(pow=3) # original
cube = lambda x: power(x,3)
Let me first acknowledge that what I want to do may be considered anything from silly to evil, but I want to find out if I can do it in Python anyway.
Let's say I have a function decorator that takes keyword arguments defining variables, and I want to access those variables in the wrapped function. I might do something like this:
def more_vars(**extras):
def wrapper(f):
#wraps(f)
def wrapped(*args, **kwargs):
return f(extras, *args, **kwargs)
return wrapped
return wrapper
Now I can do something like:
#more_vars(a='hello', b='world')
def test(deco_vars, x, y):
print(deco_vars['a'], deco_vars['b'])
print(x, y)
test(1, 2)
# Output:
# hello world
# 1 2
The thing I don't like about this is that when you use this decorator, you have to change the call signature of the function, adding the extra variable in addition to slapping on the decorator. Also, if you look at the help for the function, you see an extra variable that you're not expected to use when calling the function:
help(test)
# Output:
# Help on function test in module __main__:
#
# test(deco_vars, x, y)
This makes it look like the user is expected to call the function with 3 parameters, but obviously that won't work. So you'd have to also add a message to the docstring indicating that the first parameter isn't part of the interface, it's just an implementation detail and should be ignored. That's kind of crappy, though. Is there any way to do this without hanging these variables on something in the global scope? Ideally, I'd like it to look like the following:
#more_vars(a='hello', b='world')
def test(x, y):
print(a, b)
print(x, y)
test(1, 2)
# Output:
# hello world
# 1 2
help(test)
# Output:
# Help on function test in module __main__:
#
# test(x, y)
I am content with a Python 3 only solution if one exists.
You could do this with some trickery that inserts the variables passed to the decorator into the function's local variables:
import sys
from functools import wraps
from types import FunctionType
def is_python3():
return sys.version_info >= (3, 0)
def more_vars(**extras):
def wrapper(f):
#wraps(f)
def wrapped(*args, **kwargs):
fn_globals = {}
fn_globals.update(globals())
fn_globals.update(extras)
if is_python3():
func_code = '__code__'
else:
func_code = 'func_code'
call_fn = FunctionType(getattr(f, func_code), fn_globals)
return call_fn(*args, **kwargs)
return wrapped
return wrapper
#more_vars(a="hello", b="world")
def test(x, y):
print("locals: {}".format(locals()))
print("x: {}".format(x))
print("y: {}".format(y))
print("a: {}".format(a))
print("b: {}".format(b))
if __name__ == "__main__":
test(1, 2)
Can you do this? Sure! Should you do this? Probably not!
(Code available here.)
EDIT: answer edited for readability. Latest answer is on top, original follows.
If I understand well
you want the new arguments to be defined as keywords in the #more_vars decorator
you want to use them in the decorated function
and you want them to be hidden to the normal users (the exposed signature should still be the normal signature)
Have a look at the #with_partial decorator in my library makefun. It provides this functionality out of the box:
from makefun import with_partial
#with_partial(a='hello', b='world')
def test(a, b, x, y):
"""Here is a doc"""
print(a, b)
print(x, y)
It yields the expected output and the docstring is modified accordingly:
test(1, 2)
help(test)
yields
hello world
1 2
Help on function test in module <...>:
test(x, y)
<This function is equivalent to 'test(x, y, a=hello, b=world)', see original 'test' doc below.>
Here is a doc
To answer the question in your comment, the function creation strategy in makefun is exactly the same than the one in the famous decorator library: compile + exec. No magic here, but decorator has been using this trick for years in real-world applications so it is quite solid. See def _make in the source code.
Note that the makefun library also provides a partial(f, *args, **kwargs) function if you want to create the decorator yourself for some reason (see below for inspiration).
If you wish to do this manually, this is a solution that should work as you expect, it relies on the wraps function provided by makefun, to modify the exposed signature.
from makefun import wraps, remove_signature_parameters
def more_vars(**extras):
def wrapper(f):
# (1) capture the signature of the function to wrap and remove the invisible
func_sig = signature(f)
new_sig = remove_signature_parameters(func_sig, 'invisible_args')
# (2) create a wrapper with the new signature
#wraps(f, new_sig=new_sig)
def wrapped(*args, **kwargs):
# inject the invisible args again
kwargs['invisible_args'] = extras
return f(*args, **kwargs)
return wrapped
return wrapper
You can test that it works:
#more_vars(a='hello', b='world')
def test(x, y, invisible_args):
a = invisible_args['a']
b = invisible_args['b']
print(a, b)
print(x, y)
test(1, 2)
help(test)
You can even make the decorator definition more compact if you use decopatch to remove the useless level of nesting:
from decopatch import DECORATED
from makefun import wraps, remove_signature_parameters
#function_decorator
def more_vars(f=DECORATED, **extras):
# (1) capture the signature of the function to wrap and remove the invisible
func_sig = signature(f)
new_sig = remove_signature_parameters(func_sig, 'invisible_args')
# (2) create a wrapper with the new signature
#wraps(f, new_sig=new_sig)
def wrapped(*args, **kwargs):
kwargs['invisible_args'] = extras
return f(*args, **kwargs)
return wrapped
Finally, if you rather do not want to depend on any external library, the most pythonic way to do it is to create a function factory (but then you cannot have this as a decorator):
def make_test(a, b, name=None):
def test(x, y):
print(a, b)
print(x, y)
if name is not None:
test.__name__ = name
return test
test = make_test(a='hello', b='world')
test2 = make_test(a='hello', b='there', name='test2')
I'm the author of makefun and decopatch by the way ;)
It sounds like your only problem is that help is showing the signature of the raw test as the signature of the wrapped function, and you don't want it to.
The only reason that's happening is that wraps (or, rather, update_wrapper, which wraps calls) explicitly copies this from the wrappee to the wrapper.
You can decide exactly what you do and don't want to copy. If what you want to do differently is simple enough, it's just a matter of filtering stuff out of the default WRAPPER_ASSIGNMENTS and WRAPPER_UPDATES. If you want to change other stuff, you may need to fork update_wrapper and use your own version—but functools is one of those modules that has a link to the source right at the top of the docs, because it's meant to be used as readable sample code.
In your case, it may just be a matter of wraps(f, updated=[]), or you may want to do something fancy, like use inspect.signature to get the signature of f, and modify it to remove the first parameter, and build a wrapper explicitly around that to fool even the inspect module.
I've found a solution to this problem, although the solution is by most standards almost certainly worse than the problem itself. With some clever rewriting of the decorated function's bytecode, you can redirect all references to variables of a given name to a new closure you can dynamically create for the function. This solution only works for the standard CPython, and I have only tested it with 3.7.
import inspect
from dis import opmap, Bytecode
from types import FunctionType, CodeType
def more_vars(**vars):
'''Decorator to inject more variables into a function.'''
def wrapper(f):
code = f.__code__
new_freevars = code.co_freevars + tuple(vars.keys())
new_globals = [var for var in code.co_names if var not in vars.keys()]
new_locals = [var for var in code.co_varnames if var not in vars.keys()]
payload = b''.join(
filtered_bytecode(f, new_freevars, new_globals, new_locals))
new_code = CodeType(code.co_argcount,
code.co_kwonlyargcount,
len(new_locals),
code.co_stacksize,
code.co_flags & ~inspect.CO_NOFREE,
payload,
code.co_consts,
tuple(new_globals),
tuple(new_locals),
code.co_filename,
code.co_name,
code.co_firstlineno,
code.co_lnotab,
code.co_freevars + tuple(vars.keys()),
code.co_cellvars)
closure = tuple(get_cell(v) for (k, v) in vars.items())
return FunctionType(new_code, f.__globals__, f.__name__, f.__defaults__,
(f.__closure__ or ()) + closure)
return wrapper
def get_cell(val=None):
'''Create a closure cell object with initial value.'''
# If you know a better way to do this, I'd like to hear it.
x = val
def closure():
return x # pragma: no cover
return closure.__closure__[0]
def filtered_bytecode(func, freevars, globals, locals):
'''Get the bytecode for a function with adjusted closed variables
Any references to globlas or locals in the bytecode which exist in the
freevars are modified to reference the freevars instead.
'''
opcode_map = {
opmap['LOAD_FAST']: opmap['LOAD_DEREF'],
opmap['STORE_FAST']: opmap['STORE_DEREF'],
opmap['LOAD_GLOBAL']: opmap['LOAD_DEREF'],
opmap['STORE_GLOBAL']: opmap['STORE_DEREF']
}
freevars_map = {var: idx for (idx, var) in enumerate(freevars)}
globals_map = {var: idx for (idx, var) in enumerate(globals)}
locals_map = {var: idx for (idx, var) in enumerate(locals)}
for instruction in Bytecode(func):
if instruction.opcode not in opcode_map:
yield bytes([instruction.opcode, instruction.arg or 0])
elif instruction.argval in freevars_map:
yield bytes([opcode_map[instruction.opcode],
freevars_map[instruction.argval]])
elif 'GLOBAL' in instruction.opname:
yield bytes([instruction.opcode,
globals_map[instruction.argval]])
elif 'FAST' in instruction.opname:
yield bytes([instruction.opcode,
locals_map[instruction.argval]])
This behaves exactly as I wanted:
In [1]: #more_vars(a='hello', b='world')
...: def test(x, y):
...: print(a, b)
...: print(x, y)
...:
In [2]: test(1, 2)
hello world
1 2
In [3]: help(test)
Help on function test in module __main__:
test(x, y)
This is almost certainly not ready for production use. I would be surprised if there weren't edge cases that behave unexpectedly, and possibly even segfault. I'd probably file this under the "educational curiosity" heading.
If I have to wrap an existing method, let us say wrapee() from a new method, say wrapper(), and the wrapee() provides default values for some arguments, how do I preserve its semantics without introducing unnecessary dependencies and maintenance? Let us say, the goal is to be able to use wrapper() in place of wrapee() without having to change the client code. E.g., if wrapee() is defined as:
def wrapee(param1, param2="Some Value"):
# Do something
Then, one way to define wrapper() is:
def wrapper(param1, param2="Some Value"):
# Do something
wrapee(param1, param2)
# Do something else.
However, wrapper() has to make assumptions on the default value for param2 which I don't like. If I have the control on wrapee(), I would define it like this:
def wrapee(param1, param2=None):
param2 = param2 or "Some Value"
# Do something
Then, wrapper() would change to:
def wrapper(param1, param2=None):
# Do something
wrapee(param1, param2)
# Do something else.
If I don't have control on how wrapee() is defined, how best to define wrapper()? One option that comes into mind is to use to create a dict with non-None arguments and pass it as dictionary arguments, but it seems unnecessarily tedious.
Update:
The solution is to use both the list and dictionary arguments like this:
def wrapper(param1, *args, **argv):
# Do something
wrapee(param1, *args, **argv)
# Do something else.
All the following calls are then valid:
wrapper('test1')
wrapper('test1', 'test2')
wrapper('test1', param2='test2')
wrapper(param2='test2', param1='test1')
Check out argument lists in the Python docs.
>>> def wrapper(param1, *stuff, **kargs):
... print(param1)
... print(stuff)
... print(args)
...
>>> wrapper(3, 4, 5, foo=2)
3
(4, 5)
{'foo': 2}
Then to pass the args along:
wrapee(param1, *stuff, **kargs)
The *stuff is a variable number of non-named arguments, and the **kargs is a variable number of named arguments.
I'd hardly say that it isn't tedious, but the only approach that I can think of is to introspect the function that you are wrapping to determine if any of its parameters have default values. You can get the list of parameters and then determine which one is the first that has default values:
from inspect import getargspec
method_signature = getargspec(method)
param_names = method_signature[0]
default_values = method_signature[3]
params = []
# If any of method's parameters has default values, we need
# to know the index of the first one that does.
param_with_default_loc = -1
if default_values is not None and len(default_values) > 0:
param_slice_index = len(default_values) * -1
param_with_default = param_names[param_slice_index:][0]
param_with_default_loc = param_names.index(param_with_default)
At that point, you can iterate over param_names, copying into the dict that is passed to wrappee. Once your index >= param_with_default_loc, you can obtain the default values by looking in the default_values list with an index of your index - param_with_default_loc.
Does that make any sesne?
Of course, to make this generic, you would to define it as a wrapper function, adding yet another layer of wrapping.
def wrapper(param1, param2=None):
if param2:
wrapee(param1, param2)
else:
wrapee(param1)
is this what you want?
#!/usr/bin/python
from functools import wraps
def my_decorator(f):
#wraps(f)
def wrapper(*args, **kwds):
print 'Calling decorated function'
return f(*args, **kwds)
return wrapper
def f1(x, y):
print x, y
def f2(x, y="ok"):
print x, y
my_decorator(f1)(1,2)
my_decorator(f2)(1,2)
my_decorator(f2)(1)
adapted from http://koala/doc/python2.6-doc/html/library/functools.html#module-functools