Decorator that changes certain argument of function - python

Working on a new version of the library I change one of the default arguments of several functions. So I'd like to add a temporary warning that occurs when user calls a function without explicitly specified parameters (so function is called with its defaults).
It could be easily done just by adding the warning function and calling it inside each of base functions:
def warning(formatting):
if formatting is None:
sys.stderr.write("WARNING: The default format has changed to new_format")
return 'new_format'
return formatting
def my_function(arg1, arg2, formatting=None):
formatting = warning(formatting)
... # the following function code
However it would be more convenient to do it using decorator (for code readability). So I've implemented something like this:
def check_default_format(fun):
def warning(*a, **kw):
if 'formatting' not in kw.keys() or kw['formatting'] is None:
kw['formatting'] = 'new_format'
sys.stderr.write("WARNING: The default format has changed to new_format")
return fun(*a, **kw)
return warning
#check_default_format
def my_function(arg1, arg2, formatting=None):
... # the function code
That works as expected when I call my_function without formatting parameter and if formatting is specified as a keyword parameter.
But how to include the possibility that my_function can be called with only positional parameters? Calling decorated my_function('arg1', 'arg2', 'some_format') will produce an TypeError due to duplication of formatting parameter.
Note: I cannot assume that the formatting is always the 3rd parameter as I need to decorate different functions. I also cannot change the parameters order to preserve backward compatibility.

In python 3, you can use the inspect module's Signature.bind_partial:
def check_default_format(fun):
#wraps(fun)
def wrapper(*a, **kw):
sig= inspect.signature(fun)
args= sig.bind_partial(*a, **kw)
if 'formatting' not in args.arguments or args.arguments['formatting'] is None:
kw['formatting'] = 'new_format'
sys.stderr.write("WARNING: The default format has changed to new_format")
return fun(*a, **kw)
return wrapper

Related

Parameter 'from' in python3: formal parameter name expected [duplicate]

I am using the Click library but I can't seem to find a behavior similar to dest from argparse.
For example, I have
#click.option('--format', type=click.Choice(['t', 'j']))
def plug(format):
pass
Notice that I am using a flag with --format that gets translated into a built-in Python construct format which is not ideal.
Is there a way to change the argument passed into the click function for options?
While Click doesn't have dest-equivalent of argparse, it has certain argument-naming behavior which can be exploited. Specifically, for parameters with multiple possible names, it will prefer non-dashed to dashed names, and as secondary preference will prioritize longer names over shorter names.
URL: http://click.pocoo.org/dev/parameters/#parameter-names
So if you declare your option as...
#click.option('--format', 'not-format', type=click.Choice(['t', 'j']))
...then Click will prioritize non-dashed variant ('not-format') and call your function with not_format=... argument.
Of course it also means that this alternative spelling can also be used in command line. If that is not desired, then I guess you could add a decorator to rename keyword arguments:
import functools
def rename_kwargs(**replacements):
def actual_decorator(func):
#functools.wraps(func)
def decorated_func(*args, **kwargs):
for internal_arg, external_arg in replacements.iteritems():
if external_arg in kwargs:
kwargs[internal_arg] = kwargs.pop(external_arg)
return func(*args, **kwargs)
return decorated_func
return actual_decorator
Testing code:
if __name__ == '__main__':
#rename_kwargs(different_arg='format')
def tester(different_arg):
print different_arg
tester(format='test value')
Test output:
$ python test_decor.py
test value
In your case, it would look like:
#click.option('--format', type=click.Choice(['t', 'j']))
#replace_kwargs(not_format='format')
def plug(not_format):
pass
Renaming an option to a differently named function argument is possible by decorating the function with
#click.option('--format', '-f', 'format_arg_name')
def plug(format_arg_name):
print(format_arg_name)
then it will remap the option named format and make it available as the format_arg_name parameter.
format_arg_name will not be available as a command line option, but --format and -f are.

In Python 3.x how do I create a simple proxy function for another function that is a pure pass through?

In Python 3.x, I want to create a proxy function _proxy for a specific known function proxiedFunc and guarantee that all arguments passed are "forwarded" exactly as if they were passed directly to the proxiedFunc.
# Pseudo-Python code
def _proxy(???generic_parameters???):
return proxiedFunc(???arguments???)
What I mean by "pure pass through" -> The implementation of the _proxy method should not be affected by (non-)compatible changes to the proxiedMethod, assuming the name of the function doesn't change (SOLID principles). Obviously, callers of _proxy would need to be modified if proxiedMethod is changed incompatibly (i.e. I'm not intending for _proxy to be an adapter, but that would be a possibility).
The generic way of taking "anything" in a function definition is using *args, **kwargs.
The same syntax is used for passing those args when calling another function.
def _proxy(*args, **kwargs):
return proxiedFunc(*args, **kwargs)
The single * (e.g. *args) captures the positional arguments, and the double (e.g. **kwargs) captures the keyword arguments.
args and kwargs are the names you give to those argument-containers. By convention, the name of the "any-positional-args" argument is args or a (its type is tuple), and the name of the "any-keyword-args" argument is kwargs or kw (its type is dict).
I, too, wanted to find a way to do that, so I wrote a function for that. I posted it to github: https://github.com/make-itrain/function-proxy
Basically, what it does is:
def a(p1, p2, p3=3, *args, p4=228, p18=11, **kwargs):
# any function, doesn't matter
pass
def b(p4, p2, *args):
# some overlapping arguments
# it too doesn't matter, what's inside here
print(p4, p2, args)
args, kwargs = proxy_function(b, a, {"args": ("replaced args",)}, 322, "some_random_arg", 1337,
"arguments", ('anything',), 1, abc=338, cbd="here too?")
b(*args, **kwargs)
Prints 228 some_random_arg ('replaced args',). Cool, isn't it?

Python decorator variable access

I have a decorator function my_fun(I,k) and it is applied to a function add(x,y) as such
#my_fun(4,5)
def add(x,y): return x+y
I am new to Python would like to know if I am writing the my_fun function
How can I access x,y in the add method in my_fun?
How can I access the return value of add in the decorator function?
I am a little confused on syntax and concepts any explanation would be help.
A decorator consists of the decorator function and a function wrapper (and if you want additional arguments for the decorator another outer layer of function around it):
# Takes the arguments for the decorator and makes them accessible inside
def my_fun(decorator_argument1, decorator_argument2):
# Takes the function so that it can be wrapped.
def wrapfunc(func):
# Here we are actually going to wrap the function ... finally
def wrapper(*args, **kwargs):
# Call the function with the args and kwargs
res = func(*args, **kwargs)
# return this result
return res
# Replace the decorated function with the wrapper
return wrapper
# Return the wrapper for the function wrapper :-)
return wrapfunc
In your case if you only want to use the decorator with your function you don't need to bother with the *args, **kwargs and replace it by:
def wrapper(x, y):
# Here you can do stuff with x and y, i.e. print(x)
# Call the function with x and y
res = func(x, y)
# Here you can do stuff with the result, i.e. res = res * decorator_argument1
return res
I indicated the places where you can access x and y and the result.
If you want to predefine values for x and y a custom decorator is not the best way. You could use defaults:
def add(x=4,y=5): return x+y
add() # returns 9
add(2) # returns 7
add(5, 10) # returns 15
or if you want to fix an argument you should use functools.partial
If you're passing arguments to the decorator with #my_fun(4, 5), you need three levels of nested functions to implement the decorator in the simplest way. The outer level is the "decorator factory". It returns the middle level function, the decorator. The decorator gets called with the function it's decorating as an argument and needs to return the inner most nested function, the wrapper. The wrapper function is the one that gets called by the user.
def decorator_factory(deco_arg, deco_arg2): # name this whatever you want to use with #syntax
def decorator(func):
def wrapper(func_arg, func_arg2):
# This is a closure!
# In here you can write code using the arguments from the enclosing scpoes. e.g.:
return func(func_arg*deco_arg, func_arg2*deco_arg2) # uses args from all levels
return wrapper
return decorator
The inner functions here are closures. They can see the variables in the scope surrounding the place they were defined in, even after the functions those scope belonged to have finished running.
(Note, if you want your decorator to be able to decorate many different functions, you may want the wrapper function to accept *args and **kwargs and pass them along to func. The example above only works for functions that accept exactly two arguments. A limitation like that may be perfectly reasonable for some uses, but not always.)

Python function loses identity after being decorated

(Python 3)
First of all, I feel my title isn't quite what it should be, so if you stick through the question and come up with a better title, please feel free to edit it.
I have recently learned about Python Decorators and Python Annotations, and so I wrote two little functions to test what I have recently learned.
One of them, called wraps is supposed to mimic the behaviour of the functools wraps, while the other, called ensure_types is supposed to check, for a given function and through its annotations, if the arguments passed to some function are the correct ones.
This is the code I have for those functions:
def wraps(original_func):
"""Update the decorated function with some important attributes from the
one that was decorated so as not to lose good information"""
def update_attrs(new_func):
# Update the __annotations__
for key, value in original_func.__annotations__.items():
new_func.__annotations__[key] = value
# Update the __dict__
for key, value in original_func.__dict__.items():
new_func.__dict__[key] = value
# Copy the __name__
new_func.__name__ = original_func.__name__
# Copy the docstring (__doc__)
new_func.__doc__ = original_func.__doc__
return new_func
return update_attrs # return the decorator
def ensure_types(f):
"""Uses f.__annotations__ to check the expected types for the function's
arguments. Raises a TypeError if there is no match.
If an argument has no annotation, object is returned and so, regardless of
the argument passed, isinstance(arg, object) evaluates to True"""
#wraps(f) # say that test_types is wrapping f
def test_types(*args, **kwargs):
# Loop through the positional args, get their name and check the type
for i in range(len(args)):
# function.__code__.co_varnames is a tuple with the names of the
##arguments in the order they are in the function def statement
var_name = f.__code__.co_varnames[i]
if not(isinstance(args[i], f.__annotations__.get(var_name, object))):
raise TypeError("Bad type for function argument named '{}'".format(var_name))
# Loop through the named args, get their value and check the type
for key in kwargs.keys():
if not(isinstance(kwargs[key], f.__annotations__.get(key, object))):
raise TypeError("Bad type for function argument named '{}'".format(key))
return f(*args, **kwargs)
return test_types
Supposedly, everything is alright until now. Both the wraps and the ensure_types are supposed to be used as decorators. The problem comes when I defined a third decorator, debug_dec that is supposed to print to the console when a function is called and its arguments. The function:
def debug_dec(f):
"""Does some annoying printing for debugging purposes"""
#wraps(f)
def profiler(*args, **kwargs):
print("{} function called:".format(f.__name__))
print("\tArgs: {}".format(args))
print("\tKwargs: {}".format(kwargs))
return f(*args, **kwargs)
return profiler
That also works cooly. The problem comes when I try to use debug_dec and ensure_types at the same time.
#ensure_types
#debug_dec
def testing(x: str, y: str = "lol"):
print(x)
print(y)
testing("hahaha", 3) # raises no TypeError as expected
But if I change the order with which the decorators are called, it works just fine.
Can someone please help me understand what is going wrong, and if is there any way of solving the problem besides swapping those two lines?
EDIT
If I add the lines:
print(testing.__annotations__)
print(testing.__code__.co_varnames)
The output is as follows:
#{'y': <class 'str'>, 'x': <class 'str'>}
#('args', 'kwargs', 'i', 'var_name', 'key')
Although wraps maintains the annotations, it doesn't maintain the function signature. You see this when you print out the co_varnames. Since ensure_types does its checking by comparing the names of the arguments with the names in the annotation dict, it fails to match them up, because the wrapped function has no arguments named x and y (it just accepts generic *args and **kwargs).
You could try using the decorator module, which lets you write decorators that act like functools.wrap but also preserve the function signature (including annotations).
There is probably also a way to make it work "manually", but it would be a bit of a pain. Basically what you would have to do is have wraps store the original functions argspec (the names of its arguments), then have ensure_dict use this stored argspec instead of the wrapper's argspec in checking the types. Essentially your decorators would pass the argspec in parallel with the wrapped functions. However, using decorator is probably easier.

Is it possible to generate a Python function with arguments in runtime?

Say,
I have a python function as following:
def ooxx(**kwargs):
doSomething()
for something in cool:
yield something
I would like to provide another function with named arguments for hints as following:
def asdf(arg1, arg2, arg3=1):
frame = inspect.currentframe()
args, _, _, values = inspect.getargvalues(frame)
kwargs = dict((key, values[key]) for key in args) # convert args list into dictionary form
return list(ooxx(**kwargs))
Is it possible to have some sort of methods to generate automatically the function "asdf"? I have lots of dynamic generated ooxx functions and I would like to have corresponding asdf functions with customized named arguments. Not sure if this is the correct requirement or right way to coding :p
Your descriptions doesn't make such sense to me: You wrote a really verbose function that does this:
def asdf(arg1, arg2, arg3=1):
return list(ooxx(**locals()))
but you want to inspect the ooxx and somehow make up appropriate names for asdfs arguments? That is impossible, there is no information about this on ooxx.
If you actually have a signature and want to create a function from it you would have to resort to eval or generate function definitions to a Python file and import it.
There is also the decorator module. You can create a function with it like this:
import decorator
asdf = decorator.FunctionMaker.create(
'asdf(arg1, arg2, arg3)', # signature
'return ooxx(**locals())', # function body
{'ooxx' : ooxx}, # context for the function
('arg3', 1)) # default arguments

Categories

Resources