I'm trying to understand the arguments that are passed to a pyramid view function.
The following example demonstrates a function wrapped with two different wrapppers. The only difference between the two wrappers is the signature. In the first wrapper, the first positional argument (obj) is explicit. In the second, it is included in *args.
import functools
from pyramid.config import Configurator
import webtest
def decorator_1(func):
#functools.wraps(func)
def wrapper(obj, *args, **kwargs): # <- obj
print('decorator_1')
print(type(obj), obj)
print(args)
print(kwargs)
return func(obj, *args, **kwargs) # <- obj
wrapper.__wrapped__ = func
return wrapper
def decorator_2(func):
#functools.wraps(func)
def wrapper(*args, **kwargs):
print('decorator_2')
print(args)
print(kwargs)
return func(*args, **kwargs)
wrapper.__wrapped__ = func
return wrapper
#decorator_1
def func_1(request):
return {'func': 'func_1'}
#decorator_2
def func_2(request):
return {'func': 'func_2'}
I would expect both wrapepd method to behave the same.
In decorator_1, I expect obj to be a request object and indeed it is.
In decorator_2, I would expect args[0] to be the same request object but it is not. It appears an additional first positional argument is passed before the request object.
def add_route(config, route, view, renderer="json"):
"""Helper for adding a new route-view pair."""
route_name = view.__name__
config.add_route(route_name, route)
config.add_view(view, route_name=route_name, renderer=renderer)
config = Configurator()
add_route(config, "/func_1", func_1)
add_route(config, "/func_2", func_2)
app = config.make_wsgi_app()
testapp = webtest.TestApp(app)
testapp.get("/func_1")
testapp.get("/func_2")
Output:
decorator_1
<class 'pyramid.request.Request'> GET /func_1 HTTP/1.0
Host: localhost:80
()
{}
decorator_2
(<pyramid.traversal.DefaultRootFactory object at 0x7f981da2ee48>, <Request at 0x7f981da2ea20 GET http://localhost/func_2>)
{}
Consequently, func_2 crashes because it receives a DefaultRootFactory object it does not expect.
I'd like to understand this discrepancy. How come the signature of the wrapper changes what pyramid passes to the wrapped function?
There is a mechanism at stake I don't understand, and I suspect it might be in Pyramid's logic.
I shared my findings in the webargs issue where this came up, but just in case anyone comes across this here:
Pyramid lets you write a view function with either of these signatures
def view(request):
...
def view(context, request):
...
The second calling convention is the original one, and the first is newer. So even though it is called an "alternate" in the pyramid docs, it is the default.
They use inspect.getfullargspec to see if the view takes a single positional parameter, and if so, wrap it to match the second convention. If the view doesn't match the first convention, it is assumed to match the second convention (which is false in this case).
Related
Background
I'm working on a PyQt application and have added the following debug decorator:
def debug_decorator(func):
#functools.wraps(func)
def wrapper_func(*args, **kwargs):
logging.debug(f"Calling {func.__name__}")
ret_val = func(*args, **kwargs)
logging.debug(f"{func.__name__} returned {ret_val!r}")
return ret_val
return wrapper_func
Among other things, i use this for click handler functions, for example:
self.share_qpb = QtWidgets.QPushButton("Share")
self.share_qpb.clicked.connect(self.on_share_clicked)
[...]
#debug_decorator
def on_share_clicked(self):
If i run this code i get the following error when the share button is clicked:
TypeError: on_share_clicked() takes 1 positional argument but 2 were given
The reason for this is that the clicked signal in Qt/PyQt sends the checked state of the button (documentation)
This is only an issue when i decorate functions with my debug decorator, not otherwise (i guess the extra argument is discarded somewhere)
Question
How can i make my debug decorator work without having to add a parameter for every clicked signal handler?
(I'd like to find a solution which works both for cases when the arguments match and when there is one extra argument)
I'm running Python 3.8 and PyQt 5.15
I managed to solve this by checking to see if the number of arguments provided len(args) is more than the number of arguments that the decorated function accepts func.__code__.co_argcount
I also check if the name of the decorated function contains the string "clicked" since i have made a habit of having that as part of the name of my click handler functions. This has the advantage that if another decorated function is called with too many arguments we will (as usual) get an error
def debug_decorator(func):
#functools.wraps(func)
def wrapper_func(*args, **kwargs):
logging.debug(f"=== {func.__name__} was called")
func_arg_count_int = func.__code__.co_argcount
func_name_str = func.__name__
if len(args) > func_arg_count_int and "clicked" in func_name_str: # <----
args = args[:-1]
ret_val = func(*args, **kwargs)
logging.debug(f"=== {func.__name__} returned {ret_val!r}")
return ret_val
return wrapper_func
Thank you to #ekhumoro for helping me out with this!
So, this is a 2 part question -
Is there an idiomatic way in python to inject a parameter into the function signature when using a decorator?
For example:
def _mydecorator(func):
def wrapped(someval, *args, **kwargs):
do_something(someval)
return func(*args, **kwargs)
return wrapped
#_mydecorator
def foo(thisval, thatval=None):
do_stuff()
The reason around this is when using SaltStack's runner modules You define funcs within the module, and you can call those functions via the 'salt-run' command. If the above example was a Salt runner module call 'bar', I could then run:
salt-run bar.foo aval bval
The salt-run imports the module and calls the function with the arguments you've given on the command line. Any function in the module that begins with a _ or that is in a class is ignored and cannot be run via salt-run.
So, I wanted to define something like a timeout decorator to control how long the function can run for.
I realize I could do something like:
#timeout(30)
def foo():
...
But I wanted to make that value configurable, so that I could run something like:
salt-run bar.foo 30 aval bval
salt-run bar.foo 60 aval bval
The above decorator works, but it feels like a hack, since it's changing the signature of the function and the user has no idea, unless they look at the decorator.
I have another case where I want to make a decorator for taking care of 'prechecks' before the functions in the Salt runner execute. However, the precheck needs a piece of information from the function it's decorating. Here's an example:
def _precheck(func):
def wrapper(*args, **kwargs):
ok = False
if len(args) > 0:
ok = run_prechecks(args[0])
else:
ok = run_prechecks(kwargs['record_id'])
if ok:
func(*args, **kwargs)
return wrapper
#_precheck
def foo(record_id, this_val):
do_stuff()
This also seems hackish, since it requires that the function that's being decorated, a) has a parameter called 'record_id' and that b) it's the first argument.
Now, because I'm writing all these functions, it's not a huge deal, but seems like there's probably a better way of doing this ( like not using decorators to try and solve this )
The way to dynamically define decorator arguments is not using the syntactic sugar (#). Like this:
func = dec(dec_arguments)(func_name)(func_arguments)
import json
import sys
foo = lambda record_id, thatval=None: do_stuff(record_id, thatval)
def do_stuff(*args, **kwargs):
# python3
print(*args, json.dumps(kwargs))
def _mydecorator(timeout):
print('Timeout: %s' % timeout)
def decorator(func):
def wrapped(*args, **kwargs):
return func(*args, **kwargs)
return wrapped
return decorator
if __name__ == '__main__':
_dec_default = 30
l_argv = len(sys.argv)
if l_argv == 1:
# no args sent
sys.exit('Arguments missing')
elif l_argv == 2:
# assuming record_id is a required argument
_dec_arg = _dec_default
_args = 1
else:
# three or more args: filename 1 2 [...]
# consider using optional keyword argument `timeout`
# otherwise in combination with another optional argument it's a recipe for disaster
# if only two arguments will be given - with current logic it will be tested for `timeoutedness`
try:
_dec_arg = int(sys.argv[1])
_args = 2
except (ValueError, IndexError):
_dec_arg = _dec_default
_args = 1
foo = _mydecorator(_dec_arg)(foo)(*sys.argv[_args:])
There is no idiomatic way to do this in python 3.7 as far as I know. Indeed #functools.wraps only works when you do not modify the signature (see what does functools.wraps do ?)
However there is a way to do the same as #functools.wraps (exposing the full signature, preserving the __dict__ and docstring): #makefun.wraps. With this drop-in replacement for #wraps, you can edit the exposed signature.
from makefun import wraps
def _mydecorator(func):
#wraps(func, prepend_args="someval")
def wrapped(someval, *args, **kwargs):
print("wrapper executes with %r" % someval)
return func(*args, **kwargs)
return wrapped
#_mydecorator
def foo(thisval, thatval=None):
"""A foo function"""
print("foo executes with thisval=%r thatval=%r" % (thisval, thatval))
# let's check the signature
help(foo)
# let's test it
foo(5, 1)
Yields:
Help on function foo in module __main__:
foo(someval, thisval, thatval=None)
A foo function
wrapper executes with 5
foo executes with thisval=1 thatval=None
You can see that the exposed signature contains the prepended argument.
The same mechanism applies for appending and removing arguments, as well as editing signatures more deeply. See makefun documentation for details (I'm the author by the way ;) )
I am using decorator: deferred_set_context to set the function module, name and time as the context string(basically this context is needed to trace in-case there are multiple calls to register_request)
#deferred_set_context()
def register_request(self, machine_model):
def deferred_set_context(name=None):
def __outer__(func):
context_name = name
if not context_name:
context_name = func.__name__
#defer.inlineCallbacks
def __inner__(*args, **kwargs):
r = None
with base.this_context_if_no_context(base.Context('%s.%s.%s' % (func.__module__, context_name,
datetime.utcnow().strftime(
'%Y-%m-%dT%H:%M:%S.%f')))):
r = yield func(*args, **kwargs)
defer.returnValue(r)
return __inner__
return __outer__
Now I need to set pass the machine_model name(which is argument of register_request) to this decorator: register_request. Something like this:
#deferred_set_context(name=machine_model)
def register_request(self, machine_model):
How shall I do it?
Note that the following two code snippets are equivalent
# first
#decorator
def func():
pass
# second
def func():
pass
func = decorator(func)
You need to make your decorator take a function as an argument. If you want to pass name in as a variable, then it should be a parameter in the function returned by the decorator.
Assuming you don't want to change the behavior of deferred_set_context or register_request, you could make a second shim decorator that spies on the correct function argument and passes it along to your original decorator.
Something like this:
def new_shim_decorator(register_request):
def wrapper(self, machine_model):
return deferred_set_context(machine_model)(self, machine_model)
return wrapper
#new_shim_decorator
def register_request(self, machine_model):
print("my_function called with {}".format(machine_model))
This is kind of a hacky solution in my opinion since this decorator that proxies to the original decorator would need to know the arguments passed to register_request and mirror them.
I would prefer to see a keyword-only argument that get be optionally get-ed from the keyword argument dictionary in either a shim decorator or the original:
def deferred_set_context_by_kwarg(kwarg_name):
def outer(func):
def wrapper(*args, **kwargs):
return deferred_set_context(name=kwargs.get(kwarg_name))(*args, **kwargs)
return wrapper
return outer
#deferred_set_context_by_kwarg('machine_model')
def register_request(self, *, machine_model):
print("my_function called with {}".format(machine_model))
Notice the *, machine_model. That * is a Python 3 feature that requires machine_model to be specified as a keyword argument. This allows our deferred_set_context_by_kwarg decorator to accept a string representing the name of the keyword argument we want and use that name to retrieve the keyword argument value and pass to deferred_set_context upon each function call.
If you are sure your function register_request takes always the same 2 arguments then pick them inside the decorator code.
I need a decorator (or any other neat design pattern) for functions which are dealing with files. The main purpose is to remain the file pointer at the same position where it was, after the function acts on the file.
Here is my code, including some dummy tests. The problem is that the decorator doesn't work on instance methods, even if I pass the args and kwargs to it. I could not figure out how to design the codeā¦
import unittest
from cStringIO import StringIO
def remain_file_pointer(file_obj):
def wrap(f):
def wrapped_f(*args, **kwargs):
old_position = file_obj.tell()
f(*args, **kwargs)
file_obj.seek(old_position, 0)
return wrapped_f
return wrap
class TestRemainFilepointer(unittest.TestCase):
def test_remains_filepointer(self):
dummy_file = StringIO('abcdefg')
#remain_file_pointer(dummy_file)
def seek_into_file(dummy_file):
dummy_file.seek(1, 0)
self.assertEqual(0, dummy_file.tell())
seek_into_file(dummy_file)
self.assertEqual(0, dummy_file.tell())
def test_remains_filepointer_in_class_method(self):
class FileSeekerClass(object):
def __init__(self):
self.dummy_file = StringIO('abcdefg')
#remain_file_pointer(self.dummy_file)
def seek_into_file(self):
self.dummy_file.seek(1, 0)
fileseeker = FileSeekerClass()
self.assertEqual(0, fileseeker.dummy_file.tell())
fileseeker.seek_into_file()
self.assertEqual(0, fileseeker.dummy_file.tell())
UPDATE:
Just to clarify the basic idea:
The decorator should take an argument, which is a file handler and store the position before the actual function manipulates the file. After that, the pointer should be set to the old position. And this should work either for standalone functions and for methods.
My answer below fixes the problem by assuming that the last argument in the function is the file handler.
I'll not mark this as the final answer, since I'm pretty sure there is a better design.
I came up with a solution which assumes that the last argument of the data-manipulating function is the file handler and access it via args[-1]:
def remain_file_pointer(f):
"""Remain the file pointer position after calling the decorated function
This decorator assumes that the last argument is the file handler.
"""
def wrapper(*args, **kwargs):
file_obj = args[-1]
old_position = file_obj.tell()
return_value = f(*args, **kwargs)
file_obj.seek(old_position, 0)
return return_value
return wrapper
I am fairly new to Python and have been learning about decorators. After messing around with Flask, I am trying to write some code that simulates their route handler/decorators, just to understand how decorators (with arguments) work.
In the code below, the route decorator seems to call itself once the script runs. My question is, how is it possible that app.route() gets called when i run this script, and what is really happening here? Notice i don't call my index() function anywhere directly.
# test.py
class Flask(object):
def __init__(self, name):
self.scriptname = name
def route(self, *rargs, **kargs):
args = list(rargs)
if kargs:
print(kargs['methods'])
def decorator(f):
f(args[0])
return decorator
app = Flask(__name__)
#app.route("/", methods = ["GET","PUT"])
def index(rt):
print('route: ' + rt)
the above prints this in my terminal:
$ python test.py
['GET', 'PUT']
route: /
Any insight would be appreciated.
#app.route("/", methods = ["GET","PUT"]) is an executable statement: it calls the route() method of the app object. Since it's at module level, it will be executed when the script is imported.
Now, the result of calling app.route(...) is a function, and because you've used the # to mark it as a decorator, that function will wrap index. Note that the syntax is just a shortcut for this:
index = app.route(...)(index)
in other words, Python will call the function returned by app.route() with index as a parameter, and store the result as the new index function.
However, you're missing a level here. A normal decorator, without params, is written like this:
#foo
def bar()
pass
and when the module is imported, foo() is run and returns a function that wraps bar. But you're calling your route() function within the decorator call! So actually your function needs to return a decorator function that itself returns a function that wraps the original function... headscratching, to be sure.
Your route method should look more like this:
def route(self, *rargs, **kargs):
args = list(rargs)
if kargs:
print(kargs['methods'])
def decorator(f):
def wrapped(index_args):
f(args[0])
return wrapped
return decorator
Basically... app.route(index, "/", ["GET", "PUT"]) is a function. And this is the function which is going to be called instead of index.
In your code, when you call index(), it calls app.route(index, "/", ["GET", "PUT"]). This starts by printing kargs['methods'], then creates the decorator function:
def decorator(f):
f(args[0])
This decorator will call the decorated function (f) with one argument, args[0], which here is "/". This prints route: /.
The best explanation of decorators I've found is here: How to make a chain of function decorators?
If you dont want the self-firing, you can define your decorator this way:
def route(*rargs, **kargs):
args = list(rargs)
if kargs:
print(kargs['methods'])
def decorator(f):
f(args[0])
return decorator
#app.route("/", methods = ["GET","PUT"])
def index(rt):
print('route: ' + rt)
However, the rt argument of index will never be used, because route always calls index with args[0] which is always \...