Python: More elegant way to add optional parameters to method call - python

This will seem trivial perhaps, but it is a condition that I run into fairly frequently and would like to find a more elegant way of writing this code. The method, while not terribly relevant to the question, takes a text value and an optional is_checked value to create a radio button (using dominate). In this case, I can't set 'checked' to None, or false - it either has to be there or not. It doesn't seem like I should have to write the 'input' line twice though, just to optionally add an argument.
def _get_radio_button(text: str, is_checked=False):
with label(text, cls="radio-inline") as lbl:
if is_checked:
input(text, type="radio", name="optradio", checked='checked')
else:
input(text, type="radio", name="optradio")
return lbl
This would be my second approach, but it is the same lines of code and less readable - though perhaps a tiny bit more DRY.
a = dict(type='radio', name='optradio')
if is_checked:
a['checked']='checked'
with label(text, cls="radio-inline") as lbl:
input(text, **a)
Question: How can I handle this code case with the fewest lines possible without sacrificing readability?

Your code looks fine, except obviously for the naming of a, which could be input_opts or something like that.
Another possibility to make it a bit clearer is to use direct keyword arguments for the common stuff and just inject the optional ones using **. When only one is optional, this can be quite short, e.g.:
checked_arg = {'checked': 'checked'} if is_checked else {}
with label(text, cls="radio-inline") as lbl:
input(text, type="radio", name="optradio", **checked_arg)

Only as concept :) You can decorate in this way own or alien (library) functions. Even more, you can make decorator as class (with __call__ method which will decorate underlying function) which can be parameterized with simple "morphisms" of underlying function arguments (they may be list of functions - as arguments of decorator class constructor). Also you can make more declarative style decorator and to inspect underlying function arguments (for default values, for example) - you are limited only by own fantasy :) So:
from functools import wraps
def adapt_gui_args(callable):
#wraps(callable)
def w(*args, **kwargs):
if kwargs.pop('is_checked', False): kwargs['checked'] = 'checked'
return callable(*args, **kwargs)
return w
# may be decorated with adapt_gui_args if it's your function
def input(*args, **kwargs):
print("args: ", args)
print("kwargs: ", kwargs)
# decorate input function outside its source body
input = adapt_gui_args(input)
def test(is_checked=False):
input(1, 2, type="radio", is_checked=is_checked)
test(False)
test(True)

Related

How to make a proper function wrapper

I used naive approach to write a wrapper. Get all *args and **kwargs and pass them to the enclosing function. But something went wrong. So I simplified example to the core to illustrate my troubles.
# simplies wrapper possible: just pass the args
def wraps(f):
def call(*argv, **kw):
# add some meaningful manipulations later
return f(*argv, **kw)
return call
# check the wrapper behaves identically
class M:
def __init__(this, param):
this.param = param
M.__new__ = M.__new__
m1 = M(1)
M.__new__ = wraps(M.__new__)
m2 = M(2)
m1 was instantiated normally, but m2 fails with the following error description
TypeError: object.__new__() takes exactly one argument (the type to instantiate)
The question is how to define wraps and call function properly so they would behave identically to the function being wrapped regardless of the wrapped function.
It is not the end objective obviously, since primitive lambda x: x would suffice. It is a starting point from which I could introduce further complications.
The short answer: It's impossible. You could not define a perfect wrapper in python (and in many other languages too).
Slightly longer version. Python function is a first-class object and all manipulations acceptable for objects could be performed with a function too. So you could not presume that some complex procedure would limit itself with only calling the function passed as argument and would not use the function object in other unobvious ways
Much more verbose speculation with examples
Functions defined only at part of the domain are pretty common
def half(i):
if i < 0:
raise ValueError
if i & 1:
raise ValueError
return i / 2
Pretty straight. No we could get a little more confusing:
class Veggy:
def __init__(this, kind):
this.kind = kind
def pr(this):
print(this.kind)
def assess(v):
if v.kind in ['tomato', 'carrot']:
raise ValueError
v.pr()
Here Veggy used as a function proxy but also have public property kind which the assess function check before executing.
The same thing could be done with a function object since it also have additional properties besides calling.
def test(x):
return x + x
def assess4(f, *argv, **kw):
if f.__name__ != 'test':
raise ValueError
if f.__module__ != '__main__':
raise ValueError
if len(f.__code__.co_code) % 8 == 4:
raise ValueError
return f(*argv, **kw)
Writing correct wrapper becomes a challenge. That challenge could be complicated further:
def assess0(f, *argv, **kw):
if len(f.__code__.co_code) % 8 == 0:
kw['arg'] = True
return f(*argv[1:], kw)
else
kw['arg'] = False
return f(*argv[:-1], **kw)
Universal wrapper should handle both assess0 and assess4 correctly which is pretty impossible. And we have not touched id magic. Checking id would cast acceptable function in stone.
Coding etiquette
So you could not write a wrapper. Why someone bother to write one? Why function are so common when they could not guarantee behavior equivalence and could possible introduce non-trivial changes in code flow?
The simple answer is coding conventions. The famous substitution principle. Code should keep behavior properties when some object is substituted with another of the same type. Python put little focus on type nomination and enforcing. Rigorous type system is not a must, you could establish APIs and protocols through documentation and type annotation like the python language does.
Programs must be written for people to read, and only incidentally for machines to execute. OOP conventions are all in people minds. So python developers broke conventions requiring some non-stadard behavior for overriding object methods. This non-conventional OOP treatment make impossible to use decorators for transforming __init__ and __new__ methods.
The final solution
If python treats __new__ so special then generic wrapper should do the same.
# simplest wrapper possible: just pass the args
def wraps(f):
def call(*argv, **kw):
# add some meaningful manipulations later
return f(*argv, **kw)
def call_new(*argv, **kw):
# add some meaningful manipulations later
return f(argv[0])
if f is object.__new__:
return call_new
# elif other_special_case: pass
else:
return call
Now it could successfully pass the test
# check the wrapper behaves identically
class M:
def __init__(this, param):
this.param = param
M.__new__ = M.__new__
m1 = M(1)
M.__new__ = wraps(M.__new__)
m2 = M(2)
The drawback is that you should implement distinct workaround for any other convention breaking functions besides __new__ to make your function wrapper semi-applicable in universal context. But it is the best you could get out of python.

Does python allow me to pass dynamic variables to a decorator at runtime?

I am attempting to integrate a very old system and a newer system at work. The best I can do is to utilize an RSS firehouse type feed the system utilizes. The goal is to use this RSS feed to make the other system perform certain actions when certain people do things.
My idea is to wrap a decorator around certain functions to check if the user (a user ID provided in the RSS feed) has permissions in the new system.
My current solution has a lot of functions that look like this, which are called based on an action field in the feed:
actions_dict = {
...
'action1': function1
}
actions_dict[RSSFEED['action_taken']](RSSFEED['user_id'])
def function1(user_id):
if has_permissions(user_id):
# Do this function
I want to create a has_permissions decorator that takes the user_id so that I can remove this redundant has_permissions check in each of my functions.
#has_permissions(user_id)
def function1():
# Do this function
Unfortunately, I am not sure how to write such a decorator. All the tutorials I see have the #has_permissions() line with a hardcoded value, but in my case it needs to be passed at runtime and will be different each time the function is called.
How can I achieve this functionality?
In your question, you've named both, the check of the user_id, as well as the wanted decorator has_permissions, so I'm going with an example where names are more clear: Let's make a decorator that calls the underlying (decorated) function when the color (a string) is 'green'.
Python decorators are function factories
The decorator itself (if_green in my example below) is a function. It takes a function to be decorated as argument (named function in my example) and returns a function (run_function_if_green in the example). Usually, the returned function calls the passed function at some point, thereby "decorating" it with other actions it might run before or after it, or both.
Of course, it might only conditionally run it, as you seem to need:
def if_green(function):
def run_function_if_green(color, *args, **kwargs):
if color == 'green':
return function(*args, **kwargs)
return run_function_if_green
#if_green
def print_if_green():
print('what a nice color!')
print_if_green('red') # nothing happens
print_if_green('green') # => what a nice color!
What happens when you decorate a function with the decorator (as I did with print_if_green, here), is that the decorator (the function factory, if_green in my example) gets called with the original function (print_if_green as you see it in the code above). As is its nature, it returns a different function. Python then replaces the original function with the one returned by the decorator.
So in the subsequent calls, it's the returned function (run_function_if_green with the original print_if_green as function) that gets called as print_if_green and which conditionally calls further to that original print_if_green.
Functions factories can produce functions that take arguments
The call to the decorator (if_green) only happens once for each decorated function, not every time the decorated functions are called. But as the function returned by the decorator that one time permanently replaces the original function, it gets called instead of the original function every time that original function is invoked. And it can take arguments, if we allow it.
I've given it an argument color, which it uses itself to decide whether to call the decorated function. Further, I've given it the idiomatic vararg arguments, which it uses to call the wrapped function (if it calls it), so that I'm allowed to decorate functions taking an arbitrary number of positional and keyword arguments:
#if_green
def exclaim_if_green(exclamation):
print(exclamation, 'that IS a nice color!')
exclaim_if_green('red', 'Yay') # again, nothing
exclaim_if_green('green', 'Wow') # => Wow that IS a nice color!
The result of decorating a function with if_green is that a new first argument gets prepended to its signature, which will be invisible to the original function (as run_function_if_green doesn't forward it). As you are free in how you implement the function returned by the decorator, it could also call the original function with less, more or different arguments, do any required transformation on them before passing them to the original function or do other crazy stuff.
Concepts, concepts, concepts
Understanding decorators requires knowledge and understanding of various other concepts of the Python language. (Most of which aren't specific to Python, but one might still not be aware of them.)
For brevity's sake (this answer is long enough as it is), I've skipped or glossed over most of them. For a more comprehensive speedrun through (I think) all relevant ones, consult e.g. Understanding Python Decorators in 12 Easy Steps!.
The inputs to decorators (arguments, wrapped function) are rather static in python. There is no way to dynamically pass an argument like you're asking. If the user id can be extracted from somewhere at runtime inside the decorator function however, you can achieve what you want..
In Django for example, things like #login_required expect that the function they're wrapping has request as the first argument, and Request objects have a user attribute that they can utilize. Another, uglier option is to have some sort of global object you can get the current user from (see thread local storage).
The short answer is no: you cannot pass dynamic parameters to decorators.
But... you can certainly invoke them programmatically:
First let's create a decorator that can perform a permission check before executing a function:
import functools
def check_permissions(user_id):
def decorator(f):
#functools.wraps(f)
def wrapper(*args, **kw):
if has_permissions(user_id):
return f(*args, **kw)
else:
# what do you want to do if there aren't permissions?
...
return wrapper
return decorator
Now, when extracting an action from your dictionary, wrap it using the decorator to create a new callable that does an automatic permission check:
checked_action = check_permissions(RSSFEED['user_id'])(
actions_dict[RSSFEED['action_taken']])
Now, when you call checked_action it will first check the permissions corresponding to the user_id before executing the underlying action.
You may easily work around it, example:
from functools import wraps
def some_function():
print("some_function executed")
def some_decorator(decorator_arg1, decorator_arg2):
def decorate(func):
#wraps(func)
def wrapper(*args, **kwargs):
print(decorator_arg1)
ret = func(*args, **kwargs)
print(decorator_arg2)
return ret
return wrapper
return decorate
arg1 = "pre"
arg2 = "post"
decorated = some_decorator(arg1, arg2)(some_function)
In [4]: decorated()
pre
some_function executed
post

Python function loses identity after being decorated

(Python 3)
First of all, I feel my title isn't quite what it should be, so if you stick through the question and come up with a better title, please feel free to edit it.
I have recently learned about Python Decorators and Python Annotations, and so I wrote two little functions to test what I have recently learned.
One of them, called wraps is supposed to mimic the behaviour of the functools wraps, while the other, called ensure_types is supposed to check, for a given function and through its annotations, if the arguments passed to some function are the correct ones.
This is the code I have for those functions:
def wraps(original_func):
"""Update the decorated function with some important attributes from the
one that was decorated so as not to lose good information"""
def update_attrs(new_func):
# Update the __annotations__
for key, value in original_func.__annotations__.items():
new_func.__annotations__[key] = value
# Update the __dict__
for key, value in original_func.__dict__.items():
new_func.__dict__[key] = value
# Copy the __name__
new_func.__name__ = original_func.__name__
# Copy the docstring (__doc__)
new_func.__doc__ = original_func.__doc__
return new_func
return update_attrs # return the decorator
def ensure_types(f):
"""Uses f.__annotations__ to check the expected types for the function's
arguments. Raises a TypeError if there is no match.
If an argument has no annotation, object is returned and so, regardless of
the argument passed, isinstance(arg, object) evaluates to True"""
#wraps(f) # say that test_types is wrapping f
def test_types(*args, **kwargs):
# Loop through the positional args, get their name and check the type
for i in range(len(args)):
# function.__code__.co_varnames is a tuple with the names of the
##arguments in the order they are in the function def statement
var_name = f.__code__.co_varnames[i]
if not(isinstance(args[i], f.__annotations__.get(var_name, object))):
raise TypeError("Bad type for function argument named '{}'".format(var_name))
# Loop through the named args, get their value and check the type
for key in kwargs.keys():
if not(isinstance(kwargs[key], f.__annotations__.get(key, object))):
raise TypeError("Bad type for function argument named '{}'".format(key))
return f(*args, **kwargs)
return test_types
Supposedly, everything is alright until now. Both the wraps and the ensure_types are supposed to be used as decorators. The problem comes when I defined a third decorator, debug_dec that is supposed to print to the console when a function is called and its arguments. The function:
def debug_dec(f):
"""Does some annoying printing for debugging purposes"""
#wraps(f)
def profiler(*args, **kwargs):
print("{} function called:".format(f.__name__))
print("\tArgs: {}".format(args))
print("\tKwargs: {}".format(kwargs))
return f(*args, **kwargs)
return profiler
That also works cooly. The problem comes when I try to use debug_dec and ensure_types at the same time.
#ensure_types
#debug_dec
def testing(x: str, y: str = "lol"):
print(x)
print(y)
testing("hahaha", 3) # raises no TypeError as expected
But if I change the order with which the decorators are called, it works just fine.
Can someone please help me understand what is going wrong, and if is there any way of solving the problem besides swapping those two lines?
EDIT
If I add the lines:
print(testing.__annotations__)
print(testing.__code__.co_varnames)
The output is as follows:
#{'y': <class 'str'>, 'x': <class 'str'>}
#('args', 'kwargs', 'i', 'var_name', 'key')
Although wraps maintains the annotations, it doesn't maintain the function signature. You see this when you print out the co_varnames. Since ensure_types does its checking by comparing the names of the arguments with the names in the annotation dict, it fails to match them up, because the wrapped function has no arguments named x and y (it just accepts generic *args and **kwargs).
You could try using the decorator module, which lets you write decorators that act like functools.wrap but also preserve the function signature (including annotations).
There is probably also a way to make it work "manually", but it would be a bit of a pain. Basically what you would have to do is have wraps store the original functions argspec (the names of its arguments), then have ensure_dict use this stored argspec instead of the wrapper's argspec in checking the types. Essentially your decorators would pass the argspec in parallel with the wrapped functions. However, using decorator is probably easier.

cleaning up nested function calls

I have written several functions that run sequentially, each one taking as its input the output of the previous function so in order to run it, I have to run this line of code
make_list(cleanup(get_text(get_page(URL))))
and I just find that ugly and inefficient, is there a better way to do sequential function calls?
Really, this is the same as any case where you want to refactor commonly-used complex expressions or statements: just turn the expression or statement into a function. The fact that your expression happens to be a composition of function calls doesn't make any difference (but see below).
So, the obvious thing to do is to write a wrapper function that composes the functions together in one place, so everywhere else you can make a simple call to the wrapper:
def get_page_list(url):
return make_list(cleanup(get_text(get_page(url))))
things = get_page_list(url)
stuff = get_page_list(another_url)
spam = get_page_list(eggs)
If you don't always call the exact same chain of functions, you can always factor out into the pieces that you frequently call. For example:
def get_clean_text(page):
return cleanup(get_text(page))
def get_clean_page(url):
return get_clean_text(get_page(url))
This refactoring also opens the door to making the code a bit more verbose but a lot easier to debug, since it only appears once instead of multiple times:
def get_page_list(url):
page = get_page(url)
text = get_text(page)
cleantext = cleanup(text)
return make_list(cleantext)
If you find yourself needing to do exactly this kind of refactoring of composed functions very often, you can always write a helper that generates the refactored functions. For example:
def compose1(*funcs):
#wraps(funcs[0])
def composed(arg):
for func in reversed(funcs):
arg = func(arg)
return arg
return composed
get_page_list = compose1(make_list, cleanup, get_text, get_page)
If you want a more complicated compose function (that, e.g., allows passing multiple args/return values around), it can get a bit complicated to design, so you might want to look around on PyPI and ActiveState for the various existing implementations.
You could try something like this. I always like separating train wrecks(the book "Clean Code" calls those nested functions train wrecks). This is easier to read and debug. Remember you probably spend twice as long reading your code than writing it so make it easier to read. You will thank yourself later.
url = get_page(URL)
url_text = get_text(url)
make_list(cleanup(url_text))
# you can also encapsulate that into its own function
def build_page_list_from_url(url):
url = get_page(URL)
url_text = get_text(url)
return make_list(cleanup(url_text))
Options:
Refactor: implement this series of function calls as one, aptly-named method.
Look into decorators. They're syntactic sugar for 'chaining' functions in this way. E.g. implement cleanup and make_list as a decorators, then decorate get_text with them.
Compose the functions. See code in this answer.
You could shorten constructs like that with something like the following:
class ChainCalls(object):
def __init__(self, *funcs):
self.funcs = funcs
def __call__(self, *args, **kwargs):
result = self.funcs[-1](*args, **kwargs)
for func in self.funcs[-2::-1]:
result = func(result)
return result
def make_list(arg): return 'make_list(%s)' % arg
def cleanup(arg): return 'cleanup(%s)' % arg
def get_text(arg): return 'get_text(%s)' % arg
def get_page(arg): return 'get_page(%r)' % arg
mychain = ChainCalls(make_list, cleanup, get_text, get_page)
print( mychain('http://is.gd') )
Output:
make_list(cleanup(get_text(get_page('http://is.gd'))))

Dictionary or If statements, Jython

I am writing a script at the moment that will grab certain information from HTML using dom4j.
Since Python/Jython does not have a native switch statement I decided to use a whole bunch of if statements that call the appropriate method, like below:
if type == 'extractTitle':
extractTitle(dom)
if type == 'extractMetaTags':
extractMetaTags(dom)
I will be adding more depending on what information I want to extract from the HTML and thought about taking the dictionary approach which I found elsewhere on this site, example below:
{
'extractTitle': extractTitle,
'extractMetaTags': extractMetaTags
}[type](dom)
I know that each time I run the script the dictionary will be built, but at the same time if I were to use the if statements the script would have to check through all of them until it hits the correct one. What I am really wondering, which one performs better or is generally better practice to use?
Update: #Brian - Thanks for the great reply. I have a question, if any of the extract methods require more than one object, e.g.
handle_extractTag(self, dom, anotherObject)
# Do something
How would you make the appropriate changes to the handle method to implemented this? Hope you know what I mean :)
Cheers
To avoid specifying the tag and handler in the dict, you could just use a handler class with methods named to match the type. Eg
class MyHandler(object):
def handle_extractTitle(self, dom):
# do something
def handle_extractMetaTags(self, dom):
# do something
def handle(self, type, dom):
func = getattr(self, 'handle_%s' % type, None)
if func is None:
raise Exception("No handler for type %r" % type)
return func(dom)
Usage:
handler = MyHandler()
handler.handle('extractTitle', dom)
Update:
When you have multiple arguments, just change the handle function to take those arguments and pass them through to the function. If you want to make it more generic (so you don't have to change both the handler functions and the handle method when you change the argument signature), you can use the *args and **kwargs syntax to pass through all received arguments. The handle method then becomes:
def handle(self, type, *args, **kwargs):
func = getattr(self, 'handle_%s' % type, None)
if func is None:
raise Exception("No handler for type %r" % type)
return func(*args, **kwargs)
With your code you're running your functions all get called.
handlers = {
'extractTitle': extractTitle,
'extractMetaTags': extractMetaTags
}
handlers[type](dom)
Would work like your original if code.
It depends on how many if statements we're talking about; if it's a very small number, then it will be more efficient than using a dictionary.
However, as always, I strongly advice you to do whatever makes your code look cleaner until experience and profiling tell you that a specific block of code needs to be optimized.
Your use of the dictionary is not quite correct. In your implementation, all methods will be called and all the useless one discarded. What is usually done is more something like:
switch_dict = {'extractTitle': extractTitle,
'extractMetaTags': extractMetaTags}
switch_dict[type](dom)
And that way is facter and more extensible if you have a large (or variable) number of items.
The efficiency question is barely relevant. The dictionary lookup is done with a simple hashing technique, the if-statements have to be evaluated one at a time. Dictionaries tend to be quicker.
I suggest that you actually have polymorphic objects that do extractions from the DOM.
It's not clear how type gets set, but it sure looks like it might be a family of related objects, not a simple string.
class ExtractTitle( object ):
def process( dom ):
return something
class ExtractMetaTags( object ):
def process( dom ):
return something
Instead of setting type="extractTitle", you'd do this.
type= ExtractTitle() # or ExtractMetaTags() or ExtractWhatever()
type.process( dom )
Then, you wouldn't be building this particular dictionary or if-statement.

Categories

Resources