Handling a function argument with a decorator - python

At the core, what I'm trying to do is take a number of functions that look like this undecorated validation function:
def f(k: bool):
def g(n):
# check that n is valid
return n
return g
And make them look like this decorated validation function:
#k
def f():
def g(n):
# check that n is valid
return n
return g
The idea here being that k is describing the same functionality across all of the implementing functions.
Specifically, these functions are all returning 'validation' functions for use with the voluptuous validation framework. So all the functions of type f() are returning a function that is later executed by Schema(). k is actually allow_none, which is to say a flag that determines if a None value is ok. A very simple example might be this sample use code:
x = "Some input value."
y = None
input_validator = Schema(f(allow_none=True))
x = input_validator(x) # succeeds, returning x
y = input_validator(y) # succeeds, returning None
input_validator_no_none = Schema(f(allow_none=False))
x = input_validator(x) # succeeds, returning x
y = input_validator(y) # raises an Invalid
Without changing the sample use code I am attempting to achieve the same result by changing the undecorated validation functions to decorated validation functions. To give a concrete example, changing this:
def valid_identifier(allow_none: bool=True):
min_range = Range(min=1)
validator = Any(All(int, min_range), All(Coerce(int), min_range))
return Any(validator, None) if allow_none else validator
To this:
#allow_none(default=True)
def valid_identifier():
min_range = Range(min=1)
return Any(All(int, min_range), All(Coerce(int), min_range))
The function returned from these two should be equivalent.
What I've tried to write is this, utilizing the decorator library:
from decorator import decorator
#decorator
def allow_none(default: bool=True):
def decorate_validator(wrapped_validator, allow_none: bool=default):
#wraps(wrapped_validator)
def validator_allowing_none(*args, **kwargs):
if allow_none:
return Any(None, wrapped_validator)
else:
return wrapped_validator(*args, **kwargs)
return validator_allowing_none
return decorate_validator
And I have a unittest.TestCase in order to test if this works as expected:
#allow_none()
def test_wrapped_func():
return Schema(str)
class TestAllowNone(unittest.TestCase):
def test_allow_none__success(self):
test_string = "blah"
validation_function = test_wrapped_func(allow_none=False)
self.assertEqual(test_string, validation_function(test_string))
self.assertEqual(None, validation_function(None))
But my test returns the following failure:
def validate_callable(path, data):
try:
> return schema(data)
E TypeError: test_wrapped_func() takes 0 positional arguments but 1 was given
I tried debugging this, but couldn't get the debugger to actually enter the decoration. I suspect that because of naming issues, such as raised in this (very lengthy) blog post series, that test_wrapped_func isn't getting it's argument list properly set, and so the decorator is never even executed, but it may also be something else entirely.
I tried some other variations. By removing the function parentheses from #allow_none:
#allow_none
def test_wrapped_func():
return Schema(str)
I get a different error:
> validation_function = test_wrapped_func(allow_none=False)
E TypeError: test_wrapped_func() got an unexpected keyword argument 'allow_none'
Dropping the #decorator fails with:
> validation_function = test_wrapped_func(allow_none=False)
E TypeError: decorate_validator() missing 1 required positional argument: 'wrapped_validator'
Which makes sense because #allow_none takes an argument, and so the parentheses would logically be needed. Replacing them gives the original error.
Decorators are subtle, and I'm clearly missing something here. This is similar to currying a function, but it's not quite working. What am I missing about how this should be implemented?

I think you are putting your allow_none=default argument at the wrong nesting level. It should be on the innermost function (the wrapper), rather than the decorator (the middle level).
Try something like this:
def allow_none(default=True): # this is the decorator factory
def decorator(validator): # this is the decorator
#wraps(validator)
def wrapper(*args, allow_none=default, **kwargs): # this is the wrapper
if allow_none:
return Any(None, validator)
else:
return validator(*args, **kwargs)
return wrapper
return decorator
If you don't need the default to be settable, you can get rid of the outermost layer of nesting and just make the default value a constant in the wrapper function (or omit it if your callers will always pass a value). Note that as I wrote it above, the allow_none argument to the wrapper is a keyword-only argument. If you want to pass it as a positional parameter, you can move it ahead of *args, but that requires that it be the first positional argument, which may not be desireable from an API standpoint. More sophisticated solutions are probably possible, but overkill for this answer.

Related

Python: Decorator that reduces number of parameters in a function by fixing others

Let's say that I have the function
def add(a,b,c):
return a+b+c
I want a decorator that fixes the value of b, say to 5, and return a function with only two parameters a and c.
def add5(a,c):
return a+c+5
The function add5 should not have any other parameter. I'm not looking to solve this with a default parameters for b.
You can use functools.partial:
functools.partial(func, /, *args, **keywords)
Return a new partial
object which when called will behave like func called with the
positional arguments args and keyword arguments keywords.
from functools import partial
def add(a,b,c):
return a+b+c
If you want to give a fixed value to the first positional argument, you can do
add5 = partial(add, 5)
print(add5(1, 2))
# 8
As the first positional argument (a) will be replaced by 5, you can't do:
print(add5(a=3, b=4))
# TypeError: add() got multiple values for argument 'a'
If you want to control which parameter you fix, use keyword arguments:
add5 = partial(add, b=5)
print(add5(a=1, c=2))
# 8
In Python, functions are the first class objects, which means that –
Functions are objects; they can be referenced to, passed to a variable and returned from other functions as well.
Functions can be defined inside another function and can also be passed as argument to another function.
Decorators are very powerful and useful tool in Python since it allows programmers to modify the behavior of function or class. Decorators allow us to wrap another function in order to extend the behavior of wrapped function, without permanently modifying it.
In Decorators, functions are taken as the argument into another function and then called inside the wrapper function.
in your case:
def my_custom_decorator(f):
def outer_function(*args):
res = f(*args)
return res + 5
return outer_function
#my_custom_decorator
def A_and_C(a, c):
return a+c
print(A_and_C(2,3))
You can do it by
def add5(*arg):
return sum(args)+5
print(add5(1,2))
This will sum all the argument that you are passing to the function and will add 5 to the sum of the args.
Output
8

Closure : a function that returns the value of its previous call

I'm trying to build a function that returns the value of its previous call using closure. The first time function is called, it will return None. I'm not sure how to update last_in from one call to another.
def last_in(x):
last_in = [None]
def get():
temp = last_in[0]
last_in[0] = x
# print(last_in)
return temp
return get()
For example, print(last_in(1),last_in(2),last_in(3)) should print: None 1 2
The problem with your approach is that whenever you call last_in, i.e. the "outer" function, the previous value stored in last_in (the array, not the function) is reset to None. Instead, the outer function should be called only once so that the value is not reset each time you call it.
Not sure what you need this for, but I think it would make sense to create a decorator function for this, i.e. a function modifying an existing function. This way, all the storing-and-retrieving-the-last-result can be done in the decorator without cluttering the actual function. The outer function (the decorator) is called only once, and the original function is replaced with the decorated version that will correctly retrieve the stored value.
def return_previous(f):
f.last_result = None
def _f(*args, **kwargs):
res = f.last_result
f.last_result = f(*args, **kwargs)
return res
return _f
#return_previous
def some_function(x):
return x**2
print(some_function(1), some_function(2), some_function(3))
# None 1 4
I like the solution that #tobias_k provides, but here is another alternative which conforms to the current organization/structure of your code.
def last_in(x):
def get():
temp = last_in.__dict__.get('prev', None)
last_in.__dict__['prev'] = x
return temp
return get()
print(last_in(1),last_in(2),last_in(3))
None 1 2
This is a slight deviation from the request since it requires a second keyword argument but you could take advantage of the fact that default arguments are only set once (see here) to do something like this:
def last_in(x, __last=[None]):
last = __last[0]
__last[0] = x
return last
__last is set once when the function is declared and since it is mutable, you can update it at each function call and the updated value will persist between calls.

python try except as a function to evaluate expressions

I have tried creating a function that tries an expression and returns zero if errors are risen.
def try_or_zero(exp):
try:
exp
return exp
except:
return 0
Which obviously doesn't work. It seems the problem is that python doesn't have any form of lazy evaluation, so the expression is evaluated before it's passed to the function and so it rises the error before it gets into the function and therefor it never passes through the try logic.
Does anyone know if this can be done in Python?
Cheers
It seems the problem is that python doesn't have any form of lazy evaluation
Err... yes it does, but possibly not in the form you expect. Function arguments ARE indeed eval'd before being passed to the function, so
try_or_zero(foo.bar())
will indeed be executed as:
param = foo.bar()
try_or_zero(param)
Now python functions are plain objects (they can be used as variables, passed around as arguments to functions etc), and they are only invoked when applying the call operator (the parens, with or without arguments) so you can pass a function to try_or_zero and let try_or_zero call the function:
def try_or_zero(func):
try:
return func()
except Exception as e:
return 0
Now you're going to object that 1/ this will not work if the function expects arguments and 2/ having to write a function just for this is a PITA - and both objections are valid. Hopefully, Python also has a shortcut to create simple anonymous functions consisting of a single (even if arbitrarily complex) expression: lambda. Also, python functions (including "lambda functions" - which are, technically, plain functions) are closure - they capture the context in which they're defined - so it's quite easy to wrap all this together:
a = 42
b = "c"
def add(x, y):
return x + y
result = try_or_zero(lambda: add(a, b))
A side note about exception handling:
First don't use a bare except, at least catch Exception (else you might prevent some exception - like SysExit- to work as expected).
Also preferably only catch the exact exceptions you expect at a given point. In your case, you may want to pass a tuple of exceptions that you want to ignore, ie:
def try_or_zero(func, *exceptions):
if not exceptions:
exceptions = (Exception,)
try:
return func()
except exceptions as e:
return 0
a = 42
b = "c"
def add(x, y):
return x + y
result = try_or_zero(lambda: add(a, b), TypeError))
which will prevent your code from masking unexpected errors.
And finally: you may also want to add support for a return value other than zero in the case of an exception (not all expressions are supposed to return an int ):
# XXX : python3 only, python2 doesn't accept
# keyword args after *args
def try_or(func, *exceptions, default=0):
if not exceptions:
exceptions = (Exception,)
try:
return func()
except exceptions as e:
return default
# adding lists is legit too,
# so here you may want an empty list as the return value
# instead
a = [1, 2, 3]
# but only to lists
b = ""
result = try_or(lambda: a + b, TypeError, default=[]))
No need to bother with exec and stuff, use the fact that python functions are objects and thus can be passed as arguments
def try_or_zero(exp):
try:
return exp()
except:
return 0
And just call try_or_zero(my_awesome_func) (without the () for your method)
Pass the argument to function in str and do the exec inside function
def try_or_zero(exp):
try:
exec(exp)
return exp
except:
return 0
so your call to function will be like below
try_or_zero('1==2')
You can achieve this by enveloping your expressions in a function.
For example:
def ErrorTest():
# the expression you want to try
raise Exception
Also your try function should look like this:
def try_catch(exp):
try :
exp() # note the paranthesis
except:
return 0
And put it inside the function
try_or_zero(ErrorTest)
OutPut: 0
You Can also do it by using the eval() function, but you will have to put your code in String.
try_or_zero(exp):
try:
eval(exp) # exp must be a string, for example 'raise ValueError'
except:
return 0

Dynamic generation of Python function with given parameter names

I would like to create functions in my Python3 code, making use of data supplied during run-time. I'm stuck on how I can write a function along the lines of
def foo(varname1, varname2):
return varname1 + varname2
where the strings varname1 and varname2 that give the parameter names are specified as arguments to some constructor function, e.g.:
def makeNewFooFunc(varname1, varname2):
# do magic
return foo
fooFunc = makeNewFooFunc('first', 'second')
print(fooFunc(first=1, second=2))
# should return 3
what would be the #do magic step? Is this a job for eval, or is there an alternative?
You don't need to write a function like that.
just use **kwargs
def foo_func(**kwargs):
return sum(kwargs.values())
foo_func(any_name=1, any_name_2=2)
but if you still need to do what you want, you can try
def make_new_func(var_name_1, var_name_2):
def foo(**kwargs):
# make sure in kwargs only expected parameters
assert set(kwargs) == {var_name_1, var_name_2}
return kwargs[var_name_1] + kwargs[var_name_2]
return foo
foo_func = make_new_func('a', 'b')
foo_func(a=1, b=2)

TypeError: object.__new__() takes no parameters when using inheritance

Hey,
I'm having a hard time implementing something, that I guess shouldn't be hard. I've been reading many posts, and I still can't figure it out, though it is probably answered, and I might just simply not understand the answer :/
So, I have a class, defining an algorithm file three_dpll.py in logics/ and a couple of helper functions
class three_dpll(object):
...
def __extend__(self, symbol, value, mod):
""" __extend__(symbol, value) - extends the model
...
"""
def three_dpll(self, formula, symbols, mod):
""" three_dpll(formula, symbols, mod) - calculates 3-DPLL \n
NOTE: This algorithm should not be overwritten in any derived class!!"""
...
# find unit clause
curr_data = self.__find_unit_clause__(formula, mod)
current_symbol = curr_data[0]
current_symbol_set.add(current_symbol)
current_value = curr_data[1]
if current_symbol != None:
return three_dpll(formula, symbols - current_symbol_set,
self.__extend__(current_symbol, current_value, mod))
...
and a logic that should implement the algorithm for a certain logic, where I might redefine certain methods like from logics.three_dpll.py (or any other helper function for that matter )
from three_dpll import three_dpll
class kleene_logic(three_dpll):
""" This is the definition of Kleene logic """
pass
and now calling it from a function in another file:
def satisfiable(logic, formula):
""" satisfiable - \
takes a logic and a set of formula and returns true or false"""
# model is empty dictionary
model = {}
# symbols is a set
symbols = set()
my_logic = "logics."+logic # logic is passed as string to the script
__import__(my_logic, fromlist=['three_dpll'])
log = modules[my_logic]
used_logic = log.kleene_logic()
for clause in formula:
ite = iter(clause)
for literal in ite:
symbols.add(abs(literal))
try:
return used_logic.three_dpll(formula, symbols, model)
except FormulaValid.FormulaValid:
return True
The error I get is:
in three_dpll
self.__extend__(current_symbol, current_value, mod)) TypeError: object.__new__() takes no parameters
Any ideas on how to fix this?
Your class three_dpll also has a method three_dpll. When you
return three_dpll(...)
you are creating an instance of that class (instead of calling the method, which is probably what you wanted to do). The class has no __init__() function that could handle the arguments that you are giving. That is what the error message tells you.
What you want is probably something like
return self.three_dpll(...)
which would call the method. Not shure if it solves your problem, but that should explain something.

Categories

Resources