I have a functions module that has some functions all with some common inputs, and others that are particular to them. e.g.
def func_a(time_series, window ='1D'):
def func_b(time_series, window ='1D', availability_history ):
def func_c(time_series, window ='1D', max_lag=25, use_probability ='T'):
I am trying to run these functions in a loop as follows:
func_list = [func_a, func_b, func_c]
windows = ['1D', '5D']
params = ['', hist, (25, 'T')]
for i_func, func in enumerate(func_list):
class_obj = class_X(A,B,func)
for window in windows:
args = (window, params[i_func]) # params is a list or tuple of other params for funcs e.g.
class_obj.run_func(args)
And in another module
class class_X(object):
def __init__(self, a, b, func_to_run):
self.a = a
self.ts = b
self.method = func_to_run
def generate_output(self, *args):
return self.method(self.ts, args) # time series is common and fixed for all, other params differ or change
The above code wouldn't work because I think the functions that I am calling need to be changed to make use of *argsrather than having fixed defined params.
I think *args is meant for functions where number of input params are not known, but I am trying to use it in a case where the number of input params is known, but varies across different functions in a loop.
Is there any fix for this where I don't have to modify the functions module and can still pass all the required params as a single object (e.g. list or tuple)?
EDIT-
macromoonshine's answer states I can use kwargs like this:
def generate_output(self, **kwargs):
return self.method(self.ts, kwargs)
With this modification you can call generate_outputs() as follows:
x.generate_outputs( window ='1D', max_lag=25, use_probability ='T')
where xis an instance of your class X
Can this be enhanced so I can pass args other than time_series and window as a lookup value in a loop e.g.
x.generate_outputs( window ='1D', params[iloop])
where
params[iloop] = max_lag=25, use_probability ='T'
I tried doing this:
params = (30, "F")
x.generate_outputs( window, *params)
but get an error
TypeError: generate_output() takes 1 positional argument but 4 were given
You can use the **kwargs instead which allows arbitrary keyword parameters. This should be easier than chinging each function. You have just to modify your generate_outputs() method in your code:
def generate_output(self, **kwargs):
return self.method(self.ts, kwargs)
With this modification you can call generate_outputs() as follows:
x.generate_outputs(time_series, window ='1D', max_lag=25, use_probability ='T')
where xis an instance of your class X.
If you want to pass the kwargs from a dict instead named parameter, you have to prefix the dictionary variable with **. The adapted code should look like this:
params = [{max_lag: 35, use_probability: 'F'}, ... ]
TS= [1,2,3,4]
for i_func, func in enumerate(func_list):
class_obj = class_X(TS, func)
for window in windows:
req_args = dict(params[i_func])
req_args['window'] = 0
class_obj.generate_output(**req_args)
Related
The task is to write a class decorator, which reads a JSON file and makes its key/values to become properties of the class.
But one of conditions is that there has to be the ability to pass values manually (by creating a class object) as well.
I almost did it. There's just a tiny problem. The program reads data from JSON file and passes them successfully to the class. But when passing values manually during creation of an object of the class, values don't change and they are still being taken from JSON.
The problem only disappears when passing values as default values.
room = Room(1, 1) # Doesn't work
room = Room(tables=1, chairs=1) # Does work
Since arguments have to be passed only as numbers in tests, I have to manage it to work with just numbers, not default values.
Here's the code.
from json import load
def json_read_data(file):
def decorate(cls):
def decorated(*args, **kwargs):
if kwargs == {}:
with open(file) as f:
params = {}
for key, value in load(f).items():
params[key] = value
return cls(**params)
else:
return cls(*args, **kwargs)
return decorated
return decorate
#json_read_data('furniture.json')
class Room:
def __init__(self, tables=None, chairs=None):
self.tables = tables
self.chairs = chairs
def is_it_enough(self):
return self.chairs * 0.5 - self.tables > 0.4
kitchen = Room() # This is passing values from JSON file
print(kitchen.__dict__) # Prints {'tables': 2, 'chairs': 5}
room = Room(tables=1, chairs=1) # This is passing values manually
print(room.__dict__) # Prints {'tables': 1, 'chairs': 1}
'''
JSON file:
{
"tables": 2,
"chairs": 5
}
'''
But if we change to room = Room(1, 1), print(room.dict) prints {'tables': 2, 'chairs': 5} again. Please help me solve this problem!
You need to add your arguments to the decorator. Remember that your decorator is called first and then it calls the decorated function.
You could declare your decorator as: def json_read_data(file, *args): then the subsequent calls to cls() would have to be adapted to accept them. The second one already does, the first one needs modification.
It seems, this edit really worked:
def decorated(*args, **kwargs):
if not args and not kwargs:
I've been tinkering with decorators lately and (as an academic exercise) tried to implement a decorator that allows for partial application and/or currying of the decorated function. Furthermore this decorator should be optionally parameterizable and take a kwarg asap which determines if the decorated function should return as soon as all mandatory args/kwargs are aquired (default: asap=True) or if the decoratored function should keep caching args/kwargs until the function is called without arguments (asap=False).
Here is the decorator I came up with:
def partialcurry(_f=None, *, asap: bool=True):
""" Decorator; optionally parameterizable; Allows partial application /and/or/ currying of the decorated function F. Decorated F fires as soon as all mandatory args and kwargs are supplied, or, if ASAP=False, collects args and kwargs and fires only if F is called without args/kwargs. """
def _decor(f, *args, **kwargs):
_all_args, _all_kwargs = list(args), kwargs
#functools.wraps(f)
def _wrapper(*more_args, **more_kwargs):
nonlocal _all_args, _all_kwargs # needed for resetting, not mutating
_all_args.extend(more_args)
_all_kwargs.update(more_kwargs)
if asap:
try:
result = f(*_all_args, **_all_kwargs)
# reset closured args/kwargs caches
_all_args, _all_kwargs = list(), dict()
except TypeError:
result = _wrapper
return result
elif not asap:
if more_args or more_kwargs:
return _wrapper
else:
result = f(*_all_args, **_all_kwargs)
# again, reset closured args/kwargs caches
_all_args, _all_kwargs = list(), dict()
return result
return _wrapper
if _f is None:
return _decor
return _decor(_f)
### examples
#partialcurry
def fun(x, y, z=3):
return x, y, z
print(fun(1)) # preloaded function object
print(fun(1, 2)) # all mandatory args supplied; (1,1,2); reset
print(fun(1)(2)) # all mandatory args supplied; (1,2,3); reset
print()
#partialcurry(asap=False)
def fun2(x, y, z=3):
return x, y, z
print(fun2(1)(2, 3)) # all mandatory args supplied; preloaded function object
print(fun2()) # fire + reset
print(fun2(1)(2)) # all mandatory args supplied; preloaded function object
print(fun2(4)()) # load one more and fire + reset
I am sure that this can be generally improved (implementing this as a class would be a good idea for example) and any suggestions are much appreciated, my main question however is how to determine if all mandatory args/kwargs are supplied, because I feel like to check for a TypeError is too generic and could catch all kinds of TypeErrors. One idea would be to define a helper function that calculates the number of mandatory arguments, maybe something like this:
def _required_args_cnt(f):
""" Auxiliary function: Calculate the number of /required/ args of a function F. """
all_args_cnt = f.__code__.co_argcount + f.__code__.co_kwonlyargcount
def_args_cnt = len(f.__defaults__) if f.__defaults__ else 0
return all_args_cnt - def_args_cnt
Obviously unsatisfactory..
Any suggestions are much appreciated!
I have a working solution for what I am trying to achieve, but I am looking for simpler way to do it.
I have a class that encapsulates a function and a user can pass a function (lambda expression) to it. Those functions always take one input data argument and an arbitrary amount of user defined custom-arguments:
c.set_func(lambda x, offset, mul: mul*(x**2 + offset), offset=3, mul=1)
The user can then call a class method that will run the function with a predefined input and the currently set custom-arguments. The user also has the option to change the custom-arguments by just changing attributes of the class.
Here is my code:
from functools import partial
class C:
def __init__(self):
self.data = 10 # just an example
self.func = None
self.arg_keys = []
def set_func(self, func, **kwargs):
self.func = func
for key, value in kwargs.iteritems():
# add arguments to __dict__
self.__dict__[key] = value
self.arg_keys.append(key)
# store a list of the argument names
self.arg_keys = list(set(self.arg_keys))
def run_function(self):
# get all arguments from __dict__ that are in the stored argument-list
argdict = {key: self.__dict__[key] for key in self.arg_keys}
f = partial(self.func, **argdict)
return f(self.data)
if __name__ == '__main__':
# Here is a testrun:
c = C()
c.set_func(lambda x, offset, mul: mul*(x**2 + offset), offset=3, mul=1)
print c.run_function()
# -> 103
c.offset = 5
print c.run_function()
# -> 105
c.mul = -1
print c.run_function()
# -> -105
The important part are:
that the user can initially set the function with any number of arguments
The values of those arguments are stored until changed
Is there any builtin or otherwise simpler solution to this?
Basically I want to do something like this:
How can I hook a function in a python module?
but I want to call the old function after my own code.
like
import whatever
oldfunc = whatever.this_is_a_function
def this_is_a_function(parameter):
#my own code here
# and call original function back
oldfunc(parameter)
whatever.this_is_a_function = this_is_a_function
Is this possible?
I tried copy.copy, copy.deepcopy original function but it didn't work.
Something like this? It avoids using globals, which is generally a good thing.
import whatever
import functools
def prefix_function(function, prefunction):
#functools.wraps(function)
def run(*args, **kwargs):
prefunction(*args, **kwargs)
return function(*args, **kwargs)
return run
def this_is_a_function(parameter):
pass # Your own code here that will be run before
whatever.this_is_a_function = prefix_function(
whatever.this_is_a_function, this_is_a_function)
prefix_function is a function that takes two functions: function and prefunction. It returns a function that takes any parameters, and calls prefunction followed by function with the same parameters. The prefix_function function works for any callable, so you only need to program the prefixing code once for any other hooking you might need to do.
#functools.wraps makes it so that the docstring and name of the returned wrapper function is the same.
If you need this_is_a_function to call the old whatever.this_is_a_function with arguments different than what was passed to it, you could do something like this:
import whatever
import functools
def wrap_function(oldfunction, newfunction):
#functools.wraps(function)
def run(*args, **kwargs):
return newfunction(oldfunction, *args, **kwargs)
return run
def this_is_a_function(oldfunc, parameter):
# Do some processing or something to customize the parameters to pass
newparams = parameter * 2 # Example of a change to newparams
return oldfunc(newparams)
whatever.this_is_a_function = wrap_function(
whatever.this_is_a_function, this_is_a_function)
There is a problem that if whatever is a pure C module, it's typically impossible (or very difficult) to change its internals in the first place.
So, here's an example of monkey-patching the time function from the time module.
import time
old_time = time.time
def time():
print('It is today... but more specifically the time is:')
return old_time()
time.time = time
print time.time()
# Output:
# It is today... but more specifically the time is:
# 1456954003.2
However, if you are trying to do this to C code, you will most likely get an error like cannot overwrite attribute. In that case, you probably want to subclass the C module.
You may want to take a look at this question.
This is the perfect time to tout my super-simplistic Hooker
def hook(hookfunc, oldfunc):
def foo(*args, **kwargs):
hookfunc(*args, **kwargs)
return oldfunc(*args, **kwargs)
return foo
Incredibly simple. It will return a function that first runs the desired hook function (with the same parameters, mind you) and will then run the original function that you are hooking and return that original value. This also works to overwrite a class method. Say we have static method in a class.
class Foo:
#staticmethod
def bar(data):
for datum in data:
print(datum, end="") # assuming python3 for this
print()
But we want to print the length of the data before we print out its elements
def myNewFunction(data):
print("The length is {}.".format(len(data)))
And now we simple hook the function
Foo.bar(["a", "b", "c"])
# => a b c
Foo.bar = hook(Foo.bar, myNewFunction)
Foo.bar(["x", "y", "z"])
# => The length is 3.
# => x y z
Actually, you can replace the target function's func_code. The example below
# a normal function
def old_func():
print "i am old"
# a class method
class A(object):
def old_method(self):
print "i am old_method"
# a closure function
def make_closure(freevar1, freevar2):
def wrapper():
print "i am old_clofunc, freevars:", freevar1, freevar2
return wrapper
old_clofunc = make_closure('fv1', 'fv2')
# ===============================================
# the new function
def new_func(*args):
print "i am new, args:", args
# the new closure function
def make_closure2(freevar1, freevar2):
def wrapper():
print "i am new_clofunc, freevars:", freevar1, freevar2
return wrapper
new_clofunc = make_closure2('fv1', 'fv2')
# ===============================================
# hook normal function
old_func.func_code = new_func.func_code
# hook class method
A.old_method.im_func.func_code = new_func.func_code
# hook closure function
# Note: the closure function's `co_freevars` count should be equal
old_clofunc.func_code = new_clofunc.func_code
# ===============================================
# call the old
old_func()
A().old_method()
old_clofunc()
output:
i am new, args: ()
i am new, args: (<__main__.A object at 0x0000000004A5AC50>,)
i am new_clofunc, freevars: fv1 fv2
I have created a class that can take a function with a set of arguments. I would like to run the passed function every time the event handler signals.
I am attaching my code below which runs when I pass a fun2 which has no arguments but not with fun1. Any suggestions that I can make to the code below work with fun1 and fun2? If I omit the return statement from fun1, I get an error that 'str' object is not callable.
>>> TimerTest.main()
function 1. this function does task1
my function from init from function1
my function in start of runTimerTraceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Program Files (x86)\IronPython 2.7\TimerTest.py", line 57, in main
File "C:\Program Files (x86)\IronPython 2.7\TimerTest.py", line 25, in runTime
r
TypeError: str is not callable
import System
from System.Timers import (Timer, ElapsedEventArgs)
class timerTest:
def __init__ (self, interval,autoreset, fun):
self.Timer = Timer()
self.Timer.Interval= interval
self.Timer.AutoReset = autoreset
self.Timer.Enabled = True
self.myfunction = fun
def runTimer(self):
print 'my function in start of runTimer', self.myfunction ()
self.Timer.Start()
def OnTimedEvent (s, e):
print "The Elapsed event was raised at " , e.SignalTime
print 'printing myfunction...', self.myfunction()
self.myfunction()
self.Timer.Elapsed += OnTimedEvent
def stopTimer(self):
self.Timer.Stop()
self.Timer.Dispose= True
def fun1(a,b):
print 'function 1. this function does task1'
return 'from function1'
def fun2():
print 'Function 2. This function does something'
print 'Test 1...2...3...'
return 'From function 2'
def main():
a = timerTest(1000, True, fun1(10,20))
a.runTimer()
b= timerTest(3000,True,fun2)
b.runTimer()
if __name__ == '__main__':
main()
I am learning Python and I apologize if my questions are basic.
To change the interval, I stop the timer using a stopTimer method I added to the timerTest class:
def stopTimer(self):
self.Timer.Stop()
I take the new user input to call the runTimer method which I have revised per Paolo Moretti's suggestions:
def runTimer(self, interval,autoreset,fun,arg1, arg2, etc.):
self.Timer.Interval= interval
self.Timer.AutoReset = autoreset
myfunction = fun
my_args = args
self.Timer.Start()
def OnTimedEvent (s, e):
print "The Elapsed event was raised at " , e.SignalTime
myfunction(*my_args)
self.Timer.Elapsed += OnTimedEvent
Whenever a command button is pressed, the following method is called:
requestTimer.runTimer((self.intervalnumericUpDown.Value* 1000),True, function, *args)
I do not understand why stopping the timer and sending the request causes the runTimer method to be executed multiple times and it seems dependent on how many times I change the interval. I have tried a couple of methods: Close and Dispose with no success.
A second question on slightly different subject.
I have been looking at other .NET classes with Timer classes. A second question is on how I would translate the following VB sample code into Python. Is "callback As TimerCallback" equivalent to myfunction(*my_args)?
Public Sub New ( _
callback As TimerCallback, _
state As Object, _
dueTime As Integer, _
period As Integer _
)
per .NET documentation:
callback
Type: System.Threading.TimerCallback
A TimerCallback delegate representing a method to be executed.
I can partially get the timer event to fire if I define a function with no arguments such as:
def fun2(stateinfo):
# function code
which works with:
self.Timer = Timer(fun2, self.autoEvent, self.dueTime,self.period)
The function call fails if I replace fun2 with a more generic function call myfunction(*my_args)
You can also use * syntax for calling a function with an arbitrary argument list:
class TimerTest:
def __init__(self, interval, autoreset, fun, *args):
# ...
self.my_function = fun
self.my_args = args
# ...
def run_timer(self):
# ...
def on_timed_event(s, e):
# ...
self.my_function(*self.my_args)
# ...
Usage:
>>> t1 = TimerTest(1000, True, fun1, 10, 20)
>>> t2 = TimerTest(1000, True, fun2)
And check out the PEP8 style guide as well. Python's preferred coding conventions are different than many other common languages.
Question 1
Every time you use the addition assignment operator (+=) you are attaching a new event handler to the event. For example this code:
timer = Timer()
def on_timed_event(s, e):
print "Hello form my event handler"
timer.Elapsed += on_timed_event
timer.Elapsed += on_timed_event
timer.Start()
will print the "Hello form my event handler"phrase twice.
For more information you can check out the MSDN documentation, in particular Subscribe to and Unsubscribe from Events .
So, you should probably move the event subscription to the __init__ method, and only start the timer in your run_timer method:
def run_timer(self):
self.Timer.Start()
You could also add a new method (or use a property) for changing the interval:
def set_interval(self, interval):
self.Timer.Interval = interval
Question 2
You are right about TimerCallback: it's a delegate representing a method to be executed.
For example, this Timer constructor:
public Timer(
TimerCallback callback
)
is expecting a void function with a single parameter of type Object.
public delegate void TimerCallback(
Object state
)
When you are invoking a function using the * syntax you are doing something completely different. It's probably easier if I'll show you an example:
def foo(a, b, *args):
print a
print b
print args
>>> foo(1, 2, 3, 4, 5)
1
2
(3, 4, 5)
>>> args = (1, 2, 3)
>>> foo(1, 2, *args)
1
2
(1, 2, 3)
Basically in the second case you are invoking a function with additional arguments unpacked from a tuple.
So If you want to pass a function with a different signature to a constructor which accepts a TimerCallback delegate you have to create a new function, like #Lasse is suggesting.
def my_func(state, a, b):
pass
You can do this either using the lambda keyword:
t1 = Timer(lambda state: my_func(state, 1, 2))
or by declaring a new function:
def time_proc(state):
my_func(state, 1, 2)
t2 = Timer(time_proc)
If the function takes no parameters, simply pass it without calling it:
b = timerTest(3000, True, fun2)
If it takes parameters, you need to convert it to a function that doesn't take parameters. What you're doing is calling it, and then you pass the result, which in this case is a string. Instead do this:
a = timerTest(1000, True, lambda: fun1(10, 20))