Function that defines a function in python - python

I have a program that defines the function verboseprint to either print or not print to the screen based on a boolean:
# define verboseprint based on whether we're running in verbose mode or not
if in_verbose_mode:
def verboseprint (*args):
for arg in args:
print arg,
print
print "Done defining verbose print."
else:
# if we're not in verbosemode, do nothing
verboseprint = lambda *a: None
My program uses multiple files, and I'd like to use this definition of verboseprint in all of them. All of the files will be passed the in_verbose_mode boolean. I know that I could just define verboseprint by itself in a file and then import it into all of my other files, but I need the function definition to be able to be declared two different ways based on a boolean.
So in summary: I need a function that can declare another function in two different ways, that I can then import into multiple files.
Any help would be appreciated.

You should look up the factory design pattern. It's basically designed to do exactly what you are talking about, though it would be with classes not functions. That being said, you can get the behavior that you want by having a class that returns one of two possible objects (based on your boolean). They both have the same method but it operates differently (just like your two functions).
Class A:
def method():
do things one way
Class B:
def method():
do things another way
import A,B
Class Factory:
def __init__(bool):
self.printer = A if bool else B
def do_thing():
self.printer.method()
import Factory
fac = Factory(True)
fac.do_thing() # does A thing
fac = Factor(False)
fac.do_thing() # does B thing

Usually you don't want define a function in this way.
And I think the easy way to achieve this is you pass the boolean as a function parameter and define the behavior based on the parameter:
def verboseprint (*args, mode):
if mode == in_verbose_mode:
for arg in args:
print arg,
print
print "Done defining verbose print."
# if we're not in verbosemode, do nothing
##else:
## verboseprint = lambda *a: None
And then import this function to use in your other files.

Related

Is there a way to access the original function in a mocked method/function such that I can modify the arguments and pass it to the original functions?

I'd like to modify the arguments passed to a method in a module, as opposed to replacing its return value.
I've found a way around this, but it seems like something useful and has turned into a lesson in mocking.
module.py
from third_party import ThirdPartyClass
ThirdPartyClass.do_something('foo', 'bar')
ThirdPartyClass.do_something('foo', 'baz')
tests.py
#mock.patch('module.ThirdPartyClass.do_something')
def test(do_something):
# Instead of directly overriding its return value
# I'd like to modify the arguments passed to this function.
# change return value, no matter inputs
do_something.return_value = 'foo'
# change return value, based on inputs, but have no access to the original function
do_something.side_effect = lambda x, y: y, x
# how can I wrap do_something, so that I can modify its inputs and pass it back to the original function?
# much like a decorator?
I've tried something like the following, but not only is it repetitive and ugly, it doesn't work. After some PDB introspection.. I'm wondering if it's simply due to however this third party library works, as I do see the original functions being called successfully when I drop a pdb inside the side_effect.
Either that, or some auto mocking magic I'm just not following that I'd love to learn about.
def test():
from third_party import ThirdPartyClass
original_do_something = ThirdPartyClass.do_something
with mock.patch('module.ThirdPartyClass.do_something' as mocked_do_something:
def side_effect(arg1, arg2):
return original_do_something(arg1, 'overridden')
mocked_do_something.side_effect = side_effect
# execute module.py
Any guidance is appreciated!
You may want to use parameter wraps for the mock call. (Docs for reference.) This way the original function will be called, but it will have everything from Mock interface.
So for changing parameters called to original function you may want to try it like that:
org.py:
def func(x):
print(x)
main.py:
from unittest import mock
import org
of = org.func
def wrapped(a):
of('--{}--'.format(a))
with mock.patch('org.func', wraps=wrapped):
org.func('x')
org.func.assert_called_with('x')
result:
--x--
The trick is to pass the original underlying function that you still want to access as a parameter to the function.
Eg, for race condition testing, have tempfile.mktemp return an existing pathname:
def mock_mktemp(*, orig_mktemp=tempfile.mktemp, **kwargs):
"""Ensure mktemp returns an existing pathname."""
temp = orig_mktemp(**kwargs)
open(temp, 'w').close()
return temp
Above, orig_mktemp is evaluated when the function is declared, not when it is called, so all invocations will have access to the original method of tempfile.mktemp via orig_mktemp.
I used it as follows:
#unittest.mock.patch('tempfile.mktemp', side_effect=mock_mktemp)
def test_retry_on_existing_temp_path(self, mock_mktemp):
# Simulate race condition: creation of temp path after tempfile.mktemp
...

How can I overload in Python?

I'm trying to make a function that does different things when called on different argument types. Specifically, one of the functions should have the signature
def myFunc(string, string):
and the other should have the signature
def myFunc(list):
How can I do this, given that I'm not allowed to specify whether the arguments are strings or lists?
Python does not support overloading, even by the argument count. You need to do:
def foo(string_or_list, string = None):
if isinstance(string_or_list, list):
...
else:
...
which is pretty silly, or just rethink your design to not have to overload.
There is a recipe at http://code.activestate.com/recipes/577065-type-checking-function-overloading-decorator/ which does what you want;
basically, you wrap each version of your function with #takes and #returns type declarations; when you call the function, it tries each version until it finds one that does not throw a type error.
Edit: here is a cut-down version; it's probably not a good thing to do, but if you gotta, here's how:
from collections import defaultdict
def overloaded_function(overloads):
"""
Accepts a sequence of ((arg_types,), fn) pairs
Creates a dispatcher function
"""
dispatch_table = defaultdict(list)
for arg_types,fn in overloads:
dispatch_table[len(arg_types)].append([list(arg_types),fn])
def dispatch(*args):
for arg_types,fn in dispatch_table[len(args)]:
if all(isinstance(arg, arg_type) for arg,arg_type in zip(args,arg_types)):
return fn(*args)
raise TypeError("could not find an overloaded function to match this argument list")
return dispatch
and here's how it works:
def myfn_string_string(s1, s2):
print("Got the strings {} and {}".format(s1, s2))
def myfn_list(lst):
print("Got the list {}".format(lst))
myfn = overloaded_function([
((basestring, basestring), myfn_string_string),
((list,), myfn_list)
])
myfn("abcd", "efg") # prints "Got the strings abcd and efg"
myfn(["abc", "def"]) # prints "Got the list ['abc', 'def']"
myfn(123) # raises TypeError
*args is probably the better way, but you could do something like:
def myFunc(arg1, arg2=None):
if arg2 is not None:
#do this
else:
#do that
But that's probably a terrible way of doing it.
Not a perfect solution, but if the second string argument will never legitimately be None, you could try:
def myFunc( firstArg, secondArg = None ):
if secondArg is None:
# only one arg provided, try treating firstArg as a list
else:
# two args provided, try treating them both as strings
Define it as taking variable arguments:
def myFunc(*args):
Then you can check the amount and type of the arguments via len and isinstance, and route the call to the appropriate case-specific function.
It may make for clearer code, however, if you used optional named arguments. It would be better still if you didn't use overloading at all, it's kinda not python's way.
You can't - for instance a class instance method can be inserted in run-time.
If you had multiple __init__ for a class for instance, you'd be better off with multiple #classmethod's such as from_strings or from_sequence

How to get a functions arguments and values from outside the function?

I have searched a little bit to try to figure this one out but didn't get a solution that I was exactly looking for.
This is my use case:
I would like to evaluate expressions from a functions/methods doc-string against the f/m's parameters and values, but from outside the function (when being called but outside execution of the function
I can't statically change the source code I am evaluating (cant write in new functionality) but dynamically changing (i.e. wrapping the function or adding attributes at run-time) is acceptable
I would prefer to stick with tools in the standard library but am willing to try external libraries if it will make the task a breeze
Here is a simple example of what I am trying to do:
def f1(a,b):
"""a==b"""
pass
def f2(f):
f_locals = "get f's args and values before f is executed"
return eval(f.__doc__,None,f_locals)
>>> f2(f1(2,2))
While I have no clue why you would want to do this, what you've described can be achieved with the inspect module. This example is as close to your original example that I can come up with.
from inspect import getcallargs
def f1(a,b):
"""a==b"""
pass
def f2(f, *f_args, **f_kwargs):
f_callargs = getcallargs(f, *f_args, **f_kwargs)
return eval(f.__doc__, None, f_callargs)
f2(f1, 2, 2)
This should output True.
Keep in mind that this assumes a great many things about the arguments and docstrings of any such functions passed to f2, not the least of which is that none of the examined functions are malicious or malformed. Why don't you want to call functions normally, and why don't you want to change functions?
Edit: As Pajton pointed out, getcallargs is more appropriate here, and removes the calls to both dict and zip. The above code has been updated to reflect this.
I'm not sure if this is what you are looking for, but here's an alternative without inspect module.
#!/usr/bin/python
# -*- coding: utf-8-unix -*-
"""
This is a sample implementation of Inline.pm (Perl) in Python.
Using #inline decorator, it is now possible to write any code
in any language in docstring, and let it compile down to executable
Python code at runtime.
For this specific example, it simply evals input docstring, so
code in docstring must be in Python as well.
"""
# Language compiler for MyLang
class MyLang:
#classmethod
def compile(self, docstring):
# For this example, this simply generates code that
# evals docstring.
def testfunc(*arg, **kw):
return eval(docstring, None, kw)
return testfunc
# #inline decorator
def inline(lang):
def decorate(func):
parm = func.__code__.co_varnames[0:func.__code__.co_argcount]
fgen = lang.compile(func.__doc__)
def wrap(*arg, **kw):
# turn all args into keyword-args
kw.update(dict(zip(parm, arg)))
return fgen(**kw)
return wrap
return decorate
#inline(MyLang)
def myadd(a, b):
"""a + b"""
print(myadd(1, 9))
print(myadd(b = 8, a = 2))
print(myadd(a = 3, b = 7))

Efficient way of having a function only execute once in a loop

At the moment, I'm doing stuff like the following, which is getting tedious:
run_once = 0
while 1:
if run_once == 0:
myFunction()
run_once = 1:
I'm guessing there is some more accepted way of handling this stuff?
What I'm looking for is having a function execute once, on demand. For example, at the press of a certain button. It is an interactive app which has a lot of user controlled switches. Having a junk variable for every switch, just for keeping track of whether it has been run or not, seemed kind of inefficient.
I would use a decorator on the function to handle keeping track of how many times it runs.
def run_once(f):
def wrapper(*args, **kwargs):
if not wrapper.has_run:
wrapper.has_run = True
return f(*args, **kwargs)
wrapper.has_run = False
return wrapper
#run_once
def my_function(foo, bar):
return foo+bar
Now my_function will only run once. Other calls to it will return None. Just add an else clause to the if if you want it to return something else. From your example, it doesn't need to return anything ever.
If you don't control the creation of the function, or the function needs to be used normally in other contexts, you can just apply the decorator manually as well.
action = run_once(my_function)
while 1:
if predicate:
action()
This will leave my_function available for other uses.
Finally, if you need to only run it once twice, then you can just do
action = run_once(my_function)
action() # run once the first time
action.has_run = False
action() # run once the second time
Another option is to set the func_code code object for your function to be a code object for a function that does nothing. This should be done at the end of your function body.
For example:
def run_once():
# Code for something you only want to execute once
run_once.func_code = (lambda:None).func_code
Here run_once.func_code = (lambda:None).func_code replaces your function's executable code with the code for lambda:None, so all subsequent calls to run_once() will do nothing.
This technique is less flexible than the decorator approach suggested in the accepted answer, but may be more concise if you only have one function you want to run once.
Run the function before the loop. Example:
myFunction()
while True:
# all the other code being executed in your loop
This is the obvious solution. If there's more than meets the eye, the solution may be a bit more complicated.
I'm assuming this is an action that you want to be performed at most one time, if some conditions are met. Since you won't always perform the action, you can't do it unconditionally outside the loop. Something like lazily retrieving some data (and caching it) if you get a request, but not retrieving it otherwise.
def do_something():
[x() for x in expensive_operations]
global action
action = lambda : None
action = do_something
while True:
# some sort of complex logic...
if foo:
action()
There are many ways to do what you want; however, do note that it is quite possible that —as described in the question— you don't have to call the function inside the loop.
If you insist in having the function call inside the loop, you can also do:
needs_to_run= expensive_function
while 1:
…
if needs_to_run: needs_to_run(); needs_to_run= None
…
I've thought of another—slightly unusual, but very effective—way to do this that doesn't require decorator functions or classes. Instead it just uses a mutable keyword argument, which ought to work in most versions of Python. Most of the time these are something to be avoided since normally you wouldn't want a default argument value to change from call-to-call—but that ability can be leveraged in this case and used as a cheap storage mechanism. Here's how that would work:
def my_function1(_has_run=[]):
if _has_run: return
print("my_function1 doing stuff")
_has_run.append(1)
def my_function2(_has_run=[]):
if _has_run: return
print("my_function2 doing some other stuff")
_has_run.append(1)
for i in range(10):
my_function1()
my_function2()
print('----')
my_function1(_has_run=[]) # Force it to run.
Output:
my_function1 doing stuff
my_function2 doing some other stuff
----
my_function1 doing stuff
This could be simplified a little further by doing what #gnibbler suggested in his answer and using an iterator (which were introduced in Python 2.2):
from itertools import count
def my_function3(_count=count()):
if next(_count): return
print("my_function3 doing something")
for i in range(10):
my_function3()
print('----')
my_function3(_count=count()) # Force it to run.
Output:
my_function3 doing something
----
my_function3 doing something
Here's an answer that doesn't involve reassignment of functions, yet still prevents the need for that ugly "is first" check.
__missing__ is supported by Python 2.5 and above.
def do_once_varname1():
print 'performing varname1'
return 'only done once for varname1'
def do_once_varname2():
print 'performing varname2'
return 'only done once for varname2'
class cdict(dict):
def __missing__(self,key):
val=self['do_once_'+key]()
self[key]=val
return val
cache_dict=cdict(do_once_varname1=do_once_varname1,do_once_varname2=do_once_varname2)
if __name__=='__main__':
print cache_dict['varname1'] # causes 2 prints
print cache_dict['varname2'] # causes 2 prints
print cache_dict['varname1'] # just 1 print
print cache_dict['varname2'] # just 1 print
Output:
performing varname1
only done once for varname1
performing varname2
only done once for varname2
only done once for varname1
only done once for varname2
One object-oriented approach and make your function a class, aka as a "functor", whose instances automatically keep track of whether they've been run or not when each instance is created.
Since your updated question indicates you may need many of them, I've updated my answer to deal with that by using a class factory pattern. This is a bit unusual, and it may have been down-voted for that reason (although we'll never know for sure because they never left a comment). It could also be done with a metaclass, but it's not much simpler.
def RunOnceFactory():
class RunOnceBase(object): # abstract base class
_shared_state = {} # shared state of all instances (borg pattern)
has_run = False
def __init__(self, *args, **kwargs):
self.__dict__ = self._shared_state
if not self.has_run:
self.stuff_done_once(*args, **kwargs)
self.has_run = True
return RunOnceBase
if __name__ == '__main__':
class MyFunction1(RunOnceFactory()):
def stuff_done_once(self, *args, **kwargs):
print("MyFunction1.stuff_done_once() called")
class MyFunction2(RunOnceFactory()):
def stuff_done_once(self, *args, **kwargs):
print("MyFunction2.stuff_done_once() called")
for _ in range(10):
MyFunction1() # will only call its stuff_done_once() method once
MyFunction2() # ditto
Output:
MyFunction1.stuff_done_once() called
MyFunction2.stuff_done_once() called
Note: You could make a function/class able to do stuff again by adding a reset() method to its subclass that reset the shared has_run attribute. It's also possible to pass regular and keyword arguments to the stuff_done_once() method when the functor is created and the method is called, if desired.
And, yes, it would be applicable given the information you added to your question.
Assuming there is some reason why myFunction() can't be called before the loop
from itertools import count
for i in count():
if i==0:
myFunction()
Here's an explicit way to code this up, where the state of which functions have been called is kept locally (so global state is avoided). I don't much like the non-explicit forms suggested in other answers: it's too surprising to see f() and for this not to mean that f() gets called.
This works by using dict.pop which looks up a key in a dict, removes the key from the dict, and takes a default value to use in case the key isn't found.
def do_nothing(*args, *kwargs):
pass
# A list of all the functions you want to run just once.
actions = [
my_function,
other_function
]
actions = dict((action, action) for action in actions)
while True:
if some_condition:
actions.pop(my_function, do_nothing)()
if some_other_condition:
actions.pop(other_function, do_nothing)()
I use cached_property decorator from functools to run just once and save the value. Example from the official documentation https://docs.python.org/3/library/functools.html
class DataSet:
def __init__(self, sequence_of_numbers):
self._data = tuple(sequence_of_numbers)
#cached_property
def stdev(self):
return statistics.stdev(self._data)
You can also use one of the standard library functools.lru_cache or functools.cache decorators in front of the function:
from functools import lru_cache
#lru_cache
def expensive_function():
return None
https://docs.python.org/3/library/functools.html
If I understand the updated question correctly, something like this should work
def function1():
print "function1 called"
def function2():
print "function2 called"
def function3():
print "function3 called"
called_functions = set()
while True:
n = raw_input("choose a function: 1,2 or 3 ")
func = {"1": function1,
"2": function2,
"3": function3}.get(n)
if func in called_functions:
print "That function has already been called"
else:
called_functions.add(func)
func()
You have all those 'junk variables' outside of your mainline while True loop. To make the code easier to read those variables can be brought inside the loop, right next to where they are used. You can also set up a variable naming convention for these program control switches. So for example:
# # _already_done checkpoint logic
try:
ran_this_user_request_already_done
except:
this_user_request()
ran_this_user_request_already_done = 1
Note that on the first execution of this code the variable ran_this_user_request_already_done is not defined until after this_user_request() is called.
A simple function you can reuse in many places in your code (based on the other answers here):
def firstrun(keyword, _keys=[]):
"""Returns True only the first time it's called with each keyword."""
if keyword in _keys:
return False
else:
_keys.append(keyword)
return True
or equivalently (if you like to rely on other libraries):
from collections import defaultdict
from itertools import count
def firstrun(keyword, _keys=defaultdict(count)):
"""Returns True only the first time it's called with each keyword."""
return not _keys[keyword].next()
Sample usage:
for i in range(20):
if firstrun('house'):
build_house() # runs only once
if firstrun(42): # True
print 'This will print.'
if firstrun(42): # False
print 'This will never print.'
I've taken a more flexible approach inspired by functools.partial function:
DO_ONCE_MEMORY = []
def do_once(id, func, *args, **kwargs):
if id not in DO_ONCE_MEMORY:
DO_ONCE_MEMORY.append(id)
return func(*args, **kwargs)
else:
return None
With this approach you are able to have more complex and explicit interactions:
do_once('foobar', print, "first try")
do_once('foo', print, "first try")
do_once('bar', print, "second try")
# first try
# second try
The exciting part about this approach it can be used anywhere and does not require factories - it's just a small memory tracker.
Depending on the situation, an alternative to the decorator could be the following:
from itertools import chain, repeat
func_iter = chain((myFunction,), repeat(lambda *args, **kwds: None))
while True:
next(func_iter)()
The idea is based on iterators, which yield the function once (or using repeat(muFunction, n) n-times), and then endlessly the lambda doing nothing.
The main advantage is that you don't need a decorator which sometimes complicates things, here everything happens in a single (to my mind) readable line. The disadvantage is that you have an ugly next in your code.
Performance wise there seems to be not much of a difference, on my machine both approaches have an overhead of around 130 ns.
If the condition check needs to happen only once you are in the loop, having a flag signaling that you have already run the function helps. In this case you used a counter, a boolean variable would work just as fine.
signal = False
count = 0
def callme():
print "I am being called"
while count < 2:
if signal == False :
callme()
signal = True
count +=1
I'm not sure that I understood your problem, but I think you can divide loop. On the part of the function and the part without it and save the two loops.

Is this a good approach to execute a list of operations on a data structure in Python?

I have a dictionary of data, the key is the file name and the value is another dictionary of its attribute values. Now I'd like to pass this data structure to various functions, each of which runs some test on the attribute and returns True/False.
One approach would be to call each function one by one explicitly from the main code. However I can do something like this:
#MYmodule.py
class Mymodule:
def MYfunc1(self):
...
def MYfunc2(self):
...
#main.py
import Mymodule
...
#fill the data structure
...
#Now call all the functions in Mymodule one by one
for funcs in dir(Mymodule):
if funcs[:2]=='MY':
result=Mymodule.__dict__.get(funcs)(dataStructure)
The advantage of this approach is that implementation of main class needn't change when I add more logic/tests to MYmodule.
Is this a good way to solve the problem at hand? Are there better alternatives to this solution?
I'd say a better and much more Pythonic approach would be to define a decorator to indicate which functions you want to use:
class MyFunc(object):
funcs = []
def __init__(self, func):
self.funcs.append(func)
#MyFunc
def foo():
return 5
#MyFunc
def bar():
return 10
def quux():
# Not decorated, so will not be in MyFunc
return 20
for func in MyFunc.funcs:
print func()
Output:
5
10
Essentially you're performing the same logic: taking only functions who were defined in a particular manner and applying them to a specific set of data.
Sridhar, the method you proposed is very similar to the one used in the unittest module.
For example, this is how unittest.TestLoader finds the names of all the test methods to run (lifted from /usr/lib/python2.6/unittest.py):
def getTestCaseNames(self, testCaseClass):
"""Return a sorted sequence of method names found within testCaseClass
"""
def isTestMethod(attrname, testCaseClass=testCaseClass, prefix=self.testMethodPrefix):
return attrname.startswith(prefix) and hasattr(getattr(testCaseClass, attrname), '__call__')
testFnNames = filter(isTestMethod, dir(testCaseClass))
if self.sortTestMethodsUsing:
testFnNames.sort(key=_CmpToKey(self.sortTestMethodsUsing))
return testFnNames
Just like your proposal, unittest uses dir to list all the attributes of
testCaseClass, and filters the list for those whose name startswith prefix (which is set elsewhere to equal 'test').
I suggest a few minor changes:
If you place the functions in MYmodule.py, then (of course) the import statement must be
import MYmodule
Use getattr instead of .__dict__.get. Not only is it shorter, but it continue to work if you subclass Mymodule. That might not be your intention at this point, but using getattr is probably a good default habit anyway.
for funcs in dir(MYmodule.Mymodule):
if funcs.startswith('MY'):
result=getattr(MYmodule.Mymodule,funcs)(dataStructure)

Categories

Resources