Mocking a global variable in pytest - python

how do you mock a global variable in pytest? Here is a pair of example files:
File being tested, call it main.py:
MY_GLOBAL = 1
def foo():
return MY_GLOBAL*2
def main()
# some relevant invokation of foo somewhere here
if __name__=='__main__':
main()
File that is testing, call it test_main.py:
from main import foo
class TestFoo(object):
def test_that_it_multiplies_by_global(self):
# expected=2, we could write, but anyway ...
actual = foo()
assert actual == expected
This is just a dummy example of course, but how would you go about mocking MY_GLOBAL and giving it another value?
Thanks in advance i've been kind of breaking my head over this and i bet it's really obvious.

The global variable is an attribute of the module, which you can patch using patch.object:
import main
from unittest.mock import patch
class TestFoo(object):
def test_that_it_multiplies_by_global(self):
with patch.object(main, 'MY_GLOBAL', 3):
assert main.foo(4) == 12 # not 4
However, you want to make sure you are testing the right thing. Is foo supposed to multiply its argument by 1 (and the fact that it uses a global variable with a value of 1 an implementation detail), or is it supposed to multiply its argument by whatever value MY_GLOBAL has at the time of the call? The answer to that question will affect how you write your test.

It's useful to distinguish between module-level constants and global variables. The former are pretty common, but the latter are an anti-pattern in Python. You seem to have a module-level constant (read-only access to the var in normal production code). Global variables (R/W access in production code) should generally be refactored if possible.
For module constants:
If you can do so, it's generally more maintainable to refactor the functions that depend on module constants. This allows direct testing with alternate values as well as a single source of truth for the "constants" and backward compatibility. A minimal refactor is as simple as adding an optional parameter in each function that depends on the "constant" and doing a simple search-and-replace in that function, e.g.:
def foo(value=MY_GLOBAL):
return value*2
All other code can continue to call foo() as normal, but if you want to write tests with alternate values of MY_GLOBAL, you can simply call foo(value=7484).
If what you want is an actual global, (with the global keyword and read/write access during production code, try these alternatives.

Related

Function prototype in python? [duplicate]

Is it possible to forward-declare a function in Python? I want to sort a list using my own cmp function before it is declared.
print "\n".join([str(bla) for bla in sorted(mylist, cmp = cmp_configs)])
I've put the definition of cmp_configs method after the invocation. It fails with this error:
NameError: name 'cmp_configs' is not defined
Is there any way to "declare" cmp_configs method before it's used?
Sometimes, it is difficult to reorganize code to avoid this problem. For instance, when implementing some forms of recursion:
def spam():
if end_condition():
return end_result()
else:
return eggs()
def eggs():
if end_condition():
return end_result()
else:
return spam()
Where end_condition and end_result have been previously defined.
Is the only solution to reorganize the code and always put definitions before invocations?
Wrap the invocation into a function of its own so that
foo()
def foo():
print "Hi!"
will break, but
def bar():
foo()
def foo():
print "Hi!"
bar()
will work properly.
The general rule in Python is that a function should be defined before its usage, which does not necessarily mean it needs to be higher in the code.
If you kick-start your script through the following:
if __name__=="__main__":
main()
then you probably do not have to worry about things like "forward declaration". You see, the interpreter would go loading up all your functions and then start your main() function. Of course, make sure you have all the imports correct too ;-)
Come to think of it, I've never heard such a thing as "forward declaration" in python... but then again, I might be wrong ;-)
If you don't want to define a function before it's used, and defining it afterwards is impossible, what about defining it in some other module?
Technically you still define it first, but it's clean.
You could create a recursion like the following:
def foo():
bar()
def bar():
foo()
Python's functions are anonymous just like values are anonymous, yet they can be bound to a name.
In the above code, foo() does not call a function with the name foo, it calls a function that happens to be bound to the name foo at the point the call is made. It is possible to redefine foo somewhere else, and bar would then call the new function.
Your problem cannot be solved because it's like asking to get a variable which has not been declared.
I apologize for reviving this thread, but there was a strategy not discussed here which may be applicable.
Using reflection it is possible to do something akin to forward declaration. For instance lets say you have a section of code that looks like this:
# We want to call a function called 'foo', but it hasn't been defined yet.
function_name = 'foo'
# Calling at this point would produce an error
# Here is the definition
def foo():
bar()
# Note that at this point the function is defined
# Time for some reflection...
globals()[function_name]()
So in this way we have determined what function we want to call before it is actually defined, effectively a forward declaration. In python the statement globals()[function_name]() is the same as foo() if function_name = 'foo' for the reasons discussed above, since python must lookup each function before calling it. If one were to use the timeit module to see how these two statements compare, they have the exact same computational cost.
Of course the example here is very useless, but if one were to have a complex structure which needed to execute a function, but must be declared before (or structurally it makes little sense to have it afterwards), one can just store a string and try to call the function later.
If the call to cmp_configs is inside its own function definition, you should be fine. I'll give an example.
def a():
b() # b() hasn't been defined yet, but that's fine because at this point, we're not
# actually calling it. We're just defining what should happen when a() is called.
a() # This call fails, because b() hasn't been defined yet,
# and thus trying to run a() fails.
def b():
print "hi"
a() # This call succeeds because everything has been defined.
In general, putting your code inside functions (such as main()) will resolve your problem; just call main() at the end of the file.
There is no such thing in python like forward declaration. You just have to make sure that your function is declared before it is needed.
Note that the body of a function isn't interpreted until the function is executed.
Consider the following example:
def a():
b() # won't be resolved until a is invoked.
def b():
print "hello"
a() # here b is already defined so this line won't fail.
You can think that a body of a function is just another script that will be interpreted once you call the function.
Sometimes an algorithm is easiest to understand top-down, starting with the overall structure and drilling down into the details.
You can do so without forward declarations:
def main():
make_omelet()
eat()
def make_omelet():
break_eggs()
whisk()
fry()
def break_eggs():
for egg in carton:
break(egg)
# ...
main()
# declare a fake function (prototype) with no body
def foo(): pass
def bar():
# use the prototype however you see fit
print(foo(), "world!")
# define the actual function (overwriting the prototype)
def foo():
return "Hello,"
bar()
Output:
Hello, world!
No, I don't believe there is any way to forward-declare a function in Python.
Imagine you are the Python interpreter. When you get to the line
print "\n".join([str(bla) for bla in sorted(mylist, cmp = cmp_configs)])
either you know what cmp_configs is or you don't. In order to proceed, you have to
know cmp_configs. It doesn't matter if there is recursion.
You can't forward-declare a function in Python. If you have logic executing before you've defined functions, you've probably got a problem anyways. Put your action in an if __name__ == '__main__' at the end of your script (by executing a function you name "main" if it's non-trivial) and your code will be more modular and you'll be able to use it as a module if you ever need to.
Also, replace that list comprehension with a generator express (i.e., print "\n".join(str(bla) for bla in sorted(mylist, cmp=cmp_configs)))
Also, don't use cmp, which is deprecated. Use key and provide a less-than function.
Import the file itself. Assuming the file is called test.py:
import test
if __name__=='__main__':
test.func()
else:
def func():
print('Func worked')
TL;DR: Python does not need forward declarations. Simply put your function calls inside function def definitions, and you'll be fine.
def foo(count):
print("foo "+str(count))
if(count>0):
bar(count-1)
def bar(count):
print("bar "+str(count))
if(count>0):
foo(count-1)
foo(3)
print("Finished.")
recursive function definitions, perfectly successfully gives:
foo 3
bar 2
foo 1
bar 0
Finished.
However,
bug(13)
def bug(count):
print("bug never runs "+str(count))
print("Does not print this.")
breaks at the top-level invocation of a function that hasn't been defined yet, and gives:
Traceback (most recent call last):
File "./test1.py", line 1, in <module>
bug(13)
NameError: name 'bug' is not defined
Python is an interpreted language, like Lisp. It has no type checking, only run-time function invocations, which succeed if the function name has been bound and fail if it's unbound.
Critically, a function def definition does not execute any of the funcalls inside its lines, it simply declares what the function body is going to consist of. Again, it doesn't even do type checking. So we can do this:
def uncalled():
wild_eyed_undefined_function()
print("I'm not invoked!")
print("Only run this one line.")
and it runs perfectly fine (!), with output
Only run this one line.
The key is the difference between definitions and invocations.
The interpreter executes everything that comes in at the top level, which means it tries to invoke it. If it's not inside a definition.
Your code is running into trouble because you attempted to invoke a function, at the top level in this case, before it was bound.
The solution is to put your non-top-level function invocations inside a function definition, then call that function sometime much later.
The business about "if __ main __" is an idiom based on this principle, but you have to understand why, instead of simply blindly following it.
There are certainly much more advanced topics concerning lambda functions and rebinding function names dynamically, but these are not what the OP was asking for. In addition, they can be solved using these same principles: (1) defs define a function, they do not invoke their lines; (2) you get in trouble when you invoke a function symbol that's unbound.
Python does not support forward declarations, but common workaround for this is use of the the following condition at the end of your script/code:
if __name__ == '__main__': main()
With this it will read entire file first and then evaluate condition and call main() function which will be able to call any forward declared function as it already read the entire file first. This condition leverages special variable __name__ which returns __main__ value whenever we run Python code from current file (when code was imported as a module, then __name__ returns module name).
"just reorganize my code so that I don't have this problem." Correct. Easy to do. Always works.
You can always provide the function prior to it's reference.
"However, there are cases when this is probably unavoidable, for instance when implementing some forms of recursion"
Can't see how that's even remotely possible. Please provide an example of a place where you cannot define the function prior to it's use.
Now wait a minute. When your module reaches the print statement in your example, before cmp_configs has been defined, what exactly is it that you expect it to do?
If your posting of a question using print is really trying to represent something like this:
fn = lambda mylist:"\n".join([str(bla)
for bla in sorted(mylist, cmp = cmp_configs)])
then there is no requirement to define cmp_configs before executing this statement, just define it later in the code and all will be well.
Now if you are trying to reference cmp_configs as a default value of an argument to the lambda, then this is a different story:
fn = lambda mylist,cmp_configs=cmp_configs : \
"\n".join([str(bla) for bla in sorted(mylist, cmp = cmp_configs)])
Now you need a cmp_configs variable defined before you reach this line.
[EDIT - this next part turns out not to be correct, since the default argument value will get assigned when the function is compiled, and that value will be used even if you change the value of cmp_configs later.]
Fortunately, Python being so type-accommodating as it is, does not care what you define as cmp_configs, so you could just preface with this statement:
cmp_configs = None
And the compiler will be happy. Just be sure to declare the real cmp_configs before you ever invoke fn.
Python technically has support for forward declaration.
if you define a function/class then set the body to pass, it will have an empty entry in the global table.
you can then "redefine" the function/class later on to implement the function/class.
unlike c/c++ forward declaration though, this does not work from outside the scope (i.e. another file) as they have their own "global" namespace
example:
def foo(): pass
foo()
def foo(): print("FOOOOO")
foo()
foo is declared both times
however the first time foo is called it does not do anything as the body is just pass
but the second time foo is called. it executes the new body of print("FOOOOO")
but again. note that this does not fix circular dependancies. this is because files have their own name and have their own definitions of functions
example 2:
class bar: pass
print(bar)
this prints <class '__main__.bar'> but if it was declared in another file it would be <class 'otherfile.foo'>
i know this post is old, but i though that this answer would be useful to anyone who keeps finding this post after the many years it has been posted for
One way is to create a handler function. Define the handler early on, and put the handler below all the methods you need to call.
Then when you invoke the handler method to call your functions, they will always be available.
The handler could take an argument nameOfMethodToCall. Then uses a bunch of if statements to call the right method.
This would solve your issue.
def foo():
print("foo")
#take input
nextAction=input('What would you like to do next?:')
return nextAction
def bar():
print("bar")
nextAction=input('What would you like to do next?:')
return nextAction
def handler(action):
if(action=="foo"):
nextAction = foo()
elif(action=="bar"):
nextAction = bar()
else:
print("You entered invalid input, defaulting to bar")
nextAction = "bar"
return nextAction
nextAction=input('What would you like to do next?:')
while 1:
nextAction = handler(nextAction)

Python global variable in import * [duplicate]

I've run into a bit of a wall importing modules in a Python script. I'll do my best to describe the error, why I run into it, and why I'm tying this particular approach to solve my problem (which I will describe in a second):
Let's suppose I have a module in which I've defined some utility functions/classes, which refer to entities defined in the namespace into which this auxiliary module will be imported (let "a" be such an entity):
module1:
def f():
print a
And then I have the main program, where "a" is defined, into which I want to import those utilities:
import module1
a=3
module1.f()
Executing the program will trigger the following error:
Traceback (most recent call last):
File "Z:\Python\main.py", line 10, in <module>
module1.f()
File "Z:\Python\module1.py", line 3, in f
print a
NameError: global name 'a' is not defined
Similar questions have been asked in the past (two days ago, d'uh) and several solutions have been suggested, however I don't really think these fit my requirements. Here's my particular context:
I'm trying to make a Python program which connects to a MySQL database server and displays/modifies data with a GUI. For cleanliness sake, I've defined the bunch of auxiliary/utility MySQL-related functions in a separate file. However they all have a common variable, which I had originally defined inside the utilities module, and which is the cursor object from MySQLdb module.
I later realised that the cursor object (which is used to communicate with the db server) should be defined in the main module, so that both the main module and anything that is imported into it can access that object.
End result would be something like this:
utilities_module.py:
def utility_1(args):
code which references a variable named "cur"
def utility_n(args):
etcetera
And my main module:
program.py:
import MySQLdb, Tkinter
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
And then, as soon as I try to call any of the utilities functions, it triggers the aforementioned "global name not defined" error.
A particular suggestion was to have a "from program import cur" statement in the utilities file, such as this:
utilities_module.py:
from program import cur
#rest of function definitions
program.py:
import Tkinter, MySQLdb
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
But that's cyclic import or something like that and, bottom line, it crashes too. So my question is:
How in hell can I make the "cur" object, defined in the main module, visible to those auxiliary functions which are imported into it?
Thanks for your time and my deepest apologies if the solution has been posted elsewhere. I just can't find the answer myself and I've got no more tricks in my book.
Globals in Python are global to a module, not across all modules. (Many people are confused by this, because in, say, C, a global is the same across all implementation files unless you explicitly make it static.)
There are different ways to solve this, depending on your actual use case.
Before even going down this path, ask yourself whether this really needs to be global. Maybe you really want a class, with f as an instance method, rather than just a free function? Then you could do something like this:
import module1
thingy1 = module1.Thingy(a=3)
thingy1.f()
If you really do want a global, but it's just there to be used by module1, set it in that module.
import module1
module1.a=3
module1.f()
On the other hand, if a is shared by a whole lot of modules, put it somewhere else, and have everyone import it:
import shared_stuff
import module1
shared_stuff.a = 3
module1.f()
… and, in module1.py:
import shared_stuff
def f():
print shared_stuff.a
Don't use a from import unless the variable is intended to be a constant. from shared_stuff import a would create a new a variable initialized to whatever shared_stuff.a referred to at the time of the import, and this new a variable would not be affected by assignments to shared_stuff.a.
Or, in the rare case that you really do need it to be truly global everywhere, like a builtin, add it to the builtin module. The exact details differ between Python 2.x and 3.x. In 3.x, it works like this:
import builtins
import module1
builtins.a = 3
module1.f()
As a workaround, you could consider setting environment variables in the outer layer, like this.
main.py:
import os
os.environ['MYVAL'] = str(myintvariable)
mymodule.py:
import os
myval = None
if 'MYVAL' in os.environ:
myval = os.environ['MYVAL']
As an extra precaution, handle the case when MYVAL is not defined inside the module.
This post is just an observation for Python behaviour I encountered. Maybe the advices you read above don't work for you if you made the same thing I did below.
Namely, I have a module which contains global/shared variables (as suggested above):
#sharedstuff.py
globaltimes_randomnode=[]
globalist_randomnode=[]
Then I had the main module which imports the shared stuff with:
import sharedstuff as shared
and some other modules that actually populated these arrays. These are called by the main module. When exiting these other modules I can clearly see that the arrays are populated. But when reading them back in the main module, they were empty. This was rather strange for me (well, I am new to Python). However, when I change the way I import the sharedstuff.py in the main module to:
from globals import *
it worked (the arrays were populated).
Just sayin'
A function uses the globals of the module it's defined in. Instead of setting a = 3, for example, you should be setting module1.a = 3. So, if you want cur available as a global in utilities_module, set utilities_module.cur.
A better solution: don't use globals. Pass the variables you need into the functions that need it, or create a class to bundle all the data together, and pass it when initializing the instance.
The easiest solution to this particular problem would have been to add another function within the module that would have stored the cursor in a variable global to the module. Then all the other functions could use it as well.
module1:
cursor = None
def setCursor(cur):
global cursor
cursor = cur
def method(some, args):
global cursor
do_stuff(cursor, some, args)
main program:
import module1
cursor = get_a_cursor()
module1.setCursor(cursor)
module1.method()
Since globals are module specific, you can add the following function to all imported modules, and then use it to:
Add singular variables (in dictionary format) as globals for those
Transfer your main module globals to it
.
addglobals = lambda x: globals().update(x)
Then all you need to pass on current globals is:
import module
module.addglobals(globals())
Since I haven't seen it in the answers above, I thought I would add my simple workaround, which is just to add a global_dict argument to the function requiring the calling module's globals, and then pass the dict into the function when calling; e.g:
# external_module
def imported_function(global_dict=None):
print(global_dict["a"])
# calling_module
a = 12
from external_module import imported_function
imported_function(global_dict=globals())
>>> 12
The OOP way of doing this would be to make your module a class instead of a set of unbound methods. Then you could use __init__ or a setter method to set the variables from the caller for use in the module methods.
Update
To test the theory, I created a module and put it on pypi. It all worked perfectly.
pip install superglobals
Short answer
This works fine in Python 2 or 3:
import inspect
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
save as superglobals.py and employ in another module thusly:
from superglobals import *
superglobals()['var'] = value
Extended Answer
You can add some extra functions to make things more attractive.
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
def getglobal(key, default=None):
"""
getglobal(key[, default]) -> value
Return the value for key if key is in the global dictionary, else default.
"""
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals.get(key, default)
def setglobal(key, value):
_globals = superglobals()
_globals[key] = value
def defaultglobal(key, value):
"""
defaultglobal(key, value)
Set the value of global variable `key` if it is not otherwise st
"""
_globals = superglobals()
if key not in _globals:
_globals[key] = value
Then use thusly:
from superglobals import *
setglobal('test', 123)
defaultglobal('test', 456)
assert(getglobal('test') == 123)
Justification
The "python purity league" answers that litter this question are perfectly correct, but in some environments (such as IDAPython) which is basically single threaded with a large globally instantiated API, it just doesn't matter as much.
It's still bad form and a bad practice to encourage, but sometimes it's just easier. Especially when the code you are writing isn't going to have a very long life.

General Question about function and returning variable - python [duplicate]

Is it possible to forward-declare a function in Python? I want to sort a list using my own cmp function before it is declared.
print "\n".join([str(bla) for bla in sorted(mylist, cmp = cmp_configs)])
I've put the definition of cmp_configs method after the invocation. It fails with this error:
NameError: name 'cmp_configs' is not defined
Is there any way to "declare" cmp_configs method before it's used?
Sometimes, it is difficult to reorganize code to avoid this problem. For instance, when implementing some forms of recursion:
def spam():
if end_condition():
return end_result()
else:
return eggs()
def eggs():
if end_condition():
return end_result()
else:
return spam()
Where end_condition and end_result have been previously defined.
Is the only solution to reorganize the code and always put definitions before invocations?
Wrap the invocation into a function of its own so that
foo()
def foo():
print "Hi!"
will break, but
def bar():
foo()
def foo():
print "Hi!"
bar()
will work properly.
The general rule in Python is that a function should be defined before its usage, which does not necessarily mean it needs to be higher in the code.
If you kick-start your script through the following:
if __name__=="__main__":
main()
then you probably do not have to worry about things like "forward declaration". You see, the interpreter would go loading up all your functions and then start your main() function. Of course, make sure you have all the imports correct too ;-)
Come to think of it, I've never heard such a thing as "forward declaration" in python... but then again, I might be wrong ;-)
If you don't want to define a function before it's used, and defining it afterwards is impossible, what about defining it in some other module?
Technically you still define it first, but it's clean.
You could create a recursion like the following:
def foo():
bar()
def bar():
foo()
Python's functions are anonymous just like values are anonymous, yet they can be bound to a name.
In the above code, foo() does not call a function with the name foo, it calls a function that happens to be bound to the name foo at the point the call is made. It is possible to redefine foo somewhere else, and bar would then call the new function.
Your problem cannot be solved because it's like asking to get a variable which has not been declared.
I apologize for reviving this thread, but there was a strategy not discussed here which may be applicable.
Using reflection it is possible to do something akin to forward declaration. For instance lets say you have a section of code that looks like this:
# We want to call a function called 'foo', but it hasn't been defined yet.
function_name = 'foo'
# Calling at this point would produce an error
# Here is the definition
def foo():
bar()
# Note that at this point the function is defined
# Time for some reflection...
globals()[function_name]()
So in this way we have determined what function we want to call before it is actually defined, effectively a forward declaration. In python the statement globals()[function_name]() is the same as foo() if function_name = 'foo' for the reasons discussed above, since python must lookup each function before calling it. If one were to use the timeit module to see how these two statements compare, they have the exact same computational cost.
Of course the example here is very useless, but if one were to have a complex structure which needed to execute a function, but must be declared before (or structurally it makes little sense to have it afterwards), one can just store a string and try to call the function later.
If the call to cmp_configs is inside its own function definition, you should be fine. I'll give an example.
def a():
b() # b() hasn't been defined yet, but that's fine because at this point, we're not
# actually calling it. We're just defining what should happen when a() is called.
a() # This call fails, because b() hasn't been defined yet,
# and thus trying to run a() fails.
def b():
print "hi"
a() # This call succeeds because everything has been defined.
In general, putting your code inside functions (such as main()) will resolve your problem; just call main() at the end of the file.
There is no such thing in python like forward declaration. You just have to make sure that your function is declared before it is needed.
Note that the body of a function isn't interpreted until the function is executed.
Consider the following example:
def a():
b() # won't be resolved until a is invoked.
def b():
print "hello"
a() # here b is already defined so this line won't fail.
You can think that a body of a function is just another script that will be interpreted once you call the function.
Sometimes an algorithm is easiest to understand top-down, starting with the overall structure and drilling down into the details.
You can do so without forward declarations:
def main():
make_omelet()
eat()
def make_omelet():
break_eggs()
whisk()
fry()
def break_eggs():
for egg in carton:
break(egg)
# ...
main()
# declare a fake function (prototype) with no body
def foo(): pass
def bar():
# use the prototype however you see fit
print(foo(), "world!")
# define the actual function (overwriting the prototype)
def foo():
return "Hello,"
bar()
Output:
Hello, world!
No, I don't believe there is any way to forward-declare a function in Python.
Imagine you are the Python interpreter. When you get to the line
print "\n".join([str(bla) for bla in sorted(mylist, cmp = cmp_configs)])
either you know what cmp_configs is or you don't. In order to proceed, you have to
know cmp_configs. It doesn't matter if there is recursion.
You can't forward-declare a function in Python. If you have logic executing before you've defined functions, you've probably got a problem anyways. Put your action in an if __name__ == '__main__' at the end of your script (by executing a function you name "main" if it's non-trivial) and your code will be more modular and you'll be able to use it as a module if you ever need to.
Also, replace that list comprehension with a generator express (i.e., print "\n".join(str(bla) for bla in sorted(mylist, cmp=cmp_configs)))
Also, don't use cmp, which is deprecated. Use key and provide a less-than function.
Import the file itself. Assuming the file is called test.py:
import test
if __name__=='__main__':
test.func()
else:
def func():
print('Func worked')
TL;DR: Python does not need forward declarations. Simply put your function calls inside function def definitions, and you'll be fine.
def foo(count):
print("foo "+str(count))
if(count>0):
bar(count-1)
def bar(count):
print("bar "+str(count))
if(count>0):
foo(count-1)
foo(3)
print("Finished.")
recursive function definitions, perfectly successfully gives:
foo 3
bar 2
foo 1
bar 0
Finished.
However,
bug(13)
def bug(count):
print("bug never runs "+str(count))
print("Does not print this.")
breaks at the top-level invocation of a function that hasn't been defined yet, and gives:
Traceback (most recent call last):
File "./test1.py", line 1, in <module>
bug(13)
NameError: name 'bug' is not defined
Python is an interpreted language, like Lisp. It has no type checking, only run-time function invocations, which succeed if the function name has been bound and fail if it's unbound.
Critically, a function def definition does not execute any of the funcalls inside its lines, it simply declares what the function body is going to consist of. Again, it doesn't even do type checking. So we can do this:
def uncalled():
wild_eyed_undefined_function()
print("I'm not invoked!")
print("Only run this one line.")
and it runs perfectly fine (!), with output
Only run this one line.
The key is the difference between definitions and invocations.
The interpreter executes everything that comes in at the top level, which means it tries to invoke it. If it's not inside a definition.
Your code is running into trouble because you attempted to invoke a function, at the top level in this case, before it was bound.
The solution is to put your non-top-level function invocations inside a function definition, then call that function sometime much later.
The business about "if __ main __" is an idiom based on this principle, but you have to understand why, instead of simply blindly following it.
There are certainly much more advanced topics concerning lambda functions and rebinding function names dynamically, but these are not what the OP was asking for. In addition, they can be solved using these same principles: (1) defs define a function, they do not invoke their lines; (2) you get in trouble when you invoke a function symbol that's unbound.
Python does not support forward declarations, but common workaround for this is use of the the following condition at the end of your script/code:
if __name__ == '__main__': main()
With this it will read entire file first and then evaluate condition and call main() function which will be able to call any forward declared function as it already read the entire file first. This condition leverages special variable __name__ which returns __main__ value whenever we run Python code from current file (when code was imported as a module, then __name__ returns module name).
"just reorganize my code so that I don't have this problem." Correct. Easy to do. Always works.
You can always provide the function prior to it's reference.
"However, there are cases when this is probably unavoidable, for instance when implementing some forms of recursion"
Can't see how that's even remotely possible. Please provide an example of a place where you cannot define the function prior to it's use.
Now wait a minute. When your module reaches the print statement in your example, before cmp_configs has been defined, what exactly is it that you expect it to do?
If your posting of a question using print is really trying to represent something like this:
fn = lambda mylist:"\n".join([str(bla)
for bla in sorted(mylist, cmp = cmp_configs)])
then there is no requirement to define cmp_configs before executing this statement, just define it later in the code and all will be well.
Now if you are trying to reference cmp_configs as a default value of an argument to the lambda, then this is a different story:
fn = lambda mylist,cmp_configs=cmp_configs : \
"\n".join([str(bla) for bla in sorted(mylist, cmp = cmp_configs)])
Now you need a cmp_configs variable defined before you reach this line.
[EDIT - this next part turns out not to be correct, since the default argument value will get assigned when the function is compiled, and that value will be used even if you change the value of cmp_configs later.]
Fortunately, Python being so type-accommodating as it is, does not care what you define as cmp_configs, so you could just preface with this statement:
cmp_configs = None
And the compiler will be happy. Just be sure to declare the real cmp_configs before you ever invoke fn.
Python technically has support for forward declaration.
if you define a function/class then set the body to pass, it will have an empty entry in the global table.
you can then "redefine" the function/class later on to implement the function/class.
unlike c/c++ forward declaration though, this does not work from outside the scope (i.e. another file) as they have their own "global" namespace
example:
def foo(): pass
foo()
def foo(): print("FOOOOO")
foo()
foo is declared both times
however the first time foo is called it does not do anything as the body is just pass
but the second time foo is called. it executes the new body of print("FOOOOO")
but again. note that this does not fix circular dependancies. this is because files have their own name and have their own definitions of functions
example 2:
class bar: pass
print(bar)
this prints <class '__main__.bar'> but if it was declared in another file it would be <class 'otherfile.foo'>
i know this post is old, but i though that this answer would be useful to anyone who keeps finding this post after the many years it has been posted for
One way is to create a handler function. Define the handler early on, and put the handler below all the methods you need to call.
Then when you invoke the handler method to call your functions, they will always be available.
The handler could take an argument nameOfMethodToCall. Then uses a bunch of if statements to call the right method.
This would solve your issue.
def foo():
print("foo")
#take input
nextAction=input('What would you like to do next?:')
return nextAction
def bar():
print("bar")
nextAction=input('What would you like to do next?:')
return nextAction
def handler(action):
if(action=="foo"):
nextAction = foo()
elif(action=="bar"):
nextAction = bar()
else:
print("You entered invalid input, defaulting to bar")
nextAction = "bar"
return nextAction
nextAction=input('What would you like to do next?:')
while 1:
nextAction = handler(nextAction)

Override a "private" method in a python module

I want to test a function in python, but it relies on a module-level "private" function, that I don't want called, but I'm having trouble overriding/mocking it. Scenario:
module.py
_cmd(command, args):
# do something nasty
function_to_be_tested():
# do cool things
_cmd('rm', '-rf /')
return 1
test_module.py
import module
test_function():
assert module.function_to_be_tested() == 1
Ideally, in this test I dont want to call _cmd. I've looked at some other threads, and I've tried the following with no luck:
test_function():
def _cmd(command, args):
# do nothing
pass
module._cmd = _cmd
although checking module._cmd against _cmd doesn't give the correct reference. Using mock:
from mock import patch
def _cmd_mock(command, args):
# do nothing
pass
#patch('module._cmd', _cmd_mock)
test_function():
...
gives the correct reference when checking module._cmd, although `function_to_be_tested' still uses the original _cmd (as evidenced by it doing nasty things).
This is tricky because _cmd is a module-level function, and I dont want to move it into a module
[Disclaimer]
The synthetic example posted in this question works and the described issue become from specific implementation in production code. Maybe this question should be closed as off topic because the issue is not reproducible.
[Note] For impatient people Solution is at the end of the answer.
Anyway that question given to me a good point to thought: how we can patch a method reference when we cannot access to the variable where the reference is?
Lot of times I found some issue like this. There are lot of ways to meet that case and the commons are
Decorators: the instance we would like replace is passed as decorator argument or used in decorator static implementation
What we would like to patch is a default argument of a method
In both cases maybe refactor the code is the best way to play with that but what about if we are playing with some legacy code or the decorator is a third part decorator?
Ok, we have the back on the wall but we are using python and in python nothing is impossible. What we need is just the reference of the function/method to patch and instead of patching its reference we can patch the __code__: yes I'm speaking about patching the bytecode instead the function.
Get a real example. I'm using default parameter case that is simple, but it works either in decorator case.
def cmd(a):
print("ORIG {}".format(a))
def cmd_fake(a):
print("NEW {}".format(a))
def do_work(a, c=cmd):
c(a)
do_work("a")
cmd=cmd_fake
do_work("b")
Output:
ORIG a
ORIG b
Ok In this case we can test do_work by passing cmd_fake but there some cases where is impossible do it: for instance what about if we need to call something like that:
def what_the_hell():
list(map(lambda a:do_work(a), ["c","d"]))
what we can do is patch cmd.__code__ instead of _cmd by
cmd.__code__ = cmd_fake.__code__
So follow code
do_work("a")
what_the_hell()
cmd.__code__ = cmd_fake.__code__
do_work("b")
what_the_hell()
Give follow output:
ORIG a
ORIG c
ORIG d
NEW b
NEW c
NEW d
Moreover if we want to use a mock we can do it by add follow lines:
from unittest.mock import Mock, call
cmd_mock = Mock()
def cmd_mocker(a):
cmd_mock(a)
cmd.__code__=cmd_mocker.__code__
what_the_hell()
cmd_mock.assert_has_calls([call("c"),call("d")])
print("WORKS")
That print out
WORKS
Maybe I'm done... but OP still wait for a solution of his issue
from mock import patch, Mock
cmd_mock = Mock()
#A closure for grabbing the right function code
def cmd_mocker(a):
cmd_mock(a)
#patch.object(module._cmd,'__code__', new=cmd_mocker.__code__)
test_function():
...
Now I should say never use this trick unless you are with the back on the wall. Test should be simple to understand and to debug ... try to debug something like this and you will become mad!

How to make a cross-module variable?

The __debug__ variable is handy in part because it affects every module. If I want to create another variable that works the same way, how would I do it?
The variable (let's be original and call it 'foo') doesn't have to be truly global, in the sense that if I change foo in one module, it is updated in others. I'd be fine if I could set foo before importing other modules and then they would see the same value for it.
If you need a global cross-module variable maybe just simple global module-level variable will suffice.
a.py:
var = 1
b.py:
import a
print a.var
import c
print a.var
c.py:
import a
a.var = 2
Test:
$ python b.py
# -> 1 2
Real-world example: Django's global_settings.py (though in Django apps settings are used by importing the object django.conf.settings).
I don't endorse this solution in any way, shape or form. But if you add a variable to the __builtin__ module, it will be accessible as if a global from any other module that includes __builtin__ -- which is all of them, by default.
a.py contains
print foo
b.py contains
import __builtin__
__builtin__.foo = 1
import a
The result is that "1" is printed.
Edit: The __builtin__ module is available as the local symbol __builtins__ -- that's the reason for the discrepancy between two of these answers. Also note that __builtin__ has been renamed to builtins in python3.
I believe that there are plenty of circumstances in which it does make sense and it simplifies programming to have some globals that are known across several (tightly coupled) modules. In this spirit, I would like to elaborate a bit on the idea of having a module of globals which is imported by those modules which need to reference them.
When there is only one such module, I name it "g". In it, I assign default values for every variable I intend to treat as global. In each module that uses any of them, I do not use "from g import var", as this only results in a local variable which is initialized from g only at the time of the import. I make most references in the form g.var, and the "g." serves as a constant reminder that I am dealing with a variable that is potentially accessible to other modules.
If the value of such a global variable is to be used frequently in some function in a module, then that function can make a local copy: var = g.var. However, it is important to realize that assignments to var are local, and global g.var cannot be updated without referencing g.var explicitly in an assignment.
Note that you can also have multiple such globals modules shared by different subsets of your modules to keep things a little more tightly controlled. The reason I use short names for my globals modules is to avoid cluttering up the code too much with occurrences of them. With only a little experience, they become mnemonic enough with only 1 or 2 characters.
It is still possible to make an assignment to, say, g.x when x was not already defined in g, and a different module can then access g.x. However, even though the interpreter permits it, this approach is not so transparent, and I do avoid it. There is still the possibility of accidentally creating a new variable in g as a result of a typo in the variable name for an assignment. Sometimes an examination of dir(g) is useful to discover any surprise names that may have arisen by such accident.
Define a module ( call it "globalbaz" ) and have the variables defined inside it. All the modules using this "pseudoglobal" should import the "globalbaz" module, and refer to it using "globalbaz.var_name"
This works regardless of the place of the change, you can change the variable before or after the import. The imported module will use the latest value. (I tested this in a toy example)
For clarification, globalbaz.py looks just like this:
var_name = "my_useful_string"
You can pass the globals of one module to onother:
In Module A:
import module_b
my_var=2
module_b.do_something_with_my_globals(globals())
print my_var
In Module B:
def do_something_with_my_globals(glob): # glob is simply a dict.
glob["my_var"]=3
Global variables are usually a bad idea, but you can do this by assigning to __builtins__:
__builtins__.foo = 'something'
print foo
Also, modules themselves are variables that you can access from any module. So if you define a module called my_globals.py:
# my_globals.py
foo = 'something'
Then you can use that from anywhere as well:
import my_globals
print my_globals.foo
Using modules rather than modifying __builtins__ is generally a cleaner way to do globals of this sort.
You can already do this with module-level variables. Modules are the same no matter what module they're being imported from. So you can make the variable a module-level variable in whatever module it makes sense to put it in, and access it or assign to it from other modules. It would be better to call a function to set the variable's value, or to make it a property of some singleton object. That way if you end up needing to run some code when the variable's changed, you can do so without breaking your module's external interface.
It's not usually a great way to do things — using globals seldom is — but I think this is the cleanest way to do it.
I wanted to post an answer that there is a case where the variable won't be found.
Cyclical imports may break the module behavior.
For example:
first.py
import second
var = 1
second.py
import first
print(first.var) # will throw an error because the order of execution happens before var gets declared.
main.py
import first
On this is example it should be obvious, but in a large code-base, this can be really confusing.
I wondered if it would be possible to avoid some of the disadvantages of using global variables (see e.g. http://wiki.c2.com/?GlobalVariablesAreBad) by using a class namespace rather than a global/module namespace to pass values of variables. The following code indicates that the two methods are essentially identical. There is a slight advantage in using class namespaces as explained below.
The following code fragments also show that attributes or variables may be dynamically created and deleted in both global/module namespaces and class namespaces.
wall.py
# Note no definition of global variables
class router:
""" Empty class """
I call this module 'wall' since it is used to bounce variables off of. It will act as a space to temporarily define global variables and class-wide attributes of the empty class 'router'.
source.py
import wall
def sourcefn():
msg = 'Hello world!'
wall.msg = msg
wall.router.msg = msg
This module imports wall and defines a single function sourcefn which defines a message and emits it by two different mechanisms, one via globals and one via the router function. Note that the variables wall.msg and wall.router.message are defined here for the first time in their respective namespaces.
dest.py
import wall
def destfn():
if hasattr(wall, 'msg'):
print 'global: ' + wall.msg
del wall.msg
else:
print 'global: ' + 'no message'
if hasattr(wall.router, 'msg'):
print 'router: ' + wall.router.msg
del wall.router.msg
else:
print 'router: ' + 'no message'
This module defines a function destfn which uses the two different mechanisms to receive the messages emitted by source. It allows for the possibility that the variable 'msg' may not exist. destfn also deletes the variables once they have been displayed.
main.py
import source, dest
source.sourcefn()
dest.destfn() # variables deleted after this call
dest.destfn()
This module calls the previously defined functions in sequence. After the first call to dest.destfn the variables wall.msg and wall.router.msg no longer exist.
The output from the program is:
global: Hello world!
router: Hello world!
global: no message
router: no message
The above code fragments show that the module/global and the class/class variable mechanisms are essentially identical.
If a lot of variables are to be shared, namespace pollution can be managed either by using several wall-type modules, e.g. wall1, wall2 etc. or by defining several router-type classes in a single file. The latter is slightly tidier, so perhaps represents a marginal advantage for use of the class-variable mechanism.
This sounds like modifying the __builtin__ name space. To do it:
import __builtin__
__builtin__.foo = 'some-value'
Do not use the __builtins__ directly (notice the extra "s") - apparently this can be a dictionary or a module. Thanks to ΤΖΩΤΖΙΟΥ for pointing this out, more can be found here.
Now foo is available for use everywhere.
I don't recommend doing this generally, but the use of this is up to the programmer.
Assigning to it must be done as above, just setting foo = 'some-other-value' will only set it in the current namespace.
I use this for a couple built-in primitive functions that I felt were really missing. One example is a find function that has the same usage semantics as filter, map, reduce.
def builtin_find(f, x, d=None):
for i in x:
if f(i):
return i
return d
import __builtin__
__builtin__.find = builtin_find
Once this is run (for instance, by importing near your entry point) all your modules can use find() as though, obviously, it was built in.
find(lambda i: i < 0, [1, 3, 0, -5, -10]) # Yields -5, the first negative.
Note: You can do this, of course, with filter and another line to test for zero length, or with reduce in one sort of weird line, but I always felt it was weird.
I could achieve cross-module modifiable (or mutable) variables by using a dictionary:
# in myapp.__init__
Timeouts = {} # cross-modules global mutable variables for testing purpose
Timeouts['WAIT_APP_UP_IN_SECONDS'] = 60
# in myapp.mod1
from myapp import Timeouts
def wait_app_up(project_name, port):
# wait for app until Timeouts['WAIT_APP_UP_IN_SECONDS']
# ...
# in myapp.test.test_mod1
from myapp import Timeouts
def test_wait_app_up_fail(self):
timeout_bak = Timeouts['WAIT_APP_UP_IN_SECONDS']
Timeouts['WAIT_APP_UP_IN_SECONDS'] = 3
with self.assertRaises(hlp.TimeoutException) as cm:
wait_app_up(PROJECT_NAME, PROJECT_PORT)
self.assertEqual("Timeout while waiting for App to start", str(cm.exception))
Timeouts['WAIT_JENKINS_UP_TIMEOUT_IN_SECONDS'] = timeout_bak
When launching test_wait_app_up_fail, the actual timeout duration is 3 seconds.

Categories

Resources