Conflicting variable and function names in python - python

Let's say I've got the following functions:
def xplusy(x, y):
return x+y
def xplus1(x):
xplusy = xplusy(x, 1)
return xplusy
Now if I call a = xplus1(4) it throws the following error:
UnboundLocalError: local variable 'xplusy' referenced before assignment
The error is because of the naming conflict, if I redefine xplus1 as follows:
def xplus1(x):
s = xplusy(x, 1)
return s
it works fine.
Why is it so: can't compiler properly distinguish between a variable and a function call?
Any ways around it?

In Python, functions are data, and typing is dynamic. This means that the following lines are valid Python:
def func(x):
return x + 3
func = 3
func is now an int. The original function func is no longer referenced. The fact that func was originally a function doesn't have any bearing on what types of data can be assigned to it in the future. (This is what "dynamic typing" means.)
Therefore, since there's no static typing, and "function" is a valid data type, it wouldn't make sense for the Python interpreter to distinguish between a function and a piece of data referenced by the same name. Therefore, within a given scope, there's no way to use the same unqualified variable name to mean two different things.
In your particular case, if the code in your xplus1 function meant anything, it would mean "compute the value of xplusy(x,1) and assign that value to the variable xplusy -- thereby losing the reference to the function xplusy." However, within the scope of a function, the interpreter won't let you make an assignment to a variable outside of that scope, so it assumes that by writing an assignment statement, you're introducing a new local variable xplusy. The local variable, however, hasn't been defined yet, so your attempt to call it, xplusy(x,1), fails. The globally defined function is not called as a fall-back because, again, you can't have two unqualified names be identical and point to different data in the same scope.
Another example demonstrating the "no duplication of variable names within one scope" rule (which I actually only just discovered while playing around with the prompt in my attempt to construct this answer):
>>> def f1():
... a = xplusy(3,4)
... xplusy = 5
... print xplusy
...
>>> f1()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in f1
UnboundLocalError: local variable 'xplusy' referenced before assignment
>>> def f1():
... a = xplusy(3,4)
... print a
...
>>> f1()
7
This demonstrates that it really is the scope, not the statement that requires unique names.
EDIT: This is a really cool post that explains this and other scoping-related behavior: http://me.veekun.com/blog/2011/04/24/gotcha-python-scoping-closures/

In python function is first class object which mean it is as any other object.
More information about What are “first class” objects?

The reason this occurs is because xplusy exists in your scope as a global variable which cannot be changed unless you explicitly say xplusy is global.
def xplusy(x,y):
return x+y
def xplus1(x):
global xplusy
xplusy = xplusy(x,1)
return xplusy
however, this will make xplusy refer to an int, float or whatever xplusy returned the first time, meaning that after the first time it will throw TypeErrors.
The more pythonic way to do this would most likely be
def xplus1(x):
return xplusy(x,1)
or using the functools module:
from functools import partial
xplus1 = partial(xplusy,y=1) #since you wanted to override y with 1
if you don't care about which argument is overridden, you could simply do
xplus1 = partial(xplusy,1)

There are cases, for example with callbacks, that you refer to functions by their static name (funcName instead of funcName()). Thus in Python that variable-name is reserved for the function. A lot like how you would use LAMBDA functions that you store in a variable.

In Python, xplusy can reference anything. You can do this:
def xplusy(x, y):
return x+y
def otherfunction(x, y):
return x*y
def xplus1(x):
xplusy = otherfunction
return xplusy(x, 1)
And xplusy variable will reference to otherfunction in xplus1 scope.
xplus1(2)
>>> 2
Result is 2 * 1 instead 2 + 1. Be careful with assignments.Use as many variables as you need.

Related

Precise rules of variable binding in nested scopes [duplicate]

This question already has answers here:
Short description of the scoping rules?
(9 answers)
Closed 2 years ago.
I seem to have misunderstood something about Python variable binding. What are the precise rules for deciding which variable is accessed given a nested scope with shadowing names?
Let me illustrate with some examples. First the basic shadow.
a = 1
def foo():
a = 2
def _foo():
return a
return _foo()
print(foo()) # -> 2
Everything is fine here. The value is overwritten and returned accordingly. However, if the value is changed after the function definition, it is still the inner value:
def bar():
def _bar():
return a
a = 2
return _bar()
print(bar()) # -> 2
What's more, defining a function that references a non-existent variable is possible.
def baz():
def _baz():
return b
return _baz()
Then, if b is defined later, the function can be executed. But not if is defined in another inner scope:
def qux(f):
b = 3
return f()
print(qux(baz())) # -> NameError
Now all of these cases could be explained by having Python know about lines that come later in the program, but that conflicts with my knowledge of Python being an interpreted language, advancing line by line. So are statements parsed at once instead of line by line?
A weird behaviour with shadowing class attributes throws me off a bit more.
class C:
a = 2
b = a
def meth(self):
return a
c = meth
print(C.b, C().meth(), C.c) # -> 2 1 C.meth
Here a is defined as a class attribute and is successfully used in b, but this does not carry over to the method definition. The method itself can be used in later attributes, but not for example in other methods without going through self.
Is my guess about the binding happening all at once correct? And in that case are class bodies an exception by design, or are they not a scope at all? Or is something else going on here?
I think you might be overthinking it.
By default, variables when created are put in the narrowest enclosing function's scope.
Variables from all enclosing scopes are available in a read-only capacity, be that an enclosing function's scope or the global scope. If you try to assign to this, it'll create a new variable in the narrowest enclosing scope, shadowing those outside. Using the global keyword to bring an external variable into the local scope will stop this from happening, allowing you to assign things to the non-local scope.
Additionally, keep in mind that functions are compiled and evaluated at the time when the def statement is interpreted. For nested functions, essentially, every new call re-evaluates the inner functions. This also means that inner functions have read-only access to the scope of the outer functions. Same rules as usual.
Your bar() example works because, by the time python tries to access the variable a, it is present in at least one of the enclosing scopes. Python doesn't check these things until the last possible moment. Your qux() example doesn't work because the scope in which b is declared does not enclose the scope where _baz() is defined, and thus is not accessible.
Class scopes are weird. When the class is evaluated, all variables defined inside it are bound to the class. However, the class doesn't really count as a scope of its own, for the purpose of the methods enclosed inside it. Think of meth() as an unbound function, declared in the global scope, which C.meth refers to (and, now, C.c). Calling a function via dot notation is a syntactic shorthand:
# the following two are identical
C().meth()
C.meth(C())
and while C.meth is technically bound to C, it's not enclosed in C's class-level namespace. Trying to do C().meth() will fail, because a is not defined with respect to the function. (note that if a is defined in the global scope, the function will work as expected - C.meth() has the global scope as a parent, not C's class-level scope).

Why isn't Python switching from enclosing to local variable scope?

I was just checking my mental model about scoping in Python and got confused. The first two examples match my model, the 3rd example doesn't.
I assumed that Python has 4 scopes:
Local
Enclosed
Global
Built-in
I imagine those 4 scopes like dictionaries. The built-in one is pre-defined and the other ones get generated after some actions:
Global: The main script file creates a variable. This scope is killed once the script finished executing.
Local: Within a function, a variable is called. This scope is killed once the function finished executing
Enclosed: A function B is defined within a function A. Once B is called, the local scope of A becomes the enclosed scope of B. This scope is killed once B finished executing.
I assumed that Python has those 4 dictionaries in memory and essentially tries every time all 4:
Does the variable exist in local scope? Use it. If not, got to 2
Does the variable exist in enclosed scope? Use it. If not, go to 3.
Does the variable exist in global scope? Use it. If not, go to 4.
does the variable exist in the built-ins? Use it. If not, throw a NameError
I especially assumed that a variable could switch from enclosed scope being used to local scope being used. This is obviously not the case. Could somebody explain why? Is there maybe a bigger difference in my mental model from what actually happens?
Example 1
This prints "local"
def foo():
min = lambda n: "enclosing"
def bar():
"""Bar is enclosed by 'foo'"""
min = lambda n: "local"
print(min([1, 2, 3]))
bar()
foo()
Example 2
This prints "enclosing"
def foo():
min = lambda n: "enclosing"
def bar():
"""Bar is enclosed by 'foo'"""
print(min([1,2,3]))
bar()
foo()
Example 3
def foo():
min = lambda n: "enclosing"
def bar():
"""Bar is enclosed by 'foo'"""
print(min([1,2,3]))
min = lambda n: "local"
print(min([1, 2, 3]))
bar()
foo()
gives
Traceback (most recent call last):
File "example.py", line 13, in <module>
foo()
File "example.py", line 10, in foo
bar()
File "example.py", line 6, in bar
print(min([1,2,3]))
UnboundLocalError: local variable 'min' referenced before assignment
The rule is that if you assign to a variable anywhere inside the body of a function, the variable is considered local - in the whole function.
If you want to refer to a global variable, you have to say so explicitely by declaring
global my_variable
and if you mean to refer to the variable in the closest enclosing scope that defines it, you have do declare it as
nonlocal my_variable
So, there is nothing special happening here: the general rule for deciding that a variable is local still applies.

Preventing a function from looking up variables outside it [duplicate]

In short, the question: Is there a way to prevent Python from looking up variables outside the current scope?
Details:
Python looks for variable definitions in outer scopes if they are not defined in the current scope. Thus, code like this is liable to break when not being careful during refactoring:
def line(x, a, b):
return a + x * b
a, b = 1, 1
y1 = line(1, a, b)
y2 = line(1, 2, 3)
If I renamed the function arguments, but forgot to rename them inside the function body, the code would still run:
def line(x, a0, b0):
return a + x * b # not an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
y2 = line(1, 2, 3) # wrong result
I know it is bad practice to shadow names from outer scopes. But sometimes we do it anyway...
Is there a way to prevent Python from looking up variables outside the current scope? (So that accessing a or b raises an Error in the second example.)
Yes, maybe not in general. However you can do it with functions.
The thing you want to do is to have the function's global to be empty. You can't replace the globals and you don't want to modify it's content (becaus
that would be just to get rid of global variables and functions).
However: you can create function objects in runtime. The constructor looks like types.FunctionType((code, globals[, name[, argdefs[, closure]]]). There you can replace the global namespace:
def line(x, a0, b0):
return a + x * b # will be an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
line = types.FunctionType(line.__code__, {})
y1 = line(1, a, b) # fails since global name is not defined
You can of course clean this up by defining your own decorator:
import types
noglobal = lambda f: types.FunctionType(f.__code__, {}, argdefs=f.__defaults__)
#noglobal
def f():
return x
x = 5
f() # will fail
Strictly speaking you do not forbid it to access global variables, you just make the function believe there is no variables in global namespace. Actually you can also use this to emulate static variables since if it declares an variable to be global and assign to it it will end up in it's own sandbox of global namespace.
If you want to be able to access part of the global namespace then you'll need to populate the functions global sandbox with what you want it to see.
No, you cannot tell Python not to look names up in the global scope.
If you could, you would not be able to use any other classes or functions defined in the module, no objects imported from other modules, nor could you use built-in names. Your function namespace becomes a desert devoid of almost everything it needs, and the only way out would be to import everything into the local namespace. For every single function in your module.
Rather than try to break global lookups, keep your global namespace clean. Don't add globals that you don't need to share with other scopes in the module. Use a main() function for example, to encapsulate what are really just locals.
Also, add unittesting. Refactoring without (even just a few) tests is always prone to create bugs otherwise.
With #skyking's answer, I was unable to access any imports (I could not even use print). Also, functions with optional arguments are broken (compare How can an optional parameter become required?).
#Ax3l's comment improved that a bit. Still I was unable to access imported variables (from module import var).
Therefore, I propose this:
def noglobal(f):
return types.FunctionType(f.__code__, globals().copy(), f.__name__, f.__defaults__, f.__closure__)
For each function decorated with #noglobal, that creates a copy of the globals() defined so far. This keeps imported variables (usually imported at the top of the document) accessible. If you do it like me, defining your functions first and then your variables, this will achieve the desired effect of being able to access imported variables in your function, but not the ones you define in your code. Since copy() creates a shallow copy (Understanding dict.copy() - shallow or deep?), this should be pretty memory-efficient, too.
Note that this way, a function can only call functions defined above itself, so you may need to reorder your code.
For the record, I copy #Ax3l's version from his Gist:
def imports():
for name, val in globals().items():
# module imports
if isinstance(val, types.ModuleType):
yield name, val
# functions / callables
if hasattr(val, '__call__'):
yield name, val
noglobal = lambda fn: types.FunctionType(fn.__code__, dict(imports()))
To discourage global variable lookup, move your function into another module. Unless it inspects the call stack or imports your calling module explicitly; it won't have access to the globals from the module that calls it.
In practice, move your code into a main() function, to avoid creating unnecessary global variables.
If you use globals because several functions need to manipulate shared state then move the code into a class.
As mentioned by #bers the decorator by #skykings breaks most python functionality inside the function, such as print() and the import statement. #bers hacked around the import statement by adding the currently imported modules from globals() at the time of decorator definition.
This inspired me to write yet another decorator that hopefully does what most people who come looking at this post actually want. The underlying problem is that the new function created by the previous decorators lacked the __builtins__ variable which contains all of the standard built-in python functions (e.g. print) available in a freshly opened interpreter.
import types
import builtins
def no_globals(f):
'''
A function decorator that prevents functions from looking up variables in outer scope.
'''
# need builtins in globals otherwise can't import or print inside the function
new_globals = {'__builtins__': builtins}
new_f = types.FunctionType(f.__code__, globals=new_globals, argdefs=f.__defaults__)
new_f.__annotations__ = f.__annotations__ # for some reason annotations aren't copied over
return new_f
Then the usage goes as the following
#no_globals
def f1():
return x
x = 5
f1() # should raise NameError
#no_globals
def f2(x):
import numpy as np
print(x)
return np.sin(x)
x = 5
f2(x) # should print 5 and return -0.9589242746631385
Theoretically you can use your own decorator that removes globals() while a function call. It is some overhead to hide all globals() but, if there are not too many globals() it could be useful. During the operation we do not create/remove global objects, we just overwrites references in dictionary which refers to global objects. But do not remove special globals() (like __builtins__) and modules. Probably you do not want to remove callables from global scope too.
from types import ModuleType
import re
# the decorator to hide global variables
def noglobs(f):
def inner(*args, **kwargs):
RE_NOREPLACE = '__\w+__'
old_globals = {}
# removing keys from globals() storing global values in old_globals
for key, val in globals().iteritems():
if re.match(RE_NOREPLACE, key) is None and not isinstance(val, ModuleType) and not callable(val):
old_globals.update({key: val})
for key in old_globals.keys():
del globals()[key]
result = f(*args, **kwargs)
# restoring globals
for key in old_globals.iterkeys():
globals()[key] = old_globals[key]
return result
return inner
# the example of usage
global_var = 'hello'
#noglobs
def no_globals_func():
try:
print 'Can I use %s here?' % global_var
except NameError:
print 'Name "global_var" in unavailable here'
def globals_func():
print 'Can I use %s here?' % global_var
globals_func()
no_globals_func()
print 'Can I use %s here?' % global_var
...
Can I use hello here?
Name "global_var" in unavailable here
Can I use hello here?
Or, you can iterate over all global callables (i.e. functions) in your module and decorate them dynamically (it's little more code).
The code is for Python 2, I think it's possible to create a very similar code for Python 3.

Forbid the use of global variables inside a function [duplicate]

In short, the question: Is there a way to prevent Python from looking up variables outside the current scope?
Details:
Python looks for variable definitions in outer scopes if they are not defined in the current scope. Thus, code like this is liable to break when not being careful during refactoring:
def line(x, a, b):
return a + x * b
a, b = 1, 1
y1 = line(1, a, b)
y2 = line(1, 2, 3)
If I renamed the function arguments, but forgot to rename them inside the function body, the code would still run:
def line(x, a0, b0):
return a + x * b # not an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
y2 = line(1, 2, 3) # wrong result
I know it is bad practice to shadow names from outer scopes. But sometimes we do it anyway...
Is there a way to prevent Python from looking up variables outside the current scope? (So that accessing a or b raises an Error in the second example.)
Yes, maybe not in general. However you can do it with functions.
The thing you want to do is to have the function's global to be empty. You can't replace the globals and you don't want to modify it's content (becaus
that would be just to get rid of global variables and functions).
However: you can create function objects in runtime. The constructor looks like types.FunctionType((code, globals[, name[, argdefs[, closure]]]). There you can replace the global namespace:
def line(x, a0, b0):
return a + x * b # will be an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
line = types.FunctionType(line.__code__, {})
y1 = line(1, a, b) # fails since global name is not defined
You can of course clean this up by defining your own decorator:
import types
noglobal = lambda f: types.FunctionType(f.__code__, {}, argdefs=f.__defaults__)
#noglobal
def f():
return x
x = 5
f() # will fail
Strictly speaking you do not forbid it to access global variables, you just make the function believe there is no variables in global namespace. Actually you can also use this to emulate static variables since if it declares an variable to be global and assign to it it will end up in it's own sandbox of global namespace.
If you want to be able to access part of the global namespace then you'll need to populate the functions global sandbox with what you want it to see.
No, you cannot tell Python not to look names up in the global scope.
If you could, you would not be able to use any other classes or functions defined in the module, no objects imported from other modules, nor could you use built-in names. Your function namespace becomes a desert devoid of almost everything it needs, and the only way out would be to import everything into the local namespace. For every single function in your module.
Rather than try to break global lookups, keep your global namespace clean. Don't add globals that you don't need to share with other scopes in the module. Use a main() function for example, to encapsulate what are really just locals.
Also, add unittesting. Refactoring without (even just a few) tests is always prone to create bugs otherwise.
With #skyking's answer, I was unable to access any imports (I could not even use print). Also, functions with optional arguments are broken (compare How can an optional parameter become required?).
#Ax3l's comment improved that a bit. Still I was unable to access imported variables (from module import var).
Therefore, I propose this:
def noglobal(f):
return types.FunctionType(f.__code__, globals().copy(), f.__name__, f.__defaults__, f.__closure__)
For each function decorated with #noglobal, that creates a copy of the globals() defined so far. This keeps imported variables (usually imported at the top of the document) accessible. If you do it like me, defining your functions first and then your variables, this will achieve the desired effect of being able to access imported variables in your function, but not the ones you define in your code. Since copy() creates a shallow copy (Understanding dict.copy() - shallow or deep?), this should be pretty memory-efficient, too.
Note that this way, a function can only call functions defined above itself, so you may need to reorder your code.
For the record, I copy #Ax3l's version from his Gist:
def imports():
for name, val in globals().items():
# module imports
if isinstance(val, types.ModuleType):
yield name, val
# functions / callables
if hasattr(val, '__call__'):
yield name, val
noglobal = lambda fn: types.FunctionType(fn.__code__, dict(imports()))
To discourage global variable lookup, move your function into another module. Unless it inspects the call stack or imports your calling module explicitly; it won't have access to the globals from the module that calls it.
In practice, move your code into a main() function, to avoid creating unnecessary global variables.
If you use globals because several functions need to manipulate shared state then move the code into a class.
As mentioned by #bers the decorator by #skykings breaks most python functionality inside the function, such as print() and the import statement. #bers hacked around the import statement by adding the currently imported modules from globals() at the time of decorator definition.
This inspired me to write yet another decorator that hopefully does what most people who come looking at this post actually want. The underlying problem is that the new function created by the previous decorators lacked the __builtins__ variable which contains all of the standard built-in python functions (e.g. print) available in a freshly opened interpreter.
import types
import builtins
def no_globals(f):
'''
A function decorator that prevents functions from looking up variables in outer scope.
'''
# need builtins in globals otherwise can't import or print inside the function
new_globals = {'__builtins__': builtins}
new_f = types.FunctionType(f.__code__, globals=new_globals, argdefs=f.__defaults__)
new_f.__annotations__ = f.__annotations__ # for some reason annotations aren't copied over
return new_f
Then the usage goes as the following
#no_globals
def f1():
return x
x = 5
f1() # should raise NameError
#no_globals
def f2(x):
import numpy as np
print(x)
return np.sin(x)
x = 5
f2(x) # should print 5 and return -0.9589242746631385
Theoretically you can use your own decorator that removes globals() while a function call. It is some overhead to hide all globals() but, if there are not too many globals() it could be useful. During the operation we do not create/remove global objects, we just overwrites references in dictionary which refers to global objects. But do not remove special globals() (like __builtins__) and modules. Probably you do not want to remove callables from global scope too.
from types import ModuleType
import re
# the decorator to hide global variables
def noglobs(f):
def inner(*args, **kwargs):
RE_NOREPLACE = '__\w+__'
old_globals = {}
# removing keys from globals() storing global values in old_globals
for key, val in globals().iteritems():
if re.match(RE_NOREPLACE, key) is None and not isinstance(val, ModuleType) and not callable(val):
old_globals.update({key: val})
for key in old_globals.keys():
del globals()[key]
result = f(*args, **kwargs)
# restoring globals
for key in old_globals.iterkeys():
globals()[key] = old_globals[key]
return result
return inner
# the example of usage
global_var = 'hello'
#noglobs
def no_globals_func():
try:
print 'Can I use %s here?' % global_var
except NameError:
print 'Name "global_var" in unavailable here'
def globals_func():
print 'Can I use %s here?' % global_var
globals_func()
no_globals_func()
print 'Can I use %s here?' % global_var
...
Can I use hello here?
Name "global_var" in unavailable here
Can I use hello here?
Or, you can iterate over all global callables (i.e. functions) in your module and decorate them dynamically (it's little more code).
The code is for Python 2, I think it's possible to create a very similar code for Python 3.

Disable global variable lookup in Python

In short, the question: Is there a way to prevent Python from looking up variables outside the current scope?
Details:
Python looks for variable definitions in outer scopes if they are not defined in the current scope. Thus, code like this is liable to break when not being careful during refactoring:
def line(x, a, b):
return a + x * b
a, b = 1, 1
y1 = line(1, a, b)
y2 = line(1, 2, 3)
If I renamed the function arguments, but forgot to rename them inside the function body, the code would still run:
def line(x, a0, b0):
return a + x * b # not an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
y2 = line(1, 2, 3) # wrong result
I know it is bad practice to shadow names from outer scopes. But sometimes we do it anyway...
Is there a way to prevent Python from looking up variables outside the current scope? (So that accessing a or b raises an Error in the second example.)
Yes, maybe not in general. However you can do it with functions.
The thing you want to do is to have the function's global to be empty. You can't replace the globals and you don't want to modify it's content (becaus
that would be just to get rid of global variables and functions).
However: you can create function objects in runtime. The constructor looks like types.FunctionType((code, globals[, name[, argdefs[, closure]]]). There you can replace the global namespace:
def line(x, a0, b0):
return a + x * b # will be an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
line = types.FunctionType(line.__code__, {})
y1 = line(1, a, b) # fails since global name is not defined
You can of course clean this up by defining your own decorator:
import types
noglobal = lambda f: types.FunctionType(f.__code__, {}, argdefs=f.__defaults__)
#noglobal
def f():
return x
x = 5
f() # will fail
Strictly speaking you do not forbid it to access global variables, you just make the function believe there is no variables in global namespace. Actually you can also use this to emulate static variables since if it declares an variable to be global and assign to it it will end up in it's own sandbox of global namespace.
If you want to be able to access part of the global namespace then you'll need to populate the functions global sandbox with what you want it to see.
No, you cannot tell Python not to look names up in the global scope.
If you could, you would not be able to use any other classes or functions defined in the module, no objects imported from other modules, nor could you use built-in names. Your function namespace becomes a desert devoid of almost everything it needs, and the only way out would be to import everything into the local namespace. For every single function in your module.
Rather than try to break global lookups, keep your global namespace clean. Don't add globals that you don't need to share with other scopes in the module. Use a main() function for example, to encapsulate what are really just locals.
Also, add unittesting. Refactoring without (even just a few) tests is always prone to create bugs otherwise.
With #skyking's answer, I was unable to access any imports (I could not even use print). Also, functions with optional arguments are broken (compare How can an optional parameter become required?).
#Ax3l's comment improved that a bit. Still I was unable to access imported variables (from module import var).
Therefore, I propose this:
def noglobal(f):
return types.FunctionType(f.__code__, globals().copy(), f.__name__, f.__defaults__, f.__closure__)
For each function decorated with #noglobal, that creates a copy of the globals() defined so far. This keeps imported variables (usually imported at the top of the document) accessible. If you do it like me, defining your functions first and then your variables, this will achieve the desired effect of being able to access imported variables in your function, but not the ones you define in your code. Since copy() creates a shallow copy (Understanding dict.copy() - shallow or deep?), this should be pretty memory-efficient, too.
Note that this way, a function can only call functions defined above itself, so you may need to reorder your code.
For the record, I copy #Ax3l's version from his Gist:
def imports():
for name, val in globals().items():
# module imports
if isinstance(val, types.ModuleType):
yield name, val
# functions / callables
if hasattr(val, '__call__'):
yield name, val
noglobal = lambda fn: types.FunctionType(fn.__code__, dict(imports()))
To discourage global variable lookup, move your function into another module. Unless it inspects the call stack or imports your calling module explicitly; it won't have access to the globals from the module that calls it.
In practice, move your code into a main() function, to avoid creating unnecessary global variables.
If you use globals because several functions need to manipulate shared state then move the code into a class.
As mentioned by #bers the decorator by #skykings breaks most python functionality inside the function, such as print() and the import statement. #bers hacked around the import statement by adding the currently imported modules from globals() at the time of decorator definition.
This inspired me to write yet another decorator that hopefully does what most people who come looking at this post actually want. The underlying problem is that the new function created by the previous decorators lacked the __builtins__ variable which contains all of the standard built-in python functions (e.g. print) available in a freshly opened interpreter.
import types
import builtins
def no_globals(f):
'''
A function decorator that prevents functions from looking up variables in outer scope.
'''
# need builtins in globals otherwise can't import or print inside the function
new_globals = {'__builtins__': builtins}
new_f = types.FunctionType(f.__code__, globals=new_globals, argdefs=f.__defaults__)
new_f.__annotations__ = f.__annotations__ # for some reason annotations aren't copied over
return new_f
Then the usage goes as the following
#no_globals
def f1():
return x
x = 5
f1() # should raise NameError
#no_globals
def f2(x):
import numpy as np
print(x)
return np.sin(x)
x = 5
f2(x) # should print 5 and return -0.9589242746631385
Theoretically you can use your own decorator that removes globals() while a function call. It is some overhead to hide all globals() but, if there are not too many globals() it could be useful. During the operation we do not create/remove global objects, we just overwrites references in dictionary which refers to global objects. But do not remove special globals() (like __builtins__) and modules. Probably you do not want to remove callables from global scope too.
from types import ModuleType
import re
# the decorator to hide global variables
def noglobs(f):
def inner(*args, **kwargs):
RE_NOREPLACE = '__\w+__'
old_globals = {}
# removing keys from globals() storing global values in old_globals
for key, val in globals().iteritems():
if re.match(RE_NOREPLACE, key) is None and not isinstance(val, ModuleType) and not callable(val):
old_globals.update({key: val})
for key in old_globals.keys():
del globals()[key]
result = f(*args, **kwargs)
# restoring globals
for key in old_globals.iterkeys():
globals()[key] = old_globals[key]
return result
return inner
# the example of usage
global_var = 'hello'
#noglobs
def no_globals_func():
try:
print 'Can I use %s here?' % global_var
except NameError:
print 'Name "global_var" in unavailable here'
def globals_func():
print 'Can I use %s here?' % global_var
globals_func()
no_globals_func()
print 'Can I use %s here?' % global_var
...
Can I use hello here?
Name "global_var" in unavailable here
Can I use hello here?
Or, you can iterate over all global callables (i.e. functions) in your module and decorate them dynamically (it's little more code).
The code is for Python 2, I think it's possible to create a very similar code for Python 3.

Categories

Resources