Order of evaluation of classes? - python

In a file lib.py I defined a functional class C and an enumeration class E as follows:
class C:
a = None
def meth(self, v):
if v == E.v1:
print("In C.meth().v1")
a = E.v1
if v == E.v2:
print("In C.meth().v2")
a = E.v2
from enum import Enum
class E(Enum):
print("In Enum")
v1 = 1
v2 = 2
Then, I import the two classes into my module main.py and use the enumeration:
from lib import C
from lib import E
c = C()
c.meth(E.v1)
When running, I get the following output:
In Enum
In C.meth().v1
Now, since Python is an interpreted language (at least, when using IDLE), I'd expect to get an error on the reference to the enumerations in the method meth. Since there is no error, and it seems to run OK, I wonder what are the (ordering) rules for referencing classes in the same module, and in between different modules? Why is there no error?

Name lookup happens at run time. So when you are defining class C and its method meth, then the lookup on E isn’t done yet. So it’s not a problem that you define it afterwards. Instead, the lookup happens when you call the method.
Also, name lookup happens by going up the scope, so meth will find the original E declared on module level, regardless of whether you import it in your main.py or not. Since you also import E in main.py, which is a reference to the same object, you can reference the same enum value in there too.
See also this example:
>>> def test(): # foo is not defined at this time
print(foo)
>>> test()
NameError: global name 'foo' is not defined
>>> foo = 'bar' # after defining foo, it works:
>>> test()
bar
When defining methods, variables are never “embedded”; the methods only contain the names and those names are looked up at run-time. However, due to how Python does the lookup, names of local variables are always “around” even if they haven’t been initialized yet. This can result in UnboundLocalErrors:
>>> def test():
print(foo)
foo = 'baz'
>>> test()
UnboundLocalError: local variable 'foo' referenced before assignment
One might expect that foo would be looked up in the outer scope for the first print, but because there is a local foo (even if it wasn’t initialized yet), foo will always* resolve to the local foo.
(* The nonlocal statement allows to make foo non-local, resolving it to the outer scope—again for all uses of foo in that method.)

When a module is imported, the commands are executed from top to bottom. Inside a class-definition, the commands are also executed, to define the methods inside the class. A def defines a method, but the commands inside the def are not executed, but only parsed.

The simplest way to understand the order of evaluation in your code is is to watch it execute:
http://dbgr.cc/q
Press the play button on the far right of the debug buttons and it will automatically step through.
I think what is confusing to you is that when class E is defined, all statements inside of the E class are run. This is the case for every class definition. This includes calling the print function to say "In Enum", as well as defining the v1 and v2 members of the E class.
The line c.meth(E.v1) isn't executed until both the C and the E classes have been defined, which means that E.v1 has also already been defined. This is why there is no error like you were expecting.

Related

Preventing a function from looking up variables outside it [duplicate]

In short, the question: Is there a way to prevent Python from looking up variables outside the current scope?
Details:
Python looks for variable definitions in outer scopes if they are not defined in the current scope. Thus, code like this is liable to break when not being careful during refactoring:
def line(x, a, b):
return a + x * b
a, b = 1, 1
y1 = line(1, a, b)
y2 = line(1, 2, 3)
If I renamed the function arguments, but forgot to rename them inside the function body, the code would still run:
def line(x, a0, b0):
return a + x * b # not an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
y2 = line(1, 2, 3) # wrong result
I know it is bad practice to shadow names from outer scopes. But sometimes we do it anyway...
Is there a way to prevent Python from looking up variables outside the current scope? (So that accessing a or b raises an Error in the second example.)
Yes, maybe not in general. However you can do it with functions.
The thing you want to do is to have the function's global to be empty. You can't replace the globals and you don't want to modify it's content (becaus
that would be just to get rid of global variables and functions).
However: you can create function objects in runtime. The constructor looks like types.FunctionType((code, globals[, name[, argdefs[, closure]]]). There you can replace the global namespace:
def line(x, a0, b0):
return a + x * b # will be an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
line = types.FunctionType(line.__code__, {})
y1 = line(1, a, b) # fails since global name is not defined
You can of course clean this up by defining your own decorator:
import types
noglobal = lambda f: types.FunctionType(f.__code__, {}, argdefs=f.__defaults__)
#noglobal
def f():
return x
x = 5
f() # will fail
Strictly speaking you do not forbid it to access global variables, you just make the function believe there is no variables in global namespace. Actually you can also use this to emulate static variables since if it declares an variable to be global and assign to it it will end up in it's own sandbox of global namespace.
If you want to be able to access part of the global namespace then you'll need to populate the functions global sandbox with what you want it to see.
No, you cannot tell Python not to look names up in the global scope.
If you could, you would not be able to use any other classes or functions defined in the module, no objects imported from other modules, nor could you use built-in names. Your function namespace becomes a desert devoid of almost everything it needs, and the only way out would be to import everything into the local namespace. For every single function in your module.
Rather than try to break global lookups, keep your global namespace clean. Don't add globals that you don't need to share with other scopes in the module. Use a main() function for example, to encapsulate what are really just locals.
Also, add unittesting. Refactoring without (even just a few) tests is always prone to create bugs otherwise.
With #skyking's answer, I was unable to access any imports (I could not even use print). Also, functions with optional arguments are broken (compare How can an optional parameter become required?).
#Ax3l's comment improved that a bit. Still I was unable to access imported variables (from module import var).
Therefore, I propose this:
def noglobal(f):
return types.FunctionType(f.__code__, globals().copy(), f.__name__, f.__defaults__, f.__closure__)
For each function decorated with #noglobal, that creates a copy of the globals() defined so far. This keeps imported variables (usually imported at the top of the document) accessible. If you do it like me, defining your functions first and then your variables, this will achieve the desired effect of being able to access imported variables in your function, but not the ones you define in your code. Since copy() creates a shallow copy (Understanding dict.copy() - shallow or deep?), this should be pretty memory-efficient, too.
Note that this way, a function can only call functions defined above itself, so you may need to reorder your code.
For the record, I copy #Ax3l's version from his Gist:
def imports():
for name, val in globals().items():
# module imports
if isinstance(val, types.ModuleType):
yield name, val
# functions / callables
if hasattr(val, '__call__'):
yield name, val
noglobal = lambda fn: types.FunctionType(fn.__code__, dict(imports()))
To discourage global variable lookup, move your function into another module. Unless it inspects the call stack or imports your calling module explicitly; it won't have access to the globals from the module that calls it.
In practice, move your code into a main() function, to avoid creating unnecessary global variables.
If you use globals because several functions need to manipulate shared state then move the code into a class.
As mentioned by #bers the decorator by #skykings breaks most python functionality inside the function, such as print() and the import statement. #bers hacked around the import statement by adding the currently imported modules from globals() at the time of decorator definition.
This inspired me to write yet another decorator that hopefully does what most people who come looking at this post actually want. The underlying problem is that the new function created by the previous decorators lacked the __builtins__ variable which contains all of the standard built-in python functions (e.g. print) available in a freshly opened interpreter.
import types
import builtins
def no_globals(f):
'''
A function decorator that prevents functions from looking up variables in outer scope.
'''
# need builtins in globals otherwise can't import or print inside the function
new_globals = {'__builtins__': builtins}
new_f = types.FunctionType(f.__code__, globals=new_globals, argdefs=f.__defaults__)
new_f.__annotations__ = f.__annotations__ # for some reason annotations aren't copied over
return new_f
Then the usage goes as the following
#no_globals
def f1():
return x
x = 5
f1() # should raise NameError
#no_globals
def f2(x):
import numpy as np
print(x)
return np.sin(x)
x = 5
f2(x) # should print 5 and return -0.9589242746631385
Theoretically you can use your own decorator that removes globals() while a function call. It is some overhead to hide all globals() but, if there are not too many globals() it could be useful. During the operation we do not create/remove global objects, we just overwrites references in dictionary which refers to global objects. But do not remove special globals() (like __builtins__) and modules. Probably you do not want to remove callables from global scope too.
from types import ModuleType
import re
# the decorator to hide global variables
def noglobs(f):
def inner(*args, **kwargs):
RE_NOREPLACE = '__\w+__'
old_globals = {}
# removing keys from globals() storing global values in old_globals
for key, val in globals().iteritems():
if re.match(RE_NOREPLACE, key) is None and not isinstance(val, ModuleType) and not callable(val):
old_globals.update({key: val})
for key in old_globals.keys():
del globals()[key]
result = f(*args, **kwargs)
# restoring globals
for key in old_globals.iterkeys():
globals()[key] = old_globals[key]
return result
return inner
# the example of usage
global_var = 'hello'
#noglobs
def no_globals_func():
try:
print 'Can I use %s here?' % global_var
except NameError:
print 'Name "global_var" in unavailable here'
def globals_func():
print 'Can I use %s here?' % global_var
globals_func()
no_globals_func()
print 'Can I use %s here?' % global_var
...
Can I use hello here?
Name "global_var" in unavailable here
Can I use hello here?
Or, you can iterate over all global callables (i.e. functions) in your module and decorate them dynamically (it's little more code).
The code is for Python 2, I think it's possible to create a very similar code for Python 3.

Why does an imported function "as" another name keep its original __name__?

Here:
from os.path import exists as foo
print foo.__name__
we get: 'exists'.
Why not 'foo'? Which attribute would give 'foo'?
You can view import foo as bar as just an assignment. You would not expect a function to change its __name__ attribute when you assign another name to the function.
>>> def foo(): pass
>>>
>>> foo.__name__
'foo'
>>> bar = foo
>>> bar.__name__
'foo'
Thanks. What attribute of the variable bar would return the string 'bar' then?
There is no such attribute. Names (bar) refer to values (the function object) unidirectionally.
The __name__ attribute of a function is set as the name the function was defined with using the
def ... syntax. That's why you don't get a meaningful __name__ attribute if you define an anonymous function and assign the name foo after it has been created.
>>> foo = lambda: None
>>> foo.__name__
'<lambda>'
Importing an object just binds a new variable, and all that adding as newname does is let you pick an alternative name to use for the variable in the current namespace.
The __name__ attribute on an object says nothing about the name it is currently bound to, you can have any number of variables as well as containers such as lists or dictionaries pointing to the same object, after all:
def foo(): pass
bar = foo
spam = foo
list_of_functions = [foo]
dictionary_of_functions = {'monty': foo, 'python': foo}
The above created 4 additional references to the function object; you can't have foo.__name__ reflect all of those, and the references in list_of_functions and dictionary_of_functions do not (directly) have names.
Since import foo, import bar as foo, from module import foo and from module import bar as foo all just set the name foo in the current module, they are treated the exact same way as other assignments. You could import the function more than once, under different names, too.
Instead, the __name__ value of a function is set to name it was defined with in the def <name>(...): statement. It is a debugging aid, at most. It is used in tracebacks, for example, to make it easier to identify lines of code shown in the traceback. You'd only set the __name__ to something else if that would help identify the location better. (Note: in Python 3, there is also the __qualname_ attribute, which is used instead of __name__ as it includes more information on where the function is defined when nested or defined on a class).
The as is syntactical sugar in the file/session of the import, while the __name__ attribute is part of the function object.

Understanding `from ... import ...` behavior [duplicate]

I wonder about why import a variable in python (python 3.4) has different result than importing a module and then referencing, more over why does a deep copy is made - and is there a way to bypass the copy (and not by defining a function that simply returns it)?
a.py
v = 1
def set():
global v
v = 3
main.py
import a
import b
a.set()
b.foo()
b.py
from a import v
def foo():
print(v)
print(a.v)
print(id(v))
print(id(a.v))
Output
1
3
1585041872
1585041904
The problem is that you're modifying a scalar value. This is not a problem specific to modules, it would work the same when simply passing the variable into a function and modifying it there.
The value 1 is imported from a, period. Whatever you do in a afterwards will not modify the value, because it's a simple immutable scalar value.
If a.v was an object, changes to this object would propagate to any variable holding a reference to it.
Asked a duplicate question myself, and with the help of others I figured out what it is. Here's what I found out. With pydoc links:
from a import v does not add a reference to a.v. Instead, it add a new variable to b as b.v with the value of a.v when import happened. Changing a.v later does not change the value of b.v.
Python 2
The from form does not bind the module name: it goes through the list of identifiers, looks each one of them up in the module found in step (1), and binds the name in the local namespace to the object thus found.
Python 3
The from form uses a slightly more complex process:
find the module specified in the from clause, loading and initializing it if necessary;
for each of the identifiers specified in the import clauses:
check if the imported module has an attribute by that name
if not, attempt to import a submodule with that name and then check the imported module again for that attribute
if the attribute is not found, ImportError is raised.
otherwise, a reference to that value is stored in the local namespace, using the name in the as clause if it is present, otherwise using the attribute name
The keyword here is in the local namespace.
Let's examine the sequence of events:
a.v = 1 # a.py: v = 1
b.v = a.v # b.py: from a import v
a.v = 3 # a.set()
print(b.v) # foo(): print(v)
print(a.v) # foo(): print(a.v)
As you can see, from a import v actually binds b.v to a value from a, and later modification to the original variable don't affect the copy.
When you say import a, you are creating a reference to the module. a.v is not copied. I change in one module is noticed in all modules. When you say from a import v you are making a copy of v at the time of the import. If either variable is changed, it is not reflected elsewhere.

Using global name in a nested function

As I understand the global statement in the code below, it should prevent function_two from rebinding the name test and instead modify test in function_one. However, I get NameError: global name 'test' is not defined.
def function_one():
test = 1
def function_two():
global test
test += 1
function_two()
print test
function_one()
I have looked and I can't find an example like this. What am I missing?
Python 2 does not support the concept of a non-local. Closures (accessing test from a parent function) only support read access, not assignment in Python 2.
The global keyword really does mean global, e.g. that the name lives in the module (global) namespace. The namespace of the function_one() function is not global, it is local (to that function).
In Python 3, you can mark a name as nonlocal, which would make your example work as expected. See PEP 3104 - Access to Names in Outer Scopes.
In Python 2, you'll have to resort to tricks instead. Make the name an attribute of the nested function, for example. 'reading' the function object as a closure is allowed, as is setting attributes on such closed-over objects:
def function_one():
def function_two():
function_two.test += 1
function_two.test = 1
function_two()
print test
Another trick is to use a mutable object, such as a list or a dictionary. Again, you are only reading the closed-over name, then altering the resulting object directly:
def function_one():
test = [1]
def function_two():
test[0] += 1
function_two()
print test[0]

Disable global variable lookup in Python

In short, the question: Is there a way to prevent Python from looking up variables outside the current scope?
Details:
Python looks for variable definitions in outer scopes if they are not defined in the current scope. Thus, code like this is liable to break when not being careful during refactoring:
def line(x, a, b):
return a + x * b
a, b = 1, 1
y1 = line(1, a, b)
y2 = line(1, 2, 3)
If I renamed the function arguments, but forgot to rename them inside the function body, the code would still run:
def line(x, a0, b0):
return a + x * b # not an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
y2 = line(1, 2, 3) # wrong result
I know it is bad practice to shadow names from outer scopes. But sometimes we do it anyway...
Is there a way to prevent Python from looking up variables outside the current scope? (So that accessing a or b raises an Error in the second example.)
Yes, maybe not in general. However you can do it with functions.
The thing you want to do is to have the function's global to be empty. You can't replace the globals and you don't want to modify it's content (becaus
that would be just to get rid of global variables and functions).
However: you can create function objects in runtime. The constructor looks like types.FunctionType((code, globals[, name[, argdefs[, closure]]]). There you can replace the global namespace:
def line(x, a0, b0):
return a + x * b # will be an error
a, b = 1, 1
y1 = line(1, a, b) # correct result by coincidence
line = types.FunctionType(line.__code__, {})
y1 = line(1, a, b) # fails since global name is not defined
You can of course clean this up by defining your own decorator:
import types
noglobal = lambda f: types.FunctionType(f.__code__, {}, argdefs=f.__defaults__)
#noglobal
def f():
return x
x = 5
f() # will fail
Strictly speaking you do not forbid it to access global variables, you just make the function believe there is no variables in global namespace. Actually you can also use this to emulate static variables since if it declares an variable to be global and assign to it it will end up in it's own sandbox of global namespace.
If you want to be able to access part of the global namespace then you'll need to populate the functions global sandbox with what you want it to see.
No, you cannot tell Python not to look names up in the global scope.
If you could, you would not be able to use any other classes or functions defined in the module, no objects imported from other modules, nor could you use built-in names. Your function namespace becomes a desert devoid of almost everything it needs, and the only way out would be to import everything into the local namespace. For every single function in your module.
Rather than try to break global lookups, keep your global namespace clean. Don't add globals that you don't need to share with other scopes in the module. Use a main() function for example, to encapsulate what are really just locals.
Also, add unittesting. Refactoring without (even just a few) tests is always prone to create bugs otherwise.
With #skyking's answer, I was unable to access any imports (I could not even use print). Also, functions with optional arguments are broken (compare How can an optional parameter become required?).
#Ax3l's comment improved that a bit. Still I was unable to access imported variables (from module import var).
Therefore, I propose this:
def noglobal(f):
return types.FunctionType(f.__code__, globals().copy(), f.__name__, f.__defaults__, f.__closure__)
For each function decorated with #noglobal, that creates a copy of the globals() defined so far. This keeps imported variables (usually imported at the top of the document) accessible. If you do it like me, defining your functions first and then your variables, this will achieve the desired effect of being able to access imported variables in your function, but not the ones you define in your code. Since copy() creates a shallow copy (Understanding dict.copy() - shallow or deep?), this should be pretty memory-efficient, too.
Note that this way, a function can only call functions defined above itself, so you may need to reorder your code.
For the record, I copy #Ax3l's version from his Gist:
def imports():
for name, val in globals().items():
# module imports
if isinstance(val, types.ModuleType):
yield name, val
# functions / callables
if hasattr(val, '__call__'):
yield name, val
noglobal = lambda fn: types.FunctionType(fn.__code__, dict(imports()))
To discourage global variable lookup, move your function into another module. Unless it inspects the call stack or imports your calling module explicitly; it won't have access to the globals from the module that calls it.
In practice, move your code into a main() function, to avoid creating unnecessary global variables.
If you use globals because several functions need to manipulate shared state then move the code into a class.
As mentioned by #bers the decorator by #skykings breaks most python functionality inside the function, such as print() and the import statement. #bers hacked around the import statement by adding the currently imported modules from globals() at the time of decorator definition.
This inspired me to write yet another decorator that hopefully does what most people who come looking at this post actually want. The underlying problem is that the new function created by the previous decorators lacked the __builtins__ variable which contains all of the standard built-in python functions (e.g. print) available in a freshly opened interpreter.
import types
import builtins
def no_globals(f):
'''
A function decorator that prevents functions from looking up variables in outer scope.
'''
# need builtins in globals otherwise can't import or print inside the function
new_globals = {'__builtins__': builtins}
new_f = types.FunctionType(f.__code__, globals=new_globals, argdefs=f.__defaults__)
new_f.__annotations__ = f.__annotations__ # for some reason annotations aren't copied over
return new_f
Then the usage goes as the following
#no_globals
def f1():
return x
x = 5
f1() # should raise NameError
#no_globals
def f2(x):
import numpy as np
print(x)
return np.sin(x)
x = 5
f2(x) # should print 5 and return -0.9589242746631385
Theoretically you can use your own decorator that removes globals() while a function call. It is some overhead to hide all globals() but, if there are not too many globals() it could be useful. During the operation we do not create/remove global objects, we just overwrites references in dictionary which refers to global objects. But do not remove special globals() (like __builtins__) and modules. Probably you do not want to remove callables from global scope too.
from types import ModuleType
import re
# the decorator to hide global variables
def noglobs(f):
def inner(*args, **kwargs):
RE_NOREPLACE = '__\w+__'
old_globals = {}
# removing keys from globals() storing global values in old_globals
for key, val in globals().iteritems():
if re.match(RE_NOREPLACE, key) is None and not isinstance(val, ModuleType) and not callable(val):
old_globals.update({key: val})
for key in old_globals.keys():
del globals()[key]
result = f(*args, **kwargs)
# restoring globals
for key in old_globals.iterkeys():
globals()[key] = old_globals[key]
return result
return inner
# the example of usage
global_var = 'hello'
#noglobs
def no_globals_func():
try:
print 'Can I use %s here?' % global_var
except NameError:
print 'Name "global_var" in unavailable here'
def globals_func():
print 'Can I use %s here?' % global_var
globals_func()
no_globals_func()
print 'Can I use %s here?' % global_var
...
Can I use hello here?
Name "global_var" in unavailable here
Can I use hello here?
Or, you can iterate over all global callables (i.e. functions) in your module and decorate them dynamically (it's little more code).
The code is for Python 2, I think it's possible to create a very similar code for Python 3.

Categories

Resources