I have Python package called just "package". In it I have empty __init__.py and two modules. One is called m1.py and contains just one line:
x = 3
The other one is called m2.py and contains this line:
x = 5
Now I try to use that modules. First I do something like that:
from package.m1 import x
print package.m1.x
Of course it does not work - I get such error:
NameError: name 'package' is not defined
And I understand why it does not work. But then I do something like that:
from package.m1 import x
import package.m2
print package.m1.x
And now it does work. Why? How? I did not import package.m1!
I have only one explanation for this:
from package.m1 import x loads the modules package and package.m1. m1 is added to the package module but package is not added to your globals.
import package.m2 now adds the package module to your globals. Since m1 is already part of package it is now accessible via package.m1.
Further testing:
>>> from package import m1
>>> package.m1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'package' is not defined
>>> import package.m2
>>> package.m1
<module 'package.m1' from 'package/m1.py'>
>>> from package import m3
>>> package.m3
<module 'package.m3' from 'package/m3.py'>
Testing continued:
>>> import package.m1
>>> del package
>>> import package
>>> package.m1
<module 'package.m1' from 'package/m1.py'>
The from x import y syntax import the whole module and then reference the specified object in the current namespace. It can be translated as:
import x
y = x.y
So, you're actually importing package.m1
Related
I have a package structure like this:
- src
- src/main.py
- src/package1
- src/package1/__init__.py
- src/package1/module1.py
- src/package1/module2.py
... where module2 is a subclass of module1, and therefore module1 gets referenced by an absolute import path in module2.py.
That is, in src/package1/module2.py:
from package1.module1 import SomeClassFromModule1
The problem occurs in the main.py script:
## here the imports
def main():
# create an instance of the child class in Module2
if __name__ == "__main__":
main()
Option 1 works. That is, in src/main.py:
from package1.module2 import SomeClassFromModule2
some_name = SomeClassFromModule2()
Option 2 does not work. That is, in src/main.py:
import package1.module2.SomeClassFromModule2
some_name = package1.module2.SomeClassFromModule2()
... causes the following error.
ModuleNotFoundError: No module named 'package1.module2.SomeClassFromModule2'; 'package1.module2' is not a package
So why is there this difference between the import and from ... import idiom?
Would be glad for some clarification.
import x keyword brings all the methods and class from x in the the file it is being called.
from x import y this brings a specific method or class('y' is a method or class) from that .py file ('x' is the file here) instead of bringing all the methods it has.
In your case when you import package1.module2 the SomeClassForModule2() is being already imported and hence you need not write import package1.module2.SomeClassFromModule2
here I guess you want to access a class, so you need to create a object in order to access it.
hope this helped you
After some test, I think you cannot import a function or class by using import your_module.your_class.
It's all about package, module, function and class:
# import module
>>>import os
<module 'os' from ooxx>
#use module of module (a litte weird)
>>>os.path
<module 'posixpath' from ooxx>
#import module of module (a litte weird)
>>>import os.path
#use function
>>>os.path.dirname
<function posixpath.dirname(p)>
# you cannot import a function (or class) by using 'import your.module.func'
# 'import ooxx' always get a module or package.
>>>import os.path.dirname
ModuleNotFoundError
No module named 'os.path.dirname'; 'os.path' is not a package
# instead of it, using 'from your_module import your_function_or_class'
>>>from os.path import dirname
<function posixpath.dirname(p)>
I have installed sooo many libraries/modules/packages with pip and now I cannot differentiate which is native to the python standard library and which is not. This causes problem when my code works on my machine but it doesn't work anywhere else.
How can I check if a module/library/package that I import in my code is from the python stdlib?
Assume that the checking is done on the machine with all the external libraries/modules/packages, otherwise I could simply do a try-except import on the other machine that doesn't have them.
For example, I am sure these imports work on my machine, but when it's on a machine with only a plain Python install, it breaks:
from bs4 import BeautifulSoup
import nltk
import PIL
import gensim
You'd have to check all modules that have been imported to see if any of these are located outside of the standard library.
The following script is not bulletproof but should give you a starting point:
import sys
import os
external = set()
exempt = set()
paths = (os.path.abspath(p) for p in sys.path)
stdlib = {p for p in paths
if p.startswith((sys.prefix, sys.real_prefix))
and 'site-packages' not in p}
for name, module in sorted(sys.modules.items()):
if not module or name in sys.builtin_module_names or not hasattr(module, '__file__'):
# an import sentinel, built-in module or not a real module, really
exempt.add(name)
continue
fname = module.__file__
if fname.endswith(('__init__.py', '__init__.pyc', '__init__.pyo')):
fname = os.path.dirname(fname)
if os.path.dirname(fname) in stdlib:
# stdlib path, skip
exempt.add(name)
continue
parts = name.split('.')
for i, part in enumerate(parts):
partial = '.'.join(parts[:i] + [part])
if partial in external or partial in exempt:
# already listed or exempted
break
if partial in sys.modules and sys.modules[partial]:
# just list the parent name and be done with it
external.add(partial)
break
for name in external:
print name, sys.modules[name].__file__
Put this is a new module, import it after all imports in your script, and it'll print all modules that it thinks are not part of the standard library.
The standard library is defined in the documentation of python. You can just search there, or put the module names into a list and check programmatically with that.
Alternatively, in python3.4 there's a new isolated mode that allows to ignore a certain number of user-defined library paths. In previous versions of python you can use -s to ignore the per-user environment and -E to ignore the system defined variables.
In python2 a very simple way to check if a module is part of the standard library is to clear the sys.path:
>>> import sys
>>> sys.path = []
>>> import numpy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named numpy
>>> import traceback
>>> import os
>>> import re
However this doesn't work in python3.3+:
>>> import sys
>>> sys.path = []
>>> import traceback
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named 'traceback'
[...]
This is because starting with python3.3 the import machinery was changed, and importing the standard library uses the same mechanism as importing any other module (see the documentation).
In python3.3 the only way to make sure that only stdlib's imports succeed is to add only the standard library path to sys.path, for example:
>>> import os, sys, traceback
>>> lib_path = os.path.dirname(traceback.__file__)
>>> sys.path = [lib_path]
>>> import traceback
>>> import re
>>> import numpy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named 'numpy'
I used the traceback module to get the library path, since this should work on any system.
For the built-in modules, which are a subset of the stdlib modules, you can check sys.builtin_module_names
#Bakuriu's answer was very useful to me. The only problem I experienced is if you want to check if a particular module is stdlib however is has been imported already. In that case, sys.modules will only have an entry for it so even if sys.path is stripped, the import will succeed:
In [1]: import sys
In [2]: import virtualenv
In [3]: sys.path = []
In [4]: try:
__import__('virtualenv')
except ImportError:
print(False)
else:
print(True)
...:
True
vs
In [1]: import sys
In [2]: sys.path = []
In [3]: try:
__import__('virtualenv')
except ImportError:
print(False)
else:
print(True)
...:
False
I whipped out the following solution which seems to work in both Python2 and Python3:
from __future__ import unicode_literals, print_function
import sys
from contextlib import contextmanager
from importlib import import_module
#contextmanager
def ignore_site_packages_paths():
paths = sys.path
# remove all third-party paths
# so that only stdlib imports will succeed
sys.path = list(filter(
None,
filter(lambda i: 'site-packages' not in i, sys.path)
))
yield
sys.path = paths
def is_std_lib(module):
if module in sys.builtin_module_names:
return True
with ignore_site_packages_paths():
imported_module = sys.modules.pop(module, None)
try:
import_module(module)
except ImportError:
return False
else:
return True
finally:
if imported_module:
sys.modules[module] = imported_module
You can keep track of the source code here
I am a newbie to python. Below is my module
mymath.py
pi = 3.142
def circle(radius):
return pi * radius * radius
In terminal, I run it following way:
>>import mymath
>>mymath.pi
>>3.142
When I change pi to a local variable and reload(mymath) and do import mymath, still I get value of mymath.pi as 3.142. However the result of mymath.circle(radius) does reflect the change in result.
def circle(radius):
pi = 3
return pi * radius * radius
>>import imp
>>imp.reload(mymath)
>>import mymath
>>mymath.pi
>>3.142
>>circle(3)
>>27
Can anyone tell me what might be the issue?
From the docs for imp.reload():
When a module is reloaded, its dictionary (containing the module’s global variables) is retained. Redefinitions of names will override the old definitions, so this is generally not a problem. If the new version of a module does not define a name that was defined by the old version, the old definition remains.
So when you do imp.reload(mymath), even though pi no longer exists as a global name in the module's code the old definition remains as a part of the updated module.
If you really want to start from scratch, use the following method:
import sys
del sys.modules['mymath']
import mymath
For example:
>>> import os
>>> os.system("echo 'pi = 3.142' > mymath.py")
0
>>> import mymath
>>> mymath.pi
3.142
>>> os.system("echo 'pass' > mymath.py")
0
>>> import sys
>>> del sys.modules['mymath']
>>> import mymath
>>> mymath.pi
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'module' object has no attribute 'pi'
import v_framework as framework
framework.loadModules(["Maintenance"])
framework.Maintenance.showPage()
In framework I have:
def loadModules(aModules):
d_utility = {"Maintenance":"COOl_M_PAGE"}
for module in a_aModules:
exec("import " + d_utility[module] + " as " + module)
When loadModules is executed, it imports the modules in the v_framework namespace. Since I am importing v_framework as framework, I think I should be able to use the imported module using framework.Maintenance. But it does not work that way.
Is there a way to do way to do what I'm trying to do? Alternatively, is there any way to import modules in a namespace other than the one where exec is executed?
There are libraries for importing modules dynamically. You could use importlib (and another one that might be useful is pkgutil). Now, for your case, I guess this would do the job:
import importlib
mods = {}
def loadModules(aModule):
global mods
mods[module] = importlib.import_module(d_utility[module])
# or maybe globals()[module] = ... would work also (exactly as you expect it to
UPDATE: exec modifies the function's local namespace, not the global one (I think).
Hope it helps. :)
When you import inside a function, the module is imported/executed as normal, but the name you import under is local to the function, just like any other variable assigned inside a function.
>>> def test_import():
... import os
... print os
...
>>> test_import()
<module 'os' from '/usr/lib/python2.7/os.pyc'>
>>> os
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'os' is not defined
os has been imported though, and you can still access it through sys.modules:
>>> import sys
>>> sys.modules['os']
<module 'os' from '/usr/lib/python2.7/os.pyc'>
>>> os = sys.modules['os']
>>> os
<module 'os' from '/usr/lib/python2.7/os.pyc'>
A quick and dirty way to do what you want would be something like this; exec takes an optional mapping to be used as the local and global variables. So you could do
def loadModules(aModules):
d_utility = {"Maintenance":"COOl_M_PAGE"}
for module in aModules:
exec ('import %s as %s' % (d_utility[module], module)) in globals()
Though this is ugly and probably has security implications or something. As jadkik94 mentions, there are libraries that provide cleaner ways to deal with this.
Assume the following code structure:
#### 1/hhh/__init__.py: empty
#### 1/hhh/foo/__init__.py:
from hhh.foo.baz import *
#### 1/hhh/foo/bar.py:
xyzzy = 4
#### 1/hhh/foo/baz.py:
import hhh.foo.bar as bar
qux = bar.xyzzy + 10
I run python inside 1/ and do import hhh.foo.baz. It fails:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "hhh/foo/__init__.py", line 1, in <module>
from hhh.foo.baz import *
File "hhh/foo/baz.py", line 1, in <module>
import hhh.foo.bar as bar
AttributeError: 'module' object has no attribute 'foo'
Now I replace baz.py with:
# 1/hhh/foo/baz.py:
from hhh.foo.bar import xyzzy
qux = xyzzy + 10
and again do import hhh.foo.baz. Now it works, although I’m loading the same module, only binding a different name.
Does this mean that the distinction between import module and from module import name goes beyond just identifiers? What exactly is going on here?
(I know I can use relative imports to work around all this, but still I’d like to understand the mechanics. Plus I don’t like relative imports, and neither does PEP 8.)
When you write from hhh.foo.bar import xyzzy Python interpreter will try to load xyzzy from module hhh.foo.bar. But if you write import hhh.foo.bar as bar it will try first to find bar in hhh.foo module. So it evaluates hhh.foo, doing from hhh.foo.baz import *
. hhh.foo.baz tries to evaluate hhh.foo, hhh.footries to evaluate hhh.foo.baz, cyclic imports, exception.
in 1/hhh/foo/__init__.py you need to set the __all__ list with the names of what you want to export. i.e. __all__ = ["xyzzy"]
Why do you import from hhh.foo.bar in hhh.foo? import bar should suffice there.