Runtime import with with multiple subdirectories in python - python

what is the best way to perform this type of import in python
file to be imported which is available in location one/ne_one/one_two/"
fielname : two.py
def foo():
print "venkatttt!"
main file : main.py
s = __import__("one.one_one.one_two.two", fromlist=[])
function_class = getattr(s,"one_one")
function_class1 = getattr(function_class,"one_two")
function_class2 = getattr(function_class1,"two")
print s
print function_class
print function_class1
print function_class2
function_class2.foo()
output of this code:
<module 'one' from '/opt/auto/src/ex/one/__init__.pyc'>
<module 'one.one_one' from '/opt/auto/src/ex/one/one_one/__init__.pyc'>
<module 'one.one_one.one_two' from '/opt/auto/src/ex/one/one_one/one_two/__init__.pyc'>
<module 'one.one_one.one_two.two' from '/opt/auto/src/ex/one/one_one/one_two/two.py'>
venkatttt!
i am looking out for the best way to perform this import

From your output, I can see you already have __init__.py files in each subdirectory, therefore, you can simply import them:
$> from one.one_one.one_two.two import foo
$> foo()
If you want a handle for each module, you can import them separately:
$> import one.one_one as function_class
$> import one.one_one.one_two as function_class1
$> import one.one_one.one_two.two as function_class2
Finally, you can also define __all__ in one/__init__.py and let this auto-imports happen automatically when import one is executed.

Related

Why do *import ...* and *from x import y* idioms behave so differently here?

I have a package structure like this:
- src
- src/main.py
- src/package1
- src/package1/__init__.py
- src/package1/module1.py
- src/package1/module2.py
... where module2 is a subclass of module1, and therefore module1 gets referenced by an absolute import path in module2.py.
That is, in src/package1/module2.py:
from package1.module1 import SomeClassFromModule1
The problem occurs in the main.py script:
## here the imports
def main():
# create an instance of the child class in Module2
if __name__ == "__main__":
main()
Option 1 works. That is, in src/main.py:
from package1.module2 import SomeClassFromModule2
some_name = SomeClassFromModule2()
Option 2 does not work. That is, in src/main.py:
import package1.module2.SomeClassFromModule2
some_name = package1.module2.SomeClassFromModule2()
... causes the following error.
ModuleNotFoundError: No module named 'package1.module2.SomeClassFromModule2'; 'package1.module2' is not a package
So why is there this difference between the import and from ... import idiom?
Would be glad for some clarification.
import x keyword brings all the methods and class from x in the the file it is being called.
from x import y this brings a specific method or class('y' is a method or class) from that .py file ('x' is the file here) instead of bringing all the methods it has.
In your case when you import package1.module2 the SomeClassForModule2() is being already imported and hence you need not write import package1.module2.SomeClassFromModule2
here I guess you want to access a class, so you need to create a object in order to access it.
hope this helped you
After some test, I think you cannot import a function or class by using import your_module.your_class.
It's all about package, module, function and class:
# import module
>>>import os
<module 'os' from ooxx>
#use module of module (a litte weird)
>>>os.path
<module 'posixpath' from ooxx>
#import module of module (a litte weird)
>>>import os.path
#use function
>>>os.path.dirname
<function posixpath.dirname(p)>
# you cannot import a function (or class) by using 'import your.module.func'
# 'import ooxx' always get a module or package.
>>>import os.path.dirname
ModuleNotFoundError
No module named 'os.path.dirname'; 'os.path' is not a package
# instead of it, using 'from your_module import your_function_or_class'
>>>from os.path import dirname
<function posixpath.dirname(p)>

Importing python module, through __init__.py script

My directory structure is
app.py
lib
__init__.py
_foo.py
Inside __init__.py I have written
from . import _foo as foo
Then inside app.py I try to make a call
from lib.foo import *
but it throws ModuleNotFoundError: No module named 'lib.foo' exception.
Basically I want to import everything from _foo.py, through the __init__.py script.
While I realize the code works if _foo.py is renamed into foo.py,
I still wonder if there is any way to make import work through __init__.py.
Not sure about hacking around the import statements, but you could get away with something less explicit like this:
lib/__init__.py
from . import _foo as foo
__all__ = ['foo']
lib/_foo.py
__all__ = [
'test'
]
test = 1
>>> from lib import *
>>> foo
<module 'lib._foo' from '/path/to/test/lib/_foo.py'>
>>> foo.test
1
>>>
EDIT: You could achieve something more explicit by updating sys.modules at runtime:
app.py
import sys
from lib import _foo
sys.modules['lib.foo'] = _foo
lib/_foo.py
test = 1
keep lib/__init__.py to make lib a module
After importing app lib.foo will be an available module
>>> import app
>>> from lib import foo
>>> foo
<module 'lib._foo' from '/path/to/test/lib/_foo.py'>
>>> foo.test
1

Python runtime determine which module to load

I have my Python unittest script like below. It takes an argument '-a' to determine whether the testcases should load the base module from foo_PC_A.py or foo_PC_B.py. I use shutil.move() to rename either .py file to foo.py, so all the testcase modules (e.g. tm1.py, tm2.py) can simply import foo. Though this looks like a workaround and not Pythonic. Is there any better way to do this? Or a better design to fundamentally resolve this issue.
(run_all_unittest.py)
if sys.argv[1] = '-a':
shutil.move('foo_PC_A.py', 'foo.py')
else:
shutil.move('foo_PC_B.py', 'foo.py')
test_module_list = ['tm1', 'tm2', ...]
for test_module_name in test_module_list:
test_module = __import__(test_module_name)
test_suites.append(unittest.TestLoader().loadTestsFromModule(test_module))
alltests = unittest.TestSuite(test_suites)
unittest.TextTestRunner().run(alltests)
if sys.argv[1] = '-a':
shutil.move('foo.py', 'foo_PC_A.py')
else:
shutil.move('foo.py', 'foo_PC_B.py')
(tm1.py)
from foo import MyTestCase
...
(foo_PC_A.py)
import <some module only available on PC A>
class MyTestCase(unittest.TestCase):
...
(foo_PC_B.py)
# since the PC A modules are not available on PC B,
# just call the pre-built executable via subprocess
import subprocess
class MyTestCase(unittest.TestCase):
...
def test_run(self):
subprocess.call(...)
You can fool Python into thinking the module has already been loaded. Just import the module dynamically and use sys.modules:
import sys
import importlib
if sys.argv[1] = '-a':
sys.modules['foo'] = importlib.import_module('foo_PC_A')
else:
sys.modules['foo'] = importlib.import_module('foo_PC_A')
When any module runs import foo or from foo import ..., Python will use that path.
Note that if foo is moved to a package, the full Python path must be specified, as in:
sys.modules['path.to.foo'] = ...

How to use relative import without doing python -m?

I have a folder like this
/test_mod
__init__.py
A.py
test1.py
/sub_mod
__init__.py
B.py
test2.py
And I want to use relatives imports in test1 and test2 like this
#test1.py
from . import A
from .sub_mod import B
...
#test2.py
from .. import A
from . import B
...
While I develop test1 or test2 I want that those imports to work while I am in the IDLE, that is if I press F5 while working in test2 that every work fine, because I don't want to do python -m test_mod.sub_mod.test2 for instance.
I already check this
python-relative-imports-for-the-billionth-time
Looking at that, I tried this:
if __name__ == "__main__" and not __package__:
__package__ = "test_mod.sub_mod"
from .. import A
from . import B
But that didn't work, it gave this error:
SystemError: Parent module 'test_mod.sub_mod' not loaded, cannot perform relative import
in the end I found this solution
#relative_import_helper.py
import sys, os, importlib
def relative_import_helper(path,nivel=1,verbose=False):
namepack = os.path.dirname(path)
packs = []
for _ in range(nivel):
temp = os.path.basename(namepack)
if temp:
packs.append( temp )
namepack = os.path.dirname(namepack)
else:
break
pack = ".".join(reversed(packs))
sys.path.append(namepack)
importlib.import_module(pack)
return pack
and I use as
#test2.py
if __name__ == "__main__" and not __package__:
print("idle trick")
from relative_import_helper import relative_import_helper
__package__ = relative_import_helper(__file__,2)
from .. import A
...
then I can use relatives import while working in the IDLE.

changing namespace of current python script (class with module name)

i have a 2.6 python script and library in the following directory structure:
+ bin
\- foo.py
+ lib
\+ foo
\- bar.py
i would like users to run bin/foo.py to instantiate the classes within lib/foo.py. to achieve this, in my bin/foo.py script i have the following code:
from __future__ import absolute_import
import foo
klass = foo.bar.Klass()
however, this results in:
AttributeError: 'module' object has no attribute 'bar'
ie it thinks that foo is itself rather than the library foo - renaming bin/foo.py to bin/foo-script.py works as expected.
is there a way i can keep the bin/foo.py script and import lib/foo.py?
The current directory is on the path by default, so you need to remove that before you import the other foo module:
import sys
sys.path = [dir for dir in sys.path if dir != '']
Alternatively, prepend the lib directory so that it takes precedence:
import sys
sys.path = ['../lib'] + sys.path
If you just write import foo, it will definitely load the foo module in the current scope. Assuming lib and foo as packages, won't you need to write something like this in order to make it work?
import lib.foo.bar as foobar
klass = foobar.Klass()

Categories

Resources