I am developping a program which runs on two different plattforms. Depending on which plattform I want to run it, the import directories and the names of the libraries change. For this reason i set a variable called RUN_ON_PC to True or False.
I want to implement a helper which sets the paths correctly and imports the libraries with the correct name depending of the platform and gives an interface with the same name of the libraries to the main program. The module myimporthelper is either in the "/mylib" or in the "/sd/mylib" directory. The other module names in these directories differ.
I try to do the following which is not working, since the imported modules from myimporthelper.py are not visible to main.py:
main.py:
RUN_ON_PC = True
import sys
if RUN_ON_PC:
sys.path.append("/mylib1")
else:
sys.path.append("/sd/mylib1")
import myimporthelper
myimporthelper.importall(RUN_ON_PC)
a = moduleA.ClassA() -> produces NameError: name not defined
myimporthelper.py:
import sys
def importall(run_on_pc):
if (run_on_pc == True):
sys.path.append("C:\\Users\\.....\\mylib")
import module1 as moduleA
else:
sys.path.append("/sd/mylib")
import module_a as moduleA
I want to keep the main.py short and want to outsource the platform dependent importing stuff to other module. I was not able to find a solution for this and would aprecciate any help.
Thanks a lot in advance.
You just have to qualify the name with the helper module name
a = myimporthelper.moduleA.ClassA()
But the moduleA name has to be accessible. If you import it inside a function in the helper it won't be, because of scope, unless you assign it to a name you previously declared as global in the helper module function.
Related
I'm trying to dynamically build out __all__ in a __init__.py that will bring classes from many files into one namespace AND be given as suggestions in my editor. I've got the first part working. What I can't get working, though, is the ability to have my editor auto-complete discover everything that's in __all__ (I also tried updating __dir__ and defining a dir()/__dir__() method to no success).
For example, my directory tree is this:
things/
__init__.py
one.py
two.py
In __init__.py, I have code that automatically discovers classes One and Two (that are in their so-named files). However, in VS Code, when I type from things import , none of the suggestions are One or Two in the same way as when I type from things.one import , and have One suggested to me. If I manually type it all out, everything works fine, but I would really like to have the auto-complete working.
If I define __all__ with static names, VS Code auto-complete works as expected.
I've been scouring the Internet on this question to no avail and wonder if anyone has any tips or thoughts on how to accomplish it.
Here is what I have in __init__.py:
"""Import all submodules so they are available from the top level"""
import pkgutil
import sys
from importlib import import_module
from pathlib import Path
__all__ = []
def _class_name(module_name):
"""Assume camelCase class name form snake-case filename."""
return "".join([x.title() for x in module_name.split("_")])
# Loop through all modules in this directory and add to namespace
for (_, mod_name, _) in pkgutil.iter_modules([Path(__file__).parent]):
# Ensure that module isn't already loaded
if mod_name not in sys.modules:
loaded_mod = import_module("." + mod_name, package=__name__)
# Load class from imported module
name = _class_name(mod_name)
loaded_class = getattr(loaded_mod, name, None)
if not loaded_class:
continue
# Add this class to the top-level namespace
setattr(sys.modules[__name__], name, loaded_class)
__all__.append(getattr(loaded_mod, name))
Here is my directory:
This is my init.py:
When I use shortcut ctrl+space, I can get the following tips(hi is belong to a.py while hello is belong to b.py):
My code is structured as follows:
main.py
utils.py
blah.py
The main module uses argparse to read in the location of a configurations yaml file which is then loaded as a dictionary. Is there a way for utils and blah to import this built-up dictionary?
Edit: I tried using from main import config (config being the dictionary I built) but I get ImportError: cannot import name 'config' from 'main'
Edit2: Main imports the other 2 modules - apologies for leaving out this very important detail
I would recommend making another file, say, globals.py. Import this in main, utils, and blah, and set properties in it to be recalled by the other modules. For example:
globals.py
configs = {}
main.py
import .globals
...
user_configs = yaml.load('user/entered/path.yml')
globals.configs.update(user_configs) # modifies the global `configs` variable
utils.py
import .globals
...
# need to use one of the configs for something:
try:
relevant_config = globals.configs['relevant_config']
except KeyError:
print("User did not input the config field 'relevant_config'")
All modules will be able to see the same globals instance, thus allowing you to use what are effectively global variables across your program.
You could simply save configs as a gobal variable in main.py and have utils.py and blah.py import .main, but having a designated module for this is cleaner and clearer than to have other modules importing the main module.
Just do
import main
and use it as
main.dictionary
That should do it!
Consider the following:
a.py
foo = 1
b.py
bar = 2
c.py
import a
kik = 3
d.py
import a
import c
def main():
import b
main()
main()
How many times is a.py loaded?
How many times is b.py loaded?
More generally, I would like to know how is Python handling imported files and functions/variables?
Both a and b are loaded once. When you import a module, its content is cached so when you load the same module again, you're not calling upon the original script for the import, done using a "finder":
https://www.python.org/dev/peps/pep-0451/#finder
https://docs.python.org/3/library/importlib.html#importlib.abc.MetaPathFinder
This works across modules so if you had a d.py of which import b, it will bind to the same cache as an import within c.py.
Some interesting builtin modules can help understand what happens during an import:
https://docs.python.org/3/reference/import.html#importsystem
When a module is first imported, Python searches for the module and if found, it creates a module object 1, initializing it.
Notably here the first import, all imports after follow the __import__. Internal caches of finders are stored at sys.meta_path.
https://docs.python.org/3/library/functions.html#import
You can leverage the import system to invalidate those caches for example:
https://docs.python.org/3/library/importlib.html#importlib.import_module
If you are dynamically importing a module that was created since the interpreter began execution (e.g., created a Python source file), you may need to call invalidate_caches() in order for the new module to be noticed by the import system.
The imp (and importlib py3.4+) allows the recompilation of a module after import:
import imp
import a
imp.reload(a)
https://docs.python.org/3/library/importlib.html#importlib.reload
Python module’s code is recompiled and the module-level code re-executed, defining a new set of objects which are bound to names in the module’s dictionary by reusing the loader which originally loaded the module.
https://docs.python.org/3/library/imp.html
I'm dynamically importing any .py modules that are not the __init__.py or my skeleton.py files from a subdirectory. I first build a list that looks like (.py is omitted):
mods_in_dir = ['my_mods.frst_mod','my_mods.scnd_mod']
my_mods beign the subdirectory, frst_mod.py and sncd_mod.py being the available modules. All modules I import contain the class Operation and it offers always the same three functions. Then I import them using importlib.
import importlib
imports = {}
for i in mods_in_dir:
imports[i] = importlib.import_module(i)
Which results in apparently successful imports, as I checked it by print(sys.modules.keys()), where they show up as [...], 'my_mods.frst_mod', 'my_mods.scnd_mod', [...].
For testing reasons I defined a function in every of the modules:
def return_some_string():
return "some string"
But I'm unable to call this function. I've tried everything that came to my mind.. e.g. print(my_mods.frst_mod.Operation.return_some_string()) or print(frst_mod.Operation.return_some_string())
Both resulting in NameError: name 'my_mods' is not defined or NameError: name 'frst_mod' is not defined
edit:
Solved.
#m170897017 helped me solve the problem in my initial attempt.
I missed that I had the modules in the imports dict and didn't use that.
print(imports['my_mods.frst_mod'].ModsClassName.return_some_string()) now successfully prints some_string
Change dynamic import to this way and try again:
...
mods_in_dir = ['my_mods.frst_mod','my_mods.scnd_mod']
modules = list(map(__import__, mods_in_dir))
# if you want to invoke func1 in my_mods.frst_mod and get result from it
result1 = modules[0].func1()
# if you want to invoke func2 in my_mods.scnd_mod and get result from it
result2 = modules[1].func2()
...
There are several posts around this error I have already read, but I still don't get what I am doing wrong.
I put it into a minimal example:
Imagine I have a Doc.py, and the package Tools which includes Tool1.py and Tool2.py.
Doc.py:
from Tools import *
import sys
def __main__():
TOOL_REPORT("Tool1","Test")
def TOOL_REPORT(tool, path):
if(tool == 'Tool1'):
Tool1.REPORT(path)
elif(tool == 'Tool2'):
Tool2.REPORT(path)
else:
sys.stderr.write("This tool is not yet included in Doc. Please check TOOLS for more information.")
if __name__=="__main__": __main__()
Tool1.py:
def REPORT(path):
print("Tool1 "+path)
Tool2.py:
def REPORT(path):
print("Tool2 "+path)
If I run this, I always end up with this error:
File "Doc.py", line 15, in TOOL_REPORT
Tool1.REPORT(path)
NameError: global name 'Tool1' is not defined
I'd appreciate any hint to what is going wrong!
Your Tool1 and Tool2 submodules are not visible until explicitly imported somewhere.
You can import them in the Tools/__init__.py package file:
import Tool1, Tool2
at which point they become available for import from Tools.
Another option is to import the modules from your own code:
import Tools.Tool1, Tools.Tool2
from Tools import *
Only when explicitly imported are submodules also set as attributes of the package.
Python will treat any folder as a module when there is __init__.py file present in it. Otherwise it will just be another folder for python and not a module from which it can import things. So just add init.py file in your Tool folder (so it will become module in pythonic terms) and then you can import that module in other python scripts.
One more things for better practice instead of using
from Tools import *
Always provide the file name of library specifically which you want to import like in your case you should use it like this
from Tools import Tool1, Tool2
This will enhance the code readbility for others and for you too.