I would like to be able to import a python module which is actually located in a subdirectory of another module.
I am developing a framework with plug-ins.
Since I'm expecting to have a few thousands (there's currently >250 already) and I don't want one big directory containing >1000 files I have them ordered in directories like this, where they are grouped by the first letter of their name:
framework\
__init__.py
framework.py
tools.py
plugins\
__init__.py
a\
__init__.py
atlas.py
...
b\
__init__.py
binary.py
...
c\
__init__.py
cmake.py
...
Since I would not like to impose a burden on developers of other plugins, or people not needing as many as I have, I would like to put each plugin in the 'framework.plugins' namespace.
This way someone adding a bunch of private plugins can just do so by adding them in the folder framework.plugins and there provide a __init__.py file containing:
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)
however, currently this setup is forcing them to also use the a-z subdirectories.
Sometimes a plugin is extending another plugin, so now I have a
from framework.plugins.a import atlas
and I would like to have
from framework.pugins import atlas
Is there any way to declare a namespace where the full name space name actually doesn't map to a folder structure?
I am aware of the pkg_resources package, but this is only available via setuptools, and I'd rather not have an extra dependency.
import pkg_resources
pkg_resources.declare_namespace(__name__)
The solution should work in python 2.4-2.7.3
update:
Combining the provided answers I tried to get a list of all plugins imported in the __init__.py from plugins. However, this fails due to dependencies. Since a plugin in the 'c' folder tries to import a plugin starting with 't', and this one has not been added yet.
plugins = [ x[0].find_module(x[1]).load_module(x[1]) for x in pkgutil.walk_packages([ os.path.join(framework.plugins.__path__[0], chr(y)) for y in xrange(ord('a'), ord('z') + 1) ],'framework.plugins.' ) ]
I'm not sure If I'm on the right track here, or just overcomplicating things and better write my own PEP302 importer. However, I can't seem to find any decent examples of how these should work.
Update:
I tried to follow the suggesting of wrapping the __getattr__ function in my __init__.py, but this seems to no avail.
import pkgutil
import os
import sys
plugins = [x[1] for x in pkgutil.walk_packages([ os.path.join(__path__[0], chr(y)) for y in xrange(ord('a'), ord('z') + 1) ] )]
import types
class MyWrapper(types.ModuleType):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
if name in plugins:
askedattr = name[0] + '.' + name
else:
askedattr = name
attr = getattr(self.wrapped, askedattr)
return attr
sys.modules[__name__] = MyWrapper(sys.modules[__name__])
A simple solution would be to import your a, b... modules in plugins.__init__, like:
from a import atlas
from b import binary
...
I'd suggest lazy-loading the modules by subclassing types.ModuleType and converting modname to modname[0] + '.' + modname on demand; this should be less work than implementing a module loader.
You can look at apipkg for an example of how to do this.
Don't use the pkgutil.extend_path function here, it tries to do the opposite of what you're trying to accomplish:
This will add to the package’s __path__ all subdirectories of
directories on sys.path named after the package.
This is useful if one wants to distribute different parts of a
single logical package as multiple directories.
Just extending __path__ with the subdirectories in your framework.plugins.__init__.py works just fine.
So the solution to this problem is: put this in your __init__.py:
__path__.extend([os.path.join(__path__[0],chr(y)) for y in range(ord('a'),ord('z')+1)])
This isn't a particularly fast solution (startup overheads), but what about having plugins.__init__ scrape the filesystem and import each found file into the local namespace?
import glob
import sys
thismodule = sys.modules[__name__]
for plugin in glob.glob("?/*"):
_temp = __import__(plugin.split("/")[0],
globals(),
locals(),
[plugin.split("/")[1]],
-1)
setattr(thismodule, plugin.split("/")[1], getattr(_temp, plugin.split("/")[1]))
Related
I'm trying to dynamically build out __all__ in a __init__.py that will bring classes from many files into one namespace AND be given as suggestions in my editor. I've got the first part working. What I can't get working, though, is the ability to have my editor auto-complete discover everything that's in __all__ (I also tried updating __dir__ and defining a dir()/__dir__() method to no success).
For example, my directory tree is this:
things/
__init__.py
one.py
two.py
In __init__.py, I have code that automatically discovers classes One and Two (that are in their so-named files). However, in VS Code, when I type from things import , none of the suggestions are One or Two in the same way as when I type from things.one import , and have One suggested to me. If I manually type it all out, everything works fine, but I would really like to have the auto-complete working.
If I define __all__ with static names, VS Code auto-complete works as expected.
I've been scouring the Internet on this question to no avail and wonder if anyone has any tips or thoughts on how to accomplish it.
Here is what I have in __init__.py:
"""Import all submodules so they are available from the top level"""
import pkgutil
import sys
from importlib import import_module
from pathlib import Path
__all__ = []
def _class_name(module_name):
"""Assume camelCase class name form snake-case filename."""
return "".join([x.title() for x in module_name.split("_")])
# Loop through all modules in this directory and add to namespace
for (_, mod_name, _) in pkgutil.iter_modules([Path(__file__).parent]):
# Ensure that module isn't already loaded
if mod_name not in sys.modules:
loaded_mod = import_module("." + mod_name, package=__name__)
# Load class from imported module
name = _class_name(mod_name)
loaded_class = getattr(loaded_mod, name, None)
if not loaded_class:
continue
# Add this class to the top-level namespace
setattr(sys.modules[__name__], name, loaded_class)
__all__.append(getattr(loaded_mod, name))
Here is my directory:
This is my init.py:
When I use shortcut ctrl+space, I can get the following tips(hi is belong to a.py while hello is belong to b.py):
I'm creating a package with the following structure
/package
__init__.py
/sub_package_1
__init__.py
other_stuff.py
/sub_package_2
__init__.py
calc_stuff.py
/results_dir
I want to ensure that calc_stuff.py will save results to /results_dir, unless otherwise specified (yes, I'm not entirely certain having a results directory in my package is the best idea, but it should work well for now). However, since I don't know from where, or on which machine calc_stuff will be run, I need the package, or at least my_calc.py, to know where it is saved.
So far the two approaches I have tried:
from os import path
saved_dir = path.join(path.dirname(__file__), 'results_dir')
and
from pkg_resources import resource_filename
filepath = resource_filename(__name__, 'results_dir')
have only given me paths relative to the root of the package.
What do I need to do to ensure a statement along the lines of:
pickle.dump(my_data,open(os.path.join(full_path,
'results_dir',
'results.pkl'), 'wb')
Will result in a pickle file being saved into results_dir ?
I'm not entirely certain having a results directory in my package is the best idea, me either :)
But, if you were to put a function like the following inside a module in subpackage2, it should return a path consisting of (module path minus filename, 'results_dir', the filename you passed the function as an argument):
def get_save_path(filename):
import os
return os.path.join(os.path.dirname(__file__), "results_dir", filename)
C:\Users\me\workspaces\workspace-oxygen\test36\TestPackage\results_dir\foo.ext
I am trying to import modules dynamically in Python. Right now, I have a directory called 'modules' with two files inside; they are mod1.py and mod2.py. They are simple test functions to return time (ie. mod1.what_time('now') returns the current time).
From my main application, I can import as follows :
sys.path.append('/Users/dxg/import_test/modules')
import mod1
Then execute :
mod1.what_time('now')
and it works.
I am not always going to know what modules are available in the dirctory. I wanted to import as follows :
tree = []
tree = os.listdir('modules')
sys.path.append('/Users/dxg/import_test/modules')
for i in tree:
import i
However I get the error :
ImportError: No module named i
What am I missing?
The import instruction does not work with variable contents (as strings) (see extended explanation here), but with file names. If you want to import dynamically, you can use the importlib.import_module method:
import importlib
tree = os.listdir('modules')
...
for i in tree:
importlib.import_module(i)
Note:
You can not import from a directory where the modules are not included under Lib or the current directory like that (adding the directory to the path won't help, see previous link for why). The simplest solution would be to make this directory (modules) a package (just drop an empty __init__.py file there), and call importlib.import_module('..' + i, 'modules.subpkg') or use the __import__ method.
You might also review this question. It discusses a similar situation.
You can achieve something like what you are proposing, but it will involve some un-pythonic code. I do not recommend doing this:
dynamic_imports = dict()
for filename in tree:
name = filename.replace('.py', '')
dynamic_imports[name] = __import__(name)
I am trying to parse a given path for python source files, import each file and DoStuff™ to each imported module.
def ParsePath(path):
for root, dirs, files in os.walk(path):
for source in (s for s in files if s.endswith(".py")):
name = os.path.splitext(os.path.basename(source))[0]
m = imp.load_module(name, *imp.find_module(name, [root]))
DoStuff(m)
The above code works, but packages aren't recognized ValueError: Attempted relative import in non-package
My question is basically, how do I tell imp.load_module that a given module is part of a package?
You cannot directly tell Importer Protocol method load_module that the module given is part of the package. Taken from PEP 302 New Import Hooks
The built-in __import__ function
(known as PyImport_ImportModuleEx
in import.c) will then check to see whether the module doing the
import is a package or a submodule of a package. If it is indeed a
(submodule of a) package, it first tries to do the import relative
to the package (the parent package for a submodule). For example if
a package named "spam" does "import eggs", it will first look for
a
module named "spam.eggs". If that fails, the import continues as an
absolute import: it will look for a module named "eggs". Dotted
name imports work pretty much the same: if package "spam" does
"import eggs.bacon" (and "spam.eggs" exists and is itself a
package), "spam.eggs.bacon" is tried. If that fails "eggs.bacon" is
tried. (There are more subtleties that are not described here, but
these are not relevant for implementers of the Importer
Protocol.)
Deeper down in the mechanism, a dotted name import is split up by
its components. For "import spam.ham", first an "import spam" is
done, and only when that succeeds is "ham" imported as a submodule
of "spam".
The Importer Protocol operates at this level of individual
imports. By the time an importer gets a request for
"spam.ham",
module "spam" has already been imported.
You must then simulate what the built-in import does and load parent packages before loading sub modules.
The function imp.find_module always takes a plain module name without dots, but the documentation of imp.load_module says
The name argument indicates the full module name (including the package name, if this is a submodule of a package).
So you could try this:
def ParsePath(path):
for root, dirs, files in os.walk(path):
for source in (s for s in files if s.endswith(".py")):
name = os.path.splitext(os.path.basename(source))[0]
full_name = os.path.splitext(source)[0].replace(os.path.sep, '.')
m = imp.load_module(full_name, *imp.find_module(name, [root]))
DoStuff(m)
I had the same problem. Good news is that there is a way of doing it, but you have to use a combination of imp and importlib. Here's an illustrative example:
import imp
import importlib
package_path = r"C:\path_to_package"
package_name = "module"
module_absolute_name = "module.sub_module"
module_relative_name = ".sub_module"
# Load the package first
package_info = imp.find_module(package_name, [package_path])
package_module = imp.load_module(package_name, *package_info)
# Try an absolute import
importlib.import_module(module_absolute_name, package_name)
# Try a relative import
importlib.import_module(module_relative_name, package_name)
This will allow sub_module to import using relative module paths because we've already loaded the parent package and the submodule has been loaded correctly by importlib to know what it's being imported relative to.
I believe this solution is only necessary for those of us stuck in Python 2.*, but would need someone to confirm that.
In a big application I am working, several people import same modules differently e.g.
import x
or
from y import x
the side effects of that is x is imported twice and may introduce very subtle bugs, if someone is relying on global attributes
e.g. suppose I have a package mypakcage with three file mymodule.py, main.py and init.py
mymodule.py contents
l = []
class A(object): pass
main.py contents
def add(x):
from mypackage import mymodule
mymodule.l.append(x)
print "updated list",mymodule.l
def get():
import mymodule
return mymodule.l
add(1)
print "lets check",get()
add(1)
print "lets check again",get()
it prints
updated list [1]
lets check []
updated list [1, 1]
lets check again []
because now there are two lists in two different modules, similarly class A is different
To me it looks serious enough because classes itself will be treated differently
e.g. below code prints False
def create():
from mypackage import mymodule
return mymodule.A()
def check(a):
import mymodule
return isinstance(a, mymodule.A)
print check(create())
Question:
Is there any way to avoid this? except enforcing that module should be imported one way onyl. Can't this be handled by python import mechanism, I have seen several bugs related to this in django code and elsewhere too.
Each module namespace is imported only once. Issue is, you're importing them differently. On the first you're importing from the global package, and on the second you're doing a local, non-packaged import. Python sees modules as different. The first import is internally cached as mypackage.mymodule and the second one as mymodule only.
A way to solve this is to always use absolute imports. That is, always give your module absolute import paths from the top-level package onwards:
def add(x):
from mypackage import mymodule
mymodule.l.append(x)
print "updated list",mymodule.l
def get():
from mypackage import mymodule
return mymodule.l
Remember that your entry point (the file you run, main.py) also should be outside the package. When you want the entry point code to be inside the package, usually you use a run a small script instead. Example:
runme.py, outside the package:
from mypackage.main import main
main()
And in main.py you add:
def main():
# your code
I find this document by Jp Calderone to be a great tip on how to (not) structure your python project. Following it you won't have issues. Pay attention to the bin folder - it is outside the package. I'll reproduce the entire text here:
Filesystem structure of a Python project
Do:
name the directory something
related to your project. For example,
if your project is named "Twisted",
name the top-level directory for its
source files Twisted. When you do
releases, you should include a version
number suffix: Twisted-2.5.
create a directory Twisted/bin and
put your executables there, if you
have any. Don't give them a .py
extension, even if they are Python
source files. Don't put any code in
them except an import of and call to a
main function defined somewhere else
in your projects.
If your project
is expressable as a single Python
source file, then put it into the
directory and name it something
related to your project. For example,
Twisted/twisted.py. If you need
multiple source files, create a
package instead (Twisted/twisted/,
with an empty
Twisted/twisted/__init__.py) and
place your source files in it. For
example,
Twisted/twisted/internet.py.
put
your unit tests in a sub-package of
your package (note - this means that
the single Python source file option
above was a trick - you always need at
least one other file for your unit
tests). For example,
Twisted/twisted/test/. Of course,
make it a package with
Twisted/twisted/test/__init__.py.
Place tests in files like
Twisted/twisted/test/test_internet.py.
add Twisted/README and Twisted/setup.py to explain and
install your software, respectively,
if you're feeling nice.
Don't:
put your source in a directory
called src or lib. This makes it
hard to run without installing.
put
your tests outside of your Python
package. This makes it hard to run the
tests against an installed version.
create a package that only has a
__init__.py and then put all your
code into __init__.py. Just make a
module instead of a package, it's
simpler.
try to come up with
magical hacks to make Python able to
import your module or package without
having the user add the directory
containing it to their import path
(either via PYTHONPATH or some other
mechanism). You will not correctly
handle all cases and users will get
angry at you when your software
doesn't work in their environment.
I can only replicate this if main.py is the file you are actually running. In that case you will get the current directory of main.py on the sys path. But you apparently also have a system path set so that mypackage can be imported.
Python will in that situation not realize that mymodule and mypackage.mymodule is the same module, and you get this effect. This change illustrates this:
def add(x):
from mypackage import mymodule
print "mypackage.mymodule path", mymodule
mymodule.l.append(x)
print "updated list",mymodule.l
def get():
import mymodule
print "mymodule path", mymodule
return mymodule.l
add(1)
print "lets check",get()
add(1)
print "lets check again",get()
$ export PYTHONPATH=.
$ python mypackage/main.py
mypackage.mymodule path <module 'mypackage.mymodule' from '/tmp/mypackage/mymodule.pyc'>
mymodule path <module 'mymodule' from '/tmp/mypackage/mymodule.pyc'>
But add another mainfile, in the currect directory:
realmain.py:
from mypackage import main
and the result is different:
mypackage.mymodule path <module 'mypackage.mymodule' from '/tmp/mypackage/mymodule.pyc'>
mymodule path <module 'mypackage.mymodule' from '/tmp/mypackage/mymodule.pyc'>
So I suspect that you have your main python file within the package. And in that case the solution is to not do that. :-)