How do I import a Python module given its relative path?
For example, if dirFoo contains Foo.py and dirBar, and dirBar contains Bar.py, how do I import Bar.py into Foo.py?
Here's a visual representation:
dirFoo\
Foo.py
dirBar\
Bar.py
Foo wishes to include Bar, but restructuring the folder hierarchy is not an option.
Assuming that both your directories are real Python packages (do have the __init__.py file inside them), here is a safe solution for inclusion of modules relatively to the location of the script.
I assume that you want to do this, because you need to include a set of modules with your script. I use this in production in several products and works in many special scenarios like: scripts called from another directory or executed with python execute instead of opening a new interpreter.
import os, sys, inspect
# realpath() will make your script run, even if you symlink it :)
cmd_folder = os.path.realpath(os.path.abspath(os.path.split(inspect.getfile( inspect.currentframe() ))[0]))
if cmd_folder not in sys.path:
sys.path.insert(0, cmd_folder)
# Use this if you want to include modules from a subfolder
cmd_subfolder = os.path.realpath(os.path.abspath(os.path.join(os.path.split(inspect.getfile( inspect.currentframe() ))[0],"subfolder")))
if cmd_subfolder not in sys.path:
sys.path.insert(0, cmd_subfolder)
# Info:
# cmd_folder = os.path.dirname(os.path.abspath(__file__)) # DO NOT USE __file__ !!!
# __file__ fails if the script is called in different ways on Windows.
# __file__ fails if someone does os.chdir() before.
# sys.argv[0] also fails, because it doesn't not always contains the path.
As a bonus, this approach does let you force Python to use your module instead of the ones installed on the system.
Warning! I don't really know what is happening when current module is inside an egg file. It probably fails too.
Be sure that dirBar has the __init__.py file -- this makes a directory into a Python package.
You could also add the subdirectory to your Python path so that it imports as a normal script.
import sys
sys.path.insert(0, <path to dirFoo>)
import Bar
import os
import sys
lib_path = os.path.abspath(os.path.join(__file__, '..', '..', '..', 'lib'))
sys.path.append(lib_path)
import mymodule
Just do simple things to import the .py file from a different folder.
Let's say you have a directory like:
lib/abc.py
Then just keep an empty file in lib folder as named
__init__.py
And then use
from lib.abc import <Your Module name>
Keep the __init__.py file in every folder of the hierarchy of the import module.
If you structure your project this way:
src\
__init__.py
main.py
dirFoo\
__init__.py
Foo.py
dirBar\
__init__.py
Bar.py
Then from Foo.py you should be able to do:
import dirFoo.Foo
Or:
from dirFoo.Foo import FooObject
Per Tom's comment, this does require that the src folder is accessible either via site_packages or your search path. Also, as he mentions, __init__.py is implicitly imported when you first import a module in that package/directory. Typically __init__.py is simply an empty file.
The easiest method is to use sys.path.append().
However, you may be also interested in the imp module.
It provides access to internal import functions.
# mod_name is the filename without the .py/.pyc extention
py_mod = imp.load_source(mod_name,filename_path) # Loads .py file
py_mod = imp.load_compiled(mod_name,filename_path) # Loads .pyc file
This can be used to load modules dynamically when you don't know a module's name.
I've used this in the past to create a plugin type interface to an application, where the user would write a script with application specific functions, and just drop thier script in a specific directory.
Also, these functions may be useful:
imp.find_module(name[, path])
imp.load_module(name, file, pathname, description)
This is the relevant PEP:
http://www.python.org/dev/peps/pep-0328/
In particular, presuming dirFoo is a directory up from dirBar...
In dirFoo\Foo.py:
from ..dirBar import Bar
The easiest way without any modification to your script is to set PYTHONPATH environment variable. Because sys.path is initialized from these locations:
The directory containing the input script (or the current
directory).
PYTHONPATH (a list of directory names, with the same
syntax as the shell variable PATH).
The installation-dependent default.
Just run:
export PYTHONPATH=/absolute/path/to/your/module
You sys.path will contains above path, as show below:
print sys.path
['', '/absolute/path/to/your/module', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-linux2', '/usr/lib/python2.7/lib-tk', '/usr/lib/python2.7/lib-old', '/usr/lib/python2.7/lib-dynload', '/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages/PIL', '/usr/lib/python2.7/dist-packages/gst-0.10', '/usr/lib/python2.7/dist-packages/gtk-2.0', '/usr/lib/pymodules/python2.7', '/usr/lib/python2.7/dist-packages/ubuntu-sso-client', '/usr/lib/python2.7/dist-packages/ubuntuone-client', '/usr/lib/python2.7/dist-packages/ubuntuone-control-panel', '/usr/lib/python2.7/dist-packages/ubuntuone-couch', '/usr/lib/python2.7/dist-packages/ubuntuone-installer', '/usr/lib/python2.7/dist-packages/ubuntuone-storage-protocol']
In my opinion the best choice is to put __ init __.py in the folder and call the file with
from dirBar.Bar import *
It is not recommended to use sys.path.append() because something might gone wrong if you use the same file name as the existing python package. I haven't test that but that will be ambiguous.
The quick-and-dirty way for Linux users
If you are just tinkering around and don't care about deployment issues, you can use a symbolic link (assuming your filesystem supports it) to make the module or package directly visible in the folder of the requesting module.
ln -s (path)/module_name.py
or
ln -s (path)/package_name
Note: A "module" is any file with a .py extension and a "package" is any folder that contains the file __init__.py (which can be an empty file). From a usage standpoint, modules and packages are identical -- both expose their contained "definitions and statements" as requested via the import command.
See: http://docs.python.org/2/tutorial/modules.html
from .dirBar import Bar
instead of:
from dirBar import Bar
just in case there could be another dirBar installed and confuse a foo.py reader.
For this case to import Bar.py into Foo.py, first I'd turn these folders into Python packages like so:
dirFoo\
__init__.py
Foo.py
dirBar\
__init__.py
Bar.py
Then I would do it like this in Foo.py:
from .dirBar import Bar
If I wanted the namespacing to look like Bar.whatever, or
from . import dirBar
If I wanted the namespacing dirBar.Bar.whatever. This second case is useful if you have more modules under the dirBar package.
Add an __init__.py file:
dirFoo\
Foo.py
dirBar\
__init__.py
Bar.py
Then add this code to the start of Foo.py:
import sys
sys.path.append('dirBar')
import Bar
Relative sys.path example:
# /lib/my_module.py
# /src/test.py
if __name__ == '__main__' and __package__ is None:
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '../lib')))
import my_module
Based on this answer.
Well, as you mention, usually you want to have access to a folder with your modules relative to where your main script is run, so you just import them.
Solution:
I have the script in D:/Books/MyBooks.py and some modules (like oldies.py). I need to import from subdirectory D:/Books/includes:
import sys,site
site.addsitedir(sys.path[0] + '\\includes')
print (sys.path) # Just verify it is there
import oldies
Place a print('done') in oldies.py, so you verify everything is going OK. This way always works because by the Python definition sys.path as initialized upon program startup, the first item of this list, path[0], is the directory containing the script that was used to invoke the Python interpreter.
If the script directory is not available (e.g. if the interpreter is invoked interactively or if the script is read from standard input), path[0] is the empty string, which directs Python to search modules in the current directory first. Notice that the script directory is inserted before the entries inserted as a result of PYTHONPATH.
Another solution would be to install the py-require package and then use the following in Foo.py
import require
Bar = require('./dirBar/Bar')
Simply you can use: from Desktop.filename import something
Example:
given that the file is name test.py in directory
Users/user/Desktop , and will import everthing.
the code:
from Desktop.test import *
But make sure you make an empty file called "__init__.py" in that directory
Here's a way to import a file from one level above, using the relative path.
Basically, just move the working directory up a level (or any relative location), add that to your path, then move the working directory back where it started.
#to import from one level above:
cwd = os.getcwd()
os.chdir("..")
below_path = os.getcwd()
sys.path.append(below_path)
os.chdir(cwd)
I'm not experienced about python, so if there is any wrong in my words, just tell me. If your file hierarchy arranged like this:
project\
module_1.py
module_2.py
module_1.py defines a function called func_1(), module_2.py:
from module_1 import func_1
def func_2():
func_1()
if __name__ == '__main__':
func_2()
and you run python module_2.py in cmd, it will do run what func_1() defines. That's usually how we import same hierarchy files. But when you write from .module_1 import func_1 in module_2.py, python interpreter will say No module named '__main__.module_1'; '__main__' is not a package. So to fix this, we just keep the change we just make, and move both of the module to a package, and make a third module as a caller to run module_2.py.
project\
package_1\
module_1.py
module_2.py
main.py
main.py:
from package_1.module_2 import func_2
def func_3():
func_2()
if __name__ == '__main__':
func_3()
But the reason we add a . before module_1 in module_2.py is that if we don't do that and run main.py, python interpreter will say No module named 'module_1', that's a little tricky, module_1.py is right beside module_2.py. Now I let func_1() in module_1.py do something:
def func_1():
print(__name__)
that __name__ records who calls func_1. Now we keep the . before module_1 , run main.py, it will print package_1.module_1, not module_1. It indicates that the one who calls func_1() is at the same hierarchy as main.py, the . imply that module_1 is at the same hierarchy as module_2.py itself. So if there isn't a dot, main.py will recognize module_1 at the same hierarchy as itself, it can recognize package_1, but not what "under" it.
Now let's make it a bit complicated. You have a config.ini and a module defines a function to read it at the same hierarchy as 'main.py'.
project\
package_1\
module_1.py
module_2.py
config.py
config.ini
main.py
And for some unavoidable reason, you have to call it with module_2.py, so it has to import from upper hierarchy.module_2.py:
import ..config
pass
Two dots means import from upper hierarchy (three dots access upper than upper,and so on). Now we run main.py, the interpreter will say:ValueError:attempted relative import beyond top-level package. The "top-level package" at here is main.py. Just because config.py is beside main.py, they are at same hierarchy, config.py isn't "under" main.py, or it isn't "leaded" by main.py, so it is beyond main.py. To fix this, the simplest way is:
project\
package_1\
module_1.py
module_2.py
config.py
config.ini
main.py
I think that is coincide with the principle of arrange project file hierarchy, you should arrange modules with different function in different folders, and just leave a top caller in the outside, and you can import how ever you want.
This also works, and is much simpler than anything with the sys module:
with open("C:/yourpath/foobar.py") as f:
eval(f.read())
Call me overly cautious, but I like to make mine more portable because it's unsafe to assume that files will always be in the same place on every computer. Personally I have the code look up the file path first. I use Linux so mine would look like this:
import os, sys
from subprocess import Popen, PIPE
try:
path = Popen("find / -name 'file' -type f", shell=True, stdout=PIPE).stdout.read().splitlines()[0]
if not sys.path.__contains__(path):
sys.path.append(path)
except IndexError:
raise RuntimeError("You must have FILE to run this program!")
That is of course unless you plan to package these together. But if that's the case you don't really need two separate files anyway.
Related
If I have a project with two files main.py and main2.py. In main2.py I do import sys; sys.path.append(path), and I do import main, which main.py will be imported? The one in my project or the one in path (the question poses itself only if there are two main.py obviously)?
Can I force Python to import a specific one?
The one that is in your project.
According to python docs, the import order is:
Built-in python modules. You can see the list in the variable sys.modules
The sys.path entries
The installation-dependent default locations
Your added directory to sys.path, so it depends on the order of entries in sys.path.
You can check the order of imports by running this code:
import sys
print(sys.path)
As you will observe, the first entry of sys.path is your module's directory ("project directory") and the last is the one you appended.
You can force python to use the outer directory by adding it in the beginning of your sys.path:
sys.path.insert(0,path)
Although I would recommend against having same names for the modules as it is often hard to manage.
Another thing to keep in mind is that you can't import relative paths to your sys.path. Check this answer for more information on how to work around that: https://stackoverflow.com/a/35259170/6599912
When two paths contain the same module name, using sys.path.append(path) isn't going to change the priority of the imports, since append puts path at the end of the python path list
I'd use
sys.path.insert(0,path)
so modules from path come first
Example: my module foo.py imports main and there's one main.py in the same directory and one main.py in the sub directory
foo.py
main.py
sub/main.py
in foo.py if I do:
import sys
sys.path.append("sub")
import main
print(main.__file__)
I get the main.py file path of the current directory because sys.path always starts by the main script directory.
So I could remove that directory from sys.path but it's a bad idea because I could need other modules from there.
Now if I use insert instead:
import sys
sys.path.insert(0,"sub")
import main
print(main.__file__)
I get sub/main.py as sub is the first directory python looks into when searching modules.
When defining multiple python 3 packages in a project, I cannot determine how to configure the environment (using VS Code) such that I can run a file in any package with intra-package imports without breaking extra-package imports.
Example with absolute paths:
src/
foo/
a.py
b.py
goo/
c.py
# b.py contents
from foo.a import *
# c.py contents
from foo.b import *
where PYTHONPATH="${workspaceFolder}\src", so it should be able to see the foo and goo directories.
I can run c.py, but running b.py gives a ModuleNotFoundError: "No module named 'foo'".
Modifying b.py to use relative paths:
# modified b.py contents
from a import *
Then allows me to run b.py, but attempting to then run c.py gives a ModuleNotFoundError: "No module named 'b'".
I have also added __init__.py files to foo/ and goo/ directories, but the errors still occurs.
I did set the cwd in the launch.json file to be the ${workspaceFolder} directory, if that is relevant.
A solution I was trying to avoid, but does work in-case there are no better solutions, is putting each individual package-dependency onto the PYTHONPATH environment variable.
E.g., "${workspaceFolder}\src;${workspaceFolder}\foo" when running c.py.
But this is not an easily scalable solution and is also something that needs to be tracked across project changes, so I do not think it is a particularly good or pythonic solution.
In fact, you can find it didn't work if you try to print the current path. It just add ".." to the path instead of the parent directory.
import sys
print(sys.path)
sys.path.append('..')
print(sys.path)
So, what we have to do is add the path of module we used to "sys" in path.
Here we have three ways:
Put the written .py file into the directory that has been added to the system environment variable;
Create a new .pth file under \Python\Python310\Lib\site-packages.
Write the path of the module in this new .pth file , one path per line.
Using the pythonpath environment variable as upstairs said.
Then we can use this two ways to import :
sys. path. append ('a').
sys. path. insert (0, 'a')
I have the following directory tree:
project/
A/
__init__.py
foo.py
TestA/
__init__.py
testFoo.py
the content of testFoo is:
import unittest
from A import foo
from the project directory I run python testA/testFoo.py
I get a ModuleNotFoundError No module named A
I have two question: how to improt and run A.foo from TestA.testFoo and why is it so difficult to grasp the import logic in Python? Isn't there any debug trick to solve this kind of issues rapidly, I'm sorry I have to bother you with such basics questions?
When your are executing a file an environment variable called python path is generated, python import work with this variable to find your file to import, this path is generated with the path of the file you are executing and it will search in the current directory and sub directories containing an __init__.py file, if you want to import from a directory on the same level you need to modify your python path or change the architecture of your project so the file executed is always on top level.
you can include path to your python path like this :
import sys
sys.path.insert(0, "/path/to/file.py")
You can read more on import system : https://docs.python.org/3/reference/import.html
The best way in my opinion is to not touch the python path and include your test directoy into the directory where tested files are:
project/
A/
__init__.py
foo.py
TestA/
__init__.py
testFoo.py
Then run the python -m unittest command into your A or project directory, it will search into your current and sub directories for test and execute it.
More on unittest here : https://docs.python.org/3/library/unittest.html
Add the folder project/testA to the system pythonpath first:
import sys
sys.path.insert(0, "/path/to/pythonfile")
and try the import again.
Can you try this ?
Create an empty file __init__.py in subdirectory TestA. And add at the begin of main code
from __future__ import absolute_import
Then import as below :
import A.foo as testfoo
The recommended way in py3 may be like below
echo $pwd
$ /home/user/project
python -m testA.testFoo
The way of execute module python -m in python is a good way to replace relative references。
You definitely cannot find A because python need look from sys.path, PYTHONPATH to find the module.
And python will automatically add current top level script to sys.path not currently directory to sys.path. So if you add print(sys.path) in testFoo.py, you will see it only add project/TestA to the sys.path.
Another word, the project did not be included in sys.path, then how python can find the module A?
So you had to add the project folder to sys.path by yourself, and, this just needed in top script, something like follows:
import unittest
import sys
import os
file_path = os.path.abspath(os.path.dirname(__file__)).replace('\\', '/')
lib_path = os.path.abspath(os.path.join(file_path, '..')).replace('\\', '/')
sys.path.append(lib_path)
Imagine this directory structure:
app/
__init__.py
sub1/
__init__.py
mod1.py
sub2/
__init__.py
mod2.py
I'm coding mod1, and I need to import something from mod2. How should I do it?
I tried from ..sub2 import mod2 but I'm getting an "Attempted relative import in non-package".
I googled around but found only "sys.path manipulation" hacks. Isn't there a clean way?
Edit: all my __init__.py's are currently empty
Edit2: I'm trying to do this because sub2 contains classes that are shared across sub packages (sub1, subX, etc.).
Edit3: The behaviour I'm looking for is the same as described in PEP 366 (thanks John B)
Everyone seems to want to tell you what you should be doing rather than just answering the question.
The problem is that you're running the module as '__main__' by passing the mod1.py as an argument to the interpreter.
From PEP 328:
Relative imports use a module's __name__ attribute to determine that module's position in the package hierarchy. If the module's name does not contain any package information (e.g. it is set to '__main__') then relative imports are resolved as if the module were a top level module, regardless of where the module is actually located on the file system.
In Python 2.6, they're adding the ability to reference modules relative to the main module. PEP 366 describes the change.
Update: According to Nick Coghlan, the recommended alternative is to run the module inside the package using the -m switch.
Here is the solution which works for me:
I do the relative imports as from ..sub2 import mod2
and then, if I want to run mod1.py then I go to the parent directory of app and run the module using the python -m switch as python -m app.sub1.mod1.
The real reason why this problem occurs with relative imports, is that relative imports works by taking the __name__ property of the module. If the module is being directly run, then __name__ is set to __main__ and it doesn't contain any information about package structure. And, thats why python complains about the relative import in non-package error.
So, by using the -m switch you provide the package structure information to python, through which it can resolve the relative imports successfully.
I have encountered this problem many times while doing relative imports. And, after reading all the previous answers, I was still not able to figure out how to solve it, in a clean way, without needing to put boilerplate code in all files. (Though some of the comments were really helpful, thanks to #ncoghlan and #XiongChiamiov)
Hope this helps someone who is fighting with relative imports problem, because going through PEP is really not fun.
main.py
setup.py
app/ ->
__init__.py
package_a/ ->
__init__.py
module_a.py
package_b/ ->
__init__.py
module_b.py
You run python main.py.
main.py does: import app.package_a.module_a
module_a.py does import app.package_b.module_b
Alternatively 2 or 3 could use: from app.package_a import module_a
That will work as long as you have app in your PYTHONPATH. main.py could be anywhere then.
So you write a setup.py to copy (install) the whole app package and subpackages to the target system's python folders, and main.py to target system's script folders.
"Guido views running scripts within a package as an anti-pattern" (rejected
PEP-3122)
I have spent so much time trying to find a solution, reading related posts here on Stack Overflow and saying to myself "there must be a better way!". Looks like there is not.
This is solved 100%:
app/
main.py
settings/
local_setings.py
Import settings/local_setting.py in app/main.py:
main.py:
import sys
sys.path.insert(0, "../settings")
try:
from local_settings import *
except ImportError:
print('No Import')
explanation of nosklo's answer with examples
note: all __init__.py files are empty.
main.py
app/ ->
__init__.py
package_a/ ->
__init__.py
fun_a.py
package_b/ ->
__init__.py
fun_b.py
app/package_a/fun_a.py
def print_a():
print 'This is a function in dir package_a'
app/package_b/fun_b.py
from app.package_a.fun_a import print_a
def print_b():
print 'This is a function in dir package_b'
print 'going to call a function in dir package_a'
print '-'*30
print_a()
main.py
from app.package_b import fun_b
fun_b.print_b()
if you run $ python main.py it returns:
This is a function in dir package_b
going to call a function in dir package_a
------------------------------
This is a function in dir package_a
main.py does: from app.package_b import fun_b
fun_b.py does from app.package_a.fun_a import print_a
so file in folder package_b used file in folder package_a, which is what you want. Right??
def import_path(fullpath):
"""
Import a file with full path specification. Allows one to
import from anywhere, something __import__ does not do.
"""
path, filename = os.path.split(fullpath)
filename, ext = os.path.splitext(filename)
sys.path.append(path)
module = __import__(filename)
reload(module) # Might be out of date
del sys.path[-1]
return module
I'm using this snippet to import modules from paths, hope that helps
This is unfortunately a sys.path hack, but it works quite well.
I encountered this problem with another layer: I already had a module of the specified name, but it was the wrong module.
what I wanted to do was the following (the module I was working from was module3):
mymodule\
__init__.py
mymodule1\
__init__.py
mymodule1_1
mymodule2\
__init__.py
mymodule2_1
import mymodule.mymodule1.mymodule1_1
Note that I have already installed mymodule, but in my installation I do not have "mymodule1"
and I would get an ImportError because it was trying to import from my installed modules.
I tried to do a sys.path.append, and that didn't work. What did work was a sys.path.insert
if __name__ == '__main__':
sys.path.insert(0, '../..')
So kind of a hack, but got it all to work!
So keep in mind, if you want your decision to override other paths then you need to use sys.path.insert(0, pathname) to get it to work! This was a very frustrating sticking point for me, allot of people say to use the "append" function to sys.path, but that doesn't work if you already have a module defined (I find it very strange behavior)
Let me just put this here for my own reference. I know that it is not good Python code, but I needed a script for a project I was working on and I wanted to put the script in a scripts directory.
import os.path
import sys
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
As #EvgeniSergeev says in the comments to the OP, you can import code from a .py file at an arbitrary location with:
import imp
foo = imp.load_source('module.name', '/path/to/file.py')
foo.MyClass()
This is taken from this SO answer.
Take a look at http://docs.python.org/whatsnew/2.5.html#pep-328-absolute-and-relative-imports. You could do
from .mod1 import stuff
From Python doc,
In Python 2.5, you can switch import‘s behaviour to absolute imports using a from __future__ import absolute_import directive. This absolute- import behaviour will become the default in a future version (probably Python 2.7). Once absolute imports are the default, import string will always find the standard library’s version. It’s suggested that users should begin using absolute imports as much as possible, so it’s preferable to begin writing from pkg import string in your code
I found it's more easy to set "PYTHONPATH" enviroment variable to the top folder:
bash$ export PYTHONPATH=/PATH/TO/APP
then:
import sub1.func1
#...more import
of course, PYTHONPATH is "global", but it didn't raise trouble for me yet.
On top of what John B said, it seems like setting the __package__ variable should help, instead of changing __main__ which could screw up other things. But as far as I could test, it doesn't completely work as it should.
I have the same problem and neither PEP 328 or 366 solve the problem completely, as both, by the end of the day, need the head of the package to be included in sys.path, as far as I could understand.
I should also mention that I did not find how to format the string that should go into those variables. Is it "package_head.subfolder.module_name" or what?
You have to append the module’s path to PYTHONPATH:
export PYTHONPATH="${PYTHONPATH}:/path/to/your/module/"
A hacky way to do it is to append the current directory to the PATH at runtime as follows:
import pathlib
import sys
sys.path.append(pathlib.Path(__file__).parent.resolve())
import file_to_import # the actual intended import
In contrast to another solution for this question this uses pathlib instead of os.path.
This method queries and auto populates the path:
import os
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parentdir = os.path.dirname(currentdir)
os.sys.path.insert(1, parentdir)
# print("currentdir = ", currentdir)
# print("parentdir=", parentdir)
What a debate!
Relative newcomer to python (but years of programming experience, and dislike of perl). Relative lay-person when it comes to the dark art of Apache setup, but I know what I (think I) need to get my little experimental projects working at home.
Here is my summary of what the situ seems to be.
If I use the -m 'module' approach, I need to:-
dot it all together;
run it from a parent folder;
lose the '.py';
create an empty (!) __init__.py file in every sub-folder.
How does that work in a cgi environment, where I have aliased my scripts directory, and want to run a script directly as /dirAlias/cgi_script.py??
Why is amending sys.path a hack? The python docs page states: "A program is free to modify this list for its own purposes." If it works, it works, right? The bean counters in Accounts don't care how it works.
I just want to go up one level and down into a 'modules' dir:-
.../py
/cgi
/build
/modules
so my 'modules' can be imported from either the cgi world or the server world.
I've tried the -m/modules approach but I think I prefer the following (and am not confused how to run it in cgi-space):-
Create XX_pathsetup.py in the /path/to/python/Lib dir (or any other dir in the default sys.path list). 'XX' is some identifier that declares an intent to setup my path according to the rules in the file.
In any script that wants to be able to import from the 'modules' dir in above directory config, simply import XX_pathsetup.py.
And here's my really simple XX_pathsetup.py:
import sys, os
pypath = sys.path[0].rsplit(os.sep,1)[0]
sys.path.insert( 0, pypath+os.sep+'modules' )
Not a 'hack', IMHO. 1 small file to put in the python 'Lib' dir, one import statement which declares intent to modify the path search order.
Imagine this directory structure:
app/
__init__.py
sub1/
__init__.py
mod1.py
sub2/
__init__.py
mod2.py
I'm coding mod1, and I need to import something from mod2. How should I do it?
I tried from ..sub2 import mod2 but I'm getting an "Attempted relative import in non-package".
I googled around but found only "sys.path manipulation" hacks. Isn't there a clean way?
Edit: all my __init__.py's are currently empty
Edit2: I'm trying to do this because sub2 contains classes that are shared across sub packages (sub1, subX, etc.).
Edit3: The behaviour I'm looking for is the same as described in PEP 366 (thanks John B)
Everyone seems to want to tell you what you should be doing rather than just answering the question.
The problem is that you're running the module as '__main__' by passing the mod1.py as an argument to the interpreter.
From PEP 328:
Relative imports use a module's __name__ attribute to determine that module's position in the package hierarchy. If the module's name does not contain any package information (e.g. it is set to '__main__') then relative imports are resolved as if the module were a top level module, regardless of where the module is actually located on the file system.
In Python 2.6, they're adding the ability to reference modules relative to the main module. PEP 366 describes the change.
Update: According to Nick Coghlan, the recommended alternative is to run the module inside the package using the -m switch.
Here is the solution which works for me:
I do the relative imports as from ..sub2 import mod2
and then, if I want to run mod1.py then I go to the parent directory of app and run the module using the python -m switch as python -m app.sub1.mod1.
The real reason why this problem occurs with relative imports, is that relative imports works by taking the __name__ property of the module. If the module is being directly run, then __name__ is set to __main__ and it doesn't contain any information about package structure. And, thats why python complains about the relative import in non-package error.
So, by using the -m switch you provide the package structure information to python, through which it can resolve the relative imports successfully.
I have encountered this problem many times while doing relative imports. And, after reading all the previous answers, I was still not able to figure out how to solve it, in a clean way, without needing to put boilerplate code in all files. (Though some of the comments were really helpful, thanks to #ncoghlan and #XiongChiamiov)
Hope this helps someone who is fighting with relative imports problem, because going through PEP is really not fun.
main.py
setup.py
app/ ->
__init__.py
package_a/ ->
__init__.py
module_a.py
package_b/ ->
__init__.py
module_b.py
You run python main.py.
main.py does: import app.package_a.module_a
module_a.py does import app.package_b.module_b
Alternatively 2 or 3 could use: from app.package_a import module_a
That will work as long as you have app in your PYTHONPATH. main.py could be anywhere then.
So you write a setup.py to copy (install) the whole app package and subpackages to the target system's python folders, and main.py to target system's script folders.
"Guido views running scripts within a package as an anti-pattern" (rejected
PEP-3122)
I have spent so much time trying to find a solution, reading related posts here on Stack Overflow and saying to myself "there must be a better way!". Looks like there is not.
This is solved 100%:
app/
main.py
settings/
local_setings.py
Import settings/local_setting.py in app/main.py:
main.py:
import sys
sys.path.insert(0, "../settings")
try:
from local_settings import *
except ImportError:
print('No Import')
explanation of nosklo's answer with examples
note: all __init__.py files are empty.
main.py
app/ ->
__init__.py
package_a/ ->
__init__.py
fun_a.py
package_b/ ->
__init__.py
fun_b.py
app/package_a/fun_a.py
def print_a():
print 'This is a function in dir package_a'
app/package_b/fun_b.py
from app.package_a.fun_a import print_a
def print_b():
print 'This is a function in dir package_b'
print 'going to call a function in dir package_a'
print '-'*30
print_a()
main.py
from app.package_b import fun_b
fun_b.print_b()
if you run $ python main.py it returns:
This is a function in dir package_b
going to call a function in dir package_a
------------------------------
This is a function in dir package_a
main.py does: from app.package_b import fun_b
fun_b.py does from app.package_a.fun_a import print_a
so file in folder package_b used file in folder package_a, which is what you want. Right??
def import_path(fullpath):
"""
Import a file with full path specification. Allows one to
import from anywhere, something __import__ does not do.
"""
path, filename = os.path.split(fullpath)
filename, ext = os.path.splitext(filename)
sys.path.append(path)
module = __import__(filename)
reload(module) # Might be out of date
del sys.path[-1]
return module
I'm using this snippet to import modules from paths, hope that helps
This is unfortunately a sys.path hack, but it works quite well.
I encountered this problem with another layer: I already had a module of the specified name, but it was the wrong module.
what I wanted to do was the following (the module I was working from was module3):
mymodule\
__init__.py
mymodule1\
__init__.py
mymodule1_1
mymodule2\
__init__.py
mymodule2_1
import mymodule.mymodule1.mymodule1_1
Note that I have already installed mymodule, but in my installation I do not have "mymodule1"
and I would get an ImportError because it was trying to import from my installed modules.
I tried to do a sys.path.append, and that didn't work. What did work was a sys.path.insert
if __name__ == '__main__':
sys.path.insert(0, '../..')
So kind of a hack, but got it all to work!
So keep in mind, if you want your decision to override other paths then you need to use sys.path.insert(0, pathname) to get it to work! This was a very frustrating sticking point for me, allot of people say to use the "append" function to sys.path, but that doesn't work if you already have a module defined (I find it very strange behavior)
Let me just put this here for my own reference. I know that it is not good Python code, but I needed a script for a project I was working on and I wanted to put the script in a scripts directory.
import os.path
import sys
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
As #EvgeniSergeev says in the comments to the OP, you can import code from a .py file at an arbitrary location with:
import imp
foo = imp.load_source('module.name', '/path/to/file.py')
foo.MyClass()
This is taken from this SO answer.
Take a look at http://docs.python.org/whatsnew/2.5.html#pep-328-absolute-and-relative-imports. You could do
from .mod1 import stuff
From Python doc,
In Python 2.5, you can switch import‘s behaviour to absolute imports using a from __future__ import absolute_import directive. This absolute- import behaviour will become the default in a future version (probably Python 2.7). Once absolute imports are the default, import string will always find the standard library’s version. It’s suggested that users should begin using absolute imports as much as possible, so it’s preferable to begin writing from pkg import string in your code
I found it's more easy to set "PYTHONPATH" enviroment variable to the top folder:
bash$ export PYTHONPATH=/PATH/TO/APP
then:
import sub1.func1
#...more import
of course, PYTHONPATH is "global", but it didn't raise trouble for me yet.
On top of what John B said, it seems like setting the __package__ variable should help, instead of changing __main__ which could screw up other things. But as far as I could test, it doesn't completely work as it should.
I have the same problem and neither PEP 328 or 366 solve the problem completely, as both, by the end of the day, need the head of the package to be included in sys.path, as far as I could understand.
I should also mention that I did not find how to format the string that should go into those variables. Is it "package_head.subfolder.module_name" or what?
You have to append the module’s path to PYTHONPATH:
export PYTHONPATH="${PYTHONPATH}:/path/to/your/module/"
A hacky way to do it is to append the current directory to the PATH at runtime as follows:
import pathlib
import sys
sys.path.append(pathlib.Path(__file__).parent.resolve())
import file_to_import # the actual intended import
In contrast to another solution for this question this uses pathlib instead of os.path.
This method queries and auto populates the path:
import os
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parentdir = os.path.dirname(currentdir)
os.sys.path.insert(1, parentdir)
# print("currentdir = ", currentdir)
# print("parentdir=", parentdir)
What a debate!
Relative newcomer to python (but years of programming experience, and dislike of perl). Relative lay-person when it comes to the dark art of Apache setup, but I know what I (think I) need to get my little experimental projects working at home.
Here is my summary of what the situ seems to be.
If I use the -m 'module' approach, I need to:-
dot it all together;
run it from a parent folder;
lose the '.py';
create an empty (!) __init__.py file in every sub-folder.
How does that work in a cgi environment, where I have aliased my scripts directory, and want to run a script directly as /dirAlias/cgi_script.py??
Why is amending sys.path a hack? The python docs page states: "A program is free to modify this list for its own purposes." If it works, it works, right? The bean counters in Accounts don't care how it works.
I just want to go up one level and down into a 'modules' dir:-
.../py
/cgi
/build
/modules
so my 'modules' can be imported from either the cgi world or the server world.
I've tried the -m/modules approach but I think I prefer the following (and am not confused how to run it in cgi-space):-
Create XX_pathsetup.py in the /path/to/python/Lib dir (or any other dir in the default sys.path list). 'XX' is some identifier that declares an intent to setup my path according to the rules in the file.
In any script that wants to be able to import from the 'modules' dir in above directory config, simply import XX_pathsetup.py.
And here's my really simple XX_pathsetup.py:
import sys, os
pypath = sys.path[0].rsplit(os.sep,1)[0]
sys.path.insert( 0, pypath+os.sep+'modules' )
Not a 'hack', IMHO. 1 small file to put in the python 'Lib' dir, one import statement which declares intent to modify the path search order.