Shouldn't the imports be absolute by default in python27? - python

Imagine the directory structure:
/
a/
__init__.py
b.py
c.py
c.py
File /a/b.py looks like:
import c
should_be_absolute = c
All the other files (including __init__) are empty.
When running a test script (using python 2.7):
import a.b
print a.b.should_be_absolute
with PYTHONPATH=/ from an empty directory (so nothing is added to PYTHONPATH from current directory) I get
<module 'a.c' from '/a/c.py'>
where according to PEP 328 and the statement import <> is always absolute I would expect:
<module 'c' from '/c.py'>
The output is as expected when I remove the /a/c.py file.
What am I missing? And if this is the correct behavior - how to import the c module from b (instead of a.c)?
Update:
According to python dev mailing list it appears to be a bug in the documentation. The imports are not absolute by default in python27.

you need to add from __future__ import absolute_import or use importlib.import_module('c') on Python 2.7
It is default on Python 3.
There was a bug in Python: __future__.py and its documentation claim absolute imports became mandatory in 2.7, but they didn't.

If you are only adding / to your PYTHONPATH, then the search order could still be looking for c in the current directory. It would be a lot better if you placed everything under a root package, and referred to it absolutely:
/myPackage
a/
__init__.py
b.py
c.py
__init__.py
c.py
And a PYTHONPATH like: export PYTHONPATH=/:$PYTHONPATH
So in your a.c you would do and of these:
from myPackage import c
from myPackage.c import Foo
import myPackage.c
This way, it is always relative to your package.

"Absolute" doesn't mean the one that you think it does; instead, it means that the "usual" package resolving procedure takes place: first, it looks in the directory of the package, then in all the elements of sys.path; which includes the elements from PYTHONPATH.
If you really want to, you can use tools like the imp module, but I'd recommend against it for something like this. Because in general, you shouldn't ever have to create a module with the same name as one in the standard Python distribution.

Related

File not found on import when the same script is imported onto two other python scripts [duplicate]

This is a python newbie question:
I have the following directory structure:
test
-- test_file.py
a
-- b
-- module.py
where test, a and b are folders. Both test and a are on the same level.
module.py has a class called shape, and I want to instantiate an instance of it in test_file.py. How can I do so?
I have tried:
from a.b import module
but I got:
ImportError: No module named a.b
What you want is a relative import like:
from ..a.b import module
The problem with this is that it doesn't work if you are calling test_file.py as your main module. As stated here:
Note that both explicit and implicit relative imports are based on the name of the current module. Since the name of the main module is always "main", modules intended for use as the main module of a Python application should always use absolute imports.
So, if you want to call test_file.py as your main module, then you should consider changing the structure of your modules and using an absolute import, else just use the relative import from above.
The directory a needs to be a package. Add an __init__.py file to make it a package, which is a step up from being a simple directory.
The directory b also needs to be a subpackage of a. Add an __init__.py file.
The directory test should probably also be a package. Hard to say if this is necessary or not. It's usually a good idea for every directory of Python modules to be a formal package.
In order to import, the package needs to be on sys.path; this is built from the PYTHONPATH environment variable. By default the installed site-packages and the current working directory are (effectively) the only two places where a package can be found.
That means that a must either be installed, or, your current working directory must also be a package one level above a.
OR, you need to set your PYTHONPATH environment variable to include a.
http://docs.python.org/tutorial/modules.html#the-module-search-path
http://docs.python.org/using/cmdline.html#envvar-PYTHONPATH
Also, http://docs.python.org/library/site.html for complete information on how sys.path is built.
The first thing to do would be to quickly browse the official docs on this.
To make a directory a package, you'll have to add a __init__.py file. This means that you'll have such a file in the a and b directories. Then you can directly do an
import a.b.module
But you'll have to refer to it as a.b.module which is tedious so you can use the as form of the import like so
import a.b.module as mod #shorter name
and refer to it as mod.
Then you can instantiate things inside mod using the regular conventions like mod.shape().
There are a few other subtleties. Please go through the docs for details.

Why importing in the imported modules works differently?

If I have
a.py
b.py
In b.py I can import a
But if I have
c.py
m
a.py
b.py
and in c.py do import m.b, suddenly in b.py I get ModuleNotFoundError: No module named 'a'
What's the problem? I don't see how the second case is any different from the first
So... the modules are searched in the directory of the module that was started initially. I just don't understand the reasoning.
I'm not asking how to fix the problem. But rather asking why there's a problem in the first place...
(tested on Python 3.8.8)
It's based what file you run python.
If you run with python c.py
file c.py use:
from m import a, b
file b.py also use:
from m import a
If you go python b.py you just need import a to use in b.py
so depend what file you run it's will be the parent for relative import.
Let's start with why your first one works.
Assuming you have a file structure such that:
c.py
m - |
|
__init__.py
a.py
b.py
Let's say your current directory is within the m folder and you run the command.
python b.py
If within your b.py you place something like:
import a
import sys
print(sys.path)
it will work just fine. But let's take a look at the output from print(sys.path).
On windows, this is will look something like ['C:\\path\\to\\myprogram\\m', '<path to python installation>', etc.]
The important thing is that the m folder is on your sys.path and that is how import a gets resolved.
However if we go up one level and run:
python c.py
you should immediately notice that m is no longer on your sys.path and instead is replaced by 'C:\\path\\to\\myprogram'.
That is why it fails. Python automatically includes your current working directory in sys.path and changing out of it means it no longer knows where to look for the import.
This is an example of an absolute import. You can manipulate where python looks for these imports by modifying sys.path to include the path of the file you want to import.
sys.path.append('C:\\path\\to\\myprogram\\m')
But there's a better way to do it. If m is a package or sub-package (includes an __init__.py) you can use a relative import.
from . import a
However, there is a small caveat to this.
You can only use relative imports when the file using them is being run as a module or package.
So running them directly as a top level script
python b.py
will produce
ImportError: attempted relative import with no known parent package
Python luckily has the ability to run a script as a module built in.
if you cd one level up so as to encapsulate your m package, you can run
python -m m.b
and have it run just fine.
Additionally, since b.py is being treated as a module in c.py because of the import, the relative import will work in this case as well.
python c.py

Attempted relative import in non-package for optional package [duplicate]

Imagine this directory structure:
app/
__init__.py
sub1/
__init__.py
mod1.py
sub2/
__init__.py
mod2.py
I'm coding mod1, and I need to import something from mod2. How should I do it?
I tried from ..sub2 import mod2 but I'm getting an "Attempted relative import in non-package".
I googled around but found only "sys.path manipulation" hacks. Isn't there a clean way?
Edit: all my __init__.py's are currently empty
Edit2: I'm trying to do this because sub2 contains classes that are shared across sub packages (sub1, subX, etc.).
Edit3: The behaviour I'm looking for is the same as described in PEP 366 (thanks John B)
Everyone seems to want to tell you what you should be doing rather than just answering the question.
The problem is that you're running the module as '__main__' by passing the mod1.py as an argument to the interpreter.
From PEP 328:
Relative imports use a module's __name__ attribute to determine that module's position in the package hierarchy. If the module's name does not contain any package information (e.g. it is set to '__main__') then relative imports are resolved as if the module were a top level module, regardless of where the module is actually located on the file system.
In Python 2.6, they're adding the ability to reference modules relative to the main module. PEP 366 describes the change.
Update: According to Nick Coghlan, the recommended alternative is to run the module inside the package using the -m switch.
Here is the solution which works for me:
I do the relative imports as from ..sub2 import mod2
and then, if I want to run mod1.py then I go to the parent directory of app and run the module using the python -m switch as python -m app.sub1.mod1.
The real reason why this problem occurs with relative imports, is that relative imports works by taking the __name__ property of the module. If the module is being directly run, then __name__ is set to __main__ and it doesn't contain any information about package structure. And, thats why python complains about the relative import in non-package error.
So, by using the -m switch you provide the package structure information to python, through which it can resolve the relative imports successfully.
I have encountered this problem many times while doing relative imports. And, after reading all the previous answers, I was still not able to figure out how to solve it, in a clean way, without needing to put boilerplate code in all files. (Though some of the comments were really helpful, thanks to #ncoghlan and #XiongChiamiov)
Hope this helps someone who is fighting with relative imports problem, because going through PEP is really not fun.
main.py
setup.py
app/ ->
__init__.py
package_a/ ->
__init__.py
module_a.py
package_b/ ->
__init__.py
module_b.py
You run python main.py.
main.py does: import app.package_a.module_a
module_a.py does import app.package_b.module_b
Alternatively 2 or 3 could use: from app.package_a import module_a
That will work as long as you have app in your PYTHONPATH. main.py could be anywhere then.
So you write a setup.py to copy (install) the whole app package and subpackages to the target system's python folders, and main.py to target system's script folders.
"Guido views running scripts within a package as an anti-pattern" (rejected
PEP-3122)
I have spent so much time trying to find a solution, reading related posts here on Stack Overflow and saying to myself "there must be a better way!". Looks like there is not.
This is solved 100%:
app/
main.py
settings/
local_setings.py
Import settings/local_setting.py in app/main.py:
main.py:
import sys
sys.path.insert(0, "../settings")
try:
from local_settings import *
except ImportError:
print('No Import')
explanation of nosklo's answer with examples
note: all __init__.py files are empty.
main.py
app/ ->
__init__.py
package_a/ ->
__init__.py
fun_a.py
package_b/ ->
__init__.py
fun_b.py
app/package_a/fun_a.py
def print_a():
print 'This is a function in dir package_a'
app/package_b/fun_b.py
from app.package_a.fun_a import print_a
def print_b():
print 'This is a function in dir package_b'
print 'going to call a function in dir package_a'
print '-'*30
print_a()
main.py
from app.package_b import fun_b
fun_b.print_b()
if you run $ python main.py it returns:
This is a function in dir package_b
going to call a function in dir package_a
------------------------------
This is a function in dir package_a
main.py does: from app.package_b import fun_b
fun_b.py does from app.package_a.fun_a import print_a
so file in folder package_b used file in folder package_a, which is what you want. Right??
def import_path(fullpath):
"""
Import a file with full path specification. Allows one to
import from anywhere, something __import__ does not do.
"""
path, filename = os.path.split(fullpath)
filename, ext = os.path.splitext(filename)
sys.path.append(path)
module = __import__(filename)
reload(module) # Might be out of date
del sys.path[-1]
return module
I'm using this snippet to import modules from paths, hope that helps
This is unfortunately a sys.path hack, but it works quite well.
I encountered this problem with another layer: I already had a module of the specified name, but it was the wrong module.
what I wanted to do was the following (the module I was working from was module3):
mymodule\
__init__.py
mymodule1\
__init__.py
mymodule1_1
mymodule2\
__init__.py
mymodule2_1
import mymodule.mymodule1.mymodule1_1
Note that I have already installed mymodule, but in my installation I do not have "mymodule1"
and I would get an ImportError because it was trying to import from my installed modules.
I tried to do a sys.path.append, and that didn't work. What did work was a sys.path.insert
if __name__ == '__main__':
sys.path.insert(0, '../..')
So kind of a hack, but got it all to work!
So keep in mind, if you want your decision to override other paths then you need to use sys.path.insert(0, pathname) to get it to work! This was a very frustrating sticking point for me, allot of people say to use the "append" function to sys.path, but that doesn't work if you already have a module defined (I find it very strange behavior)
Let me just put this here for my own reference. I know that it is not good Python code, but I needed a script for a project I was working on and I wanted to put the script in a scripts directory.
import os.path
import sys
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
As #EvgeniSergeev says in the comments to the OP, you can import code from a .py file at an arbitrary location with:
import imp
foo = imp.load_source('module.name', '/path/to/file.py')
foo.MyClass()
This is taken from this SO answer.
Take a look at http://docs.python.org/whatsnew/2.5.html#pep-328-absolute-and-relative-imports. You could do
from .mod1 import stuff
From Python doc,
In Python 2.5, you can switch import‘s behaviour to absolute imports using a from __future__ import absolute_import directive. This absolute- import behaviour will become the default in a future version (probably Python 2.7). Once absolute imports are the default, import string will always find the standard library’s version. It’s suggested that users should begin using absolute imports as much as possible, so it’s preferable to begin writing from pkg import string in your code
I found it's more easy to set "PYTHONPATH" enviroment variable to the top folder:
bash$ export PYTHONPATH=/PATH/TO/APP
then:
import sub1.func1
#...more import
of course, PYTHONPATH is "global", but it didn't raise trouble for me yet.
On top of what John B said, it seems like setting the __package__ variable should help, instead of changing __main__ which could screw up other things. But as far as I could test, it doesn't completely work as it should.
I have the same problem and neither PEP 328 or 366 solve the problem completely, as both, by the end of the day, need the head of the package to be included in sys.path, as far as I could understand.
I should also mention that I did not find how to format the string that should go into those variables. Is it "package_head.subfolder.module_name" or what?
You have to append the module’s path to PYTHONPATH:
export PYTHONPATH="${PYTHONPATH}:/path/to/your/module/"
A hacky way to do it is to append the current directory to the PATH at runtime as follows:
import pathlib
import sys
sys.path.append(pathlib.Path(__file__).parent.resolve())
import file_to_import # the actual intended import
In contrast to another solution for this question this uses pathlib instead of os.path.
This method queries and auto populates the path:
import os
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parentdir = os.path.dirname(currentdir)
os.sys.path.insert(1, parentdir)
# print("currentdir = ", currentdir)
# print("parentdir=", parentdir)
What a debate!
Relative newcomer to python (but years of programming experience, and dislike of perl). Relative lay-person when it comes to the dark art of Apache setup, but I know what I (think I) need to get my little experimental projects working at home.
Here is my summary of what the situ seems to be.
If I use the -m 'module' approach, I need to:-
dot it all together;
run it from a parent folder;
lose the '.py';
create an empty (!) __init__.py file in every sub-folder.
How does that work in a cgi environment, where I have aliased my scripts directory, and want to run a script directly as /dirAlias/cgi_script.py??
Why is amending sys.path a hack? The python docs page states: "A program is free to modify this list for its own purposes." If it works, it works, right? The bean counters in Accounts don't care how it works.
I just want to go up one level and down into a 'modules' dir:-
.../py
/cgi
/build
/modules
so my 'modules' can be imported from either the cgi world or the server world.
I've tried the -m/modules approach but I think I prefer the following (and am not confused how to run it in cgi-space):-
Create XX_pathsetup.py in the /path/to/python/Lib dir (or any other dir in the default sys.path list). 'XX' is some identifier that declares an intent to setup my path according to the rules in the file.
In any script that wants to be able to import from the 'modules' dir in above directory config, simply import XX_pathsetup.py.
And here's my really simple XX_pathsetup.py:
import sys, os
pypath = sys.path[0].rsplit(os.sep,1)[0]
sys.path.insert( 0, pypath+os.sep+'modules' )
Not a 'hack', IMHO. 1 small file to put in the python 'Lib' dir, one import statement which declares intent to modify the path search order.

Why can I import successfully without __init__.py?

What exactly is the use of __init__.py? Yes, I know this file makes a directory into an importable package. However, consider the following example:
project/
foo/
__init__.py
a.py
bar/
b.py
If I want to import a into b, I have to add following statement:
sys.path.append('/path_to_foo')
import foo.a
This will run successfully with or without __init__.py. However, if there is not an sys.path.append statement, a "no module" error will occur, with or without __init__.py. This makes it seem lik eonly the system path matters, and that __init__.py does not have any effect.
Why would this import work without __init__.py?
__init__.py has nothing to do with whether Python can find your package. You've run your code in such a way that your package isn't on the search path by default, but if you had run it differently or configured your PYTHONPATH differently, the sys.path.append would have been unnecessary.
__init__.py used to be necessary to create a package, and in most cases, you should still provide it. Since Python 3.3, though, a folder without an __init__.py can be considered part of an implicit namespace package, a feature for splitting a package across multiple directories.
During import processing, the import machinery will continue to
iterate over each directory in the parent path as it does in Python
3.2. While looking for a module or package named "foo", for each directory in the parent path:
If <directory>/foo/__init__.py is found, a regular package is imported and returned.
If not, but <directory>/foo.{py,pyc,so,pyd} is found, a module is imported and returned. The exact list of extension varies by platform
and whether the -O flag is specified. The list here is
representative.
If not, but <directory>/foo is found and is a directory, it is recorded and the scan continues with the next directory in the parent
path.
Otherwise the scan continues with the next directory in the parent path.
If the scan completes without returning a module or package, and at
least one directory was recorded, then a namespace package is created.
If you really want to avoid __init__.py for some reason, you don't sys.path. Rather, create a module object and set its __path__ to a list of directories.
if I want to import a into b, I have to add following statement:
No! You'd just say: import foo.a. All this is provided you run the entire package at once using python -m main.module where main.module is the entry point to your entire application. It imports all other modules, and the modules that import more modules will try to look for them from the root of this project. For instance, foo.bar.c will import as foo.bar.b
Then it seems that only the system path matters and init.py does not have any effect.
You need to modify sys.path only when you are importing modules from locations that are not in your project, or the places where python looks for libraries. __init__.py not only makes a folder look like a package, it also does a few more things like "export" objects to outside world (__all__)
When you import something it has to either:
Retrieve an already loaded module or
Load the module that was imported
When you do import foo and python finds a folder called foo in a folder on your sys.path then it will look in that folder for an __init__.py to be considered the top level module.
(Note that if the package is not on your sys.path then you would need to append it's location to be able to import it.)
If that is not present it will look for a __init__.pyc version possibly in the __pycache__ folder, if that is also missing then that folder foo is not considered a loadable python package. If no other options for foo are found then an ImportError is raised.
If you try deleting the __init__.pyc file as well you will see that the the initializer script for a package is indeed necessary.

How to do relative imports in Python?

Imagine this directory structure:
app/
__init__.py
sub1/
__init__.py
mod1.py
sub2/
__init__.py
mod2.py
I'm coding mod1, and I need to import something from mod2. How should I do it?
I tried from ..sub2 import mod2 but I'm getting an "Attempted relative import in non-package".
I googled around but found only "sys.path manipulation" hacks. Isn't there a clean way?
Edit: all my __init__.py's are currently empty
Edit2: I'm trying to do this because sub2 contains classes that are shared across sub packages (sub1, subX, etc.).
Edit3: The behaviour I'm looking for is the same as described in PEP 366 (thanks John B)
Everyone seems to want to tell you what you should be doing rather than just answering the question.
The problem is that you're running the module as '__main__' by passing the mod1.py as an argument to the interpreter.
From PEP 328:
Relative imports use a module's __name__ attribute to determine that module's position in the package hierarchy. If the module's name does not contain any package information (e.g. it is set to '__main__') then relative imports are resolved as if the module were a top level module, regardless of where the module is actually located on the file system.
In Python 2.6, they're adding the ability to reference modules relative to the main module. PEP 366 describes the change.
Update: According to Nick Coghlan, the recommended alternative is to run the module inside the package using the -m switch.
Here is the solution which works for me:
I do the relative imports as from ..sub2 import mod2
and then, if I want to run mod1.py then I go to the parent directory of app and run the module using the python -m switch as python -m app.sub1.mod1.
The real reason why this problem occurs with relative imports, is that relative imports works by taking the __name__ property of the module. If the module is being directly run, then __name__ is set to __main__ and it doesn't contain any information about package structure. And, thats why python complains about the relative import in non-package error.
So, by using the -m switch you provide the package structure information to python, through which it can resolve the relative imports successfully.
I have encountered this problem many times while doing relative imports. And, after reading all the previous answers, I was still not able to figure out how to solve it, in a clean way, without needing to put boilerplate code in all files. (Though some of the comments were really helpful, thanks to #ncoghlan and #XiongChiamiov)
Hope this helps someone who is fighting with relative imports problem, because going through PEP is really not fun.
main.py
setup.py
app/ ->
__init__.py
package_a/ ->
__init__.py
module_a.py
package_b/ ->
__init__.py
module_b.py
You run python main.py.
main.py does: import app.package_a.module_a
module_a.py does import app.package_b.module_b
Alternatively 2 or 3 could use: from app.package_a import module_a
That will work as long as you have app in your PYTHONPATH. main.py could be anywhere then.
So you write a setup.py to copy (install) the whole app package and subpackages to the target system's python folders, and main.py to target system's script folders.
"Guido views running scripts within a package as an anti-pattern" (rejected
PEP-3122)
I have spent so much time trying to find a solution, reading related posts here on Stack Overflow and saying to myself "there must be a better way!". Looks like there is not.
This is solved 100%:
app/
main.py
settings/
local_setings.py
Import settings/local_setting.py in app/main.py:
main.py:
import sys
sys.path.insert(0, "../settings")
try:
from local_settings import *
except ImportError:
print('No Import')
explanation of nosklo's answer with examples
note: all __init__.py files are empty.
main.py
app/ ->
__init__.py
package_a/ ->
__init__.py
fun_a.py
package_b/ ->
__init__.py
fun_b.py
app/package_a/fun_a.py
def print_a():
print 'This is a function in dir package_a'
app/package_b/fun_b.py
from app.package_a.fun_a import print_a
def print_b():
print 'This is a function in dir package_b'
print 'going to call a function in dir package_a'
print '-'*30
print_a()
main.py
from app.package_b import fun_b
fun_b.print_b()
if you run $ python main.py it returns:
This is a function in dir package_b
going to call a function in dir package_a
------------------------------
This is a function in dir package_a
main.py does: from app.package_b import fun_b
fun_b.py does from app.package_a.fun_a import print_a
so file in folder package_b used file in folder package_a, which is what you want. Right??
def import_path(fullpath):
"""
Import a file with full path specification. Allows one to
import from anywhere, something __import__ does not do.
"""
path, filename = os.path.split(fullpath)
filename, ext = os.path.splitext(filename)
sys.path.append(path)
module = __import__(filename)
reload(module) # Might be out of date
del sys.path[-1]
return module
I'm using this snippet to import modules from paths, hope that helps
This is unfortunately a sys.path hack, but it works quite well.
I encountered this problem with another layer: I already had a module of the specified name, but it was the wrong module.
what I wanted to do was the following (the module I was working from was module3):
mymodule\
__init__.py
mymodule1\
__init__.py
mymodule1_1
mymodule2\
__init__.py
mymodule2_1
import mymodule.mymodule1.mymodule1_1
Note that I have already installed mymodule, but in my installation I do not have "mymodule1"
and I would get an ImportError because it was trying to import from my installed modules.
I tried to do a sys.path.append, and that didn't work. What did work was a sys.path.insert
if __name__ == '__main__':
sys.path.insert(0, '../..')
So kind of a hack, but got it all to work!
So keep in mind, if you want your decision to override other paths then you need to use sys.path.insert(0, pathname) to get it to work! This was a very frustrating sticking point for me, allot of people say to use the "append" function to sys.path, but that doesn't work if you already have a module defined (I find it very strange behavior)
Let me just put this here for my own reference. I know that it is not good Python code, but I needed a script for a project I was working on and I wanted to put the script in a scripts directory.
import os.path
import sys
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
As #EvgeniSergeev says in the comments to the OP, you can import code from a .py file at an arbitrary location with:
import imp
foo = imp.load_source('module.name', '/path/to/file.py')
foo.MyClass()
This is taken from this SO answer.
Take a look at http://docs.python.org/whatsnew/2.5.html#pep-328-absolute-and-relative-imports. You could do
from .mod1 import stuff
From Python doc,
In Python 2.5, you can switch import‘s behaviour to absolute imports using a from __future__ import absolute_import directive. This absolute- import behaviour will become the default in a future version (probably Python 2.7). Once absolute imports are the default, import string will always find the standard library’s version. It’s suggested that users should begin using absolute imports as much as possible, so it’s preferable to begin writing from pkg import string in your code
I found it's more easy to set "PYTHONPATH" enviroment variable to the top folder:
bash$ export PYTHONPATH=/PATH/TO/APP
then:
import sub1.func1
#...more import
of course, PYTHONPATH is "global", but it didn't raise trouble for me yet.
On top of what John B said, it seems like setting the __package__ variable should help, instead of changing __main__ which could screw up other things. But as far as I could test, it doesn't completely work as it should.
I have the same problem and neither PEP 328 or 366 solve the problem completely, as both, by the end of the day, need the head of the package to be included in sys.path, as far as I could understand.
I should also mention that I did not find how to format the string that should go into those variables. Is it "package_head.subfolder.module_name" or what?
You have to append the module’s path to PYTHONPATH:
export PYTHONPATH="${PYTHONPATH}:/path/to/your/module/"
A hacky way to do it is to append the current directory to the PATH at runtime as follows:
import pathlib
import sys
sys.path.append(pathlib.Path(__file__).parent.resolve())
import file_to_import # the actual intended import
In contrast to another solution for this question this uses pathlib instead of os.path.
This method queries and auto populates the path:
import os
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parentdir = os.path.dirname(currentdir)
os.sys.path.insert(1, parentdir)
# print("currentdir = ", currentdir)
# print("parentdir=", parentdir)
What a debate!
Relative newcomer to python (but years of programming experience, and dislike of perl). Relative lay-person when it comes to the dark art of Apache setup, but I know what I (think I) need to get my little experimental projects working at home.
Here is my summary of what the situ seems to be.
If I use the -m 'module' approach, I need to:-
dot it all together;
run it from a parent folder;
lose the '.py';
create an empty (!) __init__.py file in every sub-folder.
How does that work in a cgi environment, where I have aliased my scripts directory, and want to run a script directly as /dirAlias/cgi_script.py??
Why is amending sys.path a hack? The python docs page states: "A program is free to modify this list for its own purposes." If it works, it works, right? The bean counters in Accounts don't care how it works.
I just want to go up one level and down into a 'modules' dir:-
.../py
/cgi
/build
/modules
so my 'modules' can be imported from either the cgi world or the server world.
I've tried the -m/modules approach but I think I prefer the following (and am not confused how to run it in cgi-space):-
Create XX_pathsetup.py in the /path/to/python/Lib dir (or any other dir in the default sys.path list). 'XX' is some identifier that declares an intent to setup my path according to the rules in the file.
In any script that wants to be able to import from the 'modules' dir in above directory config, simply import XX_pathsetup.py.
And here's my really simple XX_pathsetup.py:
import sys, os
pypath = sys.path[0].rsplit(os.sep,1)[0]
sys.path.insert( 0, pypath+os.sep+'modules' )
Not a 'hack', IMHO. 1 small file to put in the python 'Lib' dir, one import statement which declares intent to modify the path search order.

Categories

Resources