Given the following python project, created in PyDev:
├── algorithms
│ ├── __init__.py
│ └── neighborhood
│ ├── __init__.py
│ ├── neighbor
│ │ ├── connector.py
│ │ ├── __init__.py
│ │ ├── manager.py
│ │ └── references.py
│ ├── neighborhood.py
│ ├── tests
│ │ ├── fixtures
│ │ │ └── neighborhood
│ │ ├── __init__.py
│ └── web
│ ├── __init__.py
│ └── service.py
├── configuration
│ ├── Config.py
│ └── __init__.py
├── __init__.py
└── webtrack
|- teste.py
├── .gitignore
├── __init__.py
├── manager
├── Data.py
├── ImportFile.py
└── __init__.py
We've been trying with no success to import modules from one folder to another, such as:
from algorithms.neighborhood.neighbor.connector import NeighborhoodConnector
Which yields the result:
Traceback (most recent call last):
File "teste.py", line 49, in <module>
from algorithms.neighborhood.neighbor.connector import NeighborhoodConnector
ImportError: No module named algorithms.neighborhood.neighbor.connector
We tried to append its path to the sys.path variable but with no success.
We also tried to use os.walk to insert all paths into PATH variable but still we get the same error, even though we checked PATH does contain the path to find the modules.
We are using Python 2.7 on Linux Ubuntu 13.10.
Is there anything we could be doing wrong?
Thanks in advance,
Getting imports right when running a script that lives within a package is tricky. You can read this section of the (sadly deferred) PEP 395 for a description of a bunch of ways that don't work to run such a script.
Give a file system hierarchy like:
top_level/
my_package/
__init__.py
sub_package/
__init__.py
module_a.py
module_b.py
sub_sub_package/
__init__.py
module_c.py
scripts/
__init__.py
my_script.py
script_subpackage/
__init__.py
script_module.py
There are only a few ways to make running my_script.py work right.
The first would be to put the top_level folder into the PYTHONPATH environment variable, or use a .pth file to achieve the same thing. Or, once the interpreter is running, insert that folder into sys.path (but this can get ugly).
Note that you're adding top_level to the path, not my_package! I suspect this is what you've got messed up in your current attempts at this solution. Its very easy to get wrong.
Then, absolute imports like import my_package.sub_package.module_a will mostly work correctly. (Just don't try importing package.scripts.my_script itself while it is running as the __main__ module, or you'll get a weird duplicate copy of the module.)
However, absolute imports will always be more verbose than relative imports, since you always need to specify the full path, even if you're importing a sibling module (or "niece" module, like module_c from module_a). With absolute imports, the way to get module_c is always the big, ugly mouthful of code from my_package.sub_package.sub_sub_package import module_c regardless of what module is doing the importing.
For that reason, using relative imports is often more elegant. Alas, they're hard to get to work from a script. The only ways are:
Run my_script from the top_level folder with the -m flag (e.g. python -m my_package.scripts.my_script) and never by filename.
It won't work if you're in a different folder, or if you use a different method to run the script (like pressing F5 in an IDE). This is somewhat inflexible, but there's not really any way to make it easier (until PEP 395 gets undeferred and implemented).
Set up sys.path like for absolute imports (e.g. add top_level to PYTHONPATH or something), then use a PEP 366 __package__ string to tell Python what the expected package of your script is. That is, in my_script.py you'd want to put something like this above all your relative imports:
if __name__ == "__main__" and __package__ is None:
__package__ = "my_package.my_scripts"
This will require updating if you reorganize your file organization and move the script to a different package (but that's probably less work than updating lots of absolute imports).
Once you've implemented one of those soutions, your imports can get simpler. Importing module_c from module_a becomes from .sub_sub_package import module_c. In my_script, relative imports like from ..subpackage import module_a will just work.
I know this is an old post but still I am going to post my solution.
Had a similar issue. Just added the paths with the following line before importing the package:
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
from lib import create_graph
The way imports work is slightly different in Python 2 and 3. First Python 3 and the sane way (which you seem to expect). In Python 3, all imports are relative to the folders in sys.path (see here for more about the module search path). Python doesn't use $PATH, by the way.
So you can import anything from anywhere without worrying too much.
In Python 2, imports are relative and sometimes absolute. The document about packages contains an example layout and some import statements which might be useful for you.
The section "Intra-package References" contains information about how to import between packages.
From all the above, I think that your sys.path is wrong. Make sure the folder which contains algorithms (i.e. not algorithms itself but it's parent) needs to be in sys.path
Just set __package__ = None in every .py file. It will setup all the package hierarchy automatically.
After that you may freely use absolute module names for import.
from algorithms.neighborhood.neighbor.connector import NeighborhoodConnector
Related
I am not finding the way to properly code so that both pylint and the execution of the code (within VSCode or from the command line) would work.
There are some similar questions but none seems to apply to my project structure with a src directory under which there will be multiple packages. Here's the simplified project structure:
.
├── README.md
├── src
│ ├── rssita
│ │ ├── __init__.py
│ │ ├── feeds.py
│ │ ├── rssita.py
│ │ └── termcolors.py
│ └── zanotherpackage
│ ├── __init__.py
│ └── anothermodule.py
└── tests
├── __init__.py
└── test_feeds.py
From what I understand rssita is one of my my packages (because of the init.py file) with some modules under it amongst which rssita.py file contains the following imports:
from feeds import RSS_FEEDS
from termcolors import PC
The rssita.py code as shown above runs well from both within VSCode and from command line python ( python src/rssita/rssita.py ) from the project root, but at the same time pylint (both from within VSCode and from the command line (pylint src or pylint src/rssita)) flags the two imports as not found.
If I modify the code as follows:
from rssita.feeds import RSS_FEEDS
from rssita.termcolors import PC
pylint will then be happy but the code will not run anymore since it would not find the imports.
What's the cleanest fix for this?
As far as I'm concerned pylinty is right, your setup / PYTHONPATH is screwed up: in Python 3, all imports are absolute by default, so
from feeds import RSS_FEEDS
from termcolors import PC
should look for top-level packages called feeds and termcolors which I don't think exist.
python src/rssita/rssita.py
That really ain't the correct invocation, it's going to setup a really weird PYTHONPATH in order to run a random script.
The correct imports should be package-relative:
from .feeds import RSS_FEEDS
from .termcolors import PC
Furthermore if you intend to run a package, that should be either a runnable package using __main__:
python -m rssita
or you should run the sub-package as a module:
python -m rssita.rssita
Because you're using an src-package, you'll either need to create a pyproject.toml so you can use an editable install, or you'll have to PYTHONPATH=src before you run the command. This ensures the packages are visible at the top-level of the PYTHONPATH, and thus correctly importable. Though I'm not a specialist in the interaction of src layouts & runnable packages, so there may be better solutions.
I have a project structure as follows:
python-test » tree
.
├── asdf
│ ├── __init__.py
│ ├── mycode2.py
│ ├── mycode.py
│ └── scripts
│ └── __init__.py
├── __init__.py
└── scripts
├── __init__.py
└── mymod.py
asdf/mycode.py contains:
import scripts.mymod
and scripts/mymod.py contains:
print('hello world')
All the __init__.pys are empty.
I would like to import scripts.mymod.
I am running asdf/mycode.py which fails because it is looking inside asdf/scripts instead of the root scripts folder for mymod.
> PYTHONPATH=: python asdf/mycode.py
Traceback (most recent call last):
File "asdf/mycode.py", line 1, in <module>
import scripts.mymod
ModuleNotFoundError: No module named 'scripts.mymod'
As a workaround, I can manually modify the path, which I have done in asdf/mycode2.py.
import sys
sys.path = [p for p in sys.path if 'asdf' not in p]
import scripts.mymod
This works correctly:
> PYTHONPATH=: python asdf/mycode2.py
hello world
Is there a way to import scripts.mymod in asdf/mycode.py without modifying the sys.path manually, and without changing the PYTHONPATH which has to be set to :?
TL;DR
Python imports are weird and cumbersome and you are probably best off either creating an entrypoint in the top of the directory or using a tool to handle the path manipulations.
Python Imports
Python has two different ways of handling imports, the first is the standard: import numpy as np, which is resolved against the environment PYTHONPATH in order until it finds the requested module. Included in that path is the directory of the file that is currently being executed, which you see in your example in that you are needing to manually remove that from the sys.path.
The other way that python handles imports is via relative paths, which always lead with . like from . import a or import .a or from .. import b. Unfortunately, relative import only work if the file that they are in is not being directly run (This happens if you have two files within a module where one of them imports an object from another, but both of them are intended to be imported by and external script). This is because python uses the builtin name global to resolve the relative path, which if the file is being directly run from a shell gets overridden as "__main__".
Consider a file structure:
.
├── a.py
└── b.py
where a.py is
import b
and b.py is
print(__name__)
If you run:
python3 b.py # prints "__main__"
python3 a.py # prints "b"
So, if you want to be able to import the scripts/mymod.py from asdf/mycode2.py, you could put change the import to:
from .. import scripts.mymod
But, if you do that, you will not be able to run the asdf/mycode2.py file directly, you will need to create a third file somewhere else that imports asdf/mycode2.py and run that third script (these files are called entrypoints). For example:
python-test » tree
.
├── asdf
│ ├── __init__.py
│ ├── mycode2.py
│ ├── mycode.py
│ └── scripts
│ └── __init__.py
├── __init__.py
├── entrypoint.py
└── scripts
├── __init__.py
└── mymod.py
where entrypoint.py is
import asdf.mycode2
An Alternative Approach
The alternative approach would be to develop a tool which handles manipulating python's sys.path to allow relative imports even when the current file is being run a "__main__". Full disclosure, I am currently working on such a tool based off nodejs's require function, called repyrer, which is pip-installable as
pip3 install repyrer
It allows you to do this kind of thing:
from repyrer import repyre
mymod = repyre('../scripts/mymod')
which will work even if you run the file directly.
Hope that helps!
I have created a python class as well as a python script, which is suppossed to call this class.
The entire code lies inside a Code folder, which contains a Classes and a Scripts folder.
The class is stored in:
Classes
> myClass.py
Whereas the script is stored in:
Scripts
> myScript.py
I try to call the class within my script, using:
from ..Classes.myClass import MY_CLASS
Now, the error message that I receive is: ImportError: attempted relative import with no known parent package
I think this is strange, as the two dots should indicate to python that my parent directory is one level up the hierarchy. But apparently I am missing out on sth crucial here. Is there an easy fix to this problem, or do I actually have to include my Classes folder inside my Scripts folder?
Place your modules in a package:
my_app
├── Scripts
│ ├── __init__.py
│ ├── myClass.py
├── Classes
│ ├── __init__.py
│ ├── myScript.py
└── __init__.py
I'm using python 3.7 and ran into a relative import error "Attempted relative import beyond top-level package" with the following folder structure:
├── app
│ ├── __init__.py
│ ├── services
│ │ └── item_service.py
│ └── views
│ ├── home.py
│ ├── __init__.py
My goal: import variable foo from the top level _init_.py to item_service.py using
from .. import foo
Pylint gives the error when trying this.
However the same exact import statement works in home.py, and if I add a empty _init_.py file to the services folder, the import works.
So my my question is, why? Does python require your module to be in a subpackage in order to relatively import parent package's contents?
For me it got resolved in following ways:
First import the directory (import dir)
Then try to import the views/class (from dir import views/class)
To solve:
Add init.py to all directories involved.
Add
sys.path.append("..")
BEFORE importing from sibling directory.
I've a project which uses git submodules. In my python file I want to use functions from another python file in the submodule project.
In order to work I had to add the init.py file to all subfolders in the path. My folder tree is the following:
myproj
├── gitmodules
│ ├── __init__.py
│ ├── __init__.pyc
│ └── mygitsubmodule
│ ├── __init__.py
│ ├── __init__.pyc
│ └── file.py
└── myfile.py
Is there any way to make it work without touching mygitsubmodule ?
Thanks
you can add to sys.path in the file you want to be able to access the module, something like:
import sys
sys.path.append("/home/me/myproj/gitmodules")
import mygitsubmodule
This example is adding a path as a raw string to make it clear what's happening. You should really use the more sophisticated, system independent methods described below to determine and assemble the path.
Also, I have found it better, when I used this method, to use sys.path.insert(1, .. as some functionality seems to rely of sys.path[0] being the starting directory of the program.
I am used to avoiding modifying sys.path.
The problem is, when using git submodule, submodule is a project directory, not a Python package. There is a "gap" between your module and that package, so you can't import.
Suppose you have created a submodule named foo_project, and there is a foo package inside.
.
├── foo_project
│ ├── README.rst
│ └── foo
│ └── __init__.py
└── main.py
My solution will be creating a soft link to expose that package to your module:
ln -s foo_project/foo foo
.
├── foo_project
│ ├── README.rst
│ └── foo
│ └── __init__.py
├── foo -> foo_project/foo
└── main.py
Now you can import foo in the main.py.
For reference,
from submodulefolder.file import func_name
or
import submodulefolder.file as lib_name
where file excludes the extension of file.py, seems to work in relative terms without modifying the subfolder / git submodule with a init.py since python 3.3+,
as shown here.
Tested on py3.8.5 linux native and py3.7.8 anaconda Windows, both in Spyder's Ipython-console, as well as natively on linux via terminal.