I am not finding the way to properly code so that both pylint and the execution of the code (within VSCode or from the command line) would work.
There are some similar questions but none seems to apply to my project structure with a src directory under which there will be multiple packages. Here's the simplified project structure:
.
├── README.md
├── src
│ ├── rssita
│ │ ├── __init__.py
│ │ ├── feeds.py
│ │ ├── rssita.py
│ │ └── termcolors.py
│ └── zanotherpackage
│ ├── __init__.py
│ └── anothermodule.py
└── tests
├── __init__.py
└── test_feeds.py
From what I understand rssita is one of my my packages (because of the init.py file) with some modules under it amongst which rssita.py file contains the following imports:
from feeds import RSS_FEEDS
from termcolors import PC
The rssita.py code as shown above runs well from both within VSCode and from command line python ( python src/rssita/rssita.py ) from the project root, but at the same time pylint (both from within VSCode and from the command line (pylint src or pylint src/rssita)) flags the two imports as not found.
If I modify the code as follows:
from rssita.feeds import RSS_FEEDS
from rssita.termcolors import PC
pylint will then be happy but the code will not run anymore since it would not find the imports.
What's the cleanest fix for this?
As far as I'm concerned pylinty is right, your setup / PYTHONPATH is screwed up: in Python 3, all imports are absolute by default, so
from feeds import RSS_FEEDS
from termcolors import PC
should look for top-level packages called feeds and termcolors which I don't think exist.
python src/rssita/rssita.py
That really ain't the correct invocation, it's going to setup a really weird PYTHONPATH in order to run a random script.
The correct imports should be package-relative:
from .feeds import RSS_FEEDS
from .termcolors import PC
Furthermore if you intend to run a package, that should be either a runnable package using __main__:
python -m rssita
or you should run the sub-package as a module:
python -m rssita.rssita
Because you're using an src-package, you'll either need to create a pyproject.toml so you can use an editable install, or you'll have to PYTHONPATH=src before you run the command. This ensures the packages are visible at the top-level of the PYTHONPATH, and thus correctly importable. Though I'm not a specialist in the interaction of src layouts & runnable packages, so there may be better solutions.
Related
My folder structure is as follows
./fff
├── __init__.py
├── fg
│ ├── __init__.py
│ └── settings
│ ├── __init__.py
│ └── settings.py
└── obng
└── test.py
I want to import the settings.py inside fg/settings as a module into the test.py
I have added the line
from ..fg.settings import settings
But when I run it, it gives me the following error
Traceback (most recent call last):
File "/mnt/d/Repos/fff/obng/test.py", line 1, in
from ..fg.settings import settings
ImportError: attempted relative import with no known parent package
This style of relative importing is supported as per https://docs.python.org/3/reference/import.html#package-relative-imports
What am I doing wrong here?
It is a matter of how you run your project - you should run from the parent directory of the top-level package as in
$ cd ../fff
$ python -m fff.obng.test # note no py
Then relative imports will be resolved correctly. It is an antipattern running a script directly from its folder
Normally you can't use relative imports when you run your python module as main module like python filename.py but there is a hack using __package__ to achieve this. Remember __package__ is how python resolves relative imports:
1- Create a file called __init__.py in your root directory - fff. ( I can see that you have it, I mentioned for completeness)
2- Put this code on top of your test.py module:
if __name__ == '__main__' and not __package__:
import sys
sys.path.insert(0, <path to parent directory of root directory - fff>)
__package__ = 'fff.obng'
Note: sys.path is where python searches for modules to import them.
3- Now place your relative import statement after the code above (inside the if statement, because we don't wanna mess when your test.py is being imported) :
from ..fg.settings import settings
Now you can call you test.py, it will run without problem. I don't recommend using these hacks but showing the flexibility of the language and doing exactly what you wanna do in some cases is beneficial.
Other good solutions: Absolute import I think is easier and cleaner than this. In addition take a look at #Mr_and_Mrs_D's answer another good solution would be to run your module with -m command-line flag.
Relative imports are based on the name of the current module. When running
python fff/obng/test.py
the name of test.py will be __main__ and the import will not work.
What will work is having another script called "test.py" outside the fff module that imports the fff.obng.test
fff_top
├── fff
│ ├── fg
│ │ ├── __init__.py
│ │ └── settings
│ │ ├── __init__.py
│ │ └── settings.py
│ ├── __init__.py
│ └── obng
│ ├── __init__.py
│ └── test.py
└── test.py
with fff_top/test.py:
import fff.obng.test
Then, running the "external" test.py should be ok:
python fft_top/test.py
Alternatively, I would recommend dropping relative imports entirely. One way to do this is using a virtual environment for every package you write, using for example the venv library:
python -m venv venv
Then, add a setup.py in the root folder with the content:
from setuptools import setup, find_packages
setup(name="fff", packages=find_packages())
and change the imports in obng/test.py:
from fff.fg.settings import settings
Finally, activate your virtual environment:
source venv/bin/activate
and install your package in editable mode:
pip install -e .
Then, after you have completed all the steps above:
python fff/obng/test.py
should work.
In Linux, you could create a symbolic link:
$ ln -s ../folder1 mymodules
$ python
>>> import mymodules.myfancymodule as fancy
I'm in the process of structuring my PyQt5 application to more established conventions. Now it looks like this
MyProj
├── my_qt_tool
│ ├── __init__.py
│ ├── class1.py
│ ├── my_qt_tool.py
│ ├── wizard1.py
│ ├── resources
│ │ └── templates
│ │ └── tool.conf.template
│ └── ui
│ ├── __init__.py
│ ├── mainwindow.py
│ ├── mainwindow.ui
│ ├── wizard_01_start.py
│ ├── wizard_01_start.ui
│ ├── ...
├── my_qt_tool.spec # for PyInstaller
├── bin
│ └── generate_ui_code.py # for compiling Qt *.ui to *.py
├── dist
│ └── my_qt_tool
├── environment.yml # conda environment requirements.
├── LICENSE
└── README.md
So MyProj is the top-level git repo, my_qt_tool is the package of my application, with a subpackage for UI specific code, my_qt_tool.py contains the "main" code which runs the GUI, class1.py handles business logic and wizard1.py is just some extra class for a GUI wizard.
Q1: Is this project structure canonical? Is the main function where it should be? Should *.ui files be separated to resources?
Now, after some haggling with imports, I added my_qt_tool as source directory to pycharm to make the imports work and created a run for my_qt_tool.py with working dir MyProj/my_qt_tool.
Q2: Technically, I want the working dir to be MyProj, but then I would have to reference resources/templates/tool.conf.template with my_qt_tool/resources.., which seems yucky... or is this the way to do it?
Now the imports in my_qt_tool look like this:
from class1 import DataModel
from ui.mainwindow import Ui_MainWindow
...
so no relative imports or the like, because everything is in the same package, right? (Again: to make this work, I had to add my_qt_tool as source directory in my PyCharm project settings...)
Q3: Okay, now the thing that doesn't work. Running PyInstaller on the spec file, which is pretty much stock with Analysis(['my_qt_tool/my_qt_tool.py'], ..., the resulting binary fails to start with the error message: ModuleNotFoundError: No Module named 'class1'. How can I fix this up?
Q1
if project going to get larger you may create module specific folders and each module has py and gui files within. structure it as mvc project folders.
For folder structure of mvc: https://www.tutorialsteacher.com/mvc/mvc-folder-structure and here is hov model-view architecture can be implemented https://www.learnpyqt.com/courses/model-views/modelview-architecture/.
Q2
read resources/templates/tool.conf.template when application bootstrapped instead of statically referencing. this could be done in generate_ui_code.py to load all configs as part of app reference
so no relative imports or the like, because everything is in the same package, right? (Again: to make this work, I had to add my_qt_tool as source directory in my PyCharm project settings...)
no need to add my_qt_tool if bootstrapped properly
Q3
add these 2 lines top of spec file
import sys
sys.setrecursionlimit(5000)
if you still encounter class1 related issue try importing it in where pyinstaller being called for your case its my_qt_tool.py
fix pyinstaller issue first and than consider refactoring folder structure with model-view conventions.
here are some fairly large project examples
https://wiki.python.org/moin/PyQt/SomeExistingApplications
https://github.com/topics/pyqt5-desktop-application
In a python project, I have the following directory structure
├── cooccurrence
│ ├── cooccurrence.py
│ ├── __init__.py
├── README.md
└── tests
├── __init__.py
└── test_coccurrence.py
This leads to tests code inside my test source files having a quite ceremonial line:
from cooccurrence.cooccurrence import CoCreate
How would I simplify this overall setup if I only needed a single module, and conversely, what project structure should I have to manage multiple modules under the same package?
To test, I simply use python -m unittest discover -v, and a solution that can also seamlessly enable using the project within PyCharm would be much appreciated.
You can import files in __init__.py so it will be available on package level. For example you can do in cooccurrence/__init__.py:
from cooccurrence import CoCreate
and then in your test file:
from cooccurrence import CoCreate
It will be the Pythonic way of doing so
Put the following code line in cooccurrence/__init__.py path:
from cooccurrence import *
[Note]:
Tested on Python 2.7
Consider this application:
.
├── LICENSE
├── MANIFEST.in
├── program
│ ├── apple.py
│ ├── __init__.py
│ ├── __main__.py
│ ├── nonfruit.py
│ ├── pear.py
│ ├── strawberry.py
│ └── vegetables
│ ├── carrot.py
│ ├── __init__.py
│ └── lettuce.py
├── README.md
├── setup.cfg
└── setup.py
__main__.py is the file that users should run to use my program. I am distributing my program via PyPI and so I want to be able to install it via pip as well. Because of that, I created a setup.py file with an entry point:
entry_points = {
'console_scripts': ['pg=program.__main__:main']}
The problem I'm facing is that there are several imports in my program, and these result in the situation that my program does run 'locally' (by executing python ./__main__.py, but not from installation (by running pg). Or, depending on the way I import it, the other way around.
__main__.py imports nonfruit.py:
from nonfruit import Nonfruit
nonfruit.py imports vegetables/carrot.py:
import vegetables.carrot
ca = vegetables.carrot.Carrot()
I would like to hear some advice in structuring my project regarding imports, so that it runs both locally and from setuptools installation. For example, should I use absolute imports or relative imports? And should I use from X import Y or import X.Y?
I found a solution on Jan-Philip Gehrcke's website.
The instructions are written for use with Python 3, but I applied it to Python 2.7 with success. The problem I was having originated from my directory becoming a package. Jan advices to create one file to run it from source (bootstrap-runner.py) and one file to run it from installation (bootstrap/__main__.py). Furthermore, he advices to use explicit relative imports:
from .X import Y
This will probably be a good guideline in the next applications I'm writing.
Given the following python project, created in PyDev:
├── algorithms
│ ├── __init__.py
│ └── neighborhood
│ ├── __init__.py
│ ├── neighbor
│ │ ├── connector.py
│ │ ├── __init__.py
│ │ ├── manager.py
│ │ └── references.py
│ ├── neighborhood.py
│ ├── tests
│ │ ├── fixtures
│ │ │ └── neighborhood
│ │ ├── __init__.py
│ └── web
│ ├── __init__.py
│ └── service.py
├── configuration
│ ├── Config.py
│ └── __init__.py
├── __init__.py
└── webtrack
|- teste.py
├── .gitignore
├── __init__.py
├── manager
├── Data.py
├── ImportFile.py
└── __init__.py
We've been trying with no success to import modules from one folder to another, such as:
from algorithms.neighborhood.neighbor.connector import NeighborhoodConnector
Which yields the result:
Traceback (most recent call last):
File "teste.py", line 49, in <module>
from algorithms.neighborhood.neighbor.connector import NeighborhoodConnector
ImportError: No module named algorithms.neighborhood.neighbor.connector
We tried to append its path to the sys.path variable but with no success.
We also tried to use os.walk to insert all paths into PATH variable but still we get the same error, even though we checked PATH does contain the path to find the modules.
We are using Python 2.7 on Linux Ubuntu 13.10.
Is there anything we could be doing wrong?
Thanks in advance,
Getting imports right when running a script that lives within a package is tricky. You can read this section of the (sadly deferred) PEP 395 for a description of a bunch of ways that don't work to run such a script.
Give a file system hierarchy like:
top_level/
my_package/
__init__.py
sub_package/
__init__.py
module_a.py
module_b.py
sub_sub_package/
__init__.py
module_c.py
scripts/
__init__.py
my_script.py
script_subpackage/
__init__.py
script_module.py
There are only a few ways to make running my_script.py work right.
The first would be to put the top_level folder into the PYTHONPATH environment variable, or use a .pth file to achieve the same thing. Or, once the interpreter is running, insert that folder into sys.path (but this can get ugly).
Note that you're adding top_level to the path, not my_package! I suspect this is what you've got messed up in your current attempts at this solution. Its very easy to get wrong.
Then, absolute imports like import my_package.sub_package.module_a will mostly work correctly. (Just don't try importing package.scripts.my_script itself while it is running as the __main__ module, or you'll get a weird duplicate copy of the module.)
However, absolute imports will always be more verbose than relative imports, since you always need to specify the full path, even if you're importing a sibling module (or "niece" module, like module_c from module_a). With absolute imports, the way to get module_c is always the big, ugly mouthful of code from my_package.sub_package.sub_sub_package import module_c regardless of what module is doing the importing.
For that reason, using relative imports is often more elegant. Alas, they're hard to get to work from a script. The only ways are:
Run my_script from the top_level folder with the -m flag (e.g. python -m my_package.scripts.my_script) and never by filename.
It won't work if you're in a different folder, or if you use a different method to run the script (like pressing F5 in an IDE). This is somewhat inflexible, but there's not really any way to make it easier (until PEP 395 gets undeferred and implemented).
Set up sys.path like for absolute imports (e.g. add top_level to PYTHONPATH or something), then use a PEP 366 __package__ string to tell Python what the expected package of your script is. That is, in my_script.py you'd want to put something like this above all your relative imports:
if __name__ == "__main__" and __package__ is None:
__package__ = "my_package.my_scripts"
This will require updating if you reorganize your file organization and move the script to a different package (but that's probably less work than updating lots of absolute imports).
Once you've implemented one of those soutions, your imports can get simpler. Importing module_c from module_a becomes from .sub_sub_package import module_c. In my_script, relative imports like from ..subpackage import module_a will just work.
I know this is an old post but still I am going to post my solution.
Had a similar issue. Just added the paths with the following line before importing the package:
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
from lib import create_graph
The way imports work is slightly different in Python 2 and 3. First Python 3 and the sane way (which you seem to expect). In Python 3, all imports are relative to the folders in sys.path (see here for more about the module search path). Python doesn't use $PATH, by the way.
So you can import anything from anywhere without worrying too much.
In Python 2, imports are relative and sometimes absolute. The document about packages contains an example layout and some import statements which might be useful for you.
The section "Intra-package References" contains information about how to import between packages.
From all the above, I think that your sys.path is wrong. Make sure the folder which contains algorithms (i.e. not algorithms itself but it's parent) needs to be in sys.path
Just set __package__ = None in every .py file. It will setup all the package hierarchy automatically.
After that you may freely use absolute module names for import.
from algorithms.neighborhood.neighbor.connector import NeighborhoodConnector