PyQt5 Project Structure and PyInstaller ModuleNotFoundError - python

I'm in the process of structuring my PyQt5 application to more established conventions. Now it looks like this
MyProj
├── my_qt_tool
│   ├── __init__.py
│   ├── class1.py
│   ├── my_qt_tool.py
│   ├── wizard1.py
│   ├── resources
│   │   └── templates
│   │   └── tool.conf.template
│   └── ui
│   ├── __init__.py
│   ├── mainwindow.py
│   ├── mainwindow.ui
│   ├── wizard_01_start.py
│   ├── wizard_01_start.ui
│   ├── ...
├── my_qt_tool.spec # for PyInstaller
├── bin
│   └── generate_ui_code.py # for compiling Qt *.ui to *.py
├── dist
│   └── my_qt_tool
├── environment.yml # conda environment requirements.
├── LICENSE
└── README.md
So MyProj is the top-level git repo, my_qt_tool is the package of my application, with a subpackage for UI specific code, my_qt_tool.py contains the "main" code which runs the GUI, class1.py handles business logic and wizard1.py is just some extra class for a GUI wizard.
Q1: Is this project structure canonical? Is the main function where it should be? Should *.ui files be separated to resources?
Now, after some haggling with imports, I added my_qt_tool as source directory to pycharm to make the imports work and created a run for my_qt_tool.py with working dir MyProj/my_qt_tool.
Q2: Technically, I want the working dir to be MyProj, but then I would have to reference resources/templates/tool.conf.template with my_qt_tool/resources.., which seems yucky... or is this the way to do it?
Now the imports in my_qt_tool look like this:
from class1 import DataModel
from ui.mainwindow import Ui_MainWindow
...
so no relative imports or the like, because everything is in the same package, right? (Again: to make this work, I had to add my_qt_tool as source directory in my PyCharm project settings...)
Q3: Okay, now the thing that doesn't work. Running PyInstaller on the spec file, which is pretty much stock with Analysis(['my_qt_tool/my_qt_tool.py'], ..., the resulting binary fails to start with the error message: ModuleNotFoundError: No Module named 'class1'. How can I fix this up?

Q1
if project going to get larger you may create module specific folders and each module has py and gui files within. structure it as mvc project folders.
For folder structure of mvc: https://www.tutorialsteacher.com/mvc/mvc-folder-structure and here is hov model-view architecture can be implemented https://www.learnpyqt.com/courses/model-views/modelview-architecture/.
Q2
read resources/templates/tool.conf.template when application bootstrapped instead of statically referencing. this could be done in generate_ui_code.py to load all configs as part of app reference
so no relative imports or the like, because everything is in the same package, right? (Again: to make this work, I had to add my_qt_tool as source directory in my PyCharm project settings...)
no need to add my_qt_tool if bootstrapped properly
Q3
add these 2 lines top of spec file
import sys
sys.setrecursionlimit(5000)
if you still encounter class1 related issue try importing it in where pyinstaller being called for your case its my_qt_tool.py
fix pyinstaller issue first and than consider refactoring folder structure with model-view conventions.
here are some fairly large project examples
https://wiki.python.org/moin/PyQt/SomeExistingApplications
https://github.com/topics/pyqt5-desktop-application

Related

Importing a file to a Django project - errors when running manage.py due to location of script

So, I know the source of my error, and I can fix it in a kind of hacky way, but I want to know the sort of best practices way of solving it - especially as my hacky way runs into issues when running stuff via commandline - and throws errors in my IDE.
So, I have a django project, the folder tree looks like this (edited out irrelevant parts)
├── manage.py
├── simulator
│   ├── events.py
│   ├── parser.py
│   ├── parser_test.py
│   ├── patch.py
│   ├── simulator.py
├── VSPOMs
│   ├── settings.py
│   ├── urls.py
│   └── wsgi.py
└── VSPOMsApp
├── admin.py
├── apps.py
├── models.py
├── tests.py
├── urls.py
└── views.py
When I run manage.py that is obviously running in the . directory. views.py is in the VSPOMSapp directory. If I have the following imports in views.py
from ..simulator.patch import Patch
from ..simulator.simulator import Simulator
The IDE doesn't throw an error - and this is correct as it is searching in the parent directory for a file called patch in a folder called simulator that is in the dir above. However, when I run manage.py this causes me to get an error
ImportError: attempted relative import beyond top-level package
This is, as mentioned, because manage.py is not running in the same dir as views.py. I can fix this by importing from
simulator.patch
But this is A) hacky and B runs into other, worse errors.
In patch.py we have to import from events, however, as manage.py is running in a different directory for it to work in Django the code has to be
from simulator.events import ColonisationEvent, ExtinctionEvent
Which is not only wrong, as events is in the same dir as patch but doesn't work when I need to run from command line for testing purposes. Any idea on how to fix this? I've tried to look up relative imports and stuff, but I'm really not sure what the best solution is - adding an __init__.py file in the simulator didn't help either

Import packages from outside django project

Context: The package being considered is an application that was developed by another team and I want to use the functionality exposed as part of an API call in my django project.
Directory layout:
<repo>
├── org_wide_django_code
│   ├── my_django_project
│   │   ├── my_django_project
│   │   ├── manage.py
│   │   ├── requirements.txt
│   │   └── my_application
├── frontend
│   └── application
├── other_team_work
│   ├── README.md
│   ├── __init__.py
│   ├── implemented_logic
What is the best way for me to use other_team_work in my django project my_django_project?
Prior reading:
Manipulating PYTHONPATH or sys.path is an option
Setting up a .whl or .egg to install other_team_work (also need to add a setup.py there)
I am not sure there is a best way since this depends a lot on the internal tooling of your organisation. However the main thing to pay attention to IMO are release and development cycles. Ideally you want both teams to be able to release new features, functionalities without impacting the other. (for instance if other_team makes a change to their code for a third team you would like this to have no possible impact on your application. Similarly other_team should not need to know how and when you use their module to make changes to their application.
One way to do this would be to have other_team_work be packaged and installable (using a setup.py) and then pushed to a package registery (for instance gitlab's package registery or github packages). This way you can use other_team_work as though it were just another python package with pip (using the extra-package-url argument) and using specific versions in a requirements.txt.
The simplest way (but can be dangerous) is add other_team_work to your project in PYTHONPATH. You can add this to django settings.py file, after BASE_DIR definition.
Example:
import sys
BASE_DIR = Path(__file__).resolve().parent.parent # default Django base dir
OTHER_TEAM_WORK_PARENT_DIR = BASE_DIR.parent # in accordance with your directories
sys.path.append(str(OTHER_TEAM_WORK_PARENT_DIR))
This code add your directory org_wide_django_code to PYTHONPATH and you can import other_team_work in your code:
from other_team_work import implemented_logic

python package metadata best practice [duplicate]

A typical directory tree of a python project might look like this.
.
├── src
│   ├── __init__.py
│   ├── main.py
│   ├── module1
│   │   ├── __init__.py
│   │   └── foo.py
│   └── module2
│   ├── __init__.py
│   └── bar.py
├── setup.py
└── venv
└── ...
setup.py contains package metadata like name, version, description, etc.
In some cases, it is useful to have these values inside the application code. For example with FastAPI, you can provide them to the constructor of the API object so that they are shown in the auto-generated docs. Or with Click, you can add a version option.
To avoid duplication, these values should be defined only once and used in both places. However, I have never found a good way, to share these values between setup.py and application code.
Importing one from the other does not seem to work, because they are not part of the same package structure.
What is best practice in this case?
In the code (run-time code, not setup.py) use importlib.metadata (or its back-port importlib-metadata). The only thing you need to duplicate is the name of the project (the name of the "distribution package").
For example with a project named MyLibrary:
import importlib.metadata
PROJECT_NAME = 'MyLibrary'
_DISTRIBUTION_METADATA = importlib.metadata.metadata(PROJECT_NAME)
SUMMARY = _DISTRIBUTION_METADATA['Summary']
VERSION = _DISTRIBUTION_METADATA['Version']
Aside: If it is not possible to hard-code the name of the distribution package, there are ways to find it: https://stackoverflow.com/a/63849982

Trouble organizing Python library for import to behave in expected manner

I've had a lot of trouble figuring out a key point about how the import mechanism works, and how this relates to organizing packages.
Suppose I've written two or more unrelated, reusable libraries. (I'll use "library" informally as a collection of code and resources, including tests and possibly data, as opposed to a "package" in the formal Python sense.) Here are two imaginary libraries in a parent directory called "my_libraries":
my_libraries/
├── audio_studio
│   ├── src
│   │   ├── distortion.py
│   │   ├── filter.py
│   │   └── reverb.py
│   └── test
│   └── test_audio.py
└── picasso_graphics
├── src
│   ├── brushes.py
│   ├── colors.py
│   └── easel.py
└── test
└── test_picasso.py
I'm hoping to accomplish all three of the following, all of which seem to me to be normal practice or expectation:
1. MAIN LIBRARY CODE IN SUBDIRECTORY
For neatness of library organization, I want to put the library's core code in a subdirectory such as "src" rather than at the top-level directory. (My point here isn't to debate whether "src" in particular is a good naming approach; I've read multiple pages pro and con. Some people appear to prefer the form foo/foo, but I think I'd have the same problem I'm describing with that too.)
2. ADD TO $PYTHONPATH JUST ONCE
I'd like to be able to add "my_libraries" to $PYTHONPATH or sys.path just once. If I add a new library to "my_libraries", it's automatically discoverable by my scripts.
3. NORMAL-LOOKING import STATEMENTS
I'd like to be able import from these libraries into other projects in a normal-looking way, without mentioning the "src" directory:
import picasso_graphics.brushes
OR
from picasso_graphics import brushes
HOW TO DO THIS?
Despite much reading and experimentation, I haven't been able to find a solution which satisfies all three of these criteria. The closes I've gotten is to create a picasso_graphics/__init__.py file containing the following:
base_dir = os.path.dirname(__file__)
src_dir = os.path.join(base_dir, "src")
sys.path.insert(0, src_dir)
This almost does what I want, but I have to break up the imports into two statements, so that the __init__.py file executes with the first import:
import picasso_graphics
import brushes
Am I making a wrong assumption here about what's possible? Is there a solution which satisfies all three of these criteria?
What you want, Sean, is most likely what is called a namespace project; Use a tool called pyscaffold to help with writing the boilerplate. Each project should have a setup.cfg with all your project dependencies. Once you have that, create a virtual environment for your project, then install each with ... (inside the environment).
pip install -e audio-studio
pip install -e picasso-graphics
Installing your project into your virtual environment will cause your imports to behave as you want -- between projects.
This is a bit of overhead to get started, I know, but these are skills you want to have sooner or later. Setup.cfg, virtual environments, and pip install -e is a magical pattern that just makes things work where other approaches will drive you mad.
Below is a simple example project I created using pyscaffold. Notice there is a package below src and that src does not have an init.py. That is a decision made by the pyscaffold folks to help ease import confusion - you should likely adopt it.
my_libraries/
├── audio-studio
│   ├── AUTHORS.rst
│   ├── CHANGELOG.rst
│   ├── LICENSE.txt
│   ├── README.rst
│   ├── requirements.txt
│   ├── setup.cfg
│   ├── setup.py
│   ├── src
│   │   └── audio_studio
│   │   ├── __init__.py
│   │   └── skeleton.py
│   └── tests
│   ├── conftest.py
│   └── test_skeleton.py
└── picasso-graphics
├── AUTHORS.rst
├── CHANGELOG.rst
├── LICENSE.txt
├── README.rst
├── requirements.txt
├── setup.cfg
├── setup.py
├── src
│   └── picasso_graphics
│   ├── __init__.py
│   └── skeleton.py
└── tests
├── conftest.py
└── test_skeleton.py

What import system should I use when I want to run my application both 'from source' and from installing with setuptools?

Consider this application:
.
├── LICENSE
├── MANIFEST.in
├── program
│   ├── apple.py
│   ├── __init__.py
│   ├── __main__.py
│   ├── nonfruit.py
│   ├── pear.py
│   ├── strawberry.py
│   └── vegetables
│   ├── carrot.py
│   ├── __init__.py
│   └── lettuce.py
├── README.md
├── setup.cfg
└── setup.py
__main__.py is the file that users should run to use my program. I am distributing my program via PyPI and so I want to be able to install it via pip as well. Because of that, I created a setup.py file with an entry point:
entry_points = {
'console_scripts': ['pg=program.__main__:main']}
The problem I'm facing is that there are several imports in my program, and these result in the situation that my program does run 'locally' (by executing python ./__main__.py, but not from installation (by running pg). Or, depending on the way I import it, the other way around.
__main__.py imports nonfruit.py:
from nonfruit import Nonfruit
nonfruit.py imports vegetables/carrot.py:
import vegetables.carrot
ca = vegetables.carrot.Carrot()
I would like to hear some advice in structuring my project regarding imports, so that it runs both locally and from setuptools installation. For example, should I use absolute imports or relative imports? And should I use from X import Y or import X.Y?
I found a solution on Jan-Philip Gehrcke's website.
The instructions are written for use with Python 3, but I applied it to Python 2.7 with success. The problem I was having originated from my directory becoming a package. Jan advices to create one file to run it from source (bootstrap-runner.py) and one file to run it from installation (bootstrap/__main__.py). Furthermore, he advices to use explicit relative imports:
from .X import Y
This will probably be a good guideline in the next applications I'm writing.

Categories

Resources