Here is my project directory structure, which includes the project folder, plus
a "framework" folder containing packages and modules shared amongst several projects
which resides at the same level in the hierarchy as the project folders:
Framework/
package1/
__init__.py
mod1.py
mod2.py
package2/
__init__.py
moda.py
modb.py
My_Project/
src/
main_package/
__init__.py
main_module.py
setup.py
README.txt
Here is a partial listing of the contents of my setup.py file:
from distutils.core import setup
setup(packages=[
'package1',
'package2.moda',
'main_package'
],
package_dir={
'package1': '../Framework/package1',
'package2.moda': '../Framework/package2',
'main_package': 'src/main_package'
})
Here are the issues:
No dist or build directories are created
Manifest file is created, but all modules in package2 are listed, not just the moda.py module
The build terminates with an error:
README.txt: Incorrect function
I don't know if I have a single issue (possibly related to my directory structure) or if I have multiple issues but I've read everything I can find on distribution of Python applications, and I'm stumped.
If I understand correctly, the paths in package_dir should stop at the parent directory of the directories which are Python packages. In other words, try this:
package_dir={'package1': '../Framework',
'package2': '../Framework',
'main_package': 'src'})
I've had a similar problem, which was solved through the specification of the root folder and of the packages inside that root.
My package has the following structure:
.
├── LICENSE
├── README.md
├── setup.py
└── src
└── common
├── __init__.py
├── persistence.py
├── schemas.py
└── utils.py
The setup.py contains the package_dir and packages line:
package_dir={"myutils": "src"},
packages=['myutils.common'],
After running the python setup.py bdist_wheel and installing the .whl file, the package can be called using:
import myutils.common
Related
I'm working on a project with the following directory structure:
project/
package1/
module1.py
module2.py
package2/
module1.py
module2.py
main1.py
main2.py
main3.py
...
mainN.py
where each mainX.py file is an executable Python script that imports modules from either package1, package2, or both. package1 and package2 are subpackages meant to be distributed along with the rest of the project (not independently).
The standard thing to do is to put your entry point in the top-level directory. I have N entry points, so I put them all in the top-level directory. The trouble is that N keeps growing, so my top-level directory is getting flooded with entry points.
I could move the mainX.py files to a sub-directory (say, project/run), but then all of the package1 and package2 imports would break. I could extract package1 and package2 to a separate repository and just expect it to be installed on the system (i.e., in the system / user python path), but that would complicate installation. I could modify the Python path as a precondition or during runtime, but that's messy and could introduce unintended consequences. I could write a single main.py entry point script with argument subparsers respectively pointing to run/main1.py, ..., run/mainN.py, but that would introduce coupling between main.py and each of the run/mainX.py files.
What's the standard, "Pythonic" solution to this issue?
The standard solution is to use console_scripts packaging for your entry points - read about the entry-points specification here. This feature can be used to generate script wrappers like main1.py ... mainN.py at installation time.
Since these script wrappers are generated code, they do not exist in the project source directory at all, so that problem of clutter ("top-level directory is getting flooded with entry points") goes away.
The actual code for the scripts will be defined somewhere within the package, and the places where the main*.py scripts will actually hook into code within the package is defined in the package metadata. You can hook a console script entry-point up to any callable within the package, provided it can be called without arguments (optional arguments, i.e. args with default values, are fine).
project
├── package1
│ ├── __init__.py
│ ├── module1.py
│ └── module2.py
├── package2
│ ├── __init__.py
│ ├── module1.py
│ └── module2.py
├── pyproject.toml
└── scripts
└── __init__.py
This is the new directory structure. Note the addition of __init__.py files, which indicates that package1 and package2 are packages and not just subdirectories.
For the new files added, here's the scripts/__init__.py:
# these imports should work
# from package1 import ...
# from package2.module1 import ...
def myscript1():
# put whatever main1.py did here
print("hello")
def myscript2():
# put whatever main2.py did here
print("world")
These don't need to be all in the same file, and you can put them wherever you want within the package actually, as long as you update the hooks in the [project.scripts] section of the packaging definition.
And here's that packaging definition:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "mypackage"
version = "0.0.1"
[project.scripts]
"main1.py" = "scripts:myscript1"
"main2.py" = "scripts:myscript2"
[tool.setuptools]
packages = ["package1", "package2", "scripts"]
Now when the package is installed, the console scripts are generated:
$ pip install --editable .
...
Successfully installed mypackage-0.0.1
$ main1.py
hello
$ main2.py
world
As mentioned, those executables do not live in the project directory, but within the site's scripts directory, which will be present on $PATH. The scripts are generated by pip, using vendored code from distlib's ScriptMaker. If you peek at the generated script files you'll see that they're simple wrappers, they'll just import the callable from within the package and then call it. Any argument parsing, logging configuration, etc must all still be handled within the package code.
$ ls
mypackage.egg-info package1 package2 pyproject.toml scripts
$ which main2.py
/tmp/project/.venv/bin/main2.py
The exact location of the scripts directory depends on your platform, but it can be checked like this in Python:
>>> import sysconfig
>>> sysconfig.get_path("scripts")
'/tmp/project/.venv/bin'
A solution for you is to sort the entrypoints in an additional package but run them as modules and not directly by file.
project/
package1/
module1.py
module2.py
package2/
module1.py
module2.py
run/
main1.py
main2.py
main3.py
...
mainN.py
python -m run.main3
This way your current directory (hopefully the project root) will still be the one prepended to sys.path instead of the directory containing the scripts.
More canonical solutions would include
configuring export PYTHONPATH=path/to/your/project
writing a path/to/your/project line in a foobar.pth file inside the site-packages folder of your virtualenv
using a single entrypoint that features subcommands, e.g. with https://click.palletsprojects.com/en/latest/api/#click.Group
I am trying to include a python file in the build/lib directory created when running
python setup.py install
In particular, I would like to include a simple configuration file ('definitions.py') that defines a ROOT_DIR variable, which is then used by subpackages. The 'definitions.py' file contains:
import os
ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
My goal is to have configuration files within each subpackage ('config.py') call ROOT_DIR to build their own absolute paths:
from definitions import ROOT_DIR
PACKAGE_DIR = os.path.join(ROOT_DIR, 'package1/')
The idea is drawn from this stackoverflow answer: https://stackoverflow.com/a/25389715.
However, this 'definitions.py' file never shows up in the build directory when running 'setup.py install'.
Here is the directory structure of the project:
project
|
├── setup.py
|
├── definitions.py
|
├── package1
| ├── __init__.py
| ├── config.py
| └── ...
|
├── package2
| ├── __init__.py
| └── ...
└── ...
My multiple attempts have failed (trying, e.g. the suggestions offered in https://stackoverflow.com/a/11848281). As far as I can tell, it's because definitions.py is in the top-level of my project structure (which lacks an __init__.py file).
I have tried:
1) ...using the 'package-data' variable in setuptools.setup()
package_data={'package': ['./definitions.py']}
but definitions.py does not show up in the build (I think because definitions.py is not in a 'package' that has an __init__.py?).
2) ...using a MANIFEST.in file, but this also does not work(I think because MANIFEST does not work with .py files?)
My question:
Is there a way to include definitions.py in the build directory? Or, is there a better way to provide access to absolute paths built from the top-level directory for multiple sub-packages?
If you are looking for a way to access a non-python data file in the installed module like in the question you've linked (a configuration file in the top-level package that should be accessible in subpackages), use pkg_resources machinery instead of inventing a custom path resolution. An example project structure:
project
├── setup.py
└── root
├── __init__.py
├── config.txt
├── sub1
│ └── __init__.py
└── sub2
└── __init__.py
setup.py:
from setuptools import setup
setup(
name='myproj',
...,
packages=['root', 'root.sub1', 'root.sub2'], # or setuptools.find_packages()
package_data={'root': ['config.txt']}
)
Update:
As pointed out by wim in the comments, there's now a backport for importlib.resources (which is only available in Python 3.7 and onwards) - importlib_resources, which offers a modern resource machinery that utilizes pathlib:
# access the filepath
importlib_resources.path('root', 'config.txt')
# access the contents as string
importlib_resources.read_text('root', 'config.txt')
# access the contents as file-like object
importlib_resources.open_binary('root', 'config.txt')
Original answer
Using pkg_resources, you can access the root/config.txt from any spot of your package without having to perform any path resolution at all:
import pkg_resources
# access the filepath:
filepath = pkg_resources.resource_filename('root', 'config.txt')
# access the contents as string:
contents = pkg_resources.resource_string('root', 'config.txt')
# access the contents as file-like object:
contents = pkg_resources.resource_stream('root', 'config.txt')
etc.
I have a flask app which looks like
my-app
│ └── src
│ └── python
│ └── config
│ └── app
│── MANIFEST.in
└── setup.py
The config folder is full of *.yaml files, I want to add all the static config files into my python egg after using
python setup.py install
My setup.py looks like
import os
from setuptools import setup, find_packages
path = os.path.dirname(os.path.abspath(__file__))
setup(
name="app",
version="1.0.0",
author="Anna",
description="",
keywords=[],
packages=find_packages(path + '/src/python'),
package_dir={'': path + '/src/python'},
include_package_data=True
)
I am trying the use the MANIFEST.in to add the config file
However, it always give error
error: Error: setup script specifies an absolute path:
/Users/Anna/Desktop/my-app/src/python/app
setup() arguments must *always* be /-separated paths relative to the
setup.py directory, *never* absolute paths.
I have not used any absolute paths in my code, I've seen other posts trying to bypass this error, by removing
include_package_data=True
However, in my case, if i do this to avoid this error, all my yamls won't be added.
I was wondering if there are ways to fix this problem. Thanks
I have the following project structure:
.
├── docs
├── examples
├── MANIFEST.in
├── README.rst
├── setup.cfg
├── setup.py
└── myproject
I want to bundle my project into a wheel. For this, I use the following setup.py:
#!/usr/bin/env python
from setuptools import setup, find_packages
setup(name='myproject',
version='1.0',
description='Great project'
long_description=open('README.rst').read(),
author='Myself'
packages=find_packages(exclude=['tests','test','examples'])
)
When running python setup.py bdist_wheel, the examples directory is included in the wheel. How do I prevent this?
According to
Excluding a top-level directory from a setuptools package
I would expect that examples is excluded.
I solved the issue by using a suffixed star, examples*, i.e.:
find_packages(exclude=['*tests','examples*'])
(Note that I am writing '*tests' with a leading star,because I have test packages within each code package, as in myproject.mypackage.tests. Somehow the suffixed star seems to not be necessary if there is already a prefixed one)
I am trying to create a Python package, and I have a directory structure like this:
mypkg/
├── __init__.py
├── module1
│ ├── x.py
│ ├── y.py
│ └── z.txt
└── module2
├── a.py
└── b.py
Then I added all the files in MANIFEST.in and when I check the created archive, it had all the files.
When I do python setup.py install in the dist-packages/mypkg/module1. I see only the Python files and not z.txt.
I have z.txt in both MANIFEST.in and setup.py:
setup (
packages = [
'mypkg',
'mypkg.module1',
'mypkg.module2',
],
package_data = {
'mypkg': ['module1/z.txt']
},
include_package_data = True,
...
)
I tried adding the file as data_files as well but that created a directory in /usr/local. I want to keep it inside the source code directory as the code uses that data.
I have read the posts listed below but I keep getting confused about what is the right way to keep z.txt in the right location after setup.py install.
MANIFEST.in ignored on "python setup.py install" - no data files installed?
Installing data files into site-packages with setup.py
http://blog.codekills.net/2011/07/15/lies,-more-lies-and-python-packaging-documentation-on--package_data-/
Try using setuptools instead of distutils.
Update: It got fixed when I started using setuptools instead of distutils.core. I think it was some problem with distutils not agreeing with manifest while setuptools worked without any changes in the code. I recommend using setuptools in the future. Using the link here : setup tools- developers guide