I have an SO file mymodule.cpython-37m-x86_64-linux-gnu.so that I would like to make pip-installable.
My desired end goal is to have my installed package look like this:
% tree /home/.../python3.7/site-packages
/home/.../python3.7/site-packages
├── mymodule-1.0.0.dist-info
└── mymodule.cpython-37m-x86_64-linux-gnu.so
This is what I have tried so far:
% tree .
.
├── mymodule.cpython-37m-x86_64-linux-gnu.so
├── pyproject.toml
└── setup.cfg
# setup.cfg
[options]
py_modules = mymodule
[options.package_data]
* = mymodule.cpython-37m-x86_64-linux-gnu.so
However, when trying to pip install . I cannot seem to get the .so file to be installed into site-packages.
Interestingly, when there is a file named mymodule.py instead, mymodule.py gets installed in the desired location.
Related
Im trying to use Poetry and the scripts option to run a script. Like so:
pyproject.toml
[tool.poetry.scripts]
xyz = "src.cli:main"
Folder layout
.
├── poetry.lock
├── pyproject.toml
├── run-book.txt
└── src
├── __init__.py
└── cli.py
I then perform an install like so:
❯ poetry install
Installing dependencies from lock file
No dependencies to install or update
If I then try and run the command its not found (?)
❯ xyz
zsh: command not found: xyz
Am i missing something here! Thanks,
Poetry is likely installing the script in your user local directory. On Ubuntu, for example, this is $HOME/.local/bin. If that directory isn't in your path, your shell will not find the script.
A side note: It is generally a good idea to put a subdirectory with your package name in the src directory. It's generally better to not have an __init__.py in your src directory. Also consider renaming cli.py to __main__.py. This will allow your package to be run as a script using python -m package_name.
You did everything right besides not activating the virtual environment or running that alias (xyz) via poerty run xyz. One can activate the virtualenv via poetry shell. Afterwards, xyz should run from your shell.
PS: #jisrael18's answer is totally right. Normally one would have another folder (which is your main Python module) inside the src folder.
.
├── src
│ └── pyproj
│ ├── __init__.py
│ └── __main__.py
...
I've read a lot of blog posts and questions on this site about the usage of git submodules and still have no idea how to better use them with python.
I mean, what is the easier way to manage dependencies if I have such a package:
├── mypkg
│ └── __init__.py
├── setup.py
└── submodules
├── subm1
└── subm2
Then, what if I need to use "mypkg" as a submodule for "top_level_pkg":
├── setup.py
├── submodules
│ └── mypkg
└── top_level_package
└── __init__.py
, I want to run pip install . and have all resolved correctly (have each submodule installed to the VENV in correct order).
What I've tried:
Install each submodule using "pip" running in a subprocess. But it seems to be a hacky way and hard to manage (Unexpected installation of GIT submodule)
Use "install_requires" with "setuptools.find_packages()" but without success
Use requirements.txt file for each submodule, but I can't find a way how to automate it so "pip" could automatically install all requirements for all submodules.
Ideally, I imagine a separate setup.py file for each submodule with install_requires=['submodules/subm1', 'submodules/submn'], but setuptools does not support it.
I'm not saying it's impossible, but very hard and very tricky. A safer way is to turn each submodule into an installable Python module (with it's own setup.py) and install the submodules from Git.
This link describes how to install packages from Git with setup.py: https://stackoverflow.com/a/32689886/2952185
Thankfully to Gijs Wobben and sinoroc I came up with solution that works for my case:
install_requires=['subm1 # file://localhost/<CURENT_DIR>/path/to/subm1']
I have managed to install a Python package from a git submodule together with a main package. These are proprietary and are never published to PyPI. And both pip and tox seem to work just fine.
To set the context, I have a git repo with a single Python package and a single git submodule; the git submodule also contains a single Python package. I think this structure is as generic and simple as it can possibly be, here's a visualization:
main-git-repo-name
├── mainpkg
│ └── __init__.py
├── setup.py
├── tests
└── util-git-repo-name (this is a git submodule)
├── setup.py
├── test
└── utilpkg
└── __init__.py
I wanted to have pip install everything in a single invocation, and the utilpkg should be usable in mainpkg via just import utilpkg (not nested oddly).
The answer for me was all in setup.py:
First, specify the packages to install and their locations:
packages=find_packages(exclude=["tests"])
+ find_packages(where="util-git-repo-name/utilpkg", exclude=["test"]),
package_dir={
"mainpkg": "mainpkg",
"utilpkg": "util-git-repo-name/utilpkg"
},
Second, copy all the install_requires items from the git submodule package's setup.py file into the top level. In my case the utility package is an API client generated by swagger-codegen, so I had to add:
install_requires=[
"urllib3 >= 1.15", "six >= 1.10", "certifi", "python-dateutil",
...],
Anyhow, when running pip3 install . this config results in exactly what I want in the site-packages area: a directory mainpkg/ and a directory utilpkg/
HTH
Problem
I have read this post, which provides a way to permanently avoid the sys.path hack when importing names between sibling directories. However, I followed the procedures listed in that post but found that I could not import installed package (i.e. test).
The following are things I have already done
Step1: create a project that looks like following. Both __init__.py are empty.
test
├── __init__.py
├── setup.py
├── subfolder1
│ ├── __init__.py
│ ├── program1.py
├── subfolder2
│ ├── __init__.py
│ └── program2.py
# setup.py
from setuptools import setup, find_packages
setup(name="test", version="0.1", packages=find_packages())
# program1
def func1():
print("I am from func1 in subfolder1/func1")
# program2
from test.subfolder1 import program1
program1.func1()
Step2. create virtual environment in project root directory (i.e. test directory)
conda create -n test --clone base
launch a new terminal and conda activate test
pip install -e .
conda list and I see the following, which means my test project is indeed installed in the virtual environment
...
test 0.1 dev_0 <develop>
...
Step3: go to the subfolder2 and python program2.py, but unexpectedly it returned
ModuleNotFoundError: No module named 'test.subfolder1'
The issue is I think test should be available as long as I am in virtual environment. However, it does not seem to be the case here.
Could some one help me? Thank you in advance!
You need to create an empty __init__.py file in subfolder1 to make it a package.
Edit:
You should change the import in program2.py:
from subfolder1 import program1
Or you can move setup.py a level up.
Here is my project directory structure, which includes the project folder, plus
a "framework" folder containing packages and modules shared amongst several projects
which resides at the same level in the hierarchy as the project folders:
Framework/
package1/
__init__.py
mod1.py
mod2.py
package2/
__init__.py
moda.py
modb.py
My_Project/
src/
main_package/
__init__.py
main_module.py
setup.py
README.txt
Here is a partial listing of the contents of my setup.py file:
from distutils.core import setup
setup(packages=[
'package1',
'package2.moda',
'main_package'
],
package_dir={
'package1': '../Framework/package1',
'package2.moda': '../Framework/package2',
'main_package': 'src/main_package'
})
Here are the issues:
No dist or build directories are created
Manifest file is created, but all modules in package2 are listed, not just the moda.py module
The build terminates with an error:
README.txt: Incorrect function
I don't know if I have a single issue (possibly related to my directory structure) or if I have multiple issues but I've read everything I can find on distribution of Python applications, and I'm stumped.
If I understand correctly, the paths in package_dir should stop at the parent directory of the directories which are Python packages. In other words, try this:
package_dir={'package1': '../Framework',
'package2': '../Framework',
'main_package': 'src'})
I've had a similar problem, which was solved through the specification of the root folder and of the packages inside that root.
My package has the following structure:
.
├── LICENSE
├── README.md
├── setup.py
└── src
└── common
├── __init__.py
├── persistence.py
├── schemas.py
└── utils.py
The setup.py contains the package_dir and packages line:
package_dir={"myutils": "src"},
packages=['myutils.common'],
After running the python setup.py bdist_wheel and installing the .whl file, the package can be called using:
import myutils.common
I am trying to create a Python package, and I have a directory structure like this:
mypkg/
├── __init__.py
├── module1
│ ├── x.py
│ ├── y.py
│ └── z.txt
└── module2
├── a.py
└── b.py
Then I added all the files in MANIFEST.in and when I check the created archive, it had all the files.
When I do python setup.py install in the dist-packages/mypkg/module1. I see only the Python files and not z.txt.
I have z.txt in both MANIFEST.in and setup.py:
setup (
packages = [
'mypkg',
'mypkg.module1',
'mypkg.module2',
],
package_data = {
'mypkg': ['module1/z.txt']
},
include_package_data = True,
...
)
I tried adding the file as data_files as well but that created a directory in /usr/local. I want to keep it inside the source code directory as the code uses that data.
I have read the posts listed below but I keep getting confused about what is the right way to keep z.txt in the right location after setup.py install.
MANIFEST.in ignored on "python setup.py install" - no data files installed?
Installing data files into site-packages with setup.py
http://blog.codekills.net/2011/07/15/lies,-more-lies-and-python-packaging-documentation-on--package_data-/
Try using setuptools instead of distutils.
Update: It got fixed when I started using setuptools instead of distutils.core. I think it was some problem with distutils not agreeing with manifest while setuptools worked without any changes in the code. I recommend using setuptools in the future. Using the link here : setup tools- developers guide