I use python click package and setuptools to create a simple command line.
And I work in a pipenv virtualenv.
My working directory is like this:
jkt/scripts/app.py
And my setup.py is like this:
from setuptools import setup, find_packages
setup(
name='jkt',
version='0.1',
packages=find_packages(),
include_package_data=True,
entry_points='''
[console_scripts]
jktool=jkt.scripts.app:my_function
''',
)
Then I run the command
pip install --editable .
And run jktool to execute my_function but I get the error:
ModuleNotFoundError No module named 'jkt'.
But when the app.py in jkt directory I can run my function
setup(
name='app',
version='0.1',
py_modules=['app'],
entry_points='''
[console_scripts]
app=app:jktools
''',
)
After I run pip install -e . I can use app command to run my function.
As I mentioned, I can't reproduce your error (Python 3.7 with modern pip seems to work just fine), but there are a couple things that could potentially be going wrong on older versions.
Since it doesn't look like you put __init__.py files in your subdirectories, find_packages doesn't actually find any packages at all (python3 -c 'from setuptools import find_packages; print(find_packages()) prints the empty list, []). You can fix this in one of three ways:
Create empty __init__.py files to explicitly mark those folders as package folders; on a UNIX-like system, touch jkt/__init__.py and touch jkt/scripts/__init__.py is enough to create them
Python 3.3+ only: (also requires modern setuptools so pip install --upgrade setuptools might be necessary) Replace your use of find_packages with find_namespace_packages (which recognizes Python 3 era implicit namespace packages).
Just get rid of find_packages entirely and list the packages directly, e.g. replace packages=find_packages(), with packages=['jkt', 'jkt.scripts'],
Options #2 only works on Python 3.3+, so if your package is intended to work on older versions of Python, go with option #1 or #3.
Related
I'm trying to package a python library that has setup-time (and also run-time) dependencies: it imports the modules so that the modules can inform the setup process of the location of some provided C headers:
from distutils.extension import Extension
from pybedtools.helpers import get_includes as pybedtools_get_includes
from pysam import get_include as pysam_get_include
# [...]
extensions = [
Extension(
"bam25prime.libcollapsesam", ["bam25prime/libcollapsesam.pyx"],
include_dirs=pysam_get_include()),
Extension(
"bam25prime.libcollapsebed", ["bam25prime/libcollapsebed.pyx"],
include_dirs=pybedtools_get_includes(),
language="c++"),
]
# [...]
However, one of the dependencies (pybedtools) needs to be installed with a specific --global-option pip option (see at the end of the post what happens when the option is not provided).
If I understand correctly, the currently up-to-date way to automatically have some dependencies available before setup.py is used is to indicate them in the [build-system] section of a pyproject.toml file.
I tried the following pyproject.toml:
[build-system]
requires = [
"pysam",
"pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'",
]
build-backend = "setuptools.build_meta"
(By the way, it took me quite some time to figure out how to specify the build-backend, the documentation is not easily discoverable.)
However, this generates the following error upon pip install:
ERROR: Invalid requirement: "pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'"
Hint: It looks like a path. File 'pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'' does not exist.
How can I correctly specify options for dependencies ?
If I simply specify the package and its URL ("pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers), the install fails as follows:
Exception:
Cython-generated file 'pybedtools/cbedtools.cpp' not found.
Please install Cython and run
python setup.py cythonize
It was while trying to tackle the above error that I found out about the --global-option pip option.
I can manually run pip install --global-option="cythonize" git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers, and the install works, provided the dependencies of that package are already installed, otherwise their install fails because of an unrecognized "cythonize" option (which is another issue...).
Note that this option is only needed when installing "from source" (for instance when installing from a fork on github, as is my case here).
Same thing as in your other question, I suspect cythonize is a setuptools command and not a global option.
If it's indeed the case, then you would be better off setting an alias in your setup.cfg. If you run python setup.py alias install cythonize install, this should add the following to your setup.cfg:
[aliases]
install = cythonize install
When running pip install later, pip will honor this alias and the cythonize command will be executed right before the install command.
I'm trying to prepare setup.py that will install all necessary dependencies, including google-cloud-pubsub. However, python setup.py install fails with
pkg_resources.UnknownExtra: googleapis-common-protos 1.6.0b6 has no such extra feature 'grpc'
The weird thing is that I can install those dependencies through pip install in my virtualenv.
How can I fix it or get around it? I use Python 2.7.15.
Here's minimal configuration to reproduce the problem:
setup.py
from setuptools import setup
setup(
name='example',
install_requires=['google-cloud-pubsub']
)
In your setup.py use the following:
from setuptools import setup
setup(
name='example',
install_requires=['google-cloud-pubsub', 'googleapis-common-protos==1.5.3']
)
That seems to get around it
I am trying to use setup.py to install a Python package that is kept in a git repository, which we'll call my_dependency. In my_package, I have a setup.py file with:
setup(
...
install_requires=[
...
'my_dependency=VERSION'
],
dependency_links=['git+https://...my_dependency.git#egg=my_dependency-VERSION',]
)
When I run my setup file (python setup.py develop), the dependency appears to install; it shows up as my_dependency==VERSION when I run pip freeze. However, when I start a python session and call import my_dependency, I get ImportError: No module named my_dependency.
I don't know if this is possibly the source of the problem, but when running setup.py, I get a warning:
Processing dependencies for my_package==0.1
Searching for my_dependency==VERSION
Doing git clone from https://.../my_dependency.git to /var/folders/.../.../T/easy_install-_rWjyp/my_dependency.git
Best match: my_dependency VERSION
Processing my_dependency.git
Writing /var/folders/.../my_dependency.git/setup.cfg
Running setup.py -q bdist_egg --dist-dir /var/folders/.../my_dependency.git/egg-dist-tmp-UMiNdL
warning: install_lib: 'build/lib' does not exist -- no Python modules to install
Copying my_dependency-VERSION-py2.7.egg to /.../my_package/venv/lib/python2.7/site-packages
Adding my_dependency VERSION to easy-install.pth file
However, I am able to use the package if I install it through pip, like this: pip install -e git+https://.../my_dependency.git#egg=my_dependency-VERSION
For reference, the dependency package structure looks like this:
my_dependency/
my_dependency/
__init__.py
setup.py
And its setup.py contains this:
from setuptools import setup
setup(
name='my_dependency',
version='VERSION',
description='...',
author='...',
url='https://...',
license='MIT',
install_requires=[
'numpy',
],
zip_safe=False,
)
The solution was (in retrospect) pretty silly. My dependency package was missing this line in its setup.py:
packages=['my_dependency'],
That meant the package was correctly building and installing, but it wasn't actually including the code in the package. This became apparent when I looked at the SOURCES.txt in the egg-info: it didn't include any of the Python source files in the package.
I am trying to add a runnable script for my project with setup.py. I added it to the scripts= argument of setup. The script works fine when I run it from the project, ./solver. I install it with sudo python setup.py install, and try to run it with solver, but I get ImportError: No module named 'model'. How do I correctly install and run my script with setuptools?
SOLVER/
solver/
model/
__init__.py
view/
__init__.py
controller/
__init__.py
__init__.py
main.py
solver <-- starts the app
setup.py
README.md
LICENCE
setup.py:
#!/usr/bin/env python3
import os
from setuptools import setup, find_packages
setup(
name='SOLVER',
version='1.0.0',
description='SOLVER app test',
author=['me'],
license='BSD',
classifiers=['Programming Language :: Python :: 3 :: Only'],
packages=['solver'],
#packages=find_packages(exclude=["doc", "tests"]),
install_requires=['numpy>=1.10.4'],
scripts=['solver/solver'],
)
solver:
#!/usr/bin/env python3
from solver import main
main.gui_mode()
You need to list all the packages, including the sub-packages, in the packages argument. You can use find_packages to generate that list for you. Currently, you're just installing the Python files in the solver/ directory.
from setuptools import setup, find_packages
setup(
...
packages=find_packages(),
...
)
You should also use entry_points rather than scripts, especially when all your script does is import and call one function. Setuptools will build scripts from the entry points that use the correct Python binary for the env they were installed in.
setup(
...
packages=find_packages(),
entry_points={
'console_scripts': [
'solver=solver.main:gui_mode'
]
...
}
You can install your package in development mode to get your script, rather than writing it yourself.
pip install -e .
You should use pip to install to the system as well. It keeps track of what was installed so you can uninstall it later.
pip install .
I have a virtualenv with multiple little projects in it. Consider that they are all equal, so my folder structure looks something like this:
categorisation_ml/
categorisation.py
setup.py
__init__.py
nlp/
nlp.py
setup.py
__init__.py
etc/
__init__.py
I want to install both packages into the same virtualenv so that they are both accessible everywhere within the virtualenv.
Using this and this guide, I have created a setup.py script like this (for categorisation in this case):
from setuptools import setup, find_packages
setup(
name = "categorisation",
version = "1.0",
scripts = ['categorisation.py']
)
then, I run python setup.py install , which seems to complete successfully.
When I cd into nlp/, enter python command line and try
import categorisation, I get:
ImportError: No module named categorisation.
What am I missing?
It seems that the package structure and setup.py is off. It should be something like this:
irrelevant_package_name/
__init__.py
setup.py
categorisation_ml/
categorisation.py
__init__.py
nlp/
nlp.py
__init__.py
and then the install script looking like this:
from setuptools import setup, find_packages
setup(
name='package_name',
version='1.0.0',
description='This is a working setup.py',
url='http://somesite.com',
author='Roman',
author_email='roman#somesite.com',
packages=find_packages(),
install_requires=[
'numpy',
],
zip_safe=False
)
Then install it like this:
python setup.py install #(just installs it as is)
python setup.py develop #(Keeps track of changes for development)
If you pip freeze this should come up
package_name==1.0.0
And then in python imports should look like this:
from categorisation_ml import categorisation
from nlp import nlp