extras_require with readthedocs and poetry - python

I use poetry to build my package with cython extensions, so install package flag is enabled. I'd like to host documentation via readthedocs. Poetry uses build.py file to generate setup.py with poetry build command. However readthedocs does not support build.py so I provided setup.py with following contents:
from setuptools import setup
from build import *
global setup_kwargs
setup_kwargs = {}
build(setup_kwargs)
setup(**setup_kwargs)
To get rid of requirements.txt file in docs folder, I would like to add extras_require parameter to setup, so the last line in setup.py:
setup(extras_require={'docs': ['toml']}, **setup_kwargs)
Readthedocs calls /<path>/envs/latest/bin/python -m pip install --upgrade --upgrade-strategy eager --no-cache-dir .[docs]
WARNING: does not provide the extra 'docs' and importing toml from docs/conf.py fails
If I add extras to pyproject.toml:
[tool.poetry.extras]
docs = ['toml']
Warning disappears but still rtd fails to import toml.
My .readthedocs.yml:
version: 2
sphinx:
configuration: docs/conf.py
formats: all
python:
version: 3.7
install:
- method: pip
path: .
extra_requirements:
- docs
submodules:
include: all

Related

Cannot import personal python module (installed with pip -e) from outside the module's directory

I am developing a package for internal use, and said package has a setup.py file (see below). I am a bit baffled because when I install my package (inside an environment & in editable mode so changes get reflected as I develop it), the import works if I am in the development directory, but says "package not found" from any other directory.
Moreover, pip list shows the package.
To install, I do the following:
~ > cd path/to/package
package > conda activate env
package (env) > pip install -e .
The package installs. Now, the problem is
package (env) > python -c "import mypackage" # works!
package (env) > cd
~ (env) > python -c "import mypackage" # error!
~ (env) > pip list | grep "mypackage"
mypackage 0.1.0
What's happening? Documentation says
-e, --editable <path/url>
Install a project in editable mode (i.e. setuptools “develop mode”) from a local project path or a VCS url.
Which doesn't really tell me that it should only work in said local project path...
setup.py
import os
import sys
from setuptools import find_packages, setup
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
LICENSE = "MIT"
setup(
name="name",
version="0.1.0",
author="organization",
description="...",
long_description=read('README.md'),
packages=find_packages(),
license=LICENSE,
classifiers=[
...
],
)
You need to add where you have your module to the system path
import sys
sys.path.append(yourPathHere)
import yourModule
Edit - for clarity - the above goes into the .py that calls your module, not the setup.py that you use for compiling it.
Huh - I see your environment does see it with the grep. Prob disregard the above then!!

Using tox with shared code results in double dependency installation which ultimatly downgrades the dependency

I have multiple modules in one repository. Now I would like to test the packaging for each module individually, while they might have dependencies between each other. Luckily I have found the dependency option using {distshare}/} in the documentation.
tox.ini:
[testenv]
deps =
pytest
{distshare}/pandas-ml-common-*.zip
Once I run tox it installs the local zip as one would expect. But since the dependency is also listed in the setup.py the module gets replaced by its older version from pypi. And yes you guessed it, this makes the tests fail. How can I avoid the installation from pypi once I have a dependency installed using distshare?
stdout:
(.venv) $ tox
GLOB sdist-make: /pandas-ml-utils/setup.py
py37 recreate: .tox/pandas_ml_common/py37
py37 installdeps: pytest, .tox/distshare/pandas-ml-common-0.2.0.zip
py37 inst: .tox/pandas_ml_common/.tmp/package/1/pandas-ml-utils-0.2.0.zip
py37 installed: cachetools==4.1.1,...,pandas-ml-common==0.1.15,... <--- here it is again
EDIT:
from setup.py:
packages=find_packages(),
install_requires=["pandas-ml-common", *open("requirements.txt").read().splitlines()],
extras_require={
"dev": open("dev-requirements.txt").read().splitlines(),
},
include_package_data=True,
In the requirements.txt are only external dependencies like numpy (everything without version atm).
I would maybe try something like:
[tox]
# ...
[testenv]
deps =
pytest
# omit "{distshare}/pandas-ml-common-*.zip"
commands_pre =
python -m pip install {distshare}/pandas-ml-common-*.zip
# ...

Cython generated c/cpp files in poetry sdist tar.gz for no-cython-installation

I use Poetry to build tar.gz. and .whl of my package. Cython docs recommend to distribute cython generated c files along with pyx ones. http://docs.cython.org/en/latest/src/userguide/source_files_and_compilation.html#distributing-cython-modules
What should I add to build.py or pyproject.toml to generate c/cpp files by calling poetry build and poetry build -f sdist?
I tried this (from Create package with cython so users can install it without having cython already installed):
build.py:
from setuptools.command.build_ext import build_ext
from setuptools.command.sdist import sdist as _sdist
...
class sdist(_sdist):
def run(self):
# Make sure the compiled Cython files in the distribution are up-to-date
self.run_command("build_ext")
_sdist.run(self)
def build(setup_kwargs):
setup_kwargs.update({
...
'cmdclass': {'sdist': sdist,
'build_ext': build_ext}
})
Not worked for me.
The current version of poetry (1.0.5) ignores custom build.py when building an sdist, so there's no chance without modifying poetry first. In the meantime, you can use third-party projects like taskipy to replace the poetry build command with a custom one, e.g.
# pyproject.toml
...
[tool.poetry.dev-dependencies]
cython = "^0.29.15"
taskipy = "^1.1.3"
[tool.taskipy.tasks]
sdist = "cython fib.pyx && poetry build -f sdist"
...
and execute poetry run task sdist instead of poetry build -f sdist.

Specify setup time dependency with `--global-option` for a python package

I'm trying to package a python library that has setup-time (and also run-time) dependencies: it imports the modules so that the modules can inform the setup process of the location of some provided C headers:
from distutils.extension import Extension
from pybedtools.helpers import get_includes as pybedtools_get_includes
from pysam import get_include as pysam_get_include
# [...]
extensions = [
Extension(
"bam25prime.libcollapsesam", ["bam25prime/libcollapsesam.pyx"],
include_dirs=pysam_get_include()),
Extension(
"bam25prime.libcollapsebed", ["bam25prime/libcollapsebed.pyx"],
include_dirs=pybedtools_get_includes(),
language="c++"),
]
# [...]
However, one of the dependencies (pybedtools) needs to be installed with a specific --global-option pip option (see at the end of the post what happens when the option is not provided).
If I understand correctly, the currently up-to-date way to automatically have some dependencies available before setup.py is used is to indicate them in the [build-system] section of a pyproject.toml file.
I tried the following pyproject.toml:
[build-system]
requires = [
"pysam",
"pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'",
]
build-backend = "setuptools.build_meta"
(By the way, it took me quite some time to figure out how to specify the build-backend, the documentation is not easily discoverable.)
However, this generates the following error upon pip install:
ERROR: Invalid requirement: "pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'"
Hint: It looks like a path. File 'pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'' does not exist.
How can I correctly specify options for dependencies ?
If I simply specify the package and its URL ("pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers), the install fails as follows:
Exception:
Cython-generated file 'pybedtools/cbedtools.cpp' not found.
Please install Cython and run
python setup.py cythonize
It was while trying to tackle the above error that I found out about the --global-option pip option.
I can manually run pip install --global-option="cythonize" git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers, and the install works, provided the dependencies of that package are already installed, otherwise their install fails because of an unrecognized "cythonize" option (which is another issue...).
Note that this option is only needed when installing "from source" (for instance when installing from a fork on github, as is my case here).
Same thing as in your other question, I suspect cythonize is a setuptools command and not a global option.
If it's indeed the case, then you would be better off setting an alias in your setup.cfg. If you run python setup.py alias install cythonize install, this should add the following to your setup.cfg:
[aliases]
install = cythonize install
When running pip install later, pip will honor this alias and the cythonize command will be executed right before the install command.

Python dependency resolution

I have previously created a python package and uploaded it to pypi. The package depends upon 2 other packages defined within the setup.py file:
from setuptools import setup
from dominos.version import Version
def readme():
with open('README.rst') as file:
return file.read()
setup(name='dominos',
version=Version('0.0.1').number,
author='Tomas Basham',
url='https://github.com/tomasbasham/dominos',
license='MIT',
packages=['dominos'],
install_requires=[
'ratelimit',
'requests'
],
include_package_data=True,
zip_safe=False)
As both of these were already installed within my virtualenv this package would run fine.
Now trying to consume this package within another python application (and within a separate virtualenv) I have defined the following requirements.txt file:
dominos==0.0.1
geocoder==1.13.0
For reference dominos is the package I uploaded to pypi. Now running pip install --no-cache-dir -r requirements.txt fails because dependencies of dominos are missing:
ImportError: No module named ratelimit
Surely pip should be resolving these dependencies since I have defined them in the setup.py file of dominos. Clarity on this would be great.

Categories

Resources