How does tox install my poetry project on its virtual environment? - python

I'm using tox with poetry with pyenv and I'm getting quite confused.
I'm using pyenv local 3.6.9 3.7.10 to set several python versions in my myprojectroot folder. Above that, I use poetry to manage dependencies and the mypackage build. Finally, I use tox to make automatic testings for several python versions.
My problem is that tox creates for - let's say versions 3.6.9 - a virtual environment located in the myproject/.tox directory. To that end, it installs all dependencies listed by poetry in that virtual env, but is installs also mypackage !!! (I checked in the .tox folder.
Questions:
tox usually install the packages with pip. Yet, I use poetry here. How can it install my package then? Does it build the wheel with poetry and install it afterwards?
Does it update my local directory code on modification? Should I make a tox -r?
I recently moved my test folder configuration into
project.toml
src
+- mypackage
+- __init__.py
+- mypackage.py
tests
+- test_mypackage.py
and, I need to run pytest when modifying mypackage. How to do that?
What's the link with skipsdist=True?
Thanks for your help!

ad 1) tox does not build a wheel, but an sdist (source distribution). If you wanted to build a wheel, you need to install https://github.com/ionelmc/tox-wheel
But your idea is right. poetry builds the sdist, and uses pip under the hood to install it.
ad 2) tox notices changes of your source code, so no need to do a tox -r. The documentation of tox lacks a bit info about this topic. Meanwhile, have a look at https://github.com/tox-dev/tox/issues/2003#issuecomment-815832363.
ad 3) pytest does test discovery on its own, so it should be able to find the tests. A command like pytest or poetry run pytest in your case should be enough. Disclaimer: I do not use poetry, so maybe you'd need to be a more explicit about the path. The official poetry documentation on tox suggests the following command: poetry run pytest tests/, see https://python-poetry.org/docs/faq/#is-tox-supported
ad 4) You can read more about skipsdist=True at https://tox.readthedocs.io/en/latest/config.html#conf-skipsdist - so this tells tox whether to build an sdist or not. If you do not build an sdist, your tests will not test the built package, but only the source code (if you direct pytest to it). This may be not what you want - depending whether you develop an app or a library, or other circumstances I do not know.

Related

Deploy CLI tool as package

I develop a cli tool with Python for in-house use.
I would like to introduce pipenv to my project to manage "dependencies of dependencies". It is because I encountered a bug due to a difference between production environment and development environment.
However, MY cli tool is installed as a package.(httpie and ansible takes this strategy).
So, I have to specify all dependencies in setup.py.
How should I import "dependencies of dependencies" in Pipfile.lock to setup.py?
(or should take other method?)
It is suggested that you to do this the other way around. Instead of referencing dependencies in Pipfile, you should list them in setup.py instead, and reference them in Pipfile with
pipenv install -e .

How to run a tests suite that comes in a sdist?

I would like to be able to run the tests suite when "mypackage" has been installed via pip.
I have created the mypackage-version.tar.gz file myself, using python setup.py sdist. mypackage-version.tar.gz does contain the tests/ directory (moreover, the .egg-info/SOURCES.txt inside it does list all files under tests/ too).
But after I've run pip install mypackage-version.tar.gz inside a dedicated brand new virtual environment, there is no tests/ directory anywhere in the virtual environment:
$ find ./myvenv/ -name "*tests*"
./myvenv/lib/python3.6/site-packages/pip/_vendor/webencodings/__pycache__/tests.cpython-36.pyc
./myvenv/lib/python3.6/site-packages/pip/_vendor/webencodings/tests.py
Reading pip documentation or the output of pip install --help, I just can't figure out if there's any way to ask pip to install the tests correctly along the rest.
I've also tried to unpack the tests/ directory manually (along with pytest.ini), but the tests do not start because pytest complains about not finding mypackage module (ModuleNotFoundError: No module named 'mypackage'), even if both pytest and mypackage show up in pip list... and both are the ones from the virtual environment (as which command tells), as well as pip and python (that can import mypackage).
Yet manually unpacking looks hacky, so before digging further into this, I'd liked to know what is the right way to go: using pip? Manually unpacking? Anything else? Am I missing something obvious?
EDIT: finally, I could run the manually unpacked tests using python -m pytest. But this remains a workaround, and I'd still liked to know whether there's a more proper (and automated) way to install and run the tests suite

What's the standard way to package a python project with dependencies?

I have a python project that has a few dependencies (defined under install_requires in setup.py). My ops people requires a package to be self contained and only depend on a python installation. The litmus test would be that they're able to get a zip-file and then unzip and run it without an internet connection.
Is there an easy way to package an install including dependencies? It is acceptable if I have to build on the OS/architecture that it will eventually be run on.
For what it's worth, I've tried both setup.py build and setup.py sdist, but they don't seem to fit the bill since they do not include dependencies. I've also considered virtualenv (which could be installed if absolutely necessary), but that has hard coded paths which makes it less than ideal.
There are a few nuances to how pip works. Unfortunately, using --prefix vendor to store all the dependencies of the project doesn't work if any of those dependencies, or dependencies of dependencies are installed into a place where pip can find them. It will skip those dependencies and just install the rest to your vendor folder.
In the past I've used virtualenv's --no-site-packages option to solve this issue. At one company we would ship the whole virtualenv, which includes the python binary. In the interest of only shipping the dependencies, you can combine using a virtualenv with the --prefix switch on pip to give yourself a clean environment that installs to the right place.
I'll provide an example script that creates a temporary virtualenv, activates it, then installs the dependencies to a local vendor folder. This is handy if you are running in CI.
#!/bin/bash
tempdir=$(mktemp -d -t project.XXX) # create a temporary directory
trap "rm -rf $tempdir" EXIT # ensure it is cleaned up
# create the virtualenv and exclude packages outside of it
virtualenv --python=$(which python2.7) --no-site-packages $tempdir/venv
# activate the virtualenv
source $tempdir/venv/bin/activate
# install the dependencies as above
pip install -r requirements.txt --prefix=vendor
In most cases you should be able to "vendor" all the dependencies. It's basically a crude version of virtualenv.
For example look at how the requests package includes chardet and urllib3 in its own source tree. Here's an example script that should do the initial downloading and copying for you: https://gist.github.com/proppy/1136723
Once you have the dependencies installed, you can reference them with from .some.namespace import dependency_name to make sure that you're using your local versions.
It's possible to do this with recent versions of pip (I'm using 8.1.2). On the build machine:
pip install -r requirements.txt --prefix vendor
Then run it:
PYTHONPATH=vendor/lib/python2.7/site-packages python yourapp.py
(This is basically an expansion of #valentjedi comment. Thanks!)
let's say you have python module app.py with dependencies in requirements.txt file.
first, install all your dependencies in appdeps folder.
python -m pip install -r requirements.txt --target=./appdeps
then in your app.py module add this dependency folder to the pythonpath
# app.py
import sys
sys.path.append('appdeps')
# rest of your module normally
#...
this will work the same way as if you were running this script from venv with all the dependencies installed inside ;>

How to make Travis CI to install Python dependencies declared in tests_require?

I have Python package with setup.py. It has regular dependencies declared in install_requires and development dependencies declared in tests_require, e.g. flake8.
I thought pip install -e . or running python setup.py test will also install my development dependencies and they'll be available. However, apparently they're not and I struggle to setup my Travis CI build right.
install:
- "pip install -e ."
script:
- "python setup.py test"
- "flake8"
Build configured as above will fail, because flake8 will not be found as a valid command. I also tried to invoke flake8 from inside of the python setup.py test command (via subprocess), but also without success.
Also I hate the fact that flake8 can't be easily made integral part of the python setup.py test command, but that's another story.
I prefer to keep most of the configuration in tox.ini and rely on it to install and run what is to be run. For testing I use pytest (the solution can be modified to use other testing frameworks easily).
Following files are used:
tox.ini: automates the test
.travis.yml: instructions for Travis
setup.py: installation script to install the package to test
test_requirements.txt: list of requirements for testing
tox.ini
[tox]
envlist = py{26,27,33,34}
[testenv]
commands =
py.test -sv tests []
deps =
-rtest-requirements.txt
.travis.yml
sudo: false
language: python
python:
- 2.6
- 2.7
- 3.3
- 3.4
install:
- pip install tox-travis
script:
- tox
test_requirements.txt
Just ordinary requirements file whith what ever you need in there (e.g. flake8, pytest and other dependencies)
You may see sample at https://github.com/vlcinsky/awslogs/tree/pbr-setup.py
The fact it uses there pbr, coverage and coverall is not relevant to my answer (it works with or without pbr).
The more direct answer is that pip install will not install tests_require, intentionally separating runtime requirements from test requirements. python setup.py test creates a virtualenv-like environment to run the tests in, un-doing this afterwards. flake8 is thus unavailable once it is done.
Flake8 has setuptools integration and also integrates with pytest if you use that. pytest itself also integrates with setuptools.

virtualenv + setuptools issue in Pyramid

I have followed these instructions. That is:
Created a folder blah_project and another folder venv within it.
Run virtualenv --no-site-packages venv to create a virtual environment inside venv.
Activated venv with source venv/bin/activate
Run pip install pyramid
Run pcreate -s alchemy blah
Now, the problem I'm facing is that if I run any command, for instance python blah/setup.py test -q, the required packages are installed not in the appropriate venv subpath, but rather in the current directory. Is that the expected behaviour? How do I setup the script to always install packages in the right path?
I tried looking inside setup.py and I don't really find anything relevant, i.e. there is no path passed on to setuptools.setup() function call.
Try
pip install -e .
That will help you to install the requirements in your venv environment.
This is expected behavior with the test subcommand of setup.py unfortunately. The way we solve this in a lot of our subprojects is by defining a new alias called setup.py dev which installs both testing dependencies and actual dependencies at the same time. However I don't have a great solution for you as this is the way setup.py test works intentionally. Below are links to the Pyramid configuration that allows setup.py dev to work.
https://github.com/Pylons/pyramid/blob/master/setup.cfg#L12
https://github.com/Pylons/pyramid/blob/master/setup.py#L99

Categories

Resources