I develop a cli tool with Python for in-house use.
I would like to introduce pipenv to my project to manage "dependencies of dependencies". It is because I encountered a bug due to a difference between production environment and development environment.
However, MY cli tool is installed as a package.(httpie and ansible takes this strategy).
So, I have to specify all dependencies in setup.py.
How should I import "dependencies of dependencies" in Pipfile.lock to setup.py?
(or should take other method?)
It is suggested that you to do this the other way around. Instead of referencing dependencies in Pipfile, you should list them in setup.py instead, and reference them in Pipfile with
pipenv install -e .
Related
I'm using tox with poetry with pyenv and I'm getting quite confused.
I'm using pyenv local 3.6.9 3.7.10 to set several python versions in my myprojectroot folder. Above that, I use poetry to manage dependencies and the mypackage build. Finally, I use tox to make automatic testings for several python versions.
My problem is that tox creates for - let's say versions 3.6.9 - a virtual environment located in the myproject/.tox directory. To that end, it installs all dependencies listed by poetry in that virtual env, but is installs also mypackage !!! (I checked in the .tox folder.
Questions:
tox usually install the packages with pip. Yet, I use poetry here. How can it install my package then? Does it build the wheel with poetry and install it afterwards?
Does it update my local directory code on modification? Should I make a tox -r?
I recently moved my test folder configuration into
project.toml
src
+- mypackage
+- __init__.py
+- mypackage.py
tests
+- test_mypackage.py
and, I need to run pytest when modifying mypackage. How to do that?
What's the link with skipsdist=True?
Thanks for your help!
ad 1) tox does not build a wheel, but an sdist (source distribution). If you wanted to build a wheel, you need to install https://github.com/ionelmc/tox-wheel
But your idea is right. poetry builds the sdist, and uses pip under the hood to install it.
ad 2) tox notices changes of your source code, so no need to do a tox -r. The documentation of tox lacks a bit info about this topic. Meanwhile, have a look at https://github.com/tox-dev/tox/issues/2003#issuecomment-815832363.
ad 3) pytest does test discovery on its own, so it should be able to find the tests. A command like pytest or poetry run pytest in your case should be enough. Disclaimer: I do not use poetry, so maybe you'd need to be a more explicit about the path. The official poetry documentation on tox suggests the following command: poetry run pytest tests/, see https://python-poetry.org/docs/faq/#is-tox-supported
ad 4) You can read more about skipsdist=True at https://tox.readthedocs.io/en/latest/config.html#conf-skipsdist - so this tells tox whether to build an sdist or not. If you do not build an sdist, your tests will not test the built package, but only the source code (if you direct pytest to it). This may be not what you want - depending whether you develop an app or a library, or other circumstances I do not know.
My project has django-heroku in its Pipfile as a package.
django-heroku has gunicorn in its Pipfile as a dev-package. See: https://github.com/heroku/django-heroku/blob/master/Pipfile
I would expect that after running pipenv install --dev in my project, I could then run pipenv run gunicorn.
But it throws the following error:
Error: the command gunicorn could not be found within PATH or Pipfile's [scripts].
If dev dependencies aren't available, what's the point of install --dev?
One answer is that the "dev dependencies" of package X are the packages someone would need if they were developing (as opposed to using) package X.
I would expect that after running pipenv install --dev in my project, ...
If you use pipenv install --dev in your project, pipenv should install all the packages that are required to develop your project.
If it recursively installed all dev dependencies all the way down, it might pull in Python profiling packages, test runners, etc., that other packages need for development. Those wouldn't necessarily be appropriate for someone developing your project.
As an example, if my project listed pytest as a dev dependency , I would be unhappy in pipenv installed nose, which could be listed as an dev depenedency in some other, out-of-date package.
If developers of your package need gunicorn, you should list it explicitly as a dev dependency of your project.
I believe the Pipfile you've linked to is only relevant to the development of this package.
However when the package is installed, it usually relies on setup.py:
REQUIRED = [
'dj-database-url>=0.5.0', 'whitenoise', 'psycopg2', 'django'
]
As you can see, gunicorn is missing.
When I need to work on one of my pet projects, I simply clone the repository as usual (git clone <url>), edit what I need, run the tests, update the setup.py version, commit, push, build the packages and upload them to PyPI.
What is the advantage of using pip install -e? Should I be using it? How would it improve my workflow?
I find pip install -e extremely useful when simultaneously developing a product and a dependency, which I do a lot.
Example:
You build websites using Django for numerous clients, and have also developed an in-house Django app called locations which you reuse across many projects, so you make it available on pip and version it.
When you work on a project, you install the requirements as usual, which installs locations into site packages.
But you soon discover that locations could do with some improvements.
So you grab a copy of the locations repository and start making changes. Of course, you need to test these changes in the context of a Django project.
Simply go into your project and type:
pip install -e /path/to/locations/repo
This will overwrite the directory in site-packages with a symbolic link to the locations repository, meaning any changes to code in there will automatically be reflected - just reload the page (so long as you're using the development server).
The symbolic link looks at the current files in the directory, meaning you can switch branches to see changes or try different things etc...
The alternative would be to create a new version, push it to pip, and hope you've not forgotten anything. If you have many such in-house apps, this quickly becomes untenable.
For those who don't have time:
If you install your project with an -e flag (e.g. pip install -e mynumpy) and use it in your code (e.g. from mynumpy import some_function), when you make any change to some_function, you should be able to use the updated function without reinstalling it.
pip install -e is how setuptools dependencies are handled via pip.
What you typically do is to install the dependencies:
git clone URL
cd project
run pip install -e . or pip install -e .[dev]*
And now all the dependencies should be installed.
*[dev] is the name of the requirements group from setup.py
Other than setuptools (egg) there is also a wheel system of python installation.
Both these systems are based on promise that no building and compilation is performed.
I run Vagrant on Mac OS X. I am coding inside a virtual machine with CentOS 6, and I have the same versions of Python and Ruby in my development and production environment. I have these restrictions:
I cannot manually install. Everything must come through RPM.
I cannot use pip install and gem install to install the libraries I want as the system is managed through Puppet, and everything I add will be removed.
yum has old packages. I usually cannot find the latest versions of the libraries.
I would like to put my libraries locally in a lib directory near my scripts, and create an RPM that includes those frozen versions of dependencies. I cannot find an easy way to bundle my libraries for my scripts and push everything into my production server. I would like to know the easiest way to gather my dependencies in Python and Ruby.
I tried:
virtualenv (with --relocatable option)
PYTHONPATH
sys.path.append("lib path")
I don't know which is the right way to go. Also for ruby, is there any way to solve my problems with bundler? I see that bundler is for rails. Does it work for custom small scripts?
I like the approach in Node.JS and NPM; all packages are stored locally in node_modules. I have nodejs rpm installed, and I deploy a folder with my application on the production server. I would like to do it this way in Ruby and Python.
I don't know Node, but what you describe for NPM seems to be exactly what a virtualenv is. Once the virtualenv is activated, pip installs only within that virtualenv - so puppet won't interfere. You can write out your current list of packages to a requirements.txt file with pip freeze, and recreate the whole thing again with pip install -r requirements.txt. Ideally you would then deploy with puppet, and the deploy step would involve creating or updating the virtualenv, activating it, then running that pip command.
Maybe take a look at Docker?
With Docker you could create a image of your specific environment and deploy that.
https://www.docker.com/whatisdocker/
Being new to the python games I seem to have missed out on some knowledge on how you can develop on a program but also keep it in your live environment.
Programs like gpodder can be run directly from the source checkout which is really handy however others want to be "installed" to run.
A lot of programs are distributed with a setup.py with instructions to run "python ./setup.py install" as root which will put stuff somewhere in your file-system. There are even install commands like "develop" which seem to hold the promise of what I want. So I tried:
export PYTHONPATH=/home/alex/python
python ./setup.py develop --install-dir=/home/alex/python
Which downloaded a bunch of stuff locally and seems magically ensure the application I'm hacking on is still being run out of the src tree. So I guess my roundabout question is is this the correct way of developing python code? How do things like easy_install and pip factor into this?
So I tried the following:
python /usr/share/pyshared/virtualenv.py /home/alex/src/goobook
cd /home/alex/src/goobook/googbook.git
/home/alex/src/goobook/bin/python ./setup.py develop
And finally linked the program in question to my ~/bin
cd /home/alex/src/goobook
linkbin.pl bin/goobook
However invocation throws up a load of extra chatter which seems to imply it's wrong:
17:17 alex#socrates/i686 [goobook] >goobook --help
/home/alex/bin/goobook:5: UserWarning: Module pkg_resources was already imported from /home/alex/src/goobook/lib/python2.5/site-packages/setuptools-0.6c8-py2.5.egg/pkg_resources.py, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
/home/alex/bin/goobook:5: UserWarning: Module site was already imported from /home/alex/src/goobook/lib/python2.5/site.pyc, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
Install:
http://pypi.python.org/pypi/virtualenv
to set up a localized virtual environment for your libraries, and:
http://pypi.python.org/pypi/setuptools
i.e. "easy_install" to install new things.
Virtualenv allows you to work in completely independent and isolated Python environments. It will let you easily create multiple environments which have different Python packages installed or different versions of a same package. Virtualenv also lets you easily switch between your different environments.
As of 2012, the de facto preferred tool for package management in Python is pip rather than setuptools. Pip is able to handle dependencies and to install/uninstall globally or inside a virtual environment. Pip even comes out-of-the-box with virtualenv.
Python 3
Also worth mentioning is the fact that virtual environments are becoming a part of Python itself in release 3.3, with the implementation of PEP 405.
The Python Packaging User Guide, which "aims to be the authoritative resource on how to package, publish and install Python distributions using current tools", recommends using pip to install in "development mode":
pip install -e <path>
Thus in the root directory of your package you can simply
pip install -e .
See installing from a local source tree.
The best way to develop Python apps with dependencies is to:
Download desired version of the python interpreter.
Install and use buildout (http://www.buildout.org/).
Buildout is something like Maven for Java (will fetch all needed packages automatically).
This way your Python interpreter will not be polluted by third party packages (this is important if you will be running developed application on other machines). Additionally you can integrate buildout with virtualenv package (this allows you to create virtual python interpreters for each project).