Making a neat, installable Python library with Click - python

I'm trying to make a command line tool with Click in Python, and I can't seem to find any documentation on packaging up the library into something that's installable. Is there any way to do this? At the moment I'm just using a virtual environment and installing it for testing using the commands listed in the docs: (http://click.pocoo.org/4/setuptools/#testing-the-script)
$ virtualenv venv
$ . venv/bin/activate
$ pip install --editable .
I'm relatively new to Click so forgive me if I'm missing something painfully obvious.

If you've followed the Setuptools Integration steps in the article you linked to, you're most of the way there. Try installing the package as if it came from pip (maybe in a different virtualenv):
$ virtualenv deploy
$ source deploy/bin/activate
$ pip install .
Then you can invoke your command as normal - it'll be installed under the bin directory in the virtualenv. It's a good idea to try testing the command from somewhere else to make sure you don't have a dependency on being inside the project directory (like you've probably been doing during testing).
Once you're happy that it installs correctly and all the imports work as expected, you can proceed to register your package with PyPI (the Package Index). You can read about this in the Python Docs
That's about it really - setuptools/Click does most of the heavy lifting.

Related

How to build a python package in a dev mode from source code with conda?

I am working on an open source python project which depends on a lot of non-python packages (perl, r, ...). Consequently, they use conda to install the dependencies, as they can't be simply installed with pip.
You can run $ conda install --channel bioconda <awesome_package> to install the stable build of the package. I want to install it in the development mode. For a purely python project, I'd do something like this:
pull the source code from github
$ cd path/to/awesome_package && pip install -e .
run tests
modify the code
run tests
etc
In the step 2 above, the pip command installs all I need to work. It uses setup.py script to do its job, and requirements.txt together with requirements_dev.txt to install the dependencies. And, I don't need to rebuild/reinstall anything when I change something in the source code.
How do I do the same with conda instead of pip? How do you provide a list of requirements (python and non-python) to conda?
The most relevant thing I found is this: http://conda.pydata.org/docs/building/bpp.html
However, with this approach I'd need to rebuild and reinstall the package locally very time I want to run the tests, which is something I'd like to avoid.
I am only going to change the python source code of the package I'm working on.
TL;DR: how to build a package with conda in a dev mode from the source code?
Any help is much appreciated.

Install local dist package into virtualenv

I have a pytest test, let's call it test.py. I used to run this test outside of virtualenv; now I'm trying to run it inside a virtualenv sandbox.
The project is structured like this:
~/project/test # where test.py and all virtualenv files live
~/project/mylibrary
test.py imports from mylibrary. In the past, this worked because I have the code in ~/project/mylibrary installed into /usr/lib/python2.7/dist-packages/mylibrary.
I can't run virtualenv with the --system-site-packages flag. I also can't move the code from ~/project/mylibrary into the ~/project/test folder. How can I get access to the code in mylibrary inside my virtualenv?
You don't need to do anything special - as long as you are working inside a virtualenv, python setup.py install will automatically install packages into
$VIRTUAL_ENV/lib/python2.7/site-packages
rather than your system-wide
/usr/lib/python2.7/dist-packages
directory.
In general it's better to use pip install mylibrary/, since this way you can neatly uninstall the package using pip uninstall mylibrary.
If you're installing a working copy of some code that you're developing, it might be a good idea to install it in "editable" mode using pip install -e mylibrary/, which creates a link to your source directory so that your installed module gets updated as you edit the code.
The easiest way would be to add the directory containing the library to your sys.path

Pythonic Ways of Importing Custom Modules?

I've needed to deal with this for some time, but never really figured out what the most pythonic way of importing/setting up PYTHONPATH for custom modules is. I know I can use virtualenv to manage it, I know I can set it inside of scripts, or through pth files, but none of these seem very clean and pythonic to me, so I'm guessing I'm missing something.
Almost always, all custom modules I'm interested in are contained in the git directory I've cloned down that has whatever script I'm running, if that simplifies things.
I'm guessing virtualenv is the answer, but figured I'd ask in case I'm missing anything.
EDIT: To clarify, this is only a question about custom modules. I'm already using pip for modules from PyPI.
You can use pip to install packages that are not on PyPI also. You just need an URI endpoint and a valid python package:
Examples:
$ pip install https://github.com/pypa/pip/archive/develop.zip#egg=pip
$ pip install git+https://github.com/pypa/pip.git#egg=pip
$ pip install git+git://github.com/pypa/pip.git#egg=pip
$ pip install /path/to/pip.tar.gz
$ pip install .
Read more at https://pip-installer.org/en/latest/usage.html#pip-install
virtualenv is a good start.
There are also package managers like pip and easy_install that manage third party modules.
In code you can use:
import sys
sys.path.append('/path/to/customModule')
Virtualenv is the way to go with this.
pip install virtualenv
Then make a folder to setup your environments. Inside that folder:
virtualenv <new_env_name>
That'll create a new folder in that directory, inside that there's a bin folder, run source on activate in that bin folder. You can then do pip install and it will only install it for that environment.
If you're cloning a git repo that you also want to be able to peruse the code easily (like if you're also working on that repo) clone it into your work_dir and then symlink or alias the package folder into the site-package directory inside that virtualenv's lib directory. Otherwise, if it's packaged correctly if you do python setup.py install it should install it right for that virtualenv.

list python packages consumed by an application

def get_pkgs():
pkgs = []
for importer, modname, ispkg in \
pkgutil.walk_packages(path=None, onerror=lambda x: None):
pkgs.append(modname)
return pkgs
Above code snippet gives me all python packages in distribution. Can someone suggest me a way to get all packages out of these used by a python application ?
Snakefood is nice, as already answered by Sven Marnach, but this gives all file dependencies and I'm not sure if it will just tell you what packages are used, versus every single file dependency.
For packages available in the Python Package Index, you could use virtualenv (and pip, which comes with it) to get a simple list of required/used packages.
For example, assuming you also have the excellent virtualenvwrapper tools installed (highly recommended), below is a sequence that shows what the package requirements are for pylint:
$ mkvirtualenv pylint_dep_check --no-site-packages
New python executable in pylint_dep_check/bin/python
Installing setuptools............done.
$ pip freeze #Note the wsgiref 'bug' where it always shows up
wsgiref==0.1.2
$ workon pylint_dep_check
(pylint_dep_check) $ pip install pylint
(... snipped lengthy install text ...)
(pylint_dep_check) $ pip freeze
logilab-astng==0.21.1
logilab-common==0.55.0
pylint==0.23.0
unittest2==0.5.1
wsgiref==0.1.2
Noe the all-important use of the --no-site-packages virtualenv creation option which (surprise!) ensures that your virtualenv is completely fresh and has none of the site-packages from your distribution installed. This way it is clear what is needed for the app you installed.
If this is an application that you have developed, a nice way to keep track of your dependencies (and an excellent/clean way to work) is to set up the application in a clean virtualenv (created using the --no-site-packages option again) and then again use pip freeze to figure out what packages you've installed to make it work.
The ability to start a "fresh" python installation with the --no-site-packages option is extremely useful. I do it for all applications and to test out packages I'm interested in without cluttering my workspace(s).
If you aren't using virtualenv and pip yet, get on it already. Here is a good introduction:
http://mathematism.com/2009/07/30/presentation-pip-and-virtualenv/
The simplest way is to use the Python dependency analysis tool snakefood.

Best way to install python packages locally for development

Being new to the python games I seem to have missed out on some knowledge on how you can develop on a program but also keep it in your live environment.
Programs like gpodder can be run directly from the source checkout which is really handy however others want to be "installed" to run.
A lot of programs are distributed with a setup.py with instructions to run "python ./setup.py install" as root which will put stuff somewhere in your file-system. There are even install commands like "develop" which seem to hold the promise of what I want. So I tried:
export PYTHONPATH=/home/alex/python
python ./setup.py develop --install-dir=/home/alex/python
Which downloaded a bunch of stuff locally and seems magically ensure the application I'm hacking on is still being run out of the src tree. So I guess my roundabout question is is this the correct way of developing python code? How do things like easy_install and pip factor into this?
So I tried the following:
python /usr/share/pyshared/virtualenv.py /home/alex/src/goobook
cd /home/alex/src/goobook/googbook.git
/home/alex/src/goobook/bin/python ./setup.py develop
And finally linked the program in question to my ~/bin
cd /home/alex/src/goobook
linkbin.pl bin/goobook
However invocation throws up a load of extra chatter which seems to imply it's wrong:
17:17 alex#socrates/i686 [goobook] >goobook --help
/home/alex/bin/goobook:5: UserWarning: Module pkg_resources was already imported from /home/alex/src/goobook/lib/python2.5/site-packages/setuptools-0.6c8-py2.5.egg/pkg_resources.py, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
/home/alex/bin/goobook:5: UserWarning: Module site was already imported from /home/alex/src/goobook/lib/python2.5/site.pyc, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
Install:
http://pypi.python.org/pypi/virtualenv
to set up a localized virtual environment for your libraries, and:
http://pypi.python.org/pypi/setuptools
i.e. "easy_install" to install new things.
Virtualenv allows you to work in completely independent and isolated Python environments. It will let you easily create multiple environments which have different Python packages installed or different versions of a same package. Virtualenv also lets you easily switch between your different environments.
As of 2012, the de facto preferred tool for package management in Python is pip rather than setuptools. Pip is able to handle dependencies and to install/uninstall globally or inside a virtual environment. Pip even comes out-of-the-box with virtualenv.
Python 3
Also worth mentioning is the fact that virtual environments are becoming a part of Python itself in release 3.3, with the implementation of PEP 405.
The Python Packaging User Guide, which "aims to be the authoritative resource on how to package, publish and install Python distributions using current tools", recommends using pip to install in "development mode":
pip install -e <path>
Thus in the root directory of your package you can simply
pip install -e .
See installing from a local source tree.
The best way to develop Python apps with dependencies is to:
Download desired version of the python interpreter.
Install and use buildout (http://www.buildout.org/).
Buildout is something like Maven for Java (will fetch all needed packages automatically).
This way your Python interpreter will not be polluted by third party packages (this is important if you will be running developed application on other machines). Additionally you can integrate buildout with virtualenv package (this allows you to create virtual python interpreters for each project).

Categories

Resources