Can I set download cache path for setuptools? I can do that with pip by setting environment variable PIP_DOWNLOAD_CACHE (How to cache downloaded PIP packages), but I don't know, what to do with setuptools.
I want everytime, when I run:
python setup.py develop
the setuptools caches the downloaded files, so I don't have to download it again.
You can use the options -a, -d and -f to create and use a cache directory. This is what the help text for them says:
--always-copy (-a) Copy all needed packages to install dir
--install-dir (-d) install package to DIR
--find-links (-f) additional URL(s) to search for packages
You first use python setup.py develop -a -d CACHEDIR to cache the needed packages in the directory of your choice, then use that directory as a source in the future with python setup.py develop -f CACHEDIR.
You may need to add any flags you normally use to one, the other, or both invocations, depending on the flag in question.
Related
I'm building a python package to use for 'global' functions (i.e. stuff that I will use in multiple other projects). I have built the package using py -m build and it then puts the MyPackage-0.1.0.tar.gz into the dist directory in my folder.
My goal is to be able to run pip install MyPackage from within any other projects, and it will install the latest build of my package. In other words, I do not want to use something like --find-links. This way, I could also include the package in a requirements.txt file.
I have tried putting the tarball in a directory which is on my system's PATH, and into a subfolder within there (e.g. PathDir/MyPackage/MyPackage-0.1.0.tar.gz), but I keep getting the same 'No matching distribution found' error.
The documentation for pip install says:
pip looks for packages in a number of places: on PyPI (if not disabled via --no-index), in the local filesystem, and in any additional repositories specified via --find-links or --index-url.
When it says 'in the local filesystem' where does it begin it's search? Is there a way to change this (e.g. set some environment variable)
When looking for files in the local filesystem, pip has no notion of search path. You must give a path accessible from the current working directory. It can be an absolute path:
pip install /path/to/MyPackage-0.1.0.tar.gz
a relative path:
cd /path
pip install to/MyPackage-0.1.0.tar.gz
or a simple name if the package file is inside the current working directory:
cd /path/to
pip install MyPackage-0.1.0.tar.gz
I found the answer after a lot of searching, and so here is the solution:
pip uses configuration files to define its internal settings. In these configuration files, you can specify default values for find-links. This means that python will look here for compatible packages, as well as online.
You can check what configurations have been set, and what files they will be searched in by running pip config list -v. You just need to edit/create one of the files listed and add your configuration as pip.ini with the following:
[install]
find-links=file://C:/Users/.../PathDir/MyPackage/
By creating this at the User/Global level (rather than the site level), this installation also works when inside a virtual environment.
Source: https://pip.pypa.io/en/stable/topics/configuration/
pip can install packages from the local file system. They do not need to be even package files, they can be just working directories or git checkouts.
Usually I use pip --editable:
pip --editable /path/to/my/python/package
With --editable changes in .py files in the folder are automatically reflected to your application.
You can use --editable in requirements.txt file as well.
I have a python project that has a few dependencies (defined under install_requires in setup.py). My ops people requires a package to be self contained and only depend on a python installation. The litmus test would be that they're able to get a zip-file and then unzip and run it without an internet connection.
Is there an easy way to package an install including dependencies? It is acceptable if I have to build on the OS/architecture that it will eventually be run on.
For what it's worth, I've tried both setup.py build and setup.py sdist, but they don't seem to fit the bill since they do not include dependencies. I've also considered virtualenv (which could be installed if absolutely necessary), but that has hard coded paths which makes it less than ideal.
There are a few nuances to how pip works. Unfortunately, using --prefix vendor to store all the dependencies of the project doesn't work if any of those dependencies, or dependencies of dependencies are installed into a place where pip can find them. It will skip those dependencies and just install the rest to your vendor folder.
In the past I've used virtualenv's --no-site-packages option to solve this issue. At one company we would ship the whole virtualenv, which includes the python binary. In the interest of only shipping the dependencies, you can combine using a virtualenv with the --prefix switch on pip to give yourself a clean environment that installs to the right place.
I'll provide an example script that creates a temporary virtualenv, activates it, then installs the dependencies to a local vendor folder. This is handy if you are running in CI.
#!/bin/bash
tempdir=$(mktemp -d -t project.XXX) # create a temporary directory
trap "rm -rf $tempdir" EXIT # ensure it is cleaned up
# create the virtualenv and exclude packages outside of it
virtualenv --python=$(which python2.7) --no-site-packages $tempdir/venv
# activate the virtualenv
source $tempdir/venv/bin/activate
# install the dependencies as above
pip install -r requirements.txt --prefix=vendor
In most cases you should be able to "vendor" all the dependencies. It's basically a crude version of virtualenv.
For example look at how the requests package includes chardet and urllib3 in its own source tree. Here's an example script that should do the initial downloading and copying for you: https://gist.github.com/proppy/1136723
Once you have the dependencies installed, you can reference them with from .some.namespace import dependency_name to make sure that you're using your local versions.
It's possible to do this with recent versions of pip (I'm using 8.1.2). On the build machine:
pip install -r requirements.txt --prefix vendor
Then run it:
PYTHONPATH=vendor/lib/python2.7/site-packages python yourapp.py
(This is basically an expansion of #valentjedi comment. Thanks!)
let's say you have python module app.py with dependencies in requirements.txt file.
first, install all your dependencies in appdeps folder.
python -m pip install -r requirements.txt --target=./appdeps
then in your app.py module add this dependency folder to the pythonpath
# app.py
import sys
sys.path.append('appdeps')
# rest of your module normally
#...
this will work the same way as if you were running this script from venv with all the dependencies installed inside ;>
I don't understand this...
I want to install this https://gist.github.com/sixtenbe/1178136.
It is a peak detection script for python.
Everywhere I look I am told to use pip with the .git extension.
All I see is how to download the .zip, but from there I am lost.
How can I install this?
Thanks.
You can get the individual files in the Gist (or download the Gist as an ZIP and extract) and put them in your source code folder.
Then you will be able to import them as modules in your own scripts:
import analytic_wfm as AW
AW.ACV_A6( ... )
import peakdetect as PK
PK.peakdetect_parabola( ... )
Let's give it another look.
By "installing a package" we might mean that the package should be available via import.
For that the package directory should reside either in the current directory or in one of the other directories in the import search path.
One such directory is the "user-specific site-packages directory, USER_SITE":
python -c "import site; print(site.getusersitepackages())"
Git URL
First we might need a Git URL. Going to https://gist.github.com/sixtenbe/1178136 we can click on the Embed pop-up and switch it to Clone via HTTPS:
in order to obtain the GIT URL: https://gist.github.com/1178136.git.
git and bash
Having the Git URL and the Unix shell (bash) we can install the package manually into the USER_SITE.
Let's go into the USER_SITE first:
cd $(python -c "import site; print(site.getusersitepackages())")
pwd
Now that we are in the USER_SITE, let's download the Gist:
git clone https://gist.github.com/1178136.git analytic_wfm
Finally, let's verify that the package is now available:
cd && python -c "import analytic_wfm.analytic_wfm; print(analytic_wfm.analytic_wfm.__all__)"
If numpy is installed, it prints
['ACV_A1', 'ACV_A2', 'ACV_A3', 'ACV_A4', 'ACV_A5', 'ACV_A6', 'ACV_A7', 'ACV_A8']
pip
Let's try to install a Gist package with pip.
For pip install we should prefix the Git URL with git+:
pip install --user git+https://gist.github.com/1178136.git
This gives us the error:
ERROR: git+https://gist.github.com/1178136.git does not appear to be a
Python project: neither 'setup.py' nor 'pyproject.toml' found.
Looks like the package we've picked is missing the necessary pip configuration!
Let's try another one:
pip install --user git+https://gist.github.com/bf91613a021a536c7ce16cdba9168604.git
Installs NP:
Successfully built llog
Installing collected packages: llog
Successfully installed llog-1.0
Particularly because it has the setup.py.
Note also that Gist does not support subfolders, and pip seems to depend on them in handling the packages argument, but the code in setup.py can workaround this by creating the package subfolder on the fly and copying the Python files there!
Hence if you want to import that Gist, https://gist.github.com/sixtenbe/1178136, with the rest of the requirements.txt dependencies, - you can fork it and add setup.py to the effect.
pypi
Given that the analytic-wfm can also be found at the Python Package Index, https://pypi.org/project/analytic-wfm/, you can install it with
pip install analytic-wfm
When would the -e, or --editable option be useful with pip install?
For some projects the last line in requirements.txt is -e .. What does it do exactly?
As the man page says it:
-e,--editable <path/url>
Install a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.
So you would use this when trying to install a package locally, most often in the case when you are developing it on your system. It will just link the package to the original location, basically meaning any changes to the original package would reflect directly in your environment.
Some nuggets around the same here and here.
An example run can be:
pip install -e .
or
pip install -e ~/ultimate-utils/ultimate-utils-proj-src/
note the second is the full path to where the setup.py would be at.
Concrete example of using --editable in development
If you play with this test package as in:
cd ~
git clone https://github.com/cirosantilli/vcdvcd
cd vcdvcd
git checkout 5dd4205c37ed0244ecaf443d8106fadb2f9cfbb8
python -m pip install --editable . --user
it outputs:
Obtaining file:///home/ciro/bak/git/vcdvcd
Installing collected packages: vcdvcd
Attempting uninstall: vcdvcd
Found existing installation: vcdvcd 1.0.6
Can't uninstall 'vcdvcd'. No files were found to uninstall.
Running setup.py develop for vcdvcd
Successfully installed vcdvcd-1.0.6
The Can't uninstall 'vcdvcd' is normal: it tried to uninstall any existing vcdvcd to then replace them with the "symlink-like mechanism" that is produced in the following steps, but failed because there were no previous installations.
Then it generates a file:
~/.local/lib/python3.8/site-packages/vcdvcd.egg-link
which contains:
/home/ciro/vcdvcd
.
and acts as a "symlink" to the Python interpreter.
So now, if I make any changes to the git source code under /home/ciro/vcdvcd, it reflects automatically on importers who can from any directory do:
python -c 'import vcdvcd'
Note however that at my pip version at least, binary files installed with --editable, such as the vcdcat script provided by that package via scripts= on setup.py, do not get symlinked, just copied to:
~/.local/bin/vcdcat
just like for regular installs, and therefore updates to the git repository won't directly affect them.
By comparison, a regular non --editable install from the git source:
python -m pip uninstall vcdvcd
python -m pip install --user .
produces a copy of the installed files under:
~/.local/lib/python3.8/site-packages/vcdvcd
Uninstall of an editable package as done above requires a new enough pip as mentioned at: How to uninstall editable packages with pip (installed with -e)
Tested in Python 3.8, pip 20.0.2, Ubuntu 20.04.
Recommendation: develop directly in-tree whenever possible
The editable setup is useful when you are testing your patch to a package through another project.
If however you can fully test your change in-tree, just do that instead of generating an editable install which is more complex.
E.g., the vcdvcd package above is setup in a way that you can just cd into the source and do ./vcdcat without pip installing the package itself (in general, you might need to install dependencies from requirements.txt though), and the import vcdvcd that that executable does (or possibly your own custom test) just finds the package correctly in the same directory it lives in.
From Working in "development" mode:
Although not required, it’s common to locally install your project in
“editable” or “develop” mode while you’re working on it. This allows
your project to be both installed and editable in project form.
Assuming you’re in the root of your project directory, then run:
pip install -e .
Although somewhat cryptic, -e is short for
--editable, and . refers to the current working directory, so together, it means to install the current directory (i.e. your
project) in editable mode.
Some additional insights into the internals of setuptools and distutils from “Development Mode”:
Under normal circumstances, the distutils assume that you are going to
build a distribution of your project, not use it in its “raw” or
“unbuilt” form. If you were to use the distutils that way, you would
have to rebuild and reinstall your project every time you made a
change to it during development.
Another problem that sometimes comes up with the distutils is that you
may need to do development on two related projects at the same time.
You may need to put both projects’ packages in the same directory to
run them, but need to keep them separate for revision control
purposes. How can you do this?
Setuptools allows you to deploy your projects for use in a common
directory or staging area, but without copying any files. Thus, you
can edit each project’s code in its checkout directory, and only need
to run build commands when you change a project’s C extensions or
similarly compiled files. You can even deploy a project into another
project’s checkout directory, if that’s your preferred way of working
(as opposed to using a common independent staging area or the
site-packages directory).
To do this, use the setup.py develop command. It works very similarly
to setup.py install, except that it doesn’t actually install anything.
Instead, it creates a special .egg-link file in the deployment
directory, that links to your project’s source code. And, if your
deployment directory is Python’s site-packages directory, it will also
update the easy-install.pth file to include your project’s source
code, thereby making it available on sys.path for all programs using
that Python installation.
It is important to note that pip uninstall can not uninstall a module that has been installed with pip install -e. So if you go down this route, be prepared for things to get very messy if you ever need to uninstall. A partial solution is to (1) reinstall, keeping a record of files created, as in sudo python3 -m setup.py install --record installed_files.txt, and then (2) manually delete all the files listed, as in e.g. sudo rm -r /usr/local/lib/python3.7/dist-packages/tdc7201-0.1a2-py3.7.egg/ (for release 0.1a2 of module tdc7201). This does not 100% clean everything up however; even after you've done it, importing the (removed!) local library may succeed, and attempting to install the same version from a remote server may fail to do anything (because it thinks your (deleted!) local version is already up to date).
As suggested in previous answers, there is no symlinks that are getting created.
How does '-e' option work? -> It just updates the file "PYTHONDIR/site-packages/easy-install.pth" with the project path specified in the 'command pip install -e'.
So each time python search for a package it will check this directory as well => any changes to the files in this directory is instantly reflected.
I am using pip and trying to install a python module called pyodbc which has some dependencies on non-python libraries like unixodbc-dev, unixodbc-bin, unixodbc. I cannot install these dependencies system wide at the moment, as I am only playing, so I have installed them in a non-standard location. How do I tell pip where to look for these dependencies ? More exactly, how do I pass information through pip of include dirs (gcc -I) and library dirs (gcc -L -l) to be used when building the pyodbc extension ?
pip has a --global-option flag
You can use it to pass additional flags to build_ext.
For instance, to add a --library-dirs (-L) flag:
pip install --global-option=build_ext --global-option="-L/path/to/local" pyodbc
gcc supports also environment variables:
http://gcc.gnu.org/onlinedocs/gcc/Environment-Variables.html
I couldn't find any build_ext documentation, so here is the command line help
Options for 'build_ext' command:
--build-lib (-b) directory for compiled extension modules
--build-temp (-t) directory for temporary files (build by-products)
--plat-name (-p) platform name to cross-compile for, if supported
(default: linux-x86_64)
--inplace (-i) ignore build-lib and put compiled extensions into the
source directory alongside your pure Python modules
--include-dirs (-I) list of directories to search for header files
(separated by ':')
--define (-D) C preprocessor macros to define
--undef (-U) C preprocessor macros to undefine
--libraries (-l) external C libraries to link with
--library-dirs (-L) directories to search for external C libraries
(separated by ':')
--rpath (-R) directories to search for shared C libraries at runtime
--link-objects (-O) extra explicit link objects to include in the link
--debug (-g) compile/link with debugging information
--force (-f) forcibly build everything (ignore file timestamps)
--compiler (-c) specify the compiler type
--swig-cpp make SWIG create C++ files (default is C)
--swig-opts list of SWIG command line options
--swig path to the SWIG executable
--user add user include, library and rpath
--help-compiler list available compilers
Building on Thorfin's answer and assuming that your desired include and library locations are in /usr/local, you can pass both in like so:
sudo pip install --global-option=build_ext --global-option="-I/usr/local/include/" --global-option="-L/usr/local/lib" <you package name>
Another way to indicate the location of include files and libraries are set relevant environment variables before running pip e.g.
export LDFLAGS=-L/usr/local/opt/openssl/lib
export CPPFLAGS=-I/usr/local/opt/openssl/include
pip install cryptography
Just FYI... If you are having trouble installing a package with pip, then you can use the
--no-clean option to see what is exactly going on (that is, why the build did not work). For instance, if numpy is not installing properly, you could try
pip install --no-clean numpy
then look at the Temporary folder to see how far the build got. On a Windows machine, this should be located at something like:
C:\Users\Bob\AppData\Local\Temp\pip_build_Bob\numpy
Just to be clear, the --no-clean option tries to install the package, but does not clean up after itself, letting you see what pip was trying to do.
Otherwise, if you just want to download the source code, then I would use the -d flag. For instance, to download the Numpy source code .tar file to the current directory, use:
pip install -d %cd% numpy
I was also helped by Thorfin's answer; I was building GTK3+ on windows and installing pygobject, I was having difficulties on how to include multiple folders with pip install.
I tried creating pip config file as per pip documentation. but failed.
the one working is with the command line:
pip install --global-option=build_ext --global-option="-IlistOfDirectories"
# and/or with: --global-option="-LlistofDirectories"
the separator that works with multiple folders in windows is ';' semicolon, NOT colon ':' it might be different in other OS.
sample working command line:
pip install --global-option=build_ext --global-option="-Ic:/gtk-build/gtk/x64/release/include;d:/gtk-build/gtk/x64/release/include/gobject-introspection-1.0" --global-option="-Lc:\gtk-build\gtk\x64\release\lib" pygobject==3.27.1
you can use '' or '/' for path, but make sure do not type backslash next to "
this below will fail because there is backslash next to double quote
pip install --global-option=build_ext --global-option="-Ic:\willFail\" --global-option="-Lc:\willFail\" pygobject==3.27.1
Have you ever used virtualenv? It's Python package that let's you create and maintain multiple isolated environments on one machine. Each can use different modules independent of one another without screwing up dependencies in your system library or a separate virtual environment.
If you don't have root privileges, you can download and use the virtualenv package from source:
$ curl -O https://pypi.python.org/packages/source/v/virtualenv/virtualenv-X.X.tar.gz
$ tar xvfz virtualenv-X.X.tar.gz
$ cd virtualenv-X.X
$ python virtualenv.py myVE
I followed the above steps this weekend on Ubuntu Server 12.0.4 and it worked perfectly. Each new virtual environment you create comes with PIP by default so installing packages into your new environment is easy.
Just in case it's of help to somebody, I still could not find a way to do it through pip, so ended up simply downloading the package and doing through its 'setup.py'. Also switched to what seems an easier to install API called 'pymssql'.