pip install custom include path - python

On an ubuntu system on which I don't have sudo previleges, I wish to install a package via pip (matplotlib to be precise), but some source packages are not installed on the system (however the binaries are installed).
I have created a virtual environment in which to install, and have downloaded the required source code, but I can't place them in the default /usr/include/ etc.. When pip runs matplotlib's setup.py script, the source files are reported as missing.
Is there a way to instruct pip or setup.py where to look for the source?
ps: setting CFLAGS or CPPFLAGS adds the locations of the downloaded source to compile instructions, but setup.py didn't find the source, so didn't attempt to compile some components (graphic backends).
pps: this is similar to, but more specific than this question

I would suggest doing:
Rebuild whatever binaries you need in your own home directory (this also avoids an issue if the apps get upgraded on the system or are otherwise different versions from your source). Assuming the programs use the standard configure scripts, you can do
mkdir ~/dev
cd app_src
./configure --prefix=~/dev
make; make install
Then when you want to do your pip install, do
export PATH=~/dev/bin:$PATH
export LD_LIBRARY_PATh=~/dev/lib
(Note, what I should be suggesting is pointing it to your virtualenv but I haven't had the issue you're having)
Do the pip install; if memory serves, pkg-config should pick up the info you want (this assumes matplotlib uses pkg-config to figure out where packages are stored)

Related

When would the -e, --editable option be useful with pip install?

When would the -e, or --editable option be useful with pip install?
For some projects the last line in requirements.txt is -e .. What does it do exactly?
As the man page says it:
-e,--editable <path/url>
Install a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.
So you would use this when trying to install a package locally, most often in the case when you are developing it on your system. It will just link the package to the original location, basically meaning any changes to the original package would reflect directly in your environment.
Some nuggets around the same here and here.
An example run can be:
pip install -e .
or
pip install -e ~/ultimate-utils/ultimate-utils-proj-src/
note the second is the full path to where the setup.py would be at.
Concrete example of using --editable in development
If you play with this test package as in:
cd ~
git clone https://github.com/cirosantilli/vcdvcd
cd vcdvcd
git checkout 5dd4205c37ed0244ecaf443d8106fadb2f9cfbb8
python -m pip install --editable . --user
it outputs:
Obtaining file:///home/ciro/bak/git/vcdvcd
Installing collected packages: vcdvcd
Attempting uninstall: vcdvcd
Found existing installation: vcdvcd 1.0.6
Can't uninstall 'vcdvcd'. No files were found to uninstall.
Running setup.py develop for vcdvcd
Successfully installed vcdvcd-1.0.6
The Can't uninstall 'vcdvcd' is normal: it tried to uninstall any existing vcdvcd to then replace them with the "symlink-like mechanism" that is produced in the following steps, but failed because there were no previous installations.
Then it generates a file:
~/.local/lib/python3.8/site-packages/vcdvcd.egg-link
which contains:
/home/ciro/vcdvcd
.
and acts as a "symlink" to the Python interpreter.
So now, if I make any changes to the git source code under /home/ciro/vcdvcd, it reflects automatically on importers who can from any directory do:
python -c 'import vcdvcd'
Note however that at my pip version at least, binary files installed with --editable, such as the vcdcat script provided by that package via scripts= on setup.py, do not get symlinked, just copied to:
~/.local/bin/vcdcat
just like for regular installs, and therefore updates to the git repository won't directly affect them.
By comparison, a regular non --editable install from the git source:
python -m pip uninstall vcdvcd
python -m pip install --user .
produces a copy of the installed files under:
~/.local/lib/python3.8/site-packages/vcdvcd
Uninstall of an editable package as done above requires a new enough pip as mentioned at: How to uninstall editable packages with pip (installed with -e)
Tested in Python 3.8, pip 20.0.2, Ubuntu 20.04.
Recommendation: develop directly in-tree whenever possible
The editable setup is useful when you are testing your patch to a package through another project.
If however you can fully test your change in-tree, just do that instead of generating an editable install which is more complex.
E.g., the vcdvcd package above is setup in a way that you can just cd into the source and do ./vcdcat without pip installing the package itself (in general, you might need to install dependencies from requirements.txt though), and the import vcdvcd that that executable does (or possibly your own custom test) just finds the package correctly in the same directory it lives in.
From Working in "development" mode:
Although not required, it’s common to locally install your project in
“editable” or “develop” mode while you’re working on it. This allows
your project to be both installed and editable in project form.
Assuming you’re in the root of your project directory, then run:
pip install -e .
Although somewhat cryptic, -e is short for
--editable, and . refers to the current working directory, so together, it means to install the current directory (i.e. your
project) in editable mode.
Some additional insights into the internals of setuptools and distutils from “Development Mode”:
Under normal circumstances, the distutils assume that you are going to
build a distribution of your project, not use it in its “raw” or
“unbuilt” form. If you were to use the distutils that way, you would
have to rebuild and reinstall your project every time you made a
change to it during development.
Another problem that sometimes comes up with the distutils is that you
may need to do development on two related projects at the same time.
You may need to put both projects’ packages in the same directory to
run them, but need to keep them separate for revision control
purposes. How can you do this?
Setuptools allows you to deploy your projects for use in a common
directory or staging area, but without copying any files. Thus, you
can edit each project’s code in its checkout directory, and only need
to run build commands when you change a project’s C extensions or
similarly compiled files. You can even deploy a project into another
project’s checkout directory, if that’s your preferred way of working
(as opposed to using a common independent staging area or the
site-packages directory).
To do this, use the setup.py develop command. It works very similarly
to setup.py install, except that it doesn’t actually install anything.
Instead, it creates a special .egg-link file in the deployment
directory, that links to your project’s source code. And, if your
deployment directory is Python’s site-packages directory, it will also
update the easy-install.pth file to include your project’s source
code, thereby making it available on sys.path for all programs using
that Python installation.
It is important to note that pip uninstall can not uninstall a module that has been installed with pip install -e. So if you go down this route, be prepared for things to get very messy if you ever need to uninstall. A partial solution is to (1) reinstall, keeping a record of files created, as in sudo python3 -m setup.py install --record installed_files.txt, and then (2) manually delete all the files listed, as in e.g. sudo rm -r /usr/local/lib/python3.7/dist-packages/tdc7201-0.1a2-py3.7.egg/ (for release 0.1a2 of module tdc7201). This does not 100% clean everything up however; even after you've done it, importing the (removed!) local library may succeed, and attempting to install the same version from a remote server may fail to do anything (because it thinks your (deleted!) local version is already up to date).
As suggested in previous answers, there is no symlinks that are getting created.
How does '-e' option work? -> It just updates the file "PYTHONDIR/site-packages/easy-install.pth" with the project path specified in the 'command pip install -e'.
So each time python search for a package it will check this directory as well => any changes to the files in this directory is instantly reflected.

Python pip: install in specific non-user directory and ignore system wide packages

I want to set up a Python environment for a whole team and I don't have root access to the server.
I have done a similar thing with Perl and expected to be able to do this for Python in a similar way but I keep running into a problem.
Basically, I want to be able to install a package into /SOME/DIR on the system while ignoring any system wide versions of that package.
However, when I run
pip install --install-option="--prefix=/SOME/DIR/" --up --ignore-installed SOME-MODULE
I keep getting a "permission denied" error because pip keeps trying to remove system-wide packages when upgrading.
What does work is this
pip install --user --up --ignore-installed SOME-MODULE
Which does not try to touch the system-wide packages but it installs the module into a directory in $HOME/.lib, which is not what I need.
It seems impossible to combine --user and a "--prefix" option, so it sems that I can either install into an arbitrary path but then get conflicts with already install system-wide packages or install into my home directory. Neither of them are what I need.
For now I have been using the --user option and then moved the installed files across to /SOME/DIR which works but seems odd.
Am I missing something? I have read up on virtualenv but this also doesn't quite sound like what I need. Thanks for your help!
Note that --install-options is passed directly to the packages setup.py install command - this requires the installation directory to be in your python path.
add it to your PYTHONPATH i.e.
set -gx PYTHONPATH $PYTHONPATH '/home/user/temp/lib/python3.4/site-packages'
and run pip
pip install django==1.6 --ignore-installed --install-options="--prefix=/home/user/temp"
Mostly this is a pain in the ass if you have to do this for each library (note that you will still have potential conflicts with imports if you want to use certain standard libraries from the default site-packages dir, and others from your custom dir). And the best choice is probably, as the comment says, to install virtualenv and virtualenvwrapper
I had a similar problem (permission denied + no root access), --build option made it work: pip install --install-option="--prefix=/path/to/local/lib" --build=/tmp wget

python pip specify a library directory and an include directory

I am using pip and trying to install a python module called pyodbc which has some dependencies on non-python libraries like unixodbc-dev, unixodbc-bin, unixodbc. I cannot install these dependencies system wide at the moment, as I am only playing, so I have installed them in a non-standard location. How do I tell pip where to look for these dependencies ? More exactly, how do I pass information through pip of include dirs (gcc -I) and library dirs (gcc -L -l) to be used when building the pyodbc extension ?
pip has a --global-option flag
You can use it to pass additional flags to build_ext.
For instance, to add a --library-dirs (-L) flag:
pip install --global-option=build_ext --global-option="-L/path/to/local" pyodbc
gcc supports also environment variables:
http://gcc.gnu.org/onlinedocs/gcc/Environment-Variables.html
I couldn't find any build_ext documentation, so here is the command line help
Options for 'build_ext' command:
--build-lib (-b) directory for compiled extension modules
--build-temp (-t) directory for temporary files (build by-products)
--plat-name (-p) platform name to cross-compile for, if supported
(default: linux-x86_64)
--inplace (-i) ignore build-lib and put compiled extensions into the
source directory alongside your pure Python modules
--include-dirs (-I) list of directories to search for header files
(separated by ':')
--define (-D) C preprocessor macros to define
--undef (-U) C preprocessor macros to undefine
--libraries (-l) external C libraries to link with
--library-dirs (-L) directories to search for external C libraries
(separated by ':')
--rpath (-R) directories to search for shared C libraries at runtime
--link-objects (-O) extra explicit link objects to include in the link
--debug (-g) compile/link with debugging information
--force (-f) forcibly build everything (ignore file timestamps)
--compiler (-c) specify the compiler type
--swig-cpp make SWIG create C++ files (default is C)
--swig-opts list of SWIG command line options
--swig path to the SWIG executable
--user add user include, library and rpath
--help-compiler list available compilers
Building on Thorfin's answer and assuming that your desired include and library locations are in /usr/local, you can pass both in like so:
sudo pip install --global-option=build_ext --global-option="-I/usr/local/include/" --global-option="-L/usr/local/lib" <you package name>
Another way to indicate the location of include files and libraries are set relevant environment variables before running pip e.g.
export LDFLAGS=-L/usr/local/opt/openssl/lib
export CPPFLAGS=-I/usr/local/opt/openssl/include
pip install cryptography
Just FYI... If you are having trouble installing a package with pip, then you can use the
--no-clean option to see what is exactly going on (that is, why the build did not work). For instance, if numpy is not installing properly, you could try
pip install --no-clean numpy
then look at the Temporary folder to see how far the build got. On a Windows machine, this should be located at something like:
C:\Users\Bob\AppData\Local\Temp\pip_build_Bob\numpy
Just to be clear, the --no-clean option tries to install the package, but does not clean up after itself, letting you see what pip was trying to do.
Otherwise, if you just want to download the source code, then I would use the -d flag. For instance, to download the Numpy source code .tar file to the current directory, use:
pip install -d %cd% numpy
I was also helped by Thorfin's answer; I was building GTK3+ on windows and installing pygobject, I was having difficulties on how to include multiple folders with pip install.
I tried creating pip config file as per pip documentation. but failed.
the one working is with the command line:
pip install --global-option=build_ext --global-option="-IlistOfDirectories"
# and/or with: --global-option="-LlistofDirectories"
the separator that works with multiple folders in windows is ';' semicolon, NOT colon ':' it might be different in other OS.
sample working command line:
pip install --global-option=build_ext --global-option="-Ic:/gtk-build/gtk/x64/release/include;d:/gtk-build/gtk/x64/release/include/gobject-introspection-1.0" --global-option="-Lc:\gtk-build\gtk\x64\release\lib" pygobject==3.27.1
you can use '' or '/' for path, but make sure do not type backslash next to "
this below will fail because there is backslash next to double quote
pip install --global-option=build_ext --global-option="-Ic:\willFail\" --global-option="-Lc:\willFail\" pygobject==3.27.1
Have you ever used virtualenv? It's Python package that let's you create and maintain multiple isolated environments on one machine. Each can use different modules independent of one another without screwing up dependencies in your system library or a separate virtual environment.
If you don't have root privileges, you can download and use the virtualenv package from source:
$ curl -O https://pypi.python.org/packages/source/v/virtualenv/virtualenv-X.X.tar.gz
$ tar xvfz virtualenv-X.X.tar.gz
$ cd virtualenv-X.X
$ python virtualenv.py myVE
I followed the above steps this weekend on Ubuntu Server 12.0.4 and it worked perfectly. Each new virtual environment you create comes with PIP by default so installing packages into your new environment is easy.
Just in case it's of help to somebody, I still could not find a way to do it through pip, so ended up simply downloading the package and doing through its 'setup.py'. Also switched to what seems an easier to install API called 'pymssql'.

Pythonpath is still ignored and unable to install locally with pip

I'm finding that my pythonpath environment variable is ignored. I'm using python 2.6 on ubuntu. I have in my .bashrc the following:
export PTYHONPATH=/my/home/mylibs/lib/python2.6/site-packages/:$PYTHONPATH
Then I install a new version of numpy using:
python setup.py install --prefix=/my/home/mylibs/
and it gets correctly installed locally. However, when I try to install other packages (also using setup.py) that depend on the new version of numpy, they cannot find it, because by default the loaded numpy is the one in /usr/llib, and not the one specified in my PYTHONPATH. My PYTHONPATH gets correctly set but the system-wide directory is still overruling it.
How can this be fixed? I just want my local version of numpy to be accessed when I do import numpy. I saw other posts related to this with python 2.4 but as far as I can tell it never got resolved. Also, i'd like to do this without installing pip or virtualenv for now. It seems like it should be possible using --prefix or --home options passed to setup.py and then alteration of PYTHONPATH but this does not work for me... the system wide lib dirs are read first.
edit: I try to follow the suggestions and use pip. I have a system wide install of an old pip that does not recognize --user (ver 0.3). I tried to upgrade pip with pip itself but of course that failed because I cannot install it locally, so pip install pip --upgrade --user is not an option. I downloaded a new version of pip and installed locally in my home directory but the system wide old one is still used when I type pip at the prompt. I looked into the pip package and found runner.py so I tried to use it to install packages using:
runner.py install --user numpy --upgrade
That still fails with permission denied:
OSError: [Errno 13] Permission denied: '/usr/bin/f2py2.6'
It looks like --user is broken. I also am not sure how this would solve the fact that the system wide python uses the system wide packages in /usr/lib... is there a solution to this? It seems like it's virtually impossible to install local packages in python nowadays.
Ok, Python will use the first package it finds. The PYTHONPATH gets appended to sys.path, after the system one. So it will normally find the system one first. But the "official" per-user packages directory seems to be placed before that. So create your personal site-packages directory:
mkdir -p $HOME/.local/lib64/python2.7/site-packages
mkdir $HOME/bin
(You may have to change "lib64" to "lib32" or just "lib")
This directory gets placed before the system one on my system. But you should verify it by printing out sys.path.
Then install your packages into there. However, the --user option in the latest pip version should already place it there.
As a list resort you can manipulate sys.path. You can insert your directory into sys.path before the system site-packages, then import numpy.
You are getting permissions errors from the scripts installation, trying to put that in the system location. You can pass additional options to install scripts in your $HOME/bin directory.
Install like this:
pip install --user --install-option="--install-scripts=$HOME/bin"

How can I get FEniCS working in Ubuntu 12.04 with EPD python?

FEniCS that comes in the Ubuntu 12.04 repository does not work with Enthought EPD unless I do some crazy stuff with PYTHONPATH which can often result in EPD using Ubuntu repository python modules rather than EPD modules.
The alternative then is to compile and install all of the FEniCS modules manually. This is screwy because FEniCS needs sudo to install in the normal EPD directory, /usr/local/EPD. If you use sudo, this means that PATH environment variable is not being sourced from ~/.bashrc so it thinks it's working with the native python, not EPD. I tried using the -i option on sudo, and that did some screwy things also.
I managed to solve my own problem. There were a bunch of issues with this technique that I am about to describe, and they are detailed here and here. For reasons that I don't understand, reinstalling Ubuntu fixed the problems described in the links, but that's beyond the scope of what I'm trying to cover here. Suffice it to say that it's good to install Ubuntu with / and /home as separate partitions because it makes complete reinstall very easy.
Procedure for Installing FEniCS for use with EPD
Download all of the packages here. Create the directory ~/.local/src/fenics and save them there. Run tar -xvf on all the files in that directory. An easy easy to do this is with the command for i in *.tar.gz; do tar -xvf $i; done.
First install the python modules FFC, FIAT, Instant, Viper and UFL by going into each of their directories and running python setup.py install --user. The user flag causes them to be installed in /.local/lib.. something. This will be added to your sys.path in python. You can read more about the --user flag here.
Then navigate to the directories for dolfin and ufc, and in each of them run the following commands: cmake -DCMAKE_INSTALL_PREFIX=~/.local ., make, make install.
Lastly, add source /home/chad/.local/share/dolfin/dolfin.conf to ~/.bashrc using gedit or emacs if you want to use a powerful text editor.
EDIT
You must also install ScientificPython using python setup.py install --user, and this is relatively painless.
EDIT
This should get you up and running for the demos in ~/.local/share/dolfin/demo/pde/poisson/python. I hope this helps someone.

Categories

Resources