python pip specify a library directory and an include directory - python

I am using pip and trying to install a python module called pyodbc which has some dependencies on non-python libraries like unixodbc-dev, unixodbc-bin, unixodbc. I cannot install these dependencies system wide at the moment, as I am only playing, so I have installed them in a non-standard location. How do I tell pip where to look for these dependencies ? More exactly, how do I pass information through pip of include dirs (gcc -I) and library dirs (gcc -L -l) to be used when building the pyodbc extension ?

pip has a --global-option flag
You can use it to pass additional flags to build_ext.
For instance, to add a --library-dirs (-L) flag:
pip install --global-option=build_ext --global-option="-L/path/to/local" pyodbc
gcc supports also environment variables:
http://gcc.gnu.org/onlinedocs/gcc/Environment-Variables.html
I couldn't find any build_ext documentation, so here is the command line help
Options for 'build_ext' command:
--build-lib (-b) directory for compiled extension modules
--build-temp (-t) directory for temporary files (build by-products)
--plat-name (-p) platform name to cross-compile for, if supported
(default: linux-x86_64)
--inplace (-i) ignore build-lib and put compiled extensions into the
source directory alongside your pure Python modules
--include-dirs (-I) list of directories to search for header files
(separated by ':')
--define (-D) C preprocessor macros to define
--undef (-U) C preprocessor macros to undefine
--libraries (-l) external C libraries to link with
--library-dirs (-L) directories to search for external C libraries
(separated by ':')
--rpath (-R) directories to search for shared C libraries at runtime
--link-objects (-O) extra explicit link objects to include in the link
--debug (-g) compile/link with debugging information
--force (-f) forcibly build everything (ignore file timestamps)
--compiler (-c) specify the compiler type
--swig-cpp make SWIG create C++ files (default is C)
--swig-opts list of SWIG command line options
--swig path to the SWIG executable
--user add user include, library and rpath
--help-compiler list available compilers

Building on Thorfin's answer and assuming that your desired include and library locations are in /usr/local, you can pass both in like so:
sudo pip install --global-option=build_ext --global-option="-I/usr/local/include/" --global-option="-L/usr/local/lib" <you package name>

Another way to indicate the location of include files and libraries are set relevant environment variables before running pip e.g.
export LDFLAGS=-L/usr/local/opt/openssl/lib
export CPPFLAGS=-I/usr/local/opt/openssl/include
pip install cryptography

Just FYI... If you are having trouble installing a package with pip, then you can use the
--no-clean option to see what is exactly going on (that is, why the build did not work). For instance, if numpy is not installing properly, you could try
pip install --no-clean numpy
then look at the Temporary folder to see how far the build got. On a Windows machine, this should be located at something like:
C:\Users\Bob\AppData\Local\Temp\pip_build_Bob\numpy
Just to be clear, the --no-clean option tries to install the package, but does not clean up after itself, letting you see what pip was trying to do.
Otherwise, if you just want to download the source code, then I would use the -d flag. For instance, to download the Numpy source code .tar file to the current directory, use:
pip install -d %cd% numpy

I was also helped by Thorfin's answer; I was building GTK3+ on windows and installing pygobject, I was having difficulties on how to include multiple folders with pip install.
I tried creating pip config file as per pip documentation. but failed.
the one working is with the command line:
pip install --global-option=build_ext --global-option="-IlistOfDirectories"
# and/or with: --global-option="-LlistofDirectories"
the separator that works with multiple folders in windows is ';' semicolon, NOT colon ':' it might be different in other OS.
sample working command line:
pip install --global-option=build_ext --global-option="-Ic:/gtk-build/gtk/x64/release/include;d:/gtk-build/gtk/x64/release/include/gobject-introspection-1.0" --global-option="-Lc:\gtk-build\gtk\x64\release\lib" pygobject==3.27.1
you can use '' or '/' for path, but make sure do not type backslash next to "
this below will fail because there is backslash next to double quote
pip install --global-option=build_ext --global-option="-Ic:\willFail\" --global-option="-Lc:\willFail\" pygobject==3.27.1

Have you ever used virtualenv? It's Python package that let's you create and maintain multiple isolated environments on one machine. Each can use different modules independent of one another without screwing up dependencies in your system library or a separate virtual environment.
If you don't have root privileges, you can download and use the virtualenv package from source:
$ curl -O https://pypi.python.org/packages/source/v/virtualenv/virtualenv-X.X.tar.gz
$ tar xvfz virtualenv-X.X.tar.gz
$ cd virtualenv-X.X
$ python virtualenv.py myVE
I followed the above steps this weekend on Ubuntu Server 12.0.4 and it worked perfectly. Each new virtual environment you create comes with PIP by default so installing packages into your new environment is easy.

Just in case it's of help to somebody, I still could not find a way to do it through pip, so ended up simply downloading the package and doing through its 'setup.py'. Also switched to what seems an easier to install API called 'pymssql'.

Related

How do I install a package from GitHub gist?

I don't understand this...
I want to install this https://gist.github.com/sixtenbe/1178136.
It is a peak detection script for python.
Everywhere I look I am told to use pip with the .git extension.
All I see is how to download the .zip, but from there I am lost.
How can I install this?
Thanks.
You can get the individual files in the Gist (or download the Gist as an ZIP and extract) and put them in your source code folder.
Then you will be able to import them as modules in your own scripts:
import analytic_wfm as AW
AW.ACV_A6( ... )
import peakdetect as PK
PK.peakdetect_parabola( ... )
Let's give it another look.
By "installing a package" we might mean that the package should be available via import.
For that the package directory should reside either in the current directory or in one of the other directories in the import search path.
One such directory is the "user-specific site-packages directory, USER_SITE":
python -c "import site; print(site.getusersitepackages())"
Git URL
First we might need a Git URL. Going to https://gist.github.com/sixtenbe/1178136 we can click on the Embed pop-up and switch it to Clone via HTTPS:
in order to obtain the GIT URL: https://gist.github.com/1178136.git.
git and bash
Having the Git URL and the Unix shell (bash) we can install the package manually into the USER_SITE.
Let's go into the USER_SITE first:
cd $(python -c "import site; print(site.getusersitepackages())")
pwd
Now that we are in the USER_SITE, let's download the Gist:
git clone https://gist.github.com/1178136.git analytic_wfm
Finally, let's verify that the package is now available:
cd && python -c "import analytic_wfm.analytic_wfm; print(analytic_wfm.analytic_wfm.__all__)"
If numpy is installed, it prints
['ACV_A1', 'ACV_A2', 'ACV_A3', 'ACV_A4', 'ACV_A5', 'ACV_A6', 'ACV_A7', 'ACV_A8']
pip
Let's try to install a Gist package with pip.
For pip install we should prefix the Git URL with git+:
pip install --user git+https://gist.github.com/1178136.git
This gives us the error:
ERROR: git+https://gist.github.com/1178136.git does not appear to be a
Python project: neither 'setup.py' nor 'pyproject.toml' found.
Looks like the package we've picked is missing the necessary pip configuration!
Let's try another one:
pip install --user git+https://gist.github.com/bf91613a021a536c7ce16cdba9168604.git
Installs NP:
Successfully built llog
Installing collected packages: llog
Successfully installed llog-1.0
Particularly because it has the setup.py.
Note also that Gist does not support subfolders, and pip seems to depend on them in handling the packages argument, but the code in setup.py can workaround this by creating the package subfolder on the fly and copying the Python files there!
Hence if you want to import that Gist, https://gist.github.com/sixtenbe/1178136, with the rest of the requirements.txt dependencies, - you can fork it and add setup.py to the effect.
pypi
Given that the analytic-wfm can also be found at the Python Package Index, https://pypi.org/project/analytic-wfm/, you can install it with
pip install analytic-wfm

When would the -e, --editable option be useful with pip install?

When would the -e, or --editable option be useful with pip install?
For some projects the last line in requirements.txt is -e .. What does it do exactly?
As the man page says it:
-e,--editable <path/url>
Install a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.
So you would use this when trying to install a package locally, most often in the case when you are developing it on your system. It will just link the package to the original location, basically meaning any changes to the original package would reflect directly in your environment.
Some nuggets around the same here and here.
An example run can be:
pip install -e .
or
pip install -e ~/ultimate-utils/ultimate-utils-proj-src/
note the second is the full path to where the setup.py would be at.
Concrete example of using --editable in development
If you play with this test package as in:
cd ~
git clone https://github.com/cirosantilli/vcdvcd
cd vcdvcd
git checkout 5dd4205c37ed0244ecaf443d8106fadb2f9cfbb8
python -m pip install --editable . --user
it outputs:
Obtaining file:///home/ciro/bak/git/vcdvcd
Installing collected packages: vcdvcd
Attempting uninstall: vcdvcd
Found existing installation: vcdvcd 1.0.6
Can't uninstall 'vcdvcd'. No files were found to uninstall.
Running setup.py develop for vcdvcd
Successfully installed vcdvcd-1.0.6
The Can't uninstall 'vcdvcd' is normal: it tried to uninstall any existing vcdvcd to then replace them with the "symlink-like mechanism" that is produced in the following steps, but failed because there were no previous installations.
Then it generates a file:
~/.local/lib/python3.8/site-packages/vcdvcd.egg-link
which contains:
/home/ciro/vcdvcd
.
and acts as a "symlink" to the Python interpreter.
So now, if I make any changes to the git source code under /home/ciro/vcdvcd, it reflects automatically on importers who can from any directory do:
python -c 'import vcdvcd'
Note however that at my pip version at least, binary files installed with --editable, such as the vcdcat script provided by that package via scripts= on setup.py, do not get symlinked, just copied to:
~/.local/bin/vcdcat
just like for regular installs, and therefore updates to the git repository won't directly affect them.
By comparison, a regular non --editable install from the git source:
python -m pip uninstall vcdvcd
python -m pip install --user .
produces a copy of the installed files under:
~/.local/lib/python3.8/site-packages/vcdvcd
Uninstall of an editable package as done above requires a new enough pip as mentioned at: How to uninstall editable packages with pip (installed with -e)
Tested in Python 3.8, pip 20.0.2, Ubuntu 20.04.
Recommendation: develop directly in-tree whenever possible
The editable setup is useful when you are testing your patch to a package through another project.
If however you can fully test your change in-tree, just do that instead of generating an editable install which is more complex.
E.g., the vcdvcd package above is setup in a way that you can just cd into the source and do ./vcdcat without pip installing the package itself (in general, you might need to install dependencies from requirements.txt though), and the import vcdvcd that that executable does (or possibly your own custom test) just finds the package correctly in the same directory it lives in.
From Working in "development" mode:
Although not required, it’s common to locally install your project in
“editable” or “develop” mode while you’re working on it. This allows
your project to be both installed and editable in project form.
Assuming you’re in the root of your project directory, then run:
pip install -e .
Although somewhat cryptic, -e is short for
--editable, and . refers to the current working directory, so together, it means to install the current directory (i.e. your
project) in editable mode.
Some additional insights into the internals of setuptools and distutils from “Development Mode”:
Under normal circumstances, the distutils assume that you are going to
build a distribution of your project, not use it in its “raw” or
“unbuilt” form. If you were to use the distutils that way, you would
have to rebuild and reinstall your project every time you made a
change to it during development.
Another problem that sometimes comes up with the distutils is that you
may need to do development on two related projects at the same time.
You may need to put both projects’ packages in the same directory to
run them, but need to keep them separate for revision control
purposes. How can you do this?
Setuptools allows you to deploy your projects for use in a common
directory or staging area, but without copying any files. Thus, you
can edit each project’s code in its checkout directory, and only need
to run build commands when you change a project’s C extensions or
similarly compiled files. You can even deploy a project into another
project’s checkout directory, if that’s your preferred way of working
(as opposed to using a common independent staging area or the
site-packages directory).
To do this, use the setup.py develop command. It works very similarly
to setup.py install, except that it doesn’t actually install anything.
Instead, it creates a special .egg-link file in the deployment
directory, that links to your project’s source code. And, if your
deployment directory is Python’s site-packages directory, it will also
update the easy-install.pth file to include your project’s source
code, thereby making it available on sys.path for all programs using
that Python installation.
It is important to note that pip uninstall can not uninstall a module that has been installed with pip install -e. So if you go down this route, be prepared for things to get very messy if you ever need to uninstall. A partial solution is to (1) reinstall, keeping a record of files created, as in sudo python3 -m setup.py install --record installed_files.txt, and then (2) manually delete all the files listed, as in e.g. sudo rm -r /usr/local/lib/python3.7/dist-packages/tdc7201-0.1a2-py3.7.egg/ (for release 0.1a2 of module tdc7201). This does not 100% clean everything up however; even after you've done it, importing the (removed!) local library may succeed, and attempting to install the same version from a remote server may fail to do anything (because it thinks your (deleted!) local version is already up to date).
As suggested in previous answers, there is no symlinks that are getting created.
How does '-e' option work? -> It just updates the file "PYTHONDIR/site-packages/easy-install.pth" with the project path specified in the 'command pip install -e'.
So each time python search for a package it will check this directory as well => any changes to the files in this directory is instantly reflected.

setuptools download cache path

Can I set download cache path for setuptools? I can do that with pip by setting environment variable PIP_DOWNLOAD_CACHE (How to cache downloaded PIP packages), but I don't know, what to do with setuptools.
I want everytime, when I run:
python setup.py develop
the setuptools caches the downloaded files, so I don't have to download it again.
You can use the options -a, -d and -f to create and use a cache directory. This is what the help text for them says:
--always-copy (-a) Copy all needed packages to install dir
--install-dir (-d) install package to DIR
--find-links (-f) additional URL(s) to search for packages
You first use python setup.py develop -a -d CACHEDIR to cache the needed packages in the directory of your choice, then use that directory as a source in the future with python setup.py develop -f CACHEDIR.
You may need to add any flags you normally use to one, the other, or both invocations, depending on the flag in question.

pip install custom include path

On an ubuntu system on which I don't have sudo previleges, I wish to install a package via pip (matplotlib to be precise), but some source packages are not installed on the system (however the binaries are installed).
I have created a virtual environment in which to install, and have downloaded the required source code, but I can't place them in the default /usr/include/ etc.. When pip runs matplotlib's setup.py script, the source files are reported as missing.
Is there a way to instruct pip or setup.py where to look for the source?
ps: setting CFLAGS or CPPFLAGS adds the locations of the downloaded source to compile instructions, but setup.py didn't find the source, so didn't attempt to compile some components (graphic backends).
pps: this is similar to, but more specific than this question
I would suggest doing:
Rebuild whatever binaries you need in your own home directory (this also avoids an issue if the apps get upgraded on the system or are otherwise different versions from your source). Assuming the programs use the standard configure scripts, you can do
mkdir ~/dev
cd app_src
./configure --prefix=~/dev
make; make install
Then when you want to do your pip install, do
export PATH=~/dev/bin:$PATH
export LD_LIBRARY_PATh=~/dev/lib
(Note, what I should be suggesting is pointing it to your virtualenv but I haven't had the issue you're having)
Do the pip install; if memory serves, pkg-config should pick up the info you want (this assumes matplotlib uses pkg-config to figure out where packages are stored)

Easy_install's --prefix option doesn't change where it tries to install my package

I want to install Sphinx 1.1.3 for python 2.6. However, I don't have sudo rights. So instead of installing it in the default place, I want to set a different location, using --prefix. Doing the following:
-bash-3.2$ easy_install Sphinx-1.1.3-py2.6.egg --prefix=/homes/ndeklein/python2.6/site-packages/
gives me:
error: can't create or remove files in install directory
The following error occurred while trying to add or remove files in the
installation directory:
[Errno 13] Permission denied: '/usr/lib/python2.4/site-packages/test-easy-install-18534.write-test'
The installation directory you specified (via --install-dir, --prefix, or
the distutils default setting) was:
/usr/lib/python2.4/site-packages/
Am I typing something wrong with the prefix? Also, what I could use instead (which I've used with other packages):
python setup.py install --home=/homes/ndeklein/python2.6/site-packages/
but I can't find the setup.py script. I'm guessing that EGGs don't have a setup.py script, is that true?
You need to specify options before the package, so the command should be:
easy_install --prefix=/homes/ndeklein/python2.6/site-packages/ Sphinx-1.1.3-py2.6.egg
This website discusses non-root python installs. It might be useful to you...
http://www.astropython.org/tutorials/user-rootsudo-free-installation-of-python-modules7/
To quote a little bit of it:
A user configuration file, ~/.pydistutils.cfg, will override the internal system path for python package installations, redirecting the built libraries (lib), scripts (bin) and data (share) into user owned and specified directories. You must simply tell the python installer where theses directories are located.
The user file, ~/.pydistutils.cfg, has the following lines, using a pretty obvious syntax:
[install]
install_scripts = ~/usr/bin
install_data = ~/usr/share
install_lib = ~/usr/lib/python2.4/site-packages
Of course, whatever directories you specify there should probably exist and you should put them at the front of your PYTHONPATH:
export PYTHONPATH=~/usr/lib/python2.4/site-packages:${PYTHONPATH}
It also looks like more modern python installations (compared to the things in the link) should be able to use the ~/.local directory:
easy_install --prefix=~/.local ...
There is also:
easy_install --user ...
which will install to a user-specific site directory.
You could try using pip install of easy_install(pip is recommended over easy_install these days)
Then you can just use
pip install --user Sphinx
see http://www.pip-installer.org/en/latest/installing.html on how to install pip if needed
You may also want to pip install virtualenv and work inside virtualenv(where pip will install all packages in a local site packages folder). see http://pypi.python.org/pypi/virtualenv for more info.

Categories

Resources