Why are pip packages not installed to $PATH to by default? - python

Typically when using rubygems in the ruby ecosystem or npm/yarn in the JavaScript ecosystem packages are installed somewhere on the $PATH or at least you are instructed to add the package install location to your path $PATH.
It seems like with pip in the python ecosystem there is never an emphasis on this. Instead, you are encouraged to run modules via python -m <name>, etc.
Seems a bit odd to me, was this just a design decision? Is it good practice to put site-packages or whatever location pip is using into $PATH? Sometimes a binary is added to /usr/local/bin (and sometimes with a different name than the package itself e.g django-admin instead of django with pip install Django whereas the binary is usually the same as the package name in ruby/JavaScript ecosystems), for example, I see, but is that all of the time?

You're confusing $PATH and $PYTHONPATH. site-packages is for libraries, $PATH is for programs (binaries and scripts). pip installs libraries into its site-packages and scripts to the corresponding bin/ directory; e.g. if site-packages is /usr/local/lib/pythonX.Y/site-packages scripts are installed into /usr/local/bin/.
pip doesn't check if /usr/local/bin/is in $PATH. Well, I agree with you — it should check and remind user to add bin/ directory to $PATH if it's not already there.

The site-packages directory for a particular python installation is automatically added to sys.path when you run the binary for the installation. When that binary executes import modname, it looks in the directories on sys.path. So when you run pip install with a particular python binary, pip, by default, puts the package in the site-packages for that binary so that binary can import the package. Advanced used can do more complicated things.

Related

How to prevent pip install from hardcoding the python.exe location into the exe files?

I'm working with a flexible solution on top of Python virtual environments venv. Ideally, I would like my environments to be accessible and executable in a location-agnostic fashion and via environment variables. Instead, every time I do pip install, the resulting executables will hardcode the absolute path to the environment from where the installation was triggered. As result, changing the root location of the environment will break all the executables.
I'd like my virtual environment which is located under %MYVENV_HOME% and its actual absolute location different for different machines: CI, PROD, UAT, DEV etc; to be agnostic to the absolute path where the pip install was triggered from.
I was super happy to find a possible solution using the following approach:
python -m pip install --install-option="--prefix=%MYVENV_HOME%" my_package
But they have disabled it, like no, you can't have any flexibility, Using --prefix error with --install option.
Is there any hack or similar that would allow flexibility and base location agnostic installations with pip?
I have also tried doing the following but it doesn't work. The actual absolute location is resolved during pip install, fixed and again hardcoded into the resulting executables:
python -m pip install --prefix %MYVENV_HOME% my_package

Can virtualenv include necessary project packages from site-packages

Running command line:
virtualenv --system-site-packages venv
I'm expecting venv folder venv\Lib\site-packages to contain all the necessary library from the projects that are located in:
C:\Users\XXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\
But it's not the case, only a few are installed.
Example, my program currently use pdfminer which is in
C:\Users\XXXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\
I want it to be included in venv\Lib\site-packages but it is not copied.
Any advice?
--system-site-packages doesn't copy packages, it just allows python from the virtualenv to access packages in C:\Users\XXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\.
There is no way to copy packages because they could depend on their installation directory. If you want these packages in the virtualenv don't use --system-site-packages and install all packages in the virtualenv.
A virtualenv environment is the same as if you have just installed a new version of Python. It has no packages other than the standard packages provided with Python. If you want other packages, you have to install them with 'pip' or however you'd do it with the native Python version that you are using.
So in general, just do pip install <packagename>.
If you find yourself often wanting to create virtualenvs with a standard set of base packages, then put together a requirements.txt file listing all of the packages you want to install as a base, and do pip install -r requirements.txt inside a new virtualenv, right after you create it.
One nice thing about a virtualenv is that it's all yours. Your user owns 100% of it, unlike the base Python version that is owned by the system. To install new packages into the base Python version, you often have to have root access (sudo privileges). With virtualenvs, you don't need special permissions (in fact, you'll get all screwed up if you use sudo in a virtualenv) to install all the packages you want. Everything you do lives within your own home directory. Another neat thing is that when you are done with a virtualenv, you just throw away the root directory that contains it.
If its not mandatory to use virtualenv, I would suggest to go with Anaconda. That'll pretty much help your concern.
Conda as a package manager helps you find and install packages. By default quite a few packages are already installed, so as to set you up quickly for your project. To check the list of packages installed in terminal, type: conda list to obtain the packages installed using conda.
If you need a package that requires a different version of Python, you do not need to switch to a different environment manager, because conda is also an environment manager.
With just a few commands, you can set up a totally separate environment to run that different version of Python, while continuing to run your usual version of Python in your normal environment

Anaconda installation to home directory

I have set up a SSH connection to a remote server. I want to run some of my python programs on it so am having to download all the modules I had been using.
I just downloaded Ananconda (I don't have root access so installed it in ~) and added ~/anaconda/bin to my PATH. However when I try import numpy in Python, it says the module is not found. How do I fix this?
You might be running the wrong version of Python.
To check, use which -a python
james#bodacious:~$which -a python
/usr/bin/python
james#bodacious:~$
In my case, I'm running the version from /usr/bin/python, and that's the only version found in my $PATH. You should see the version from ~/anaconda/bin in your list as well, and for it to be run when you type python it needs to be at the top.
If it's not, you can check your $PATH and, if necessary, add ~/anaconda/bin to the front of it.
james#bodacious:~$echo $PATH
/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Users/james/bin
james#bodacious:~$PATH=~/anaconda/bin:$PATH
james#bodacious:~$echo $PATH
/Users/james/anaconda/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Users/james/bin
james#bodacious:~$
I haven't any Fedora/Redhat systems handy, but I believe you can:
yum install numpy
HTH
You've said that all you really want is to be able to use numpy - based on that, using anaconda is probably overkill.
It sounds as though what you're really asking is "Since I don't have root access and can't install system packages, how can I set up a python environment in my home dir that has what I need?"
This sounds like a job for... Super Grover! no wait, I meant virtualenv.
Hopefully your system will already have virtualenv installed for you. If it does, it's fairly simple for you to create your own environment with your own set of packages:
james#bodacious:~$mkdir venv/
james#bodacious:~$cd venv/
james#bodacious:venv$virtualenv .
New python executable in ./bin/python
Installing Setuptools..............................................................................................................................................................................................................................done.
Installing Pip.....................................................................................................................................................................................................................................................................................................................................done.
james#bodacious:venv$source bin/activate
(venv)james#bodacious:venv$pip install numpy
Downloading/unpacking numpy
Downloading numpy-1.7.1.zip (3.1MB): 3.1MB downloaded
Once that completes, you'll have your own copy of numpy which you can access in this environment just by using cd venv; source bin/activate to set your $PATH and $PYTHONPATH to point at your custom install.
If you don't already have virtualenv installed things get more tricky....

Pip recognize linked site-packages in virtualenv

For a number of reasons, such as when the package takes a long time to compile (lxml) it seems to be recommended to symlink such packages from the system sitepackages directory to a virtualenv.
Some example questions:
Use a single site package (as exception) for a virtualenv
How to install lxml into virtualenv from the local system?
But such packages are not recognized by pip, which will happily try to reinstall them. How to deal with this?
Okay, it seems the trick is to also link the egg-info directory.

Best way to install python packages locally for development

Being new to the python games I seem to have missed out on some knowledge on how you can develop on a program but also keep it in your live environment.
Programs like gpodder can be run directly from the source checkout which is really handy however others want to be "installed" to run.
A lot of programs are distributed with a setup.py with instructions to run "python ./setup.py install" as root which will put stuff somewhere in your file-system. There are even install commands like "develop" which seem to hold the promise of what I want. So I tried:
export PYTHONPATH=/home/alex/python
python ./setup.py develop --install-dir=/home/alex/python
Which downloaded a bunch of stuff locally and seems magically ensure the application I'm hacking on is still being run out of the src tree. So I guess my roundabout question is is this the correct way of developing python code? How do things like easy_install and pip factor into this?
So I tried the following:
python /usr/share/pyshared/virtualenv.py /home/alex/src/goobook
cd /home/alex/src/goobook/googbook.git
/home/alex/src/goobook/bin/python ./setup.py develop
And finally linked the program in question to my ~/bin
cd /home/alex/src/goobook
linkbin.pl bin/goobook
However invocation throws up a load of extra chatter which seems to imply it's wrong:
17:17 alex#socrates/i686 [goobook] >goobook --help
/home/alex/bin/goobook:5: UserWarning: Module pkg_resources was already imported from /home/alex/src/goobook/lib/python2.5/site-packages/setuptools-0.6c8-py2.5.egg/pkg_resources.py, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
/home/alex/bin/goobook:5: UserWarning: Module site was already imported from /home/alex/src/goobook/lib/python2.5/site.pyc, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
Install:
http://pypi.python.org/pypi/virtualenv
to set up a localized virtual environment for your libraries, and:
http://pypi.python.org/pypi/setuptools
i.e. "easy_install" to install new things.
Virtualenv allows you to work in completely independent and isolated Python environments. It will let you easily create multiple environments which have different Python packages installed or different versions of a same package. Virtualenv also lets you easily switch between your different environments.
As of 2012, the de facto preferred tool for package management in Python is pip rather than setuptools. Pip is able to handle dependencies and to install/uninstall globally or inside a virtual environment. Pip even comes out-of-the-box with virtualenv.
Python 3
Also worth mentioning is the fact that virtual environments are becoming a part of Python itself in release 3.3, with the implementation of PEP 405.
The Python Packaging User Guide, which "aims to be the authoritative resource on how to package, publish and install Python distributions using current tools", recommends using pip to install in "development mode":
pip install -e <path>
Thus in the root directory of your package you can simply
pip install -e .
See installing from a local source tree.
The best way to develop Python apps with dependencies is to:
Download desired version of the python interpreter.
Install and use buildout (http://www.buildout.org/).
Buildout is something like Maven for Java (will fetch all needed packages automatically).
This way your Python interpreter will not be polluted by third party packages (this is important if you will be running developed application on other machines). Additionally you can integrate buildout with virtualenv package (this allows you to create virtual python interpreters for each project).

Categories

Resources