Bypass confirmation prompt for pip uninstall - python

I'm trying to uninstall all django packages in my superuser environment to ensure that all my webapp dependencies are installed to my virtualenv.
sudo su
sudo pip freeze | grep -E '^django-' | xargs pip -q uninstall
But pip wants to confirm every package uninstall, and there doesn't seem to be a -y option for pip. Is there a better way to uninstall a batch of python modules? Is rm -rf .../site-packages/ a proper way to go? Is there an easy_install alternative?
Alternatively, would it be better to force pip to install all dependencies to the virtualenv rather than relying on the system python modules to meet those dependencies, e.g. pip --upgrade install, but forcing even equally old versions to be installed to override any system modules. I tried activating my virtualenv and then pip install --upgrade -r requirements.txt and that does seem to install the dependencies, even those existing in my system path, but I can't be sure if that's because my system modules were old. And man pip doesn't seem to guarantee this behavior (i.e. installing the same version of a package that already exists in the system site-packages).

starting with pip version 7.1.2 you can run pip uninstall -y <python package(s)>
pip uninstall -y package1 package2 package3
or from file
pip uninstall -y -r requirements.txt

Pip does NOT include a --yes option (as of pip version 1.3.1).
WORKAROUND: pipe yes to it!
$ sudo ls # enter pw so not prompted again
$ /usr/bin/yes | sudo pip uninstall pymongo

If you want to uninstall every package from requirements.txt,
pip uninstall -y -r requirements.txt

on www.saturncloud.io, Jupiter notebooks one can use like this:
!yes | pip uninstall tensorflow
!yes | pip uninstall gast
!yes | pip uninstall tensorflow-probability

Alternatively, would it be better to force pip to install all dependencies to the virtualenv rather than relying on the system python modules to meet those dependencies,
Yes. Don't mess too much with the inbuilt system installed packages. Many of the system packages, particularly in OS X (even the debian and the derived varieties) depend too much on them.
pip --upgrade install, but forcing even equally old versions to be installed to override any system modules.
It should not be a big deal if there are a few more packages installed within the venv that are already there in the system package, particularly if they are of different version. Thats the whole point of virtualenv.
I tried activating my virtualenv and then pip install --upgrade -r requirements.txt and that does seem to install the dependencies, even those existing in my system path, but I can't be sure if that's because my system modules were old. And man pip doesn't seem to guarantee this behavior (i.e. installing the same version of a package that already exists in the system site-packages).
No, it doesn't install the packages already there in the main installation unless you have used the --no-site-packages flag to create it, or the required and present versions are different..

Lakshman Prasad was right, pip --upgrade and/or virtualenv --no-site-packages is the way to go. Uninstalling the system-wide python modules is bad.
The --upgrade option to pip does install required modules in the virtual env, even if they already exist in the system environment, and even if the required version or latest available version is the same as the system version.
pip --upgrade install
And, using the --no-site-packages option when creating the virtual environment ensures that missing dependencies can't possibly be masked by the presence of missing modules in the system path. This helps expose problems during migration of a module from one package to another, e.g. pinax.apps.groups -> django-groups, especially when the problem is with load templatetags statements in django which search all available modules for templatetags directories and the tag definitions within.

pip install -U xxxx
can bypass confirm

Related

Pip doesn't require root user to install but requires root user to upgrade

I'm confused by the intended pip usage. Pip comes installed with Python, which is great, but I get the following warnings when new versions come out:
WARNING: You are using pip version 21.1.1; however, version 21.1.3 is available.
You should consider upgrading via the '/usr/local/opt/python#3.8/bin/python3.8 -m pip install --upgrade pip' command.
I follow the instructions to install it using the command they gave. But then it uninstalls my existing pip and is not able to install the new version.
Installing collected packages: pip
Attempting uninstall: pip
Found existing installation: pip 21.1.1
Uninstalling pip-21.1.1:
ERROR: Could not install packages due to an OSError: Cannot move the non-empty directory '/usr/local/lib/python3.8/site-packages/pip-21.1.1.dist-info/': Lacking write permission to '/usr/local/lib/python3.8/site-packages/pip-21.1.1.dist-info/'.
The pip command is now unrecognized, and the official documentation for upgrading pip suggests running:
python -m pip install -U pip
which gives the same permission error.
I Google this error, and found that the community highly advises to not sudo from these questions (this and this). They also advised pip3 install --upgrade pip --user which also gave the same error. The common consensus is to only install pip packages inside virtual environments, but I'm hesitant to have pip completely uninstalled.
So I got pip to install using sudo, but it's unclear whether I've inadvertently affected (or will affect future) system-wide installations, or how I'd check for these.
I don't understand why installing pip inside /usr/local/ requires sudo, and whether I should only be using pip exclusively inside virtual environments and never outside it
pip can be installed with sudo, into a folder that you don't have permissions to write to. However, it can install packages outside of that folder (and thus, into a folder you have write permissions). However, it is recommended that you don't install pip into a root folder, and instead install it into your home directory.
The command to install pip as root is
sudo apt-get install pip
It should then prompt you for your password. I recommend using sudo whenever you install something.

Python Pip: Is it safe to delete ~/.local/bin in order to delete all user packages installed by pip?

I installed a few user packages with pip by using
pip3 install --user <package_name>
I did this on a machine running Ubuntu 17.10.
I would like to start afresh. Is it safe to delete ~/.local/bin in order to do so or is there some other, more elegant solution? In particular, I'm worrying about messing with the Python packages, which my system requires to function properly.
The Free Desktop Specification notes that ~/.local/bin is a general place for user binaries, so I don't think its safe to assume that deleting that wont affect anything else.
The best method would be to use pip3 uninstall --user <package> to remove specific packages. you can list installed packages with pip3 list --user
Edit: A one-liner to delete all pip3 installed packages, using pip3's uninstall method and jq:
pip3 list --user --format=json | jq '.[].name' | xargs -I{} pip3 uninstall --user {}
Be careful though, as it will delete everything user installed, whether you use it or not!
You should uninstall package manually:
pip uninstall package_name

pip 10 and apt: how to avoid "Cannot uninstall X" errors for distutils packages

I am dealing with a legacy Dockerfile. Here is a very simplified version of what I am dealing with:
FROM ubuntu:14.04
RUN apt-get -y update && apt-get -y install \
python-pip \
python-numpy # ...and many other packages
RUN pip install -U pip
RUN pip install -r /tmp/requirements1.txt # includes e.g., numpy==1.13.0
RUN pip install -r /tmp/requirements2.txt
RUN pip install -r /tmp/requirements3.txt
First, several packages are installed using apt, and then several packages are installed using pip. pip version 10 has been released, and part of the release is this new restriction:
Removed support for uninstalling projects which have been installed using distutils. distutils installed projects do not include metadata indicating what files belong to that install and thus it is impossible to actually uninstall them rather than just remove the metadata saying they've been installed while leaving all of the actual files behind.
This leads to the following problem in my setup. For example, first apt installs python-numpy. Later pip tries to install a newer version of numpy from e.g., /tmp/requirements1.txt, and tries to uninstall the older version, but because of the new restriction, it cannot remove this version:
Installing collected packages: numpy
Found existing installation: numpy 1.8.2
Cannot uninstall 'numpy'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
Now I know at this point there are several solutions.
I could not install python-numpy through apt. However, this causes issues because python-numpy installs a few different packages as requirements, and I do not know if another part of the system relies on these packages. And in reality, there are several apt packages installed through the Dockerfile, and each one I remove seems to reveal another Cannot uninstall X error, and removes a number of other packages along with it, that our app may or may not rely on.
I could also use the --ignore-installed option when I try to pip install things that have already been installed through apt, but then again I have the same problem of every --ignore-installed argument revealing yet another thing that needs to be ignored.
I could pin pip at an older version that does not have this restriction, but I don't want to be stuck using an outdated version of pip forever.
I have been going around in circles trying to come up with a good solution that involves minimal changes to this legacy Dockerfile, and allows the app we deploy with that file to continue to function as it has been. Any suggestions as to how I can safely get around this problem of pip 10 not being able to install newer versions of distutils packages? Thank you!
UPDATE:
I did not realize that --ignore-installed could be used without a package as an argument to ignore all installed packages. I am considering whether or not this might be a good option for me, and have asked about it here.
This is the solution I ended up going with, and our apps have been running in production without any issues for close to a month with this fix in place:
All I had to do was to add
--ignore-installed
to the pip install lines in my dockerfile that were raising errors. Using the same dockerfile example from my original question, the fixed dockerfile would look something like:
FROM ubuntu:14.04
RUN apt-get -y update && apt-get -y install \
python-pip \
python-numpy # ...and many other packages
RUN pip install -U pip
RUN pip install -r /tmp/requirements1.txt --ignore-installed # don't try to uninstall existing packages, e.g., numpy
RUN pip install -r /tmp/requirements2.txt
RUN pip install -r /tmp/requirements3.txt
The documentation I could find for --ignore-installed was unclear in my opinion (pip install --help simply says "Ignore the installed packages (reinstalling instead)."), and I asked about the potential dangers of this flag here, but have yet to get satisfying answer. However, if there are any negative side effects, our production environment has yet to see the effects of them, and I think the risk is low/none (at least that has been our experience). I was able to confirm that in our case, when this flag was used, the existing installation was not uninstalled, but that the newer installation was always used.
Update:
I wanted to highlight this answer by #ivan_pozdeev. He provides some information that this answer does not include, and he also outlines some potential side-effects of my solution.
This is what worked for me--
pip install --ignore-installed <Your package name>
or
sudo pip install --ignore-installed <Your package name>
or (inside jupyter notebook)
import sys
!{sys.executable} -m pip install --ignore-installed <Your package name>
For windows
write
conda update --all
pip install --upgrade <Your package name>
OR
conda update --all
pip install <Your package name>
OR
pip install wrapt --upgrade --ignore-installed
pip install <Your package name>
from ERROR: Cannot uninstall 'wrapt'. during upgrade
You can just remove numpy manually but keep the other dependencies installed by apt. Then use pip as before to install the latest version of numpy.
#Manually remove just numpy installed by distutils
RUN rm /usr/lib/python2.7/dist-packages/numpy-1.8.2.egg-info
RUN rm -r /usr/lib/python2.7/dist-packages/numpy
RUN pip install -U pip
RUN pip install -r /tmp/requirements1.txt
The location of numpy should be the same. But if you want to confirm the location you can run the container without running the requirements.txt files and issue the following commands in the python console inside the container.
>>> import numpy
>>> print numpy.__file__
/usr/lib/python2.7/dist-packages/numpy/__init__.pyc

Install package locally

I use python's pip to install packages. Now I want to install scipy, which is already installed on the system, but an old version and on a part of the system where I don't have access to. If I try
pip install scipy
pip rightfully tells me that the package is already installed. If I do
pip install scipy --upgrade
pip tries to upgrade the package but I don't have the access rights to do that.
How can I tell pip to install the package local to my user and to ignore the other scipy package?
I think the best way for avoid override packages it's using a virtual environment. Python has it's own virtual environment and you could install it by:
Python 2.7
> sudo apt-get install python-virtualenv
Python 3
> sudo apt-get install virtualenv
With modern python versions, virtualenv is usually included. Once installed, you could generate a virtual enviroment typing:
> virtualenv venv
This would create a folder in the current directory named venv (you could name it whatever you want). In this package the libraries will be installed.
So, it's time to activate the virtual environment
> source venv/bin/activate
You could verify the environment has been activated by checking the prompt changes. If it happens, all the packages installed using pip will be installed locally.
(venv)> pip install scipy
You could check this website for more info.
Don't forget that you eventually have to clear your $PYTHONPATH variable, in order for it to not pick up other packages.

How do I remove packages installed with Python's easy_install?

Python's easy_install makes installing new packages extremely convenient. However, as far as I can tell, it doesn't implement the other common features of a dependency manager - listing and removing installed packages.
What is the best way of finding out what's installed, and what is the preferred way of removing installed packages? Are there any files that need to be updated if I remove packages manually (e.g. by rm /usr/local/lib/python2.6/dist-packages/my_installed_pkg.egg or similar)?
pip, an alternative to setuptools/easy_install, provides an "uninstall" command.
Install pip according to the installation instructions:
$ wget https://bootstrap.pypa.io/get-pip.py
$ python get-pip.py
Then you can use pip uninstall to remove packages installed with easy_install
To uninstall an .egg you need to rm -rf the egg (it might be a directory) and remove the matching line from site-packages/easy-install.pth
First you have to run this command:
$ easy_install -m [PACKAGE]
It removes all dependencies of the package.
Then remove egg file of that package:
$ sudo rm -rf /usr/local/lib/python2.X/site-packages/[PACKAGE].egg
All the info is in the other answers, but none summarizes both your requests or seem to make things needlessly complex:
For your removal needs use:
pip uninstall <package>
(install using easy_install pip)
For your 'list installed packages' needs either use:
pip freeze
Or:
yolk -l
which can output more package details.
(Install via easy_install yolk or pip install yolk)
There are several sources on the net suggesting a hack by reinstalling the package with the -m option and then just removing the .egg file in lib/ and the binaries in bin/. Also, discussion about this setuptools issue can be found on the python bug tracker as setuptools issue 21.
Edit: Added the link to the python bugtracker.
If the problem is a serious-enough annoyance to you, you might consider virtualenv. It allows you to create an environment that encapsulates python libraries. You install packages there rather than in the global site-packages directory. Any scripts you run in that environment have access to those packages (and optionally, your global ones as well). I use this a lot when evaluating packages that I am not sure I want/need to install globally. If you decide you don't need the package, it's easy enough to just blow that virtual environment away. It's pretty easy to use. Make a new env:
$>virtualenv /path/to/your/new/ENV
virtual_envt installs setuptools for you in the new environment, so you can do:
$>ENV/bin/easy_install
You can even create your own boostrap scripts that setup your new environment. So, with one command, you can create a new virtual env with, say, python 2.6, psycopg2 and django installed by default (you can can install an env-specific version of python if you want).
Official(?) instructions: http://peak.telecommunity.com/DevCenter/EasyInstall#uninstalling-packages
If you have replaced a package with another version, then you can just delete the package(s) you don't need by deleting the PackageName-versioninfo.egg file or directory (found in the installation directory).
If you want to delete the currently installed version of a package (or all versions of a package), you should first run:
easy_install -mxN PackageName
This will ensure that Python doesn't continue to search for a package you're planning to remove. After you've done this, you can safely delete the .egg files or directories, along with any scripts you wish to remove.
try
$ easy_install -m [PACKAGE]
then
$ rm -rf .../python2.X/site-packages/[PACKAGE].egg
To list installed Python packages, you can use yolk -l. You'll need to use easy_install yolk first though.
Came across this question, while trying to uninstall the many random Python packages installed over time.
Using information from this thread, this is what I came up with:
cat package_list | xargs -n1 sudo pip uninstall -y
The package_list is cleaned up (awk) from a pip freeze in a virtualenv.
To remove almost all Python packages:
yolk -l | cut -f 1 -d " " | grep -v "setuptools|pip|ETC.." | xargs -n1 pip uninstall -y
I ran into the same problem on my MacOS X Leopard 10.6.blah.
Solution is to make sure you're calling the MacPorts Python:
sudo port install python26
sudo port install python_select
sudo python_select python26
sudo port install py26-mysql
Hope this helps.
For me only deleting this file : easy-install.pth
worked, rest pip install django==1.3.7
This worked for me. It's similar to previous answers but the path to the packages is different.
sudo easy_install -m
sudo rm -rf /Library/Python/2.7/site-packages/.egg
Plaform: MacOS High Sierra version 10.13.3

Categories

Resources