How do I update Anaconda? - python

I have Anaconda installed on my computer and I'd like to update it. In Navigator I can see that there are several individual packages that can be updated, but also an anaconda package that sometimes has a version number and sometimes says custom. How do I proceed?

root is the old (pre-conda 4.4) name for the main environment; after conda 4.4, it was renamed to be base. source
What 95% of people actually want
In most cases what you want to do when you say that you want to update Anaconda is to execute the command:
conda update --all
(But this should be preceded by conda update -n base conda or simply conda update conda so you have the latest conda version installed)
This will update all packages in the current environment to the latest version -- with the small print being that it may use an older version of some packages in order to satisfy dependency constraints (often this won't be necessary and when it is necessary the package plan solver will do its best to minimize the impact).
This needs to be executed from the command line, and the best way to get there is from Anaconda Navigator, then the "Environments" tab, then click on the triangle beside the base environment, selecting "Open Terminal":
This operation will only update the one selected environment (in this case, the base environment). If you have other environments you'd like to update you can repeat the process above, but first click on the environment. When it is selected there is a triangular marker on the right (see image above, step 3). Or from the command line you can provide the environment name (-n envname) or path (-p /path/to/env), for example to update your dspyr environment from the screenshot above:
conda update -n dspyr --all
Update individual packages
If you are only interested in updating an individual package then simply click on the blue arrow or blue version number in Navigator, e.g. for astroid or astropy in the screenshot above, and this will tag those packages for an upgrade. When you are done you need to click the "Apply" button:
Or from the command line:
conda update astroid astropy
Updating just the packages in the standard Anaconda Distribution
If you don't care about package versions and just want "the latest set of all packages in the standard Anaconda Distribution, so long as they work together", then you should take a look at this gist.
Why updating the Anaconda package is almost always a bad idea
In most cases updating the Anaconda package in the package list will have a surprising result: you may actually downgrade many packages (in fact, this is likely if it indicates the version as custom). The gist above provides details.
Leverage conda environments
Your base environment is probably not a good place to try and manage an exact set of packages: it is going to be a dynamic working space with new packages installed and packages randomly updated. If you need an exact set of packages then create a conda environment to hold them. Thanks to the conda package cache and the way file linking is used doing this is typically i) fast and ii) consumes very little additional disk space. E.g.
conda create -n myspecialenv -c bioconda -c conda-forge python=3.5 pandas beautifulsoup seaborn nltk
The conda documentation has more details and examples.
pip, PyPI, and setuptools?
None of this is going to help with updating packages that have been installed from PyPI via pip or any packages installed using python setup.py install. conda list will give you some hints about the pip-based Python packages you have in an environment, but it won't do anything special to update them.
Commercial use of Anaconda or Anaconda Enterprise
It is pretty much exactly the same story, with the exception that you may not be able to update the base environment if it was installed by someone else (say to /opt/anaconda/latest). If you're not able to update the environments you are using you should be able to clone and then update:
conda create -n myenv --clone base
conda update -n myenv --all

If you are trying to update your Anaconda version to a new one, you'll notice that running the new installer wouldn't work, as it complains the installation directory is non-empty.
So you should use conda to upgrade as detailed by the official docs:
conda update conda
conda update anaconda
In Windows, if you made a "for all users" installation, it might be necessary to run from an Anaconda prompt with Administrator privileges.
This prevents the error:
ERROR conda.core.link:_execute(502): An error occurred while uninstalling package 'defaults::conda-4.5.4-py36_0'.
PermissionError(13, 'Access is denied')

Open "command or conda prompt" and run:
conda update conda
conda update anaconda
It's a good idea to run both command twice (one after the other) to be sure that all the basic files are updated.
This should put you back on the latest 'releases', which contains packages that are selected by the people at Continuum to work well together.
If you want the last version of each package run (this can lead to an unstable environment):
conda update --all
Hope this helps.
Sources:
https://docs.anaconda.com/anaconda/install/update-version
https://github.com/conda/conda/issues/1414#issuecomment-119071154

This is what the official Anaconda documentation recommends:
conda update conda
conda install anaconda=2021.11
You can find the current and past version codes here.
The command will update to a specific release of the Anaconda meta-package.
I feel like (contrary to the claim made in the accepted answer) this is more what 95% of Anaconda users want imho: Upgrading to the latest version of the Anaconda meta-package (put together and tested by the Anaconda Distributors) and ignoring the update status of individual packages, which would be issued by conda update --all.

Open Anaconda cmd in base mode:
Then use conda update conda to update Anaconda.
You can then use conda update --all to update all the requirements for Anaconda:
conda update conda
conda update --all

Here's the best practice (in my humble experience). Selecting these four packages will also update all other dependencies to the appropriate versions that will help you keep your environment consistent. The latter is a common problem others have expressed in earlier responses. This solution doesn't need the terminal.

If you have trouble to get e.g. from 3.3.x to 4.x (conda update conda "does not work" to get to the next version) than try it more specific like so:
conda install conda=4.0 (or conda install anaconda=4.0)
https://www.anaconda.com/blog/developer-blog/anaconda-4-release/
You should know what you do, because conda could break due to the forced installation.
If you would like to get more flexibility/security you could use pkg-manager like nix(-pkgs) [with nix-shell] / NixOS.

Intro
This answer wraps up many answers and comments, it does not add new code, all credits go to the other answers, especially this answer that shows how to install the official release, fully in line with the docs.
In the following, the "docs" mean the official Anaconda documentation at Updating from older versions. It makes sense to read the docs, it is a short overview.
And since it will be used quite often, here is the definition of metapackage:
A metapackage is a very simple package that has at least a name and a
version. It need not have any dependencies or build steps.
Metapackages may list dependencies to several core, low-level
libraries and may contain links to software files that are
automatically downloaded when executed.
First step
As a first step before the anaconda install, you update conda:
conda update conda
Second step
As a second step, you have three choices: custom or official metapackage, or conda update --all.
1. Custom metapackage
If you are allowed to have the most recent custom metapackage (mind that this might not always be the best choice for standard packages with constrained dependencies), then you can use
conda install anaconda
Docs:
There is a special “custom” version of the Anaconda metapackage that
has all the package dependencies, but none of them are constrained.
The “custom” version is lower in version ordering than any actual
release number.
The starting point for the tests was the installed release 2021.05. After this, conda update anaconda and conda install anaconda both lead to the same new "downgraded custom version" of custom-py38_1, see at the bottom of the code blocks: version change of anaconda = 2021.05-py38_0 --> custom-py38_1. But using update leads to far more installed packages than install here:
update leads to more installation steps than install
(base) C:\WINDOWS\system32>conda update anaconda
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: C:\Users\toeft\anaconda3
added / updated specs:
- anaconda
The following packages will be downloaded:
package | build
---------------------------|-----------------
_anaconda_depends-2020.07 | py38_0 6 KB
anaconda-custom | py38_1 36 KB
anaconda-client-1.8.0 | py38haa95532_0 170 KB
anaconda-project-0.10.1 | pyhd3eb1b0_0 218 KB
astroid-2.6.6 | py38haa95532_0 314 KB
astropy-4.3.1 | py38hc7d831d_0 6.1 MB
attrs-21.2.0 | pyhd3eb1b0_0 46 KB
babel-2.9.1 | pyhd3eb1b0_0 5.5 MB
...
xlsxwriter-3.0.1 | pyhd3eb1b0_0 111 KB
xlwings-0.24.7 | py38haa95532_0 887 KB
zeromq-4.3.4 | hd77b12b_0 4.2 MB
zipp-3.5.0 | pyhd3eb1b0_0 13 KB
zope.interface-5.4.0 | py38h2bbff1b_0 305 KB
zstd-1.4.9 | h19a0ad4_0 478 KB
------------------------------------------------------------
Total: 218.2 MB
The following NEW packages will be INSTALLED:
_anaconda_depends pkgs/main/win-64::_anaconda_depends-2020.07-py38_0
cfitsio pkgs/main/win-64::cfitsio-3.470-he774522_6
charset-normalizer pkgs/main/noarch::charset-normalizer-2.0.4-pyhd3eb1b0_0
conda-pack pkgs/main/noarch::conda-pack-0.6.0-pyhd3eb1b0_0
debugpy pkgs/main/win-64::debugpy-1.4.1-py38hd77b12b_0
fonttools pkgs/main/noarch::fonttools-4.25.0-pyhd3eb1b0_0
gmpy2 pkgs/main/win-64::gmpy2-2.0.8-py38h7edee0f_3
libllvm9 pkgs/main/win-64::libllvm9-9.0.1-h21ff451_0
matplotlib-inline pkgs/main/noarch::matplotlib-inline-0.1.2-pyhd3eb1b0_2
mpc pkgs/main/win-64::mpc-1.1.0-h7edee0f_1
mpfr pkgs/main/win-64::mpfr-4.0.2-h62dcd97_1
mpir pkgs/main/win-64::mpir-3.0.0-hec2e145_1
munkres pkgs/main/noarch::munkres-1.1.4-py_0
The following packages will be REMOVED:
jupyter-packaging-0.7.12-pyhd3eb1b0_0
The following packages will be UPDATED:
anaconda-client 1.7.2-py38_0 --> 1.8.0-py38haa95532_0
anaconda-project 0.9.1-pyhd3eb1b0_1 --> 0.10.1-pyhd3eb1b0_0
astroid 2.5-py38haa95532_1 --> 2.6.6-py38haa95532_0
astropy 4.2.1-py38h2bbff1b_1 --> 4.3.1-py38hc7d831d_0
attrs 20.3.0-pyhd3eb1b0_0 --> 21.2.0-pyhd3eb1b0_0
babel 2.9.0-pyhd3eb1b0_0 --> 2.9.1-pyhd3eb1b0_0
bitarray 1.9.2-py38h2bbff1b_1 --> 2.3.0-py38h2bbff1b_1
bleach 3.3.0-pyhd3eb1b0_0 --> 4.0.0-pyhd3eb1b0_0
bokeh 2.3.2-py38haa95532_0 --> 2.3.3-py38haa95532_0
ca-certificates 2021.4.13-haa95532_1 --> 2021.7.5-haa95532_1
certifi 2020.12.5-py38haa95532_0 --> 2021.5.30-py38haa95532_0
cffi 1.14.5-py38hcd4344a_0 --> 1.14.6-py38h2bbff1b_0
click 7.1.2-pyhd3eb1b0_0 --> 8.0.1-pyhd3eb1b0_0
comtypes 1.1.9-py38haa95532_1002 --> 1.1.10-py38haa95532_1002
curl 7.71.1-h2a8f88b_1 --> 7.78.0-h86230a5_0
cython 0.29.23-py38hd77b12b_0 --> 0.29.24-py38hd77b12b_0
dask 2021.4.0-pyhd3eb1b0_0 --> 2021.8.1-pyhd3eb1b0_0
dask-core 2021.4.0-pyhd3eb1b0_0 --> 2021.8.1-pyhd3eb1b0_0
decorator 5.0.6-pyhd3eb1b0_0 --> 5.0.9-pyhd3eb1b0_0
distributed 2021.4.0-py38haa95532_0 --> 2021.8.1-py38haa95532_0
docutils 0.17-py38haa95532_1 --> 0.17.1-py38haa95532_1
et_xmlfile pkgs/main/noarch::et_xmlfile-1.0.1-py~ --> pkgs/main/win-64::et_xmlfile-1.1.0-py38haa95532_0
fsspec 0.9.0-pyhd3eb1b0_0 --> 2021.7.0-pyhd3eb1b0_0
gevent 21.1.2-py38h2bbff1b_1 --> 21.8.0-py38h2bbff1b_1
greenlet 1.0.0-py38hd77b12b_2 --> 1.1.1-py38hd77b12b_0
idna 2.10-pyhd3eb1b0_0 --> 3.2-pyhd3eb1b0_0
imagecodecs 2021.3.31-py38h5da4933_0 --> 2021.6.8-py38h5da4933_0
intel-openmp 2021.2.0-haa95532_616 --> 2021.3.0-haa95532_3372
ipykernel 5.3.4-py38h5ca1d4c_0 --> 6.2.0-py38haa95532_1
ipython 7.22.0-py38hd4e2768_0 --> 7.26.0-py38hd4e2768_0
isort 5.8.0-pyhd3eb1b0_0 --> 5.9.3-pyhd3eb1b0_0
itsdangerous 1.1.0-pyhd3eb1b0_0 --> 2.0.1-pyhd3eb1b0_0
jinja2 2.11.3-pyhd3eb1b0_0 --> 3.0.1-pyhd3eb1b0_0
json5 0.9.5-py_0 --> 0.9.6-pyhd3eb1b0_0
jupyterlab 3.0.14-pyhd3eb1b0_1 --> 3.1.7-pyhd3eb1b0_0
jupyterlab_server 2.4.0-pyhd3eb1b0_0 --> 2.7.1-pyhd3eb1b0_0
keyring 22.3.0-py38haa95532_0 --> 23.0.1-py38haa95532_0
krb5 1.18.2-hc04afaa_0 --> 1.19.2-h5b6d351_0
libcurl 7.71.1-h2a8f88b_1 --> 7.78.0-h86230a5_0
libxml2 2.9.10-hb89e7f3_3 --> 2.9.12-h0ad7f3c_0
lz4-c 1.9.3-h2bbff1b_0 --> 1.9.3-h2bbff1b_1
markupsafe 1.1.1-py38he774522_0 --> 2.0.1-py38h2bbff1b_0
matplotlib 3.3.4-py38haa95532_0 --> 3.4.2-py38haa95532_0
matplotlib-base 3.3.4-py38h49ac443_0 --> 3.4.2-py38h49ac443_0
mkl 2021.2.0-haa95532_296 --> 2021.3.0-haa95532_524
mkl-service 2.3.0-py38h2bbff1b_1 --> 2.4.0-py38h2bbff1b_0
mkl_random 1.2.1-py38hf11a4ad_2 --> 1.2.2-py38hf11a4ad_0
more-itertools 8.7.0-pyhd3eb1b0_0 --> 8.8.0-pyhd3eb1b0_0
nbconvert 6.0.7-py38_0 --> 6.1.0-py38haa95532_0
networkx 2.5-py_0 --> 2.6.2-pyhd3eb1b0_0
nltk 3.6.1-pyhd3eb1b0_0 --> 3.6.2-pyhd3eb1b0_0
notebook 6.3.0-py38haa95532_0 --> 6.4.3-py38haa95532_0
numpy 1.20.1-py38h34a8a5c_0 --> 1.20.3-py38ha4e8547_0
numpy-base 1.20.1-py38haf7ebc8_0 --> 1.20.3-py38hc2deb75_0
openjpeg 2.3.0-h5ec785f_1 --> 2.4.0-h4fc8c34_0
openssl 1.1.1k-h2bbff1b_0 --> 1.1.1l-h2bbff1b_0
packaging 20.9-pyhd3eb1b0_0 --> 21.0-pyhd3eb1b0_0
pandas 1.2.4-py38hd77b12b_0 --> 1.3.2-py38h6214cd6_0
path 15.1.2-py38haa95532_0 --> 16.0.0-py38haa95532_0
pathlib2 2.3.5-py38haa95532_2 --> 2.3.6-py38haa95532_2
pillow 8.2.0-py38h4fa10fc_0 --> 8.3.1-py38h4fa10fc_0
pkginfo 1.7.0-py38haa95532_0 --> 1.7.1-py38haa95532_0
prometheus_client 0.10.1-pyhd3eb1b0_0 --> 0.11.0-pyhd3eb1b0_0
pydocstyle 6.0.0-pyhd3eb1b0_0 --> 6.1.1-pyhd3eb1b0_0
pyerfa 1.7.3-py38h2bbff1b_0 --> 2.0.0-py38h2bbff1b_0
pygments 2.8.1-pyhd3eb1b0_0 --> 2.10.0-pyhd3eb1b0_0
pylint 2.7.4-py38haa95532_1 --> 2.9.6-py38haa95532_1
pyodbc 4.0.30-py38ha925a31_0 --> 4.0.31-py38hd77b12b_0
pytest 6.2.3-py38haa95532_2 --> 6.2.4-py38haa95532_2
python-dateutil 2.8.1-pyhd3eb1b0_0 --> 2.8.2-pyhd3eb1b0_0
pywin32 227-py38he774522_1 --> 228-py38hbaba5e8_1
pyzmq 20.0.0-py38hd77b12b_1 --> 22.2.1-py38hd77b12b_1
qtconsole 5.0.3-pyhd3eb1b0_0 --> 5.1.0-pyhd3eb1b0_0
qtpy 1.9.0-py_0 --> 1.10.0-pyhd3eb1b0_0
regex 2021.4.4-py38h2bbff1b_0 --> 2021.8.3-py38h2bbff1b_0
requests 2.25.1-pyhd3eb1b0_0 --> 2.26.0-pyhd3eb1b0_0
rope 0.18.0-py_0 --> 0.19.0-pyhd3eb1b0_0
scikit-learn 0.24.1-py38hf11a4ad_0 --> 0.24.2-py38hf11a4ad_1
seaborn 0.11.1-pyhd3eb1b0_0 --> 0.11.2-pyhd3eb1b0_0
singledispatch 3.6.1-pyhd3eb1b0_1001 --> 3.7.0-pyhd3eb1b0_1001
six pkgs/main/win-64::six-1.15.0-py38haa9~ --> pkgs/main/noarch::six-1.16.0-pyhd3eb1b0_0
sortedcontainers 2.3.0-pyhd3eb1b0_0 --> 2.4.0-pyhd3eb1b0_0
sphinx 4.0.1-pyhd3eb1b0_0 --> 4.0.2-pyhd3eb1b0_0
sphinxcontrib-htm~ 1.0.3-pyhd3eb1b0_0 --> 2.0.0-pyhd3eb1b0_0
sphinxcontrib-ser~ 1.1.4-pyhd3eb1b0_0 --> 1.1.5-pyhd3eb1b0_0
sqlalchemy 1.4.7-py38h2bbff1b_0 --> 1.4.22-py38h2bbff1b_0
sqlite 3.35.4-h2bbff1b_0 --> 3.36.0-h2bbff1b_0
testpath 0.4.4-pyhd3eb1b0_0 --> 0.5.0-pyhd3eb1b0_0
threadpoolctl 2.1.0-pyh5ca1d4c_0 --> 2.2.0-pyhbf3da8f_0
tifffile 2021.4.8-pyhd3eb1b0_2 --> 2021.7.2-pyhd3eb1b0_2
tqdm 4.59.0-pyhd3eb1b0_1 --> 4.62.1-pyhd3eb1b0_1
typed-ast 1.4.2-py38h2bbff1b_1 --> 1.4.3-py38h2bbff1b_1
typing_extensions 3.7.4.3-pyha847dfd_0 --> 3.10.0.0-pyh06a4308_0
urllib3 1.26.4-pyhd3eb1b0_0 --> 1.26.6-pyhd3eb1b0_1
wheel 0.36.2-pyhd3eb1b0_0 --> 0.37.0-pyhd3eb1b0_0
xlsxwriter 1.3.8-pyhd3eb1b0_0 --> 3.0.1-pyhd3eb1b0_0
xlwings 0.23.0-py38haa95532_0 --> 0.24.7-py38haa95532_0
zeromq 4.3.3-ha925a31_3 --> 4.3.4-hd77b12b_0
zipp 3.4.1-pyhd3eb1b0_0 --> 3.5.0-pyhd3eb1b0_0
zope.interface 5.3.0-py38h2bbff1b_0 --> 5.4.0-py38h2bbff1b_0
zstd 1.4.5-h04227a9_0 --> 1.4.9-h19a0ad4_0
The following packages will be DOWNGRADED:
anaconda 2021.05-py38_0 --> custom-py38_1
install leads to less installation steps than update:
(base) C:\WINDOWS\system32>conda install anaconda
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: C:\Users\toeft\anaconda3
added / updated specs:
- anaconda
The following packages will be downloaded:
package | build
---------------------------|-----------------
_anaconda_depends-2020.07 | py38_0 6 KB
anaconda-custom | py38_1 36 KB
ca-certificates-2021.7.5 | haa95532_1 113 KB
certifi-2021.5.30 | py38haa95532_0 140 KB
gmpy2-2.0.8 | py38h7edee0f_3 145 KB
libllvm9-9.0.1 | h21ff451_0 61 KB
mpc-1.1.0 | h7edee0f_1 260 KB
mpfr-4.0.2 | h62dcd97_1 1.5 MB
mpir-3.0.0 | hec2e145_1 1.3 MB
openssl-1.1.1l | h2bbff1b_0 4.8 MB
------------------------------------------------------------
Total: 8.4 MB
The following NEW packages will be INSTALLED:
_anaconda_depends pkgs/main/win-64::_anaconda_depends-2020.07-py38_0
gmpy2 pkgs/main/win-64::gmpy2-2.0.8-py38h7edee0f_3
libllvm9 pkgs/main/win-64::libllvm9-9.0.1-h21ff451_0
mpc pkgs/main/win-64::mpc-1.1.0-h7edee0f_1
mpfr pkgs/main/win-64::mpfr-4.0.2-h62dcd97_1
mpir pkgs/main/win-64::mpir-3.0.0-hec2e145_1
The following packages will be UPDATED:
ca-certificates 2021.4.13-haa95532_1 --> 2021.7.5-haa95532_1
certifi 2020.12.5-py38haa95532_0 --> 2021.5.30-py38haa95532_0
openssl 1.1.1k-h2bbff1b_0 --> 1.1.1l-h2bbff1b_0
The following packages will be DOWNGRADED:
anaconda 2021.05-py38_0 --> custom-py38_1
2. Official metapackage (= release)
In the following code snippets, update and install lead to the same results. I use install like in the docs.
If you do not want to install a custom version of the metapackage but rather need the most recent official release, install with
conda install anaconda=VersionNumber
Find the VersionNumber
At the time of writing, in 09/2021, the latest available release (Anaconda individual edition) is
conda install anaconda=2021.05
But how to get hold of this VersionNumber?
Have a look at the Anaconda Release notes of the individual edition. If you need an older version, you need to scroll down that page, for example to find 2020.11. The most recent is always on top of the page. If you use a commercial edition, you need to check other release notes.
Thus, something like the 2021.05 version code is the latest release shortcut that you need to find. You can also find the full version name of your OS like for example Anaconda3-2021.05-Windows-x86_64.exe in the list of available Anaconda versions that is directly linked in the docs. It is sorted by name and date, thus, you need to search for the year like "YYYY-MM" / "YYYY-" or scroll through the whole list to find the most recent versions:
For the example of Windows 10 64 bit, the command could as well be:
conda update anaconda=Anaconda3-2021.05-Windows-x86_64.exe
If you install a release after having installed the most recent custom metapackage, you will see some packages to be removed and quite many to be downgraded slightly. This is because the release is slightly back in time, but therefore also fully trusted.
Docs:
conda update anaconda=VersionNumber grabs the specific release of the
Anaconda metapackage, for example conda update anaconda=2019.10. That
metapackage represents a pinned state that has undergone testing as a
collection.
3. Do not use conda update --all
As to the docs (last sentence of the following quote below), installing the custom (= most recent) metapackage of 2019.07 can be done as well by running
conda update --all
and if you have virtual environments, you need:
conda update -n myenv --all
YET: This was probably an exception for 2019.07. It does not seem to hold for higher metapackage versions. I checked the differences of conda update --all against conda update anaconda on a row to row comparison (see below, after the quote). Although they seem like twins at first, there were enough small differences to say that you should keep your hands off conda update --all since possible conflicting constraints are even mentioned in the docs.
Docs:
conda update --all will unpin everything. This updates all packages in
the current environment to the latest version. In doing so, it drops
all the version constraints from the history and tries to make
everything as new as it can.
This has the same behavior with removing packages. If any packages are
orphaned by an update, they are removed. conda update --all may not be
able to make everything the latest versions because you may have
conflicting constraints in your environment.
With Anaconda 2019.07’s newer Anaconda metapackage, conda update --all
will make the metapackage go to the custom version in order to update
other specs.
The whole output, put against each other on a row to row base, reveals the following remaining row differences. This proves that conda update --all is not just the custom metapackage:
conda update --all output lines not found in conda update anaconda
(base) C:\WINDOWS\system32>conda update --all
The following packages will be downloaded:
anaconda-navigator-2.0.4 | py38_0 5.2 MB
conda-build-3.21.4 | py38haa95532_0 552 KB
conda-content-trust-0.1.1 | pyhd3eb1b0_0 56 KB
conda-repo-cli-1.0.4 | pyhd3eb1b0_0 47 KB
conda-token-0.3.0 | pyhd3eb1b0_0 10 KB
menuinst-1.4.17 | py38h59b6b97_0 96 KB
python-3.8.11 | h6244533_1 16.0 MB
Total: 224.8 MB
The following NEW packages will be INSTALLED:
conda-content-tru~ pkgs/main/noarch::conda-content-trust-0.1.1-pyhd3eb1b0_0
conda-repo-cli pkgs/main/noarch::conda-repo-cli-1.0.4-pyhd3eb1b0_0
conda-token pkgs/main/noarch::conda-token-0.3.0-pyhd3eb1b0_0
The following packages will be UPDATED:
anaconda-navigator 1.10.0-py38_0 --> 2.0.4-py38_0
conda-build 3.20.5-py38_1 --> 3.21.4-py38haa95532_0
et_xmlfile pkgs/main/noarch::et_xmlfile-1.0.1-py~ --> pkgs/main/win-64::et_xmlfile-1.1.0-py38haa95532_0
menuinst 1.4.16-py38he774522_1 --> 1.4.17-py38h59b6b97_0
python 3.8.8-hdbf39b2_5 --> 3.8.11-h6244533_1
six pkgs/main/win-64::six-1.15.0-py38haa9~ --> pkgs/main/noarch::six-1.16.0-pyhd3eb1b0_0
sphinxcontrib-htm~ 1.0.3-pyhd3eb1b0_0 --> 2.0.0-pyhd3eb1b0_0
sphinxcontrib-ser~ 1.1.4-pyhd3eb1b0_0 --> 1.1.5-pyhd3eb1b0_0
conda update anaconda output lines not found in conda update --all
(base) C:\WINDOWS\system32>conda update anaconda
added / updated specs:
- anaconda
The following packages will be downloaded:
cfitsio-3.470 | he774522_6 512 KB
imagecodecs-2021.6.8 | py38h5da4933_0 6.1 MB
jinja2-3.0.1 | pyhd3eb1b0_0 110 KB
tifffile-2021.7.2 | pyhd3eb1b0_2 135 KB
typed-ast-1.4.3 | py38h2bbff1b_1 135 KB
Total: 209.8 MB
The following NEW packages will be INSTALLED:
cfitsio pkgs/main/win-64::cfitsio-3.470-he774522_6
The following packages will be UPDATED:
et_xmlfile pkgs/main/noarch::et_xmlfile-1.0.1-py~ --> pkgs/main/win-64::et_xmlfile-1.1.0-py38haa95532_0
imagecodecs 2021.3.31-py38h5da4933_0 --> 2021.6.8-py38h5da4933_0
jinja2 2.11.3-pyhd3eb1b0_0 --> 3.0.1-pyhd3eb1b0_0
six pkgs/main/win-64::six-1.15.0-py38haa9~ --> pkgs/main/noarch::six-1.16.0-pyhd3eb1b0_0
sphinxcontrib-htm~ 1.0.3-pyhd3eb1b0_0 --> 2.0.0-pyhd3eb1b0_0
sphinxcontrib-ser~ 1.1.4-pyhd3eb1b0_0 --> 1.1.5-pyhd3eb1b0_0
tifffile 2021.4.8-pyhd3eb1b0_2 --> 2021.7.2-pyhd3eb1b0_2
typed-ast 1.4.2-py38h2bbff1b_1 --> 1.4.3-py38h2bbff1b_1
Therefore, conda update --all is not recommended, better stick to the custom metapackage if you need the highest possible update, or take the official metapackage if you are fine with a lag of a couple of months and a collection of packages without any conflicts is most important (for example, if you are in a production environment).
Result: Which to install: official or custom metapackage?
Some answers or comments say that the custom metapackage install might need to be run twice to get to a proper state. I cannot confirm this (tested with conda install anaconda and conda update anaconda, but I am also in a fresh Python installation). This is still a hint that it might be more stable to install the most recent official metapackage (= release, conda install anaconda=VersionNumber = conda update anaconda=VersionNumber) which can have a lag of some months.
On the other hand, the custom metapackage (the most recent trusted package collection) might be good if you want the most recent versions available. Then run conda install anaconda or the even stronger command conda update anaconda.
This is also the way to update Spyder:
They do not even use conda update conda before conda update anaconda, the latter seems enough.
Small "proof": I used conda update conda at first, and after that, conda update anaconda had nothing to do anymore, conda update conda had done all or the tasks.
conda update anaconda
Collecting package metadata (current_repodata.json): done Solving environment: done
# All requested packages already installed.
That again sounds as if both commands are made the same now, perhaps they have not been the same only in the past.
The choice is up to you, it depends on how urgently you need to be up-to-date with some packages. Just start the installer to see what would happen, you can still enter n to cancel the installation. I am going to take
conda update anaconda
without conda update conda.
And do not take conda update --all unless you need the most recent update of some package, for example as a requirement for another package to be installed. I ran into that when testing --all, only after that, a new tensorflow add-on was suggested for download, but not after the other commands. Normally, you will not need to be up to date on the point, therefore do not use --all.

Yet, another answer:
conda update -n base conda -c anaconda
where -c your preferred channel or simply leave out.
copied from here

I'm using Windows 10. The following updates everything and also installs some new packages, including a Python update (for me it was 3.7.3).
At the shell, try the following (be sure to change where your Anaconda 3 Data is installed). It takes some time to update everything.
conda update --prefix X:\XXXXData\Anaconda3 anaconda

To update your installed version to the latest version, say 2019.07, run:
conda install anaconda=2019.07
In most cases, this method can meet your needs and avoid dependency problems.

On Mac, open a terminal and run the following two commands.
conda update conda
conda update anaconda
Make sure to run each command multiple times to update to the current version.

Use:
conda create -n py37 -c anaconda anaconda=5.3.1
conda env export -n py37 --file env.yaml
Locate the env.yaml file in C:\Windows\System32 and run the cmd as administrator:
conda env update -n root -f env.yaml
Then it works!

I also tried updating anaconda using conda install -n base anaconda=2022.10, but this resulted in conflicts indicating that my python version was too low (3.6).
I eventually managed to update using the following command:
conda install -n base anaconda=2022.10 python=3.8
The latest anaconda version code can be found in the release notes.
(In between I also reset my base environment using conda install --rev 0, not sure this was necessary though. In any case, be aware of possible consequences when trying this!)

Related

conda: what difference does it make if we set pip_interop_enabled=True?

There are many posts on this site which reference, typically in passing, the idea of setting pip_interop_enabled=True within some environment. This makes conda and pip3 somehow interact better, I am told. To be precise, people say conda will search PyPI for packages that don't exist in the main channels if this is true. They also say it's "experimental."
Here is conda's documentation about this. It notes that much of conda's behavior in recent versions has also improved even with pip_interop_enabled=False, leading to questions about what this setting even does.
Here is my question: in real terms, what does all of this mean?
Is the only difference that conda will search PyPI if this is True and not if it's False?
Are there other things that it does? For instance, if I need to install some package from pip, will conda know better not to clobber it if this setting is True?
What, to be precise, goes wrong if I set this to True? Are there known edge cases that somehow break things if this "experimental" setting is set to True?
Why would I ever not want to set this?
Not a PyPI Searching Feature
First, let's clarify: Conda will not "search PyPI" - that is not what the pip_interop_enabled configuration option adds. Rather, it enables the solver to allow a package already installed with pip to satisfy a dependency requirement of a Conda package. Note that the option is about Pip interoperability (as distinct from PyPI) and it doesn't matter whether the package was sourced from PyPI, GitHub, local, etc..
Example: scipy -> numpy
Let's consider a simple example to illustrate the behavior. Start with the following environment that has Python 3.10 and numpy installed from PyPI.
pip_interop.yaml
name: pip_interop
channels:
- conda-forge
dependencies:
- python=3.10
- pip
## PyPI packages
- pip:
- numpy
which we can create with
conda env create -n pip_interop -f pip_interop.yaml
and verify that the numpy is from PyPI:
$ conda list -n pip_interop numpy
# packages in environment at /Users/user/mambaforge/envs/pip_interop:
#
# Name Version Build Channel
numpy 1.24.2 pypi_0
Let's see what would happen installing scipy and in particular, how it satisfies its numpy dependency.
Installing without Pip interoperability
In default mode, we see the following behavior
$ conda install -n pip_interop scipy
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: /Users/user/mambaforge/envs/pip_interop
added / updated specs:
- scipy
The following packages will be downloaded:
package | build
---------------------------|-----------------
cryptography-39.0.1 | py310hdd0c95c_0 1.1 MB
numpy-1.24.2 | py310h788a5b3_0 6.1 MB
scipy-1.10.0 | py310h240c617_2 20.2 MB
------------------------------------------------------------
Total: 27.4 MB
The following NEW packages will be INSTALLED:
appdirs conda-forge/noarch::appdirs-1.4.4-pyh9f0ad1d_0
brotlipy conda-forge/osx-64::brotlipy-0.7.0-py310h90acd4f_1005
certifi conda-forge/noarch::certifi-2022.12.7-pyhd8ed1ab_0
cffi conda-forge/osx-64::cffi-1.15.1-py310ha78151a_3
charset-normalizer conda-forge/noarch::charset-normalizer-2.1.1-pyhd8ed1ab_0
cryptography conda-forge/osx-64::cryptography-39.0.1-py310hdd0c95c_0
idna conda-forge/noarch::idna-3.4-pyhd8ed1ab_0
libblas conda-forge/osx-64::libblas-3.9.0-16_osx64_openblas
libcblas conda-forge/osx-64::libcblas-3.9.0-16_osx64_openblas
libcxx conda-forge/osx-64::libcxx-14.0.6-hccf4f1f_0
libgfortran conda-forge/osx-64::libgfortran-5.0.0-11_3_0_h97931a8_27
libgfortran5 conda-forge/osx-64::libgfortran5-11.3.0-h082f757_27
liblapack conda-forge/osx-64::liblapack-3.9.0-16_osx64_openblas
libopenblas conda-forge/osx-64::libopenblas-0.3.21-openmp_h429af6e_3
llvm-openmp conda-forge/osx-64::llvm-openmp-15.0.7-h61d9ccf_0
numpy conda-forge/osx-64::numpy-1.24.2-py310h788a5b3_0
packaging conda-forge/noarch::packaging-23.0-pyhd8ed1ab_0
pooch conda-forge/noarch::pooch-1.6.0-pyhd8ed1ab_0
pycparser conda-forge/noarch::pycparser-2.21-pyhd8ed1ab_0
pyopenssl conda-forge/noarch::pyopenssl-23.0.0-pyhd8ed1ab_0
pysocks conda-forge/noarch::pysocks-1.7.1-pyha2e5f31_6
python_abi conda-forge/osx-64::python_abi-3.10-3_cp310
requests conda-forge/noarch::requests-2.28.2-pyhd8ed1ab_0
scipy conda-forge/osx-64::scipy-1.10.0-py310h240c617_2
urllib3 conda-forge/noarch::urllib3-1.26.14-pyhd8ed1ab_0
Proceed ([y]/n)?
Observe that despite numpy already being installed in the environment, Conda is proposing to replace it with a Conda version. That is, only considers the information in conda-meta/ to determine whether a package is installed and won't check the environment's lib/python3.10/site-packages/.
Installing with Pip interoperability
Now we try it with the pip_interop_enabled turned on:
$ CONDA_PIP_INTEROP_ENABLED=1 conda install -n foo scipy
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: /Users/user/mambaforge/envs/pip_interop
added / updated specs:
- scipy
The following packages will be downloaded:
package | build
---------------------------|-----------------
cryptography-39.0.1 | py310hdd0c95c_0 1.1 MB
scipy-1.10.0 | py310h240c617_2 20.2 MB
------------------------------------------------------------
Total: 21.3 MB
The following NEW packages will be INSTALLED:
appdirs conda-forge/noarch::appdirs-1.4.4-pyh9f0ad1d_0
brotlipy conda-forge/osx-64::brotlipy-0.7.0-py310h90acd4f_1005
certifi conda-forge/noarch::certifi-2022.12.7-pyhd8ed1ab_0
cffi conda-forge/osx-64::cffi-1.15.1-py310ha78151a_3
charset-normalizer conda-forge/noarch::charset-normalizer-2.1.1-pyhd8ed1ab_0
cryptography conda-forge/osx-64::cryptography-39.0.1-py310hdd0c95c_0
idna conda-forge/noarch::idna-3.4-pyhd8ed1ab_0
libblas conda-forge/osx-64::libblas-3.9.0-16_osx64_openblas
libcblas conda-forge/osx-64::libcblas-3.9.0-16_osx64_openblas
libcxx conda-forge/osx-64::libcxx-14.0.6-hccf4f1f_0
libgfortran conda-forge/osx-64::libgfortran-5.0.0-11_3_0_h97931a8_27
libgfortran5 conda-forge/osx-64::libgfortran5-11.3.0-h082f757_27
liblapack conda-forge/osx-64::liblapack-3.9.0-16_osx64_openblas
libopenblas conda-forge/osx-64::libopenblas-0.3.21-openmp_h429af6e_3
llvm-openmp conda-forge/osx-64::llvm-openmp-15.0.7-h61d9ccf_0
packaging conda-forge/noarch::packaging-23.0-pyhd8ed1ab_0
pooch conda-forge/noarch::pooch-1.6.0-pyhd8ed1ab_0
pycparser conda-forge/noarch::pycparser-2.21-pyhd8ed1ab_0
pyopenssl conda-forge/noarch::pyopenssl-23.0.0-pyhd8ed1ab_0
pysocks conda-forge/noarch::pysocks-1.7.1-pyha2e5f31_6
python_abi conda-forge/osx-64::python_abi-3.10-3_cp310
requests conda-forge/noarch::requests-2.28.2-pyhd8ed1ab_0
scipy conda-forge/osx-64::scipy-1.10.0-py310h240c617_2
urllib3 conda-forge/noarch::urllib3-1.26.14-pyhd8ed1ab_0
Proceed ([y]/n)?
Note that now the numpy is not proposed to be replaced and this is because the existing pip-installed version is consider able to satisfy the dependency.
Why is this experimental?
There may be multiple reasons why this remains experimental after several years. One important reason is that Conda only tests its package builds against Conda builds of the dependencies. So, it cannot guarantee that the packages are functionally exchangeable.
Furthermore, Conda packages often bring in non-Python dependencies. There has been a rise in wheel deployments, which is the PyPI approach to this, but isn't ubiquitous. There are still many "wrapper" packages out there where the PyPI version assumes some binary is on PATH, whereas the installation of the Conda package guarantees the binary is also installed.
Another important issue is that the PyPI-Conda name mapping is not well-defined. That is, the name of a package in PyPI may not correspond to its Conda package name. This can directly lead to cryptic issues when the names diverge. Specifically, Conda will not correctly recognize that a pip-installed package satisfies the requirement when the names don't match. Hence, the is some unexpected heterogeneity in how the interoperability applies.
Example: torch vs pytorch
In the Python ecosystem, the torch module is provided by the PyPI package torch. However, the package torch in PyPI goes by pytorch on Conda channels.
Here's how this can lead to inconsistent behavior. Let's begin with torch installed from PyPI:
pip_interop.yaml
name: pip_interop
channels:
- conda-forge
dependencies:
- python=3.10
- pip
## PyPI packages
- pip:
- torch
Creating with:
conda env create -n pip_interop -f pip_interop.yaml
Now if we install torchvision from Conda, even with the pip_interop_enabled on, we get:
$ CONDA_PIP_INTEROP_ENABLED=1 conda install -n pip_interop torchvision
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: /Users/user/mambaforge/envs/pip_interop
added / updated specs:
- torchvision
The following packages will be downloaded:
package | build
---------------------------|-----------------
cryptography-39.0.1 | py310hdd0c95c_0 1.1 MB
jpeg-9e | hb7f2c08_3 226 KB
libprotobuf-3.21.12 | hbc0c0cd_0 1.8 MB
mkl-2022.2.1 | h44ed08c_16952 113.1 MB
numpy-1.24.2 | py310h788a5b3_0 6.1 MB
pillow-9.4.0 | py310h306a057_1 44.1 MB
pytorch-1.13.1 |cpu_py310h2bbf33f_1 56.9 MB
sleef-3.5.1 | h6db0672_2 1.0 MB
torchvision-0.14.1 |cpu_py310hd5ee960_0 5.9 MB
------------------------------------------------------------
Total: 230.1 MB
The following NEW packages will be INSTALLED:
brotlipy conda-forge/osx-64::brotlipy-0.7.0-py310h90acd4f_1005
certifi conda-forge/noarch::certifi-2022.12.7-pyhd8ed1ab_0
cffi conda-forge/osx-64::cffi-1.15.1-py310ha78151a_3
charset-normalizer conda-forge/noarch::charset-normalizer-2.1.1-pyhd8ed1ab_0
cryptography conda-forge/osx-64::cryptography-39.0.1-py310hdd0c95c_0
freetype conda-forge/osx-64::freetype-2.12.1-h3f81eb7_1
idna conda-forge/noarch::idna-3.4-pyhd8ed1ab_0
jpeg conda-forge/osx-64::jpeg-9e-hb7f2c08_3
lcms2 conda-forge/osx-64::lcms2-2.14-h29502cd_1
lerc conda-forge/osx-64::lerc-4.0.0-hb486fe8_0
libblas conda-forge/osx-64::libblas-3.9.0-16_osx64_openblas
libcblas conda-forge/osx-64::libcblas-3.9.0-16_osx64_openblas
libcxx conda-forge/osx-64::libcxx-14.0.6-hccf4f1f_0
libdeflate conda-forge/osx-64::libdeflate-1.17-hac1461d_0
libgfortran conda-forge/osx-64::libgfortran-5.0.0-11_3_0_h97931a8_27
libgfortran5 conda-forge/osx-64::libgfortran5-11.3.0-h082f757_27
liblapack conda-forge/osx-64::liblapack-3.9.0-16_osx64_openblas
libopenblas conda-forge/osx-64::libopenblas-0.3.21-openmp_h429af6e_3
libpng conda-forge/osx-64::libpng-1.6.39-ha978bb4_0
libprotobuf conda-forge/osx-64::libprotobuf-3.21.12-hbc0c0cd_0
libtiff conda-forge/osx-64::libtiff-4.5.0-hee9004a_2
libwebp-base conda-forge/osx-64::libwebp-base-1.2.4-h775f41a_0
libxcb conda-forge/osx-64::libxcb-1.13-h0d85af4_1004
llvm-openmp conda-forge/osx-64::llvm-openmp-15.0.7-h61d9ccf_0
mkl conda-forge/osx-64::mkl-2022.2.1-h44ed08c_16952
numpy conda-forge/osx-64::numpy-1.24.2-py310h788a5b3_0
openjpeg conda-forge/osx-64::openjpeg-2.5.0-h13ac156_2
pillow conda-forge/osx-64::pillow-9.4.0-py310h306a057_1
pthread-stubs conda-forge/osx-64::pthread-stubs-0.4-hc929b4f_1001
pycparser conda-forge/noarch::pycparser-2.21-pyhd8ed1ab_0
pyopenssl conda-forge/noarch::pyopenssl-23.0.0-pyhd8ed1ab_0
pysocks conda-forge/noarch::pysocks-1.7.1-pyha2e5f31_6
python_abi conda-forge/osx-64::python_abi-3.10-3_cp310
pytorch conda-forge/osx-64::pytorch-1.13.1-cpu_py310h2bbf33f_1
requests conda-forge/noarch::requests-2.28.2-pyhd8ed1ab_0
sleef conda-forge/osx-64::sleef-3.5.1-h6db0672_2
tbb conda-forge/osx-64::tbb-2021.7.0-hb8565cd_1
torchvision conda-forge/osx-64::torchvision-0.14.1-cpu_py310hd5ee960_0
typing_extensions conda-forge/noarch::typing_extensions-4.4.0-pyha770c72_0
urllib3 conda-forge/noarch::urllib3-1.26.14-pyhd8ed1ab_0
xorg-libxau conda-forge/osx-64::xorg-libxau-1.0.9-h35c211d_0
xorg-libxdmcp conda-forge/osx-64::xorg-libxdmcp-1.1.3-h35c211d_0
zstd conda-forge/osx-64::zstd-1.5.2-hbc0c0cd_6
Proceed ([y]/n)?
That is, Conda still tries to install pytorch and this means that it will lead to clobbering of the existing torch package installed from PyPI. This has the potential to having residual files from the clobbered version of the package intermixed with the clobbering version.
Basically, this is undefined behavior and the Conda software may not give you any warning about potential problems.

how to install the arch package with anaconda?

I am trying to install the arch package https://pypi.org/project/arch/ using Anaconda.
The suggested install runs fine
(base) C:\Users\john>conda install arch-py -c conda-forge
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: C:\Users\john\anaconda3
added / updated specs:
- arch-py
The following NEW packages will be INSTALLED:
arch-py conda-forge/win-64::arch-py-4.18-py38h294d835_0
cython conda-forge/win-64::cython-0.29.22-py38h885f38d_0
icc_rt pkgs/main/win-64::icc_rt-2019.0.0-h0cc432a_1
patsy conda-forge/noarch::patsy-0.5.1-py_0
property-cached conda-forge/noarch::property-cached-1.6.4-py_0
scipy pkgs/main/win-64::scipy-1.6.1-py38h14eb087_0
statsmodels conda-forge/win-64::statsmodels-0.12.2-py38h347fdf6_0
The following packages will be UPDATED:
certifi pkgs/main::certifi-2020.12.5-py38haa9~ --> conda-forge::certifi-2020.12.5-py38haa244fe_1
The following packages will be SUPERSEDED by a higher-priority channel:
ca-certificates pkgs/main::ca-certificates-2021.1.19-~ --> conda-forge::ca-certificates-2020.12.5-h5b45459_0
conda pkgs/main::conda-4.9.2-py38haa95532_0 --> conda-forge::conda-4.9.2-py38haa244fe_0
openssl pkgs/main::openssl-1.1.1j-h2bbff1b_0 --> conda-forge::openssl-1.1.1j-h8ffe710_0
Proceed ([y]/n)? y
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
(base) C:\Users\john>spyder
Unfortunately, I cannot import the package correctly when I start Spyder.
from arch import arch_model
ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject
What should I do?
Thanks!
The package requires the most recent version of numpy. I tried to remove numpy and reinstall version 1.20.0 (the version needed) without success. Anaconda would stick to 1.19
Ultimately, I did what I should have done a long time ago. Download miniconda (not anaconda) and install only the packages I need. That way, no annoying conflicts when updating packages with conda!

Why does conda want to update unrelated packages when I want to remove just one?

Windows 10
conda 4.9.2 (via miniconda)
I installed a single package that did not require any other dependencies to be installed anew or upgraded. Once I realised that I had installed an unsuitable version of the package, I went to remove it, and this is the screen I was presented with:
(pydata) PS C:\Users\Navneeth> conda remove xlrd
Collecting package metadata (repodata.json): done
Solving environment: |
Warning: 2 possible package resolutions (only showing differing packages):
- defaults/win-64::libtiff-4.1.0-h56a325e_1, defaults/win-64::zstd-1.4.9-h19a0ad4_0
- defaults/win-64::libtiff-4.2.0-hd0e1b90_0, defaults/win-64::zstd-1.4.5-h04227a9done
## Package Plan ##
environment location: C:\Users\Navneeth\Miniconda3\envs\pydata
removed specs:
- xlrd
The following packages will be downloaded:
package | build
---------------------------|-----------------
decorator-5.0.3 | pyhd3eb1b0_0 12 KB
importlib-metadata-3.7.3 | py38haa95532_1 31 KB
importlib_metadata-3.7.3 | hd3eb1b0_1 11 KB
ipython-7.22.0 | py38hd4e2768_0 998 KB
jupyter_client-6.1.12 | pyhd3eb1b0_0 88 KB
libtiff-4.1.0 | h56a325e_1 739 KB
nbformat-5.1.3 | pyhd3eb1b0_0 44 KB
notebook-6.3.0 | py38haa95532_0 4.4 MB
pandoc-2.12 | haa95532_0 13.2 MB
parso-0.8.2 | pyhd3eb1b0_0 69 KB
pillow-8.2.0 | py38h4fa10fc_0 671 KB
prometheus_client-0.10.0 | pyhd3eb1b0_0 46 KB
prompt-toolkit-3.0.17 | pyh06a4308_0 256 KB
terminado-0.9.4 | py38haa95532_0 26 KB
zipp-3.4.1 | pyhd3eb1b0_0 15 KB
zstd-1.4.9 | h19a0ad4_0 478 KB
------------------------------------------------------------
Total: 21.0 MB
The following packages will be REMOVED:
xlrd-2.0.1-pyhd3eb1b0_0
The following packages will be UPDATED:
decorator 4.4.2-pyhd3eb1b0_0 --> 5.0.3-pyhd3eb1b0_0
importlib-metadata pkgs/main/noarch::importlib-metadata-~ --> pkgs/main/win-64::importlib-metadata-3.7.3-py38haa95532_1
importlib_metadata 2.0.0-1 --> 3.7.3-hd3eb1b0_1
ipython 7.21.0-py38hd4e2768_0 --> 7.22.0-py38hd4e2768_0
jupyter_client 6.1.7-py_0 --> 6.1.12-pyhd3eb1b0_0
nbformat 5.1.2-pyhd3eb1b0_1 --> 5.1.3-pyhd3eb1b0_0
notebook 6.2.0-py38haa95532_0 --> 6.3.0-py38haa95532_0
pandoc 2.11-h9490d1a_0 --> 2.12-haa95532_0
parso 0.8.1-pyhd3eb1b0_0 --> 0.8.2-pyhd3eb1b0_0
pillow 8.1.2-py38h4fa10fc_0 --> 8.2.0-py38h4fa10fc_0
prometheus_client 0.9.0-pyhd3eb1b0_0 --> 0.10.0-pyhd3eb1b0_0
prompt-toolkit 3.0.8-py_0 --> 3.0.17-pyh06a4308_0
sqlite 3.33.0-h2a8f88b_0 --> 3.35.3-h2bbff1b_0
terminado 0.9.2-py38haa95532_0 --> 0.9.4-py38haa95532_0
zipp 3.4.0-pyhd3eb1b0_0 --> 3.4.1-pyhd3eb1b0_0
zstd 1.4.5-h04227a9_0 --> 1.4.9-h19a0ad4_0
The following packages will be DOWNGRADED:
libtiff 4.2.0-he0120a3_0 --> 4.1.0-h56a325e_1
Proceed ([y]/n)?
Why does conda want to update or downgrade all these other packages when the opposite wasn't done when I installed xlrd? Is there a way that I can safely remove the just xlrd. (I hear using --force is risky.)
Asymmetry
Conda re-solves when removing. When installing, Conda first attempts a frozen solve, which amounts to keeping all installed packages fixed and just searching for a version of the requested package(s) that are compatible. In this specific case, xlrd (v2.1.0) is a noarch with only a python>=3.6 constraint. So this installs in this frozen solve pass.
The constraint xlrd will also be added to the explicit specifications.1
When removing, Conda will first remove the constraint, and then re-solves the environment with the new set of explicit specifications. It is in this solve that Conda identifies that newer versions of packages and then proposes updating then.
So, the asymmetry is that the frozen solve explicitly avoids checking for any new packages, but the removal will trigger such a check. There is not currently a way to avoid this without bypassing dependency checking.
Mamba
Actually, mamba, a compiled (fast!) drop-in replacement for conda, will remove only the specified package if it doesn't have anything depending on it. That is its default behavior in my testing.
Addendum: Still Some Unexplained Behavior
I replicated your experience by first creating an environment with two specs:
name: foo
channels:
- conda-forge
dependencies:
- python=3.8.0
- pip=20
To simulate this being an old environment, I went into the envs/foo/conda-meta/history and changed2 the line
# update specs: ['pip=20', 'python=3.8.0']
to
# update specs: ['python=3.8']
Subsequently running conda install xlrd does as expected. Then conda remove xlrd gives a somewhat odd result:
## Package Plan ##
environment location: /opt/conda/envs/foo
removed specs:
- xlrd
The following packages will be downloaded:
package | build
---------------------------|-----------------
pip-21.1.1 | pyhd8ed1ab_0 1.1 MB conda-forge
------------------------------------------------------------
Total: 1.1 MB
The following packages will be REMOVED:
xlrd-2.0.1-pyhd8ed1ab_3
The following packages will be UPDATED:
pip 20.3.4-pyhd8ed1ab_0 --> 21.1.1-pyhd8ed1ab_0
Proceed ([y]/n)?
This effectively replicates OP result, however, the additional oddity here is that the python package is not suggested to be updated, even though I had intentionally loosened its constraint from 3.8.0 to 3.8. It appears that only packages not in the explicit specifications are subject to updating during package removal.
[1] The explicit specifications are the internally maintained records that Conda keeps of every constraint a user has explicitly specified. One can view the current explicit specifications of an environment with conda env export --from-history. The raw internal records can be found at yourenv/conda-meta/history.
[2] Not a recommended practice!

InvalidArchiveError when installing openssl-1.1.1g through Anaconda 4.7.12 on Windows 10

I am trying to install requests module, and openssl keep causing the InvalidArchiveError.
I am using conda 4.7.12, python 3.8.2, on Windows 10 operating system. I had the same issue when installing other packages requiring openssl-1.1.1g. I have followed the advice from the error message to delete and re-download 'openssl-1.1.1g-he774522_0.tar.bz2', but I keep getting the same error.
Is this version of openssl broken or something? Any help will be greatly appreciated.
lykim#Louis MINGW64 ~/Desktop/master/Learning
$ conda install -c anaconda requests
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... done
==> WARNING: A newer version of conda exists. <==
current version: 4.7.12
latest version: 4.8.3
Please update conda by running
$ conda update -n base conda
## Package Plan ##
environment location: C:\ProgramData\Anaconda3\envs\pytorch
added / updated specs:
- requests
The following packages will be downloaded:
package | build
---------------------------|-----------------
openssl-1.1.1g | he774522_0 5.8 MB anaconda
------------------------------------------------------------
Total: 5.8 MB
The following NEW packages will be INSTALLED:
brotlipy conda-forge/win-64::brotlipy-0.7.0-py38h1e8a9f7_1000
cffi anaconda/win-64::cffi-1.14.0-py38h7a1dbc1_0
chardet anaconda/win-64::chardet-3.0.4-py38_1003
cryptography anaconda/win-64::cryptography-2.9.2-py38h7a1dbc1_0
idna anaconda/noarch::idna-2.9-py_1
pycparser anaconda/noarch::pycparser-2.20-py_0
pyopenssl anaconda/win-64::pyopenssl-19.1.0-py38_0
pysocks anaconda/win-64::pysocks-1.7.1-py38_0
requests anaconda/win-64::requests-2.23.0-py38_0
urllib3 conda-forge/noarch::urllib3-1.25.9-py_0
win_inet_pton anaconda/win-64::win_inet_pton-1.1.0-py38_0
The following packages will be SUPERSEDED by a higher-priority channel:
certifi conda-forge::certifi-2020.4.5.1-py38h~ --> anaconda::certifi-2020.4.5.1-py38_0
openssl conda-forge --> anaconda
Proceed ([y]/n)? y
Downloading and Extracting Packages
openssl-1.1.1g | 5.8 MB | ########## | 100%
InvalidArchiveError('Error with archive C:\\ProgramData\\Anaconda3\\pkgs\\openssl-1.1.1g-he774522_0.tar.bz2. You probably need to delete and re-download or re-create this file. Message from libarchive was:\n\nCould not unlink')
(pytorch)
I encountered the same problem a couple of times. This time my problem was solved simply by closing down jupyter notebook, which was running and using openssl, so if your python is running somewhere, try to close it down. (In addition had already removed openssl-1.1.1g-he774522_0.tar.bz2 and a number of folders openssl-1.1.1g-he774522_0 containing also those .tar balls, so that might be necessary as well)
You may try to go to your packages directory
C:\ProgramData\Anaconda3\pkgs\
Then delete openssl-1.1.1g-he774522_0.tar.bz2 file, install libarchive and reinstall your package.
Another Solution
Simply you can install on your anaconda environment using pip instead of conda, anyway i recommend to update your conda too.
Encountered the same error, I think it was caused by an earlier failed installation which had left an incomplete openssl-1.1.1g-he774522_0 directory in users\username\Anaconda3\pkgs. Just delete it and then it completes just fine.
I had the same issue. There was some "openssl*" folders. Deleted them, and the "openssl*" .bz2 files as well, and tried to install again. Now works perfectly.
Eventually the folders can't be deleted because they're used by other programs. If it happens, go to Task Manager and close python.exe.
I was able to fix my error by elevating my command window. The delete method wasn't working.

Managing packages: PyCharm vs conda vs pip

I'm new to Python and recently installed PyCharm 2016.3 on Windows 10. I'm also using Anaconda 3.
I don't know much about package management and would like to understand it better. Normally I just use conda update --all but I noticed (by checking the package list of my local PyCharm Interpreter) that this doesn't upgrade all packages to the latest version.
One such package is Pillow of which there's a version 4.0.0 but conda (4.3.11) won't update it past 3.4.2. I tried conda install pillow: 4.0.0 and got:
UnsatisfiableError: The following specifications were found to be in conflict:
- pillow 4.0.0*
- python 3.5*
- spyder-app
Use "conda info <package>" to see the dependencies for each package.
Later I found out that Pillow is also available on conda-forge so I tried conda install -c conda-forge pillow=4.0.0 and got:
The following NEW packages will be INSTALLED:
libiconv: 1.14-vc14_4 conda-forge [vc14]
libxml2: 2.9.3-vc14_9 conda-forge [vc14]
olefile: 0.44-py35_0 conda-forge
vc: 14-0 conda-forge
The following packages will be UPDATED:
freetype: 2.5.5-vc14_2 [vc14] --> 2.7-vc14_0 conda-forge [vc14]
jpeg: 8d-vc14_2 [vc14] --> 9b-vc14_0 conda-forge [vc14]
libtiff: 4.0.6-vc14_2 [vc14] --> 4.0.6-vc14_7 conda-forge [vc14]
pillow: 3.4.2-py35_0 --> 4.0.0-py35_2 conda-forge
The following packages will be SUPERCEDED by a higher-priority channel:
conda: 4.3.11-py35_0 --> 4.2.13-py35_0 conda-forge
conda-env: 2.6.0-0 --> 2.6.0-0 conda-forge
qt: 4.8.7-vc14_9 [vc14] --> 4.8.7-vc14_6 conda-forge [vc14]
I decided not to proceed and instead tried pip install pillow. Since this command doesn't ask for confirmation the package was simply installed. Now when I type conda list I get:
Pillow 4.0.0 <pip>
pillow 3.4.2 py35_0
The package list of the PyCharm Interpreter now shows Pillow as being version 4.0.0 but conda update pillow still returns:
# All requested packages already installed.
pillow 3.4.2 py35_0
My questions are:
1) What should I rely on to keep all my packages up to date, without compatibility issues?
2) Why did conda install pillow: 4.0.0 return an error but conda install -c conda-forge pillow=4.0.0 didn't?
3) What do the * next to pillow 4.0.0 and python 3.5 in the list of dependencies mean?
4) Since now I have both Pillow 3.4.2 (in /anaconda3/pkgs) and Pillow 4.0.0 (in /anaconda3/lib/site-packages) which one would be used if I imported Pillow?
5) Does the superseding conda: 4.3.11-py35_0 --> 4.2.13-py35_0 conda-forge mean conda is getting downgraded?
6) What is the difference between the tags pip, py35_0, py35_4, np111py35_2, etc?
7) PyCharm tells me there's a version 2.9.5 of package Jinja2 but both normal conda and conda-forge only find 2.9.4. From which channel is PyCharm getting this information?
Ok, I can't answer all of your questions but here goes:
1) Conda defers to the "pain up front" approach for handling dependency/conflict resolution. You'll have to get all of your packages to play nicely together in the repo's/channels that you have available to even make a package or keep them in an environment together. You can try running it with --force or --no-deps to try getting it in but ..... that can cause issues for you in the future (I don't know if that would even work with the later versions of conda, it changes a lot). Simply keeping packages up to date, and up to latest, I would just use pip. Its come a long way in the last few years (https://glyph.twistedmatrix.com/2016/08/python-packaging.html)
2) I am not completely sure, I believe it would have something to do with providing an explicit non-url channel for conda to look at. Typically you pass it the URL to the conda-forge repo (I think, again we don't use conda-forge internally).
3) The * means you are ignoring the patch/build 4.0.0 == Major.Minor.Build. Likewise, 3.5* == any version of 3.5
4) I would import pillow in a terminal, and then print out the module to see where its getting pulled from, why guess?
5) pass (although I think so)
6)
pip : means you installed that package via pip. It will not be picked up if you do conda list --explicit
py35_0 : has a requirement / only available to envs / packages that use python 3.5
py35_4 : not sure (always forget that one)
np111py35_2 : requires python3.5 and also numpy 1.11 (I think *)
7) I tend to steer clear of pycharm, I believe that you can inspect the python interpreter that pycharm is pointing at to see what environment its in. Based on the root environment, you can do a conda info and get a list of all of the channels you are pointing to.
Note: if you are going to use conda, you may just want to add conda-forge to your channels list instead of passing the -c (but seeing how the other channels are organized should help you see how you should pass the -c flag)

Categories

Resources