conda prevent python upgrade on windows - python

EDIT
official python 2.7.11 installer from python.org has a major bug (PyInitialize from embedded CPython causes hard crash) which can be avoided by following instructions in this bug report:
https://bugs.python.org/issue25824
ORIGINAL
Is it possible to prevent conda to update python on windows when installing new packages? The problem is that I need to use official CPython installation.
e.g.
conda install numba
Fetching package metadata: ....
Solving package specifications: .........
Package plan for installation in environment C:\Python\Python27_32b:
The following packages will be downloaded:
package | build
---------------------------|-----------------
python-2.7.11 | 4 22.3 MB
enum34-1.1.2 | py27_0 54 KB
funcsigs-0.4 | py27_0 19 KB
numpy-1.10.4 | py27_0 2.8 MB
llvmlite-0.10.0 | py27_0 4.6 MB
numexpr-2.5.1 | np110py27_0 141 KB
scipy-0.17.0 | np110py27_0 10.8 MB
numba-0.25.0 | np110py27_0 1.6 MB
scikit-learn-0.17.1 | np110py27_0 3.3 MB
------------------------------------------------------------
Total: 45.5 MB
The following NEW packages will be INSTALLED:
enum34: 1.1.2-py27_0
funcsigs: 0.4-py27_0
llvmlite: 0.10.0-py27_0
numba: 0.25.0-np110py27_0
pip: 8.1.1-py27_1
python: 2.7.11-4
The following packages will be UPDATED:
numexpr: 2.5.1-np111py27_0 --> 2.5.1-np110py27_0
scikit-learn: 0.17.1-np111py27_0 --> 0.17.1-np110py27_0
scipy: 0.17.0-np111py27_0 --> 0.17.0-np110py27_0
The following packages will be DOWNGRADED:
numpy: 1.11.0-py27_0 --> 1.10.4-py27_0
Proceed ([y]/n)?

Related

conda conflicts do not make sense when installing graphviz (UnsatisfiableError) [duplicate]

I am attempting to create a conda environment with 3 packages and a specific python version and get the following output:
$ conda create -n testing_junk -y instrain awscli samtools python=3.8
Collecting package metadata (current_repodata.json): done
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: |
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed \
UnsatisfiableError: The following specifications were found to be incompatible with each other:
Output in format: Requested package -> Available versions
Package ncurses conflicts for:
python=3.8 -> ncurses[version='>=6.1,<6.2.0a0|>=6.2,<7.0a0|>=6.1,<7.0a0']
awscli -> python[version='>=3.8,<3.9.0a0'] -> ncurses[version='5.9.*|5.9|>=6.1,<6.2.0a0|>=6.2,<7.0a0|>=6.1,<7.0a0|>=6.0,<7.0a0|6.0.*']
instrain -> python[version='>=3.4'] -> ncurses[version='5.9.*|5.9|>=6.1,<6.2.0a0|>=6.2,<7.0a0|>=6.1,<7.0a0|>=6.0,<7.0a0|6.0.*']
python=3.8 -> readline[version='>=7.0,<8.0a0'] -> ncurses[version='5.9.*|>=6.0,<7.0a0|6.0.*']
samtools -> ncurses[version='5.9|5.9.*|>=5.9,<5.10.0a0|>=6.1,<6.2.0a0']
Package python conflicts for:
awscli -> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
python=3.8
instrain -> biopython -> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.8,<3.9.0a0|>=3.7,<3.8.0a0|>=3.5,<3.6.0a0|3.4.*|>3|>=3.5|<3.0.0|>=3.6']
instrain -> python[version='>=3.4']
awscli -> python_abi=3.8[build=*_cp38] -> python[version='3.7.*|3.8.*']
Package ca-certificates conflicts for:
samtools -> openssl[version='>=1.1.1a,<1.1.2a'] -> ca-certificates
python=3.8 -> openssl[version='>=1.1.1g,<1.1.2a'] -> ca-certificates
awscli -> python[version='>=2.7,<2.8.0a0'] -> ca-certificates
Package setuptools conflicts for:
python=3.8 -> pip -> setuptools
instrain -> matplotlib-base -> setuptools[version='>=40.0']
Package libgcc-ng conflicts for:
samtools -> ncurses[version='>=6.1,<6.2.0a0'] -> libgcc-ng[version='>=7.2.0']
samtools -> libgcc-ng[version='>=4.9|>=7.3.0']
Package pypy3.6 conflicts for:
instrain -> numpy -> pypy3.6[version='7.3.0.*|7.3.1.*|>=7.3.1']
awscli -> python[version='>=3.6,<3.7.0a0'] -> pypy3.6[version='7.3.*|7.3.0.*|7.3.1.*']
Package bzip2 conflicts for:
samtools -> bzip2[version='1.0.*|>=1.0.6,<2.0a0|>=1.0.8,<2.0a0']
instrain -> pysam -> bzip2[version='>=1.0.6,<2.0a0|>=1.0.8,<2.0a0']
awscli -> python[version='>=3.7,<3.8.0a0'] -> bzip2[version='>=1.0.6,<2.0a0|>=1.0.8,<2.0a0']
Package zlib conflicts for:
samtools -> zlib[version='1.2.11.*|>=1.2.11,<1.3.0a0|1.2.8.*|1.2.8']
samtools -> curl[version='>=7.59.0,<8.0a0'] -> zlib[version='1.2.*|1.2.11']
Package samtools conflicts for:
samtools
instrain -> pysam -> samtools[version='1.3|1.3.1.*|1.3.1|1.5.*|1.6.*|1.7|1.7.*|1.9.*|>=1.4.1|>=1.4.1,<1.5|>=1.4,<1.5|>=1.3,<1.4|>=1.3']
Package openssl conflicts for:
samtools -> curl[version='>=7.59.0,<8.0a0'] -> openssl[version='1.0.*|>=1.0.2o,<1.0.3a|>=1.0.2m,<1.0.3a']
samtools -> openssl[version='>=1.0.2p,<1.0.3a|>=1.0.2r,<1.0.3a|>=1.1.1a,<1.1.2a']
Package _libgcc_mutex conflicts for:
samtools -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex[version='*|0.1',build='main|conda_forge']
python=3.8 -> libgcc-ng[version='>=7.5.0'] -> _libgcc_mutex[version='*|0.1',build='main|conda_forge']The following specifications were found to be incompatible with your CUDA driver:
- feature:/linux-64::__cuda==10.2=0
- feature:|#/linux-64::__cuda==10.2=0
Your installed CUDA driver is: 10.2
I understand that there is something about the packages that conflict with each other, but I'm unable to interpret this output to understand what the problem is.
For example, in looking at the first block of conflicts (related to ncurses), shouldn't version 6.1 satisfy all requirements listed?
Additionally, for the block about package setuptools, I don't see any problem at all?
Any insight into how to interpret these conflicts so that I can attempt to address them would be much appreciated.
Some Practical Advice
#Quantum7's answer gives a fine literal interpretation of Conda's conflict reporting. However, I wanted to offer a more practical take, which is that this "feature" from Conda is too non-specific to be useful in most non-trivial environments. And sometimes it won't even include the underlying conflict. Don't waste your time with it!
Conda's Conflict Reporting is Often Not Helpful
On the face of it, Conda attempts to report all possible sources of conflict. That is, all sets of paths in the dependency graph that begin from the explicit specifications and end in the same package. This amounts to most of what is reported being innocuous and frankly distracting. For example, the zlib "conflicts":
Package zlib conflicts for:
samtools -> zlib[version='1.2.11.*|>=1.2.11,<1.3.0a0|1.2.8.*|1.2.8']
samtools -> curl[version='>=7.59.0,<8.0a0'] -> zlib[version='1.2.*|1.2.11']
Since samtools depends on zlib both directly and indirectly (mediated through curl), this comes up as two alternate paths that lead to constraints. The problem is that the intersection of the final constraints are not empty, such that there is nothing incompatible here.
Furthermore, there are cases where none of what is reported is in conflict (e.g., this question or this one), which means parsing through the output could be a complete waste of time.
Try Mamba
Instead, if one is actually concerned with resolving conflicts, I find Mamba to be more effective to work with, both in speed and precision.
# install mamba
conda install -n base conda-forge::mamba
# use 'mamba' just like 'conda'
mamba create -n foo instrain awscli samtools python=3.8
Unfortunately, this example simply works now. However, there are other questions where Conda and Mamba unsatisfiability reporting is compared, e.g., this question.
The package version specification is described here, but the important parts are:
, represents AND and has the highest precedence
| represents OR and has second precedence
Alpha versions (e.g. '6.2.0a0') are used in upper bounds since they are the first possible release from that version
As an example, consider the first line of the ncurses group:
python=3.8 -> ncurses[version='>=6.1,<6.2.0a0|>=6.2,<7.0a0|>=6.1,<7.0a0']
This indicates that your requested python=3.8 depends on ncurses with the following versions:
>=6.1,<6.2.0a0 version 6.1*
OR >=6.2,<7.0a0 version 6.2*
OR >=6.1,<7.0a0 (redundantly) at least 6.1 but not as high as 7.0
The lists are difficult to read since they include many unnecessary constraints. However, also didn't see any real conflicts with your versions. Not trusting my own ability to check package specifications, I found the conda package that does it directly:
>>> from conda.models import version as cv
>>> cv.VersionSpec(">=6.1,<6.2.0a0|>=6.2,<7.0a0|>=6.1,<7.0a0").match("6.1")
True
Running this on all the conflicts you reported, I was a able to find versions that satisfied all the stated version requirements: https://gist.github.com/sbliven/aab43e1f0bce1f4ac63aaaaa718df0b3
The only part I can't test is the cuda part, but it does look like you have a graphics card that supports CUDA 10.2.
When I started this answer I was preparing an explanation about how SAT solvers like the one used in conda add constraints iteratively and how that could lead to seemingly valid constraints being output as conflicts. However, there should always be some conflict, so I think your issue must lie elsewhere
Since samtools seems to have been removed from conda-forge I'm unable to reproduce the example myself, so I'm left confused about the exact error you're seeing. Hopefully understanding the version strings helps in the future.
Edit: Of course, samtools is from bioconda not conda-forge! The following command worked for me:
conda create -n testing_junk -c bioconda -y instrain awscli samtools python=3.8
It resolved to these package versions (maybe something was fixed since your question posted):
The following packages will be downloaded:
package | build
---------------------------|-----------------
asteval-0.9.16 | pyh5ca1d4c_0 18 KB conda-forge
awscli-1.18.221 | py38h50d1736_0 1.8 MB conda-forge
biopython-1.74 | py38h0b31af3_0 2.5 MB conda-forge
blas-2.14 | openblas 10 KB conda-forge
boost-1.70.0 | py38hbf1eeb5_1 347 KB conda-forge
boost-cpp-1.70.0 | hef959ae_3 18.9 MB conda-forge
botocore-1.19.61 | pyhd8ed1ab_0 4.5 MB conda-forge
brotlipy-0.7.0 |py38h5406a74_1001 357 KB conda-forge
c-ares-1.11.0 | 0 73 KB bioconda
capnproto-0.6.1 | h0ceac7d_2 2.4 MB conda-forge
cffi-1.14.4 | py38h979bc6b_1 219 KB conda-forge
colorama-0.4.3 | py_0 17 KB conda-forge
cryptography-3.3.1 | py38h6b4ec92_1 614 KB conda-forge
docutils-0.15.2 | py38h50d1736_1 739 KB conda-forge
drep-3.0.0 | py_2 59 KB bioconda
fastani-1.32 | he69ab0f_0 151 KB bioconda
future-0.18.2 | py38h50d1736_3 715 KB conda-forge
hdf5-1.10.6 |nompi_h34ad4e8_1111 3.0 MB conda-forge
htslib-1.11 | h422799e_1 1.5 MB bioconda
idna-3.1 | pyhd3deb0d_0 52 KB conda-forge
instrain-1.4.0 | py_0 380 KB bioconda
jmespath-0.10.0 | pyh9f0ad1d_0 21 KB conda-forge
joblib-1.0.0 | pyhd8ed1ab_0 206 KB conda-forge
kiwisolver-1.3.1 | py38hd9c93a9_1 57 KB conda-forge
libblas-3.8.0 | 14_openblas 10 KB conda-forge
libcblas-3.8.0 | 14_openblas 10 KB conda-forge
libdeflate-1.6 | h0b31af3_0 61 KB conda-forge
liblapack-3.8.0 | 14_openblas 10 KB conda-forge
liblapacke-3.8.0 | 14_openblas 10 KB conda-forge
libnghttp2-1.41.0 | h8a08a2b_1 736 KB conda-forge
libopenblas-0.3.7 | h3d69b6c_4 8.2 MB conda-forge
llvm-openmp-8.0.1 | h770b8ee_0 253 KB conda-forge
llvmlite-0.34.0 | py38h3707e27_2 247 KB conda-forge
lmfit-1.0.1 | py_1 69 KB conda-forge
mash-2.2.2 | h194473e_2 449 KB bioconda
matplotlib-base-3.3.4 | py38hb24f337_0 6.8 MB conda-forge
mummer4-4.0.0rc1 | pl526h4a8c4bd_0 699 KB bioconda
numba-0.51.2 | py38h6be0db6_0 3.5 MB conda-forge
openmp-8.0.1 | 0 8 KB conda-forge
pandas-1.2.1 | py38he9f00de_0 10.6 MB conda-forge
pillow-8.1.0 | py38hc1d52f7_1 646 KB conda-forge
pluggy-0.13.1 | py38h50d1736_4 29 KB conda-forge
prodigal-2.6.3 | h01d97ff_2 397 KB bioconda
psutil-5.8.0 | py38h5406a74_1 350 KB conda-forge
pyasn1-0.4.8 | py_0 53 KB conda-forge
pysam-0.16.0.1 | py38hb3e8b06_1 2.1 MB bioconda
pysocks-1.7.1 | py38h50d1736_3 27 KB conda-forge
pytest-6.2.2 | py38h50d1736_0 432 KB conda-forge
pyyaml-5.3.1 | py38h5406a74_2 173 KB conda-forge
rsa-4.4.1 | pyh9f0ad1d_0 27 KB conda-forge
s3transfer-0.3.4 | pyhd8ed1ab_0 51 KB conda-forge
samtools-1.11 | h725deca_0 381 KB bioconda
scikit-learn-0.22.1 | py38hebd9d1a_0 4.7 MB
scipy-1.5.3 | py38h352ea5d_0 19.1 MB conda-forge
seaborn-0.11.1 | hd8ed1ab_1 4 KB conda-forge
seaborn-base-0.11.1 | pyhd8ed1ab_1 217 KB conda-forge
statsmodels-0.12.1 | py38hc7193ba_2 10.5 MB conda-forge
tornado-6.1 | py38h5406a74_1 643 KB conda-forge
uncertainties-3.1.5 | pyhd8ed1ab_0 75 KB conda-forge
------------------------------------------------------------
Total: 110.1 MB
sometimes packages have conflicts because are download from different chanels. Try this in terminal:
conda config --add channels conda-forge # add conda-forge channel
conda config --set channel_priority strict # set priority of channel
then try:
conda create -n testing_junk -c conda-forge python=3.8 -y instrain awscli samtools
I hope you find it useful

How to diagnose a conda install that forces downgrade of other packages?

I tried to use Anaconda to install Pandas. However, as part of doing so, conda forcibly downgraded other unrelated packages, such as TensorFlow and Numpy. Pandas eventually installed with version 1.2.4.
I don't want these other packages to be downgraded. How do I diagnose this problem? Is the problem with Pandas, conda, or the conda package server?
> conda install pandas
The following packages will be downloaded:
package | build
---------------------------|-----------------
h5py-2.10.0 | py39hec9cf62_0 1.1 MB
intel-openmp-2020.2 | 254 947 KB
mkl-2020.2 | 256 213.9 MB
mkl-service-2.3.0 | py39he8ac12f_0 58 KB
mkl_fft-1.3.0 | py39h54f3939_0 195 KB
mkl_random-1.0.2 | py39h63df603_0 379 KB
numpy-1.19.2 | py39h89c1606_0 21 KB
numpy-base-1.19.2 | py39h2ae0177_0 5.3 MB
pandas-1.2.4 | py39h2531618_0 11.4 MB
python-dateutil-2.8.2 | pyhd3eb1b0_0 241 KB
pytz-2021.1 | pyhd3eb1b0_0 244 KB
scipy-1.6.2 | py39h91f5cce_0 20.6 MB
tensorflow-2.4.1 |mkl_py39h4683426_0 3 KB
tensorflow-base-2.4.1 |mkl_py39h43e0292_0 125.5 MB
------------------------------------------------------------
Total: 379.7 MB
The following NEW packages will be INSTALLED:
pandas pkgs/main/linux-64::pandas-1.2.4-py39h2531618_0
python-dateutil pkgs/main/noarch::python-dateutil-2.8.2-pyhd3eb1b0_0
pytz pkgs/main/noarch::pytz-2021.1-pyhd3eb1b0_0
The following packages will be DOWNGRADED:
h5py 3.2.1-py39h6c542dc_0 --> 2.10.0-py39hec9cf62_0
intel-openmp 2021.3.0-h06a4308_3350 --> 2020.2-254
mkl 2021.3.0-h06a4308_520 --> 2020.2-256
mkl-service 2.4.0-py39h7f8727e_0 --> 2.3.0-py39he8ac12f_0
mkl_fft 1.3.0-py39h42c9631_2 --> 1.3.0-py39h54f3939_0
mkl_random 1.2.2-py39h51133e4_0 --> 1.0.2-py39h63df603_0
numpy 1.20.3-py39hf144106_0 --> 1.19.2-py39h89c1606_0
numpy-base 1.20.3-py39h74d4b33_0 --> 1.19.2-py39h2ae0177_0
scipy 1.6.2-py39had2a1c9_1 --> 1.6.2-py39h91f5cce_0
tensorflow 2.5.0-mkl_py39h4a0693c_0 --> 2.4.1-mkl_py39h4683426_0
tensorflow-base 2.5.0-mkl_py39h35b2a3d_0 --> 2.4.1-mkl_py39h43e0292_0
Proceed ([y]/n)?

Why does conda want to update unrelated packages when I want to remove just one?

Windows 10
conda 4.9.2 (via miniconda)
I installed a single package that did not require any other dependencies to be installed anew or upgraded. Once I realised that I had installed an unsuitable version of the package, I went to remove it, and this is the screen I was presented with:
(pydata) PS C:\Users\Navneeth> conda remove xlrd
Collecting package metadata (repodata.json): done
Solving environment: |
Warning: 2 possible package resolutions (only showing differing packages):
- defaults/win-64::libtiff-4.1.0-h56a325e_1, defaults/win-64::zstd-1.4.9-h19a0ad4_0
- defaults/win-64::libtiff-4.2.0-hd0e1b90_0, defaults/win-64::zstd-1.4.5-h04227a9done
## Package Plan ##
environment location: C:\Users\Navneeth\Miniconda3\envs\pydata
removed specs:
- xlrd
The following packages will be downloaded:
package | build
---------------------------|-----------------
decorator-5.0.3 | pyhd3eb1b0_0 12 KB
importlib-metadata-3.7.3 | py38haa95532_1 31 KB
importlib_metadata-3.7.3 | hd3eb1b0_1 11 KB
ipython-7.22.0 | py38hd4e2768_0 998 KB
jupyter_client-6.1.12 | pyhd3eb1b0_0 88 KB
libtiff-4.1.0 | h56a325e_1 739 KB
nbformat-5.1.3 | pyhd3eb1b0_0 44 KB
notebook-6.3.0 | py38haa95532_0 4.4 MB
pandoc-2.12 | haa95532_0 13.2 MB
parso-0.8.2 | pyhd3eb1b0_0 69 KB
pillow-8.2.0 | py38h4fa10fc_0 671 KB
prometheus_client-0.10.0 | pyhd3eb1b0_0 46 KB
prompt-toolkit-3.0.17 | pyh06a4308_0 256 KB
terminado-0.9.4 | py38haa95532_0 26 KB
zipp-3.4.1 | pyhd3eb1b0_0 15 KB
zstd-1.4.9 | h19a0ad4_0 478 KB
------------------------------------------------------------
Total: 21.0 MB
The following packages will be REMOVED:
xlrd-2.0.1-pyhd3eb1b0_0
The following packages will be UPDATED:
decorator 4.4.2-pyhd3eb1b0_0 --> 5.0.3-pyhd3eb1b0_0
importlib-metadata pkgs/main/noarch::importlib-metadata-~ --> pkgs/main/win-64::importlib-metadata-3.7.3-py38haa95532_1
importlib_metadata 2.0.0-1 --> 3.7.3-hd3eb1b0_1
ipython 7.21.0-py38hd4e2768_0 --> 7.22.0-py38hd4e2768_0
jupyter_client 6.1.7-py_0 --> 6.1.12-pyhd3eb1b0_0
nbformat 5.1.2-pyhd3eb1b0_1 --> 5.1.3-pyhd3eb1b0_0
notebook 6.2.0-py38haa95532_0 --> 6.3.0-py38haa95532_0
pandoc 2.11-h9490d1a_0 --> 2.12-haa95532_0
parso 0.8.1-pyhd3eb1b0_0 --> 0.8.2-pyhd3eb1b0_0
pillow 8.1.2-py38h4fa10fc_0 --> 8.2.0-py38h4fa10fc_0
prometheus_client 0.9.0-pyhd3eb1b0_0 --> 0.10.0-pyhd3eb1b0_0
prompt-toolkit 3.0.8-py_0 --> 3.0.17-pyh06a4308_0
sqlite 3.33.0-h2a8f88b_0 --> 3.35.3-h2bbff1b_0
terminado 0.9.2-py38haa95532_0 --> 0.9.4-py38haa95532_0
zipp 3.4.0-pyhd3eb1b0_0 --> 3.4.1-pyhd3eb1b0_0
zstd 1.4.5-h04227a9_0 --> 1.4.9-h19a0ad4_0
The following packages will be DOWNGRADED:
libtiff 4.2.0-he0120a3_0 --> 4.1.0-h56a325e_1
Proceed ([y]/n)?
Why does conda want to update or downgrade all these other packages when the opposite wasn't done when I installed xlrd? Is there a way that I can safely remove the just xlrd. (I hear using --force is risky.)
Asymmetry
Conda re-solves when removing. When installing, Conda first attempts a frozen solve, which amounts to keeping all installed packages fixed and just searching for a version of the requested package(s) that are compatible. In this specific case, xlrd (v2.1.0) is a noarch with only a python>=3.6 constraint. So this installs in this frozen solve pass.
The constraint xlrd will also be added to the explicit specifications.1
When removing, Conda will first remove the constraint, and then re-solves the environment with the new set of explicit specifications. It is in this solve that Conda identifies that newer versions of packages and then proposes updating then.
So, the asymmetry is that the frozen solve explicitly avoids checking for any new packages, but the removal will trigger such a check. There is not currently a way to avoid this without bypassing dependency checking.
Mamba
Actually, mamba, a compiled (fast!) drop-in replacement for conda, will remove only the specified package if it doesn't have anything depending on it. That is its default behavior in my testing.
Addendum: Still Some Unexplained Behavior
I replicated your experience by first creating an environment with two specs:
name: foo
channels:
- conda-forge
dependencies:
- python=3.8.0
- pip=20
To simulate this being an old environment, I went into the envs/foo/conda-meta/history and changed2 the line
# update specs: ['pip=20', 'python=3.8.0']
to
# update specs: ['python=3.8']
Subsequently running conda install xlrd does as expected. Then conda remove xlrd gives a somewhat odd result:
## Package Plan ##
environment location: /opt/conda/envs/foo
removed specs:
- xlrd
The following packages will be downloaded:
package | build
---------------------------|-----------------
pip-21.1.1 | pyhd8ed1ab_0 1.1 MB conda-forge
------------------------------------------------------------
Total: 1.1 MB
The following packages will be REMOVED:
xlrd-2.0.1-pyhd8ed1ab_3
The following packages will be UPDATED:
pip 20.3.4-pyhd8ed1ab_0 --> 21.1.1-pyhd8ed1ab_0
Proceed ([y]/n)?
This effectively replicates OP result, however, the additional oddity here is that the python package is not suggested to be updated, even though I had intentionally loosened its constraint from 3.8.0 to 3.8. It appears that only packages not in the explicit specifications are subject to updating during package removal.
[1] The explicit specifications are the internally maintained records that Conda keeps of every constraint a user has explicitly specified. One can view the current explicit specifications of an environment with conda env export --from-history. The raw internal records can be found at yourenv/conda-meta/history.
[2] Not a recommended practice!

conda install packages without upgrading python [duplicate]

I had been using Anaconda with python 2.7
$ python
Python 2.7.14 |Anaconda custom (64-bit)| (default, Dec 7 2017, 17:05:42)
[GCC 7.2.0] on linux2
Type "help", "copyright", "credits" or "license" for more information.
When I decided to install tensorflow (since for some reason I had the non-gpu version)
The command I used was:
$ conda install -c anaconda tensorflow-gpu
However, after it was done (detail on output of this cmd to follow), I no longer had conda:
$ conda install -c conda-forge keras
Traceback (most recent call last):
File "/home/me/anaconda2/bin/conda", line 12, in <module>
from conda.cli import main
ModuleNotFoundError: No module named 'conda'
(Note: I also no longer had Keras) and was now running Python 3.7(!?):
$ python
Python 3.6.8 |Anaconda, Inc.| (default, Dec 30 2018, 01:22:34)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>>
What happened? How do I stop it from happening again? This happened once before and I ended up deleting all my anaconda files, then reinstalling. I don't want to make that a habit.
The output of my conda install was:
$ conda install -c anaconda tensorflow-gpu
Collecting package metadata: done
Solving environment: done
## Package Plan ##
environment location: /home/me/anaconda2
added / updated specs:
- tensorflow-gpu
The following packages will be downloaded:
package | build
---------------------------|-----------------
_tflow_190_select-0.0.1 | gpu 2 KB anaconda
absl-py-0.7.0 | py36_0 156 KB anaconda
astor-0.7.1 | py36_0 43 KB anaconda
c-ares-1.15.0 | h7b6447c_1 98 KB anaconda
ca-certificates-2018.12.5 | 0 123 KB anaconda
certifi-2018.11.29 | py36_0 146 KB anaconda
cudatoolkit-9.0 | h13b8566_0 340.4 MB anaconda
cudnn-7.1.2 | cuda9.0_0 367.8 MB anaconda
cupti-9.0.176 | 0 1.6 MB anaconda
curl-7.63.0 | hbc83047_1000 145 KB anaconda
gast-0.2.2 | py36_0 138 KB anaconda
git-2.11.1 | 0 9.5 MB anaconda
grpcio-1.16.1 | py36hf8bcb03_1 1.1 MB anaconda
krb5-1.16.1 | h173b8e3_7 1.4 MB anaconda
libcurl-7.63.0 | h20c2e04_1000 550 KB anaconda
libedit-3.1.20181209 | hc058e9b_0 188 KB anaconda
libssh2-1.8.0 | h1ba5d50_4 233 KB anaconda
markdown-3.0.1 | py36_0 107 KB anaconda
mkl_fft-1.0.10 | py36ha843d7b_0 170 KB anaconda
mkl_random-1.0.2 | py36hd81dba3_0 407 KB anaconda
ncurses-6.1 | he6710b0_1 958 KB anaconda
numpy-1.15.4 | py36h7e9f1db_0 47 KB anaconda
numpy-base-1.15.4 | py36hde5b4d6_0 4.3 MB anaconda
openssl-1.1.1 | h7b6447c_0 5.0 MB anaconda
pip-18.1 | py36_0 1.8 MB anaconda
protobuf-3.5.2 | py36hf484d3e_1 610 KB anaconda
python-3.6.8 | h0371630_0 34.4 MB anaconda
qt-4.8.7 | 2 34.1 MB anaconda
setuptools-40.6.3 | py36_0 625 KB anaconda
six-1.12.0 | py36_0 22 KB anaconda
sqlite-3.26.0 | h7b6447c_0 1.9 MB anaconda
tensorboard-1.9.0 | py36hf484d3e_0 3.3 MB anaconda
tensorflow-1.9.0 |gpu_py36h02c5d5e_1 3 KB anaconda
tensorflow-base-1.9.0 |gpu_py36h6ecc378_0 170.8 MB anaconda
tensorflow-gpu-1.9.0 | hf154084_0 2 KB anaconda
termcolor-1.1.0 | py36_1 7 KB anaconda
tk-8.6.8 | hbc83047_0 3.1 MB anaconda
werkzeug-0.14.1 | py36_0 423 KB anaconda
wheel-0.32.3 | py36_0 35 KB anaconda
------------------------------------------------------------
Total: 985.7 MB
The following NEW packages will be INSTALLED:
_tflow_190_select anaconda/linux-64::_tflow_190_select-0.0.1-gpu
c-ares anaconda/linux-64::c-ares-1.15.0-h7b6447c_1
cudatoolkit anaconda/linux-64::cudatoolkit-9.0-h13b8566_0
cudnn anaconda/linux-64::cudnn-7.1.2-cuda9.0_0
cupti anaconda/linux-64::cupti-9.0.176-0
krb5 anaconda/linux-64::krb5-1.16.1-h173b8e3_7
pip anaconda/linux-64::pip-18.1-py36_0
tensorflow-gpu anaconda/linux-64::tensorflow-gpu-1.9.0-hf154084_0
The following packages will be UPDATED:
absl-py conda-forge/noarch::absl-py-0.1.10-py~ --> anaconda/linux-64::absl-py-0.7.0-py36_0
ca-certificates conda-forge::ca-certificates-2018.11.~ --> anaconda::ca-certificates-2018.12.5-0
curl pkgs/main::curl-7.60.0-h84994c4_0 --> anaconda::curl-7.63.0-hbc83047_1000
gast 0.2.0-py27_0 --> 0.2.2-py36_0
grpcio pkgs/main::grpcio-1.12.1-py27hdbcaa40~ --> anaconda::grpcio-1.16.1-py36hf8bcb03_1
libcurl pkgs/main::libcurl-7.60.0-h1ad7b7a_0 --> anaconda::libcurl-7.63.0-h20c2e04_1000
libedit pkgs/main::libedit-3.1-heed3624_0 --> anaconda::libedit-3.1.20181209-hc058e9b_0
markdown conda-forge/noarch::markdown-2.6.11-p~ --> anaconda/linux-64::markdown-3.0.1-py36_0
mkl_fft pkgs/main::mkl_fft-1.0.6-py27hd81dba3~ --> anaconda::mkl_fft-1.0.10-py36ha843d7b_0
ncurses pkgs/main::ncurses-6.0-h9df7e31_2 --> anaconda::ncurses-6.1-he6710b0_1
openssl conda-forge::openssl-1.0.2p-h14c3975_~ --> anaconda::openssl-1.1.1-h7b6447c_0
protobuf conda-forge::protobuf-3.5.2-py27hd28b~ --> anaconda::protobuf-3.5.2-py36hf484d3e_1
python pkgs/main::python-2.7.14-h1571d57_29 --> anaconda::python-3.6.8-h0371630_0
setuptools pkgs/main::setuptools-38.4.0-py27_0 --> anaconda::setuptools-40.6.3-py36_0
six pkgs/main::six-1.11.0-py27h5f960f1_1 --> anaconda::six-1.12.0-py36_0
sqlite pkgs/main::sqlite-3.23.1-he433501_0 --> anaconda::sqlite-3.26.0-h7b6447c_0
tensorflow conda-forge::tensorflow-1.3.0-py27_0 --> anaconda::tensorflow-1.9.0-gpu_py36h02c5d5e_1
tk pkgs/main::tk-8.6.7-hc745277_3 --> anaconda::tk-8.6.8-hbc83047_0
wheel pkgs/main::wheel-0.30.0-py27h2bc6bb2_1 --> anaconda::wheel-0.32.3-py36_0
The following packages will be SUPERSEDED by a higher-priority channel:
certifi conda-forge::certifi-2018.11.29-py27_~ --> anaconda::certifi-2018.11.29-py36_0
git pkgs/main::git-2.17.0-pl526hb75a9fb_0 --> anaconda::git-2.11.1-0
libssh2 pkgs/main::libssh2-1.8.0-h9cfc8f7_4 --> anaconda::libssh2-1.8.0-h1ba5d50_4
mkl_random pkgs/main::mkl_random-1.0.2-py27hd81d~ --> anaconda::mkl_random-1.0.2-py36hd81dba3_0
numpy pkgs/main::numpy-1.15.4-py27h7e9f1db_0 --> anaconda::numpy-1.15.4-py36h7e9f1db_0
numpy-base pkgs/main::numpy-base-1.15.4-py27hde5~ --> anaconda::numpy-base-1.15.4-py36hde5b4d6_0
qt pkgs/main::qt-5.9.4-h4e5bff0_0 --> anaconda::qt-4.8.7-2
tensorflow-base pkgs/main::tensorflow-base-1.9.0-eige~ --> anaconda::tensorflow-base-1.9.0-gpu_py36h6ecc378_0
werkzeug pkgs/main::werkzeug-0.14.1-py27_0 --> anaconda::werkzeug-0.14.1-py36_0
The following packages will be DOWNGRADED:
astor 0.7.1-py27_0 --> 0.7.1-py36_0
tensorboard 1.10.0-py27hf484d3e_0 --> 1.9.0-py36hf484d3e_0
termcolor 1.1.0-py27_1 --> 1.1.0-py36_1
Proceed ([y]/n)? y
Downloading and Extracting Packages
tensorflow-gpu-1.9.0 | 2 KB | ########################################################################################################################################## | 100%
absl-py-0.7.0 | 156 KB | ########################################################################################################################################## | 100%
six-1.12.0 | 22 KB | ########################################################################################################################################## | 100%
git-2.11.1 | 9.5 MB | ########################################################################################################################################## | 100%
_tflow_190_select-0. | 2 KB | ########################################################################################################################################## | 100%
setuptools-40.6.3 | 625 KB | ########################################################################################################################################## | 100%
c-ares-1.15.0 | 98 KB | ########################################################################################################################################## | 100%
cupti-9.0.176 | 1.6 MB | ########################################################################################################################################## | 100%
libssh2-1.8.0 | 233 KB | ########################################################################################################################################## | 100%
gast-0.2.2 | 138 KB | ########################################################################################################################################## | 100%
ncurses-6.1 | 958 KB | ########################################################################################################################################## | 100%
protobuf-3.5.2 | 610 KB | ########################################################################################################################################## | 100%
tensorflow-base-1.9. | 170.8 MB | ########################################################################################################################################## | 100%
ca-certificates-2018 | 123 KB | ########################################################################################################################################## | 100%
python-3.6.8 | 34.4 MB | ########################################################################################################################################## | 100%
cudatoolkit-9.0 | 340.4 MB | ########################################################################################################################################## | 100%
qt-4.8.7 | 34.1 MB | ########################################################################################################################################## | 100%
sqlite-3.26.0 | 1.9 MB | ########################################################################################################################################## | 100%
astor-0.7.1 | 43 KB | ########################################################################################################################################## | 100%
tensorboard-1.9.0 | 3.3 MB | ########################################################################################################################################## | 100%
mkl_fft-1.0.10 | 170 KB | ########################################################################################################################################## | 100%
mkl_random-1.0.2 | 407 KB | ########################################################################################################################################## | 100%
certifi-2018.11.29 | 146 KB | ########################################################################################################################################## | 100%
wheel-0.32.3 | 35 KB | ########################################################################################################################################## | 100%
numpy-base-1.15.4 | 4.3 MB | ########################################################################################################################################## | 100%
numpy-1.15.4 | 47 KB | ########################################################################################################################################## | 100%
curl-7.63.0 | 145 KB | ########################################################################################################################################## | 100%
openssl-1.1.1 | 5.0 MB | ########################################################################################################################################## | 100%
tk-8.6.8 | 3.1 MB | ########################################################################################################################################## | 100%
libedit-3.1.20181209 | 188 KB | ########################################################################################################################################## | 100%
markdown-3.0.1 | 107 KB | ########################################################################################################################################## | 100%
werkzeug-0.14.1 | 423 KB | ########################################################################################################################################## | 100%
krb5-1.16.1 | 1.4 MB | ########################################################################################################################################## | 100%
termcolor-1.1.0 | 7 KB | ########################################################################################################################################## | 100%
pip-18.1 | 1.8 MB | ########################################################################################################################################## | 100%
libcurl-7.63.0 | 550 KB | ########################################################################################################################################## | 100%
tensorflow-1.9.0 | 3 KB | ########################################################################################################################################## | 100%
grpcio-1.16.1 | 1.1 MB | ########################################################################################################################################## | 100%
cudnn-7.1.2 | 367.8 MB | ########################################################################################################################################## | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
(OK - I see the change to Python 3.7 now, but that's still a nasty thing to have to be careful about. Is there some way to force it to leave my Python version alone?)
Cause
Changing Python versions without updating the conda package breaks Conda. . The Python version change (2.7.14 -> 3.6.8) created a situation where the new python has a new site-packages which no longer contains a conda package, whereas if you only update within 2.7.x, this wouldn't be an issue.
Conda includes both a set of binaries (e.g., what you're invoking when you type conda in a shell) and a Python package by the same name. The Python package is necessary for Conda as a whole to function and it get's loaded whenever you try to use conda.
It is problematic that many packages on Anaconda seem to be triggering Python version changes, but not subsequently triggering a conda package update. This sounds like something the dependency resolver is overlooking - i.e., default behavior should be to protect integrity of base environment where conda lives.
Trying to Recover
One possible route to recovery is to temporarily use micromamba (a standalone build of mamba) to repair the base environment. You can do all the following from any directory, so maybe use a temporary one or wherever you put downloads. Please report in the comments if this works or needs adjusting!
Installing Micromamba
Download the appropriate micromamba for your platform (here we'll use the latest linux-64 build). The actual binary will be at bin/micromamba:
# download and unpack
wget -qO- https://micro.mamba.pm/api/micromamba/linux-64/latest | tar -xvj bin/micromamba
Temporarily set MAMBA_ROOT_PREFIX to the base of your install. Typically this is the anaconda3 or miniconda3 folder; in this case, we'll use the path given by OP:
export MAMBA_ROOT_PREFIX=/home/me/anaconda2
Temporarily configured the shell to add the micromamba command:
eval "$(./bin/micromamba shell hook -s posix)"
Test that is works by checking the configuration information:
micromamba info
The key thing to check for is that base environment: correctly identifies to where your base env is and shows it as (writable). You should also see the pkgs folder in your base env in the package cache: .
Reinstall conda for the Current Python
(Re-)Install the conda package in the base env:
micromamba install -n base conda
Make sure that the build of Conda that is suggested corresponds to the version of Python currently installed. The --force-reinstall flag might be useful if it claims the requirement is already satisfied. Alternatively, try
micromamba upgrade -n base conda
Try a new shell and see if conda is working. You don't need to keep the micromamba around. However, I do enthusiastically encourage users to permanently install mamba (see next step).
(Optional) Install Mamba in base
Consider also installing Mamba directly in the base environment. It is a compiled (fast!) alternative frontend to Conda environment management.
micromamba install -n base mamba
One can then use mamba in most places where conda would be used.
Last Recourse
If all else fails you may just have to reinstall. Others have reported installing in other directories and being able to still use and access their environmentss.
Preventions
Avoiding Breakage through Better Practice
First, just a general (opinionated) recommendation: leverage virtual environments more. This isn't directly solving the problem, but it will help you have a workflow that is significantly less prone to encountering such pitfalls. You shouldn't have accepted such a huge change in the first place, not to base. Personally, I rarely install things in base outside of infrastructure (emacs, jupyter-related things, conda, etc.).1 Software packages go into project-specific or at least development-type environments.
For example, were I doing the install shown, I would have made a new environment for it
mamba create -n tf36 anaconda::tensorflow-gpu python=3.6
or whatever Python version you actually wish to work in.
Direct Solution: Pinning
Conda does support package pinning, and this is the more direct way to ensure you never ruin your base install again by transitioning Python 2 to 3. Namely, in the environment's conda-meta folder create a file, pinned and add the line
python 2.7.*
Note that some users have reported similar issues for 3.6 -> 3.7 transitions, so I believe including the minor version here is necessary. See the documentation on pinning.
[1] Note that I use a Miniforge variant (Mambaforge), not the Anaconda installer, so I have more control over base from the start.
I have solved this issue by removing any PYTHONHOME sys PATH(s).

anaconda update all downgrades packages

When I try to update all packages in my Anaconda3 virtualenv using the conda update --all command, instead of upgrading all packages, some packages Anaconda tells me would be downgraded.
This is the output of the conda update --all command:
Fetching package metadata: ....
Solving package specifications: ......................................................................................................................................................................................................................................................................................
Package plan for installation in environment /home/xiaolong/development/anaconda3/envs/jupyter:
The following packages will be downloaded:
package | build
---------------------------|-----------------
mkl-rt-11.1 | p0 100.1 MB
numpy-1.10.2 | py35_p0 5.8 MB
pillow-3.1.1 | py35_0 812 KB
werkzeug-0.11.4 | py35_0 420 KB
clyent-1.2.1 | py35_0 13 KB
numexpr-2.4.4 | np110py35_p0 334 KB
scipy-0.16.1 | np110py35_p0 23.2 MB
bokeh-0.11.1 | py35_0 3.1 MB
datashape-0.5.1 | py35_0 91 KB
scikit-learn-0.17 | np110py35_p1 8.8 MB
odo-0.4.2 | py35_0 176 KB
------------------------------------------------------------
Total: 142.8 MB
The following NEW packages will be INSTALLED:
mkl-rt: 11.1-p0
The following packages will be UPDATED:
bokeh: 0.11.0-py35_0 --> 0.11.1-py35_0
clyent: 1.2.0-py35_0 --> 1.2.1-py35_0
datashape: 0.5.0-py35_0 --> 0.5.1-py35_0
odo: 0.4.0-py35_0 --> 0.4.2-py35_0
pillow: 3.1.0-py35_0 --> 3.1.1-py35_0
werkzeug: 0.11.3-py35_0 --> 0.11.4-py35_0
The following packages will be DOWNGRADED:
numexpr: 2.4.6-np110py35_1 --> 2.4.4-np110py35_p0 [mkl]
numpy: 1.10.4-py35_0 --> 1.10.2-py35_p0 [mkl]
scikit-learn: 0.17-np110py35_2 --> 0.17-np110py35_p1 [mkl]
scipy: 0.17.0-np110py35_1 --> 0.16.1-np110py35_p0 [mkl]
Proceed ([y]/n)?
I'd like to know why this is happening. Why would some packages be downgraded? Maybe a better question is:
What changed in those downgrading versions of the packages, so that now when I update other packages, they need to be reverted to an earlier version?
From this I hope to conclude, whether I need any property of the current version of for example scipy, or if I can let it be downgraded.

Categories

Resources