Does it make sense to use Conda + Poetry? - python

Does it make sense to use Conda + Poetry for a Machine Learning project? Allow me to share my (novice) understanding and please correct or enlighten me:
As far as I understand, Conda and Poetry have different purposes but are largely redundant:
Conda is primarily a environment manager (in fact not necessarily Python), but it can also manage packages and dependencies.
Poetry is primarily a Python package manager (say, an upgrade of pip), but it can also create and manage Python environments (say, an upgrade of Pyenv).
My idea is to use both and compartmentalize their roles: let Conda be the environment manager and Poetry the package manager. My reasoning is that (it sounds like) Conda is best for managing environments and can be used for compiling and installing non-python packages, especially CUDA drivers (for GPU capability), while Poetry is more powerful than Conda as a Python package manager.
I've managed to make this work fairly easily by using Poetry within a Conda environment. The trick is to not use Poetry to manage the Python environment: I'm not using commands like poetry shell or poetry run, only poetry init, poetry install etc (after activating the Conda environment).
For full disclosure, my environment.yml file (for Conda) looks like this:
name: N
channels:
- defaults
- conda-forge
dependencies:
- python=3.9
- cudatoolkit
- cudnn
and my poetry.toml file looks like that:
[tool.poetry]
name = "N"
authors = ["B"]
[tool.poetry.dependencies]
python = "3.9"
torch = "^1.10.1"
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
To be honest, one of the reasons I proceeded this way is that I was struggling to install CUDA (for GPU support) without Conda.
Does this project design look reasonable to you?

I have experience with a Conda + Poetry setup, and it's been working fine. The great majority of my dependencies are specified in pyproject.toml, but when there's something that's unavailable in PyPI, or installing it with Conda is easier, I add it to environment.yml. Moreover, Conda is used as a virtual environment manager, which works well with Poetry: there is no need to use poetry run or poetry shell, it is enough to activate the right Conda environment.
Tips for creating a reproducible environment
Add Poetry, possibly with a version number (if needed), as a dependency in environment.yml, so that you get Poetry installed when you run conda create, along with Python and other non-PyPI dependencies.
Add conda-lock, which gives you lock files for Conda dependencies, just like you have poetry.lock for Poetry dependencies.
Consider using mamba which is generally compatible with conda, but is better at resolving conflicts, and is also much faster. An additional benefit is that all users of your setup will use the same package resolver, independent from the locally-installed version of Conda.
By default, use Poetry for adding Python dependencies. Install packages via Conda if there's a reason to do so (e.g. in order to get a CUDA-enabled version). In such a case, it is best to specify the package's exact version in environment.yml, and after it's installed, to add an entry with the same version specification to Poetry's pyproject.toml (without ^ or ~ before the version number). This will let Poetry know that the package is there and should not be upgraded.
If you use a different channels that provide the same packages, it might be not obvious which channel a particular package will be downloaded from. One solution is to specify the channel for the package using the :: notation (see the pytorch entry below), and another solution is to enable strict channel priority. Unfortunately, in Conda 4.x there is no way to enable this option through environment.yml.
Note that Python adds user site-packages to sys.path, which may cause lack of reproducibility if the user has installed Python packages outside Conda environments. One possible solution is to make sure that the PYTHONNOUSERSITE environment variable is set to True (or to any other non-empty value).
Example
environment.yml:
name: my_project_env
channels:
- pytorch
- conda-forge
# We want to have a reproducible setup, so we don't want default channels,
# which may be different for different users. All required channels should
# be listed explicitly here.
- nodefaults
dependencies:
- python=3.10.* # or don't specify the version and use the latest stable Python
- mamba
- pip # pip must be mentioned explicitly, or conda-lock will fail
- poetry=1.* # or 1.1.*, or no version at all -- as you want
- tensorflow=2.8.0
- pytorch::pytorch=1.11.0
- pytorch::torchaudio=0.11.0
- pytorch::torchvision=0.12.0
# Non-standard section listing target platforms for conda-lock:
platforms:
- linux-64
virtual-packages.yml (may be used e.g. when we want conda-lock to generate CUDA-enabled lock files even on platforms without CUDA):
subdirs:
linux-64:
packages:
__cuda: 11.5
First-time setup
You can avoid playing with the bootstrap env and simplify the example below if you have conda-lock, mamba and poetry already installed outside your target environment.
# Create a bootstrap env
conda create -p /tmp/bootstrap -c conda-forge mamba conda-lock poetry='1.*'
conda activate /tmp/bootstrap
# Create Conda lock file(s) from environment.yml
conda-lock -k explicit --conda mamba
# Set up Poetry
poetry init --python=~3.10 # version spec should match the one from environment.yml
# Fix package versions installed by Conda to prevent upgrades
poetry add --lock tensorflow=2.8.0 torch=1.11.0 torchaudio=0.11.0 torchvision=0.12.0
# Add conda-lock (and other packages, as needed) to pyproject.toml and poetry.lock
poetry add --lock conda-lock
# Remove the bootstrap env
conda deactivate
rm -rf /tmp/bootstrap
# Add Conda spec and lock files
git add environment.yml virtual-packages.yml conda-linux-64.lock
# Add Poetry spec and lock files
git add pyproject.toml poetry.lock
git commit
Usage
The above setup may seem complex, but it can be used in a fairly simple way.
Creating the environment
conda create --name my_project_env --file conda-linux-64.lock
conda activate my_project_env
poetry install
Activating the environment
conda activate my_project_env
Updating the environment
# Re-generate Conda lock file(s) based on environment.yml
conda-lock -k explicit --conda mamba
# Update Conda packages based on re-generated lock file
mamba update --file conda-linux-64.lock
# Update Poetry packages and re-generate poetry.lock
poetry update

To anyone using #michau's answer but having issues including poetry in the environment.yml. Currently, poetry versions 1.2 or greater aren't supported by conda-forge. You can still include poetry v1.2 in the .yml with the below as an alternative:
dependencies:
- python=3.9.*
- mamba
- pip
- pip:
- "poetry>=1.2"

Related

Why some specific versions of packages are not available in conda channels [duplicate]

I am attempting to use Conda to create an environment from a Pip requirements file. The contents of the file are
requirements.txt
numpy==1.18.2
torch==1.4.0
torchvision==0.5.0
scikit-learn==0.22.2.post1
Pillow==8.3.2
pydicom==1.4.2
pandas==1.0.3
Running the command
conda create -n $name --file requirements.txt
gives a PackageNotFound error as the channels are missing.
How do I amend this?
Possible Issues
There are a few potential issues.
Conda pytorch
First, not all packages in Conda go by the same name as they do in other repositories. Part of this is due to the nature of Conda being a general package repository, rather than a language-specific one. In particular, the torch module is delivered via the Conda pytorch package.
So that has to change.
NumPy version unavailable
That particular build of NumPy does not appear to be available in either defaults or conda-forge channels.
$ mamba search numpy=1.18.2
No match found for: numpy=1.18.2. Search: *numpy*=1.18.2
PackagesNotFoundError: The following packages are not available from current channels:
- numpy=1.18.2
Current channels:
- https://conda.anaconda.org/conda-forge/osx-64
- https://conda.anaconda.org/conda-forge/noarch
- https://conda.anaconda.org/bioconda/osx-64
- https://conda.anaconda.org/bioconda/noarch
- https://repo.anaconda.com/pkgs/main/osx-64
- https://repo.anaconda.com/pkgs/main/noarch
- https://repo.anaconda.com/pkgs/r/osx-64
- https://repo.anaconda.com/pkgs/r/noarch
Why would this happen? For most Python packages, Conda works downstream of the PyPI repository. When new releases come out, the Conda Forge bot (for example) will auto-generate a pull request to corresponding feedstock. Sometimes these don't "just work" and need some troubleshooting to get built. Occasionally, the process to get the builds working won't finish before a new release hits. This results in a newer pull request superceding the previous one, and can lead to the old pull request being abandoned. This results in gaps in the coverage of PyPI by Conda Forge, and is exactly what happened here.
If you can tolerate a different version, conda-forge does provide v1.18.1 (below) and v1.18.4 (above).
Otherwise, if you require exact replication of package versions, then you will have to source this from PyPI. I'll show this in the end.
Channel issues
Missing channels
OP does not indicate the channel configuration. The torchvision==0.5.0 package, for example, only is available through the pytorch channel.
Masked channels
Another issue here could be the use of the channel_priority: strict setting. If this setting were used, it is possible a channel with the version required might be a priori excluded by the SAT solver simply because the package (but not the correct version) is available in a higher priority channel. These days channel_priority: flexible is the default and can be set with:
conda config --set channel_priority flexible
Solutions
Exact replication (PyPI only)
Give the package names and versions, these packages likely originated from PyPI. If you need to exactly replicate the original environment - say, for reproducing scientific results - then I'd recommend sourcing everything from PyPI. The best way to do this is to use Conda to source Python and Pip, then let Pip install the requirements.txt.
Judging from the package versions, we're talking Python 3.7 or 3.8. You'd probably be fine with just python=3.8, but [a precise guesstimate from release dates would be python=3.8.2. So, try something like:
environment.yaml
name: my_env
channels:
- conda-forge
dependencies:
- python=3.8.2
- pip
- pip:
- -r requirements.txt
Then create the environment with
conda env create -n $name -f environment.yaml
making sure the requirements.txt is in the folder with the YAML.
If adding packages to this environment later, I would recommend only using pip install. Otherwise, Conda may have issues.
Conda-only environment
Assuming the numpy=1.18.2 can be substituted, a Conda-only environment might be something like:
environment.yaml
name: my_env
channels:
- pytorch
- conda-forge
dependencies:
- python=3.8
- numpy=1.18.1 # alternatively, 1.18.4
- pytorch=1.4.0
- torchvision=0.5.0
- scikit-learn=0.22.2.post1
- pillow=8.3.2
- pydicom=1.4.2
- pandas=1.0.3
Again, creating with:
conda env create -n $name -f environment.yaml
Note that in YAML only one = is used. This would be the best approach if you plan to install additional packages through Conda in an ad hoc manner (e.g., conda install).
Mixed Conda-Pip environment
You could also try a mixed environment mostly similar to the last one, but having Pip specifically provide numpy==1.18.2. I wouldn't recommend this, since the other dependencies with definitely bring in NumPy first from Conda, and then Pip will clobber it to provide the exact version.

Conda install conflicting requirements

I am attempting to use Conda to create an environment from a Pip requirements file. The contents of the file are
requirements.txt
numpy==1.18.2
torch==1.4.0
torchvision==0.5.0
scikit-learn==0.22.2.post1
Pillow==8.3.2
pydicom==1.4.2
pandas==1.0.3
Running the command
conda create -n $name --file requirements.txt
gives a PackageNotFound error as the channels are missing.
How do I amend this?
Possible Issues
There are a few potential issues.
Conda pytorch
First, not all packages in Conda go by the same name as they do in other repositories. Part of this is due to the nature of Conda being a general package repository, rather than a language-specific one. In particular, the torch module is delivered via the Conda pytorch package.
So that has to change.
NumPy version unavailable
That particular build of NumPy does not appear to be available in either defaults or conda-forge channels.
$ mamba search numpy=1.18.2
No match found for: numpy=1.18.2. Search: *numpy*=1.18.2
PackagesNotFoundError: The following packages are not available from current channels:
- numpy=1.18.2
Current channels:
- https://conda.anaconda.org/conda-forge/osx-64
- https://conda.anaconda.org/conda-forge/noarch
- https://conda.anaconda.org/bioconda/osx-64
- https://conda.anaconda.org/bioconda/noarch
- https://repo.anaconda.com/pkgs/main/osx-64
- https://repo.anaconda.com/pkgs/main/noarch
- https://repo.anaconda.com/pkgs/r/osx-64
- https://repo.anaconda.com/pkgs/r/noarch
Why would this happen? For most Python packages, Conda works downstream of the PyPI repository. When new releases come out, the Conda Forge bot (for example) will auto-generate a pull request to corresponding feedstock. Sometimes these don't "just work" and need some troubleshooting to get built. Occasionally, the process to get the builds working won't finish before a new release hits. This results in a newer pull request superceding the previous one, and can lead to the old pull request being abandoned. This results in gaps in the coverage of PyPI by Conda Forge, and is exactly what happened here.
If you can tolerate a different version, conda-forge does provide v1.18.1 (below) and v1.18.4 (above).
Otherwise, if you require exact replication of package versions, then you will have to source this from PyPI. I'll show this in the end.
Channel issues
Missing channels
OP does not indicate the channel configuration. The torchvision==0.5.0 package, for example, only is available through the pytorch channel.
Masked channels
Another issue here could be the use of the channel_priority: strict setting. If this setting were used, it is possible a channel with the version required might be a priori excluded by the SAT solver simply because the package (but not the correct version) is available in a higher priority channel. These days channel_priority: flexible is the default and can be set with:
conda config --set channel_priority flexible
Solutions
Exact replication (PyPI only)
Give the package names and versions, these packages likely originated from PyPI. If you need to exactly replicate the original environment - say, for reproducing scientific results - then I'd recommend sourcing everything from PyPI. The best way to do this is to use Conda to source Python and Pip, then let Pip install the requirements.txt.
Judging from the package versions, we're talking Python 3.7 or 3.8. You'd probably be fine with just python=3.8, but [a precise guesstimate from release dates would be python=3.8.2. So, try something like:
environment.yaml
name: my_env
channels:
- conda-forge
dependencies:
- python=3.8.2
- pip
- pip:
- -r requirements.txt
Then create the environment with
conda env create -n $name -f environment.yaml
making sure the requirements.txt is in the folder with the YAML.
If adding packages to this environment later, I would recommend only using pip install. Otherwise, Conda may have issues.
Conda-only environment
Assuming the numpy=1.18.2 can be substituted, a Conda-only environment might be something like:
environment.yaml
name: my_env
channels:
- pytorch
- conda-forge
dependencies:
- python=3.8
- numpy=1.18.1 # alternatively, 1.18.4
- pytorch=1.4.0
- torchvision=0.5.0
- scikit-learn=0.22.2.post1
- pillow=8.3.2
- pydicom=1.4.2
- pandas=1.0.3
Again, creating with:
conda env create -n $name -f environment.yaml
Note that in YAML only one = is used. This would be the best approach if you plan to install additional packages through Conda in an ad hoc manner (e.g., conda install).
Mixed Conda-Pip environment
You could also try a mixed environment mostly similar to the last one, but having Pip specifically provide numpy==1.18.2. I wouldn't recommend this, since the other dependencies with definitely bring in NumPy first from Conda, and then Pip will clobber it to provide the exact version.

How to export and import a conda environment without errors

I exported a conda environment in this way:
conda env export > environment.yml´
Then commited and pulled the environment.yml file to the git repo.
From another computer I cloned the repo and then tried to create the conda environment:
conda env create -f environment.yml
First I got a warning:
Warning: you have pip-installed dependencies in your environment file,
but you do not list pip itself as one of your conda dependencies.
Conda may not use the correct pip to install your packages, and they
may end up in the wrong place. Please add an explicit pip dependency.
I'm adding one for you, but still nagging you
I don't know why conda export does not include pip in the environment definition.
Then I got errors like wrong/unavailable versions of packages:
es-core-news-sm==3.0.0 version not found
I just removed the version part and only left the name of the package and got it work with:
conda env update --prefix ./env --file environment.yml --prune
Here additional details:
I would like to know how can I avoid this behavior?
es-core-news-sm==3.0 does not exist on pypi, where only 3.1 and 2.3.1 are available, hence your error message.
This is of course something very specific to the environment that you have and the packages that you have installed. In your specific case, just removing the version can be a fix, but no guarantee that this will work in all cases.
As for the cause, I can only guess, but what I expect happened in your case is:
You installed es-core-news-sm==3.0 to your environment
The developers of that package created a newer version and decided to delete the old version
Exporting the environment does correctly state that it contains es-core-news-sm==3.0
Creating an environment from the .yaml from step 3 fails, because the packe is not available any longer (see 2.)
An alternative (depending on your usecase) coul;d be to use conda-pack, which can create a packed version of your environment that you can then unpack. This only works though if the OS on the source and target machine are the same

Why does conda create try to install weird packages?

I am trying to install a new conda environment that will be totally separate from my other environments, so I run:
conda create --name foot35 python=3.5
Anaconda then asks for my approval to install these NEW packages:
asn1crypto: 0.22.0-py35he3634b9_1
ca-certificates: 2017.08.26-h94faf87_0
cachecontrol: 0.12.3-py35h3f82863_0
certifi: 2017.7.27.1-py35hbab57cd_0
cffi: 1.10.0-py35h4132a7f_1
chardet: 3.0.4-py35h177e1b7_1
colorama: 0.3.9-py35h32a752f_0
cryptography: 2.0.3-py35h67a4558_1
distlib: 0.2.5-py35h12c42d7_0
html5lib: 0.999999999-py35h79d4e7f_0
idna: 2.6-py35h8dcb9ae_1
lockfile: 0.12.2-py35h667c6d9_0
msgpack-python: 0.4.8-py35hdef45cb_0
openssl: 1.0.2l-vc14hcac20b0_2 [vc14]
packaging: 16.8-py35h5fb721f_1
pip: 9.0.1-py35h69293b5_3
progress: 1.3-py35ha84af61_0
pycparser: 2.18-py35h15a15da_1
pyopenssl: 17.2.0-py35hea705d1_0
pyparsing: 2.2.0-py35hcabcaab_1
pysocks: 1.6.7-py35hb30ac0d_1
python: 3.5.4-hedc2606_15
requests: 2.18.4-py35h54a615f_1
setuptools: 36.5.0-py35h21a22e4_0
six: 1.10.0-py35h06cf344_1
urllib3: 1.22-py35h8cc84eb_0
vc: 14-h2379b0c_1
vs2015_runtime: 14.0.25123-hd4c4e62_1
webencodings: 0.5.1-py35h5d527fb_1
wheel: 0.29.0-py35hdbcb6e6_1
win_inet_pton: 1.0.1-py35hbef1270_1
wincertstore: 0.2-py35hfebbdb8_0
I don't know why it suggests these specific ones. I looked up lockfile and its website says:
Note: This package is deprecated.
Here is a screenshot of my command prompt as additional information.
I am trying to do a clean install that is unrelated/independent to the root environment.
Why is conda trying to install these things and how do I fix it?
conda create will "Create a new conda environment from a list of specified packages." ( https://conda.io/docs/commands/conda-create.html )
What list??!? The .condarc file is the conda configuration file.
https://conda.io/docs/user-guide/configuration/use-condarc.html#overview
The .condarc file can change many parameters, including:
Where conda looks for packages.
If and how conda uses a proxy server.
Where conda lists known environments.
Whether to update the bash prompt with the current activated environment name.
Whether user-built packages should be uploaded to Anaconda.org.
**Default packages or features to include in new environments.**
Additionally, if you ever typed conda config, even accidentally...
The .condarc file is not included by default, but it is automatically created in your home directory the first time you run the conda config command.
A .condarc file may also be located in the root environment, in which case it overrides any in the home directory.
If you would like a single clean env then Boshika's recommendation of --no-default-packages flag for an instance though, you can check and modify the default packages for all further envs. ( https://conda.io/docs/user-guide/configuration/use-condarc.html#always-add-packages-by-default-create-default-packages )
Always add packages by default (create_default_packages)
When creating new environments, add the specified packages by default. The default packages are installed in every environment you create. You can override this option at the command prompt with the --no-default-packages flag. The default is to not include any packages.
EXAMPLE:
create_default_packages:
- pip
- ipython
- scipy=0.15.0
Lockfile may be there due to legacy requirements across all operating systems. Hopefully, you have the tools to remove it if you choose.
To avoid conda from installing all default packages you can try this
conda create --name foot35 --no-deps python=3.5
please don't loose the hope it's very weird for me also.
What you have to do just follow the steps: -
1.Download the anaconda for you system from it's official site and Install it : https://repo.continuum.io
After the Installation process, you can select your own package from there and please don't need to download anything from anywhere, it's full of packages over the internet.
3.If you want to work on python download Syder IDE its very useful for the Machine learning library.
Don't create other environment instead of root by defaults otherwise you have to duplicate all the file again, if there is any error while installing in root so close the window and again run as administration and after that its works fine.
Cause all the file in your root environment so you don't worry about the path in future and you can install and uninstall the packages : like - numpy , pandas, tensorflow and its gpu , scikit-learn etc from there eaisly.
Thank you
These packages are generally useful if you wish to pip install ... anything. Without many of them doing a pip install requests could result in errors such as these (and more)
No Module named Setuptools
pip: command not found
pip is configured with locations that require TLS/SSL, however the ssl module in Python is not available
The issue that the conda create ... exposes is that the packages it wants to pull down are variable (based on when you create the environment). If you wish to maintain the same environment for you and for those who may collaborate with you, then freezing or pinning conda create's default installed package may be necessary.
One way of doing this is creating your environment with conda env create using a conda environment YAML file such as this example:
dependencies:
- ca-certificates=2018.03.07
- certifi=2018.4.16
- libedit=3.1.20170329
- libffi=3.2.1
- ncurses=6.1
- openssl=1.0.2o
- pip=10.0.1
- python=3.6.6
- readline=7.0
- setuptools=40.0.0
- sqlite=3.24.0
- tk=8.6.7
- wheel=0.31.1
- xz=5.2.4
- zlib=1.2.11
conda env create -n <NAME_OF_ENVIRONMENT> -f <PATH_TO_CONDA_REQUIREMENTS_FILE>
(note it's conda env create not conda create)

anaconda update all possible packages?

I tried the conda search --outdated, there are lots of outdated packages, for example the scipy is 0.17.1 but the latest is 0.18.0. However, when I do the conda update --all. It will not update any packages.
update 1
conda update --all --alt-hint
Fetching package metadata .......
Solving package specifications: ..........
# All requested packages already installed.
# packages in environment at /home/user/opt/anaconda2:
#
update 2
I can update those packages separately. I can do conda update scipy. But why I cannot update all of them in one go?
TL;DR: dependency conflicts: Updating one requires (by it's requirements) to downgrade another
You are right:
conda update --all
is actually the way to go1. Conda always tries to upgrade the packages to the newest version in the series (say Python 2.x or 3.x).
Dependency conflicts
But it is possible that there are dependency conflicts (which prevent a further upgrade). Conda usually warns very explicitly if they occur.
e.g. X requires Y <5.0, so Y will never be >= 5.0
That's why you 'cannot' upgrade them all.
Resolving
Update 1: since a while, mamba has proven to be an extremely powerful drop-in replacement for conda in terms of dependency resolution and (IMH experience) finds solutions to problems where conda fails. A way to invoke it without installing mamba is via the --solver=libmamba flag (requires conda-libmamba-solver), as pointed out by matteo in the comments.
To add: maybe it could work but a newer version of X working with Y > 5.0 is not available in conda. It is possible to install with pip, since more packages are available in pip. But be aware that pip also installs packages if dependency conflicts exist and that it usually breaks your conda environment in the sense that you cannot reliably install with conda anymore. If you do that, do it as a last resort and after all packages have been installed with conda. It's rather a hack.
A safe way you can try is to add conda-forge as a channel when upgrading (add -c conda-forge as a flag) or any other channel you find that contains your package if you really need this new version. This way conda does also search in this places for available packages.
Considering your update: You can upgrade them each separately, but doing so will not only include an upgrade but also a downgrade of another package as well. Say, to add to the example above:
X > 2.0 requires Y < 5.0, X < 2.0 requires Y > 5.0
So upgrading Y > 5.0 implies downgrading X to < 2.0 and vice versa.
(this is a pedagogical example, of course, but it's the same in reality, usually just with more complicated dependencies and sub-dependencies)
So you still cannot upgrade them all by doing the upgrades separately; the dependencies are just not satisfiable so earlier or later, an upgrade will downgrade an already upgraded package again. Or break the compatibility of the packages (which you usually don't want!), which is only possible by explicitly invoking an ignore-dependencies and force-command. But that is only to hack your way around issues, definitely not the normal-user case!
1 If you actually want to update the packages of your installation, which you usually don't. The command run in the base environment will update the packages in this, but usually you should work with virtual environments (conda create -n myenv and then conda activate myenv). Executing conda update --all inside such an environment will update the packages inside this environment. However, since the base environment is also an environment, the answer applies to both cases in the same way.
To answer more precisely to the question:
conda (which is conda for miniconda as for Anaconda) updates all but ONLY within a specific version of a package -> major and minor. That's the paradigm.
In the documentation you will find "NOTE: Conda updates to the highest version in its series, so Python 2.7 updates to the highest available in the 2.x series and 3.6 updates to the highest available in the 3.x series."
doc
If Wang does not gives a reproducible example, one can only assist.
e.g. is it really the virtual environment he wants to update or could Wang get what he/she wants with
conda update -n ENVIRONMENT --all
*PLEASE read the docs before executing "update --all"!
This does not lead to an update of all packages by nature. Because conda tries to resolve the relationship of dependencies between all packages in your environment, this can lead to DOWNGRADED packages without warnings.
If you only want to update almost all, you can create a pin file
echo "conda ==4.0.0" >> ~/miniconda3/envs/py35/conda-meta/pinned
echo "numpy 1.7.*" >> ~/miniconda3/envs/py35/conda-meta/pinned
before running the update. conda issues not pinned
If later on you want to ignore the file in your env for an update, you can do:
conda update --all --no-pin
You should not do update --all. If you need it nevertheless you are saver to test this in a cloned environment.
First step should always be to backup your current specification:
conda list -n py35 --explicit
(but even so there is not always a link to the source available - like for jupyterlab extensions)
Next you can clone and update:
conda create -n py356 --clone py35
conda activate py356
conda config --set pip_interop_enabled True # for conda>=4.6
conda update --all
conda config
update:
Currently I would use mamba (or micromamba) as conda pkg-manager replacement
update:
Because the idea of conda is nice but it is not working out very well for complex environments I personally prefer the combination of nix-shell (or lorri) and poetry [as superior pip/conda .-)] (intro poetry2nix).
Alternatively you can use nix and mach-nix (where you only need you requirements file. It resolves and builds environments best.
On Linux / macOS you could use nix like
nix-env -iA nixpkgs.python37
to enter an environment that has e.g. in this case Python3.7 (for sure you can change the version)
or as a very good Python (advanced) environment you can use mach-nix (with nix) like
mach-nix env ./env -r requirements.txt
(which even supports conda [but currently in beta])
or via api like
nix-shell -p nixFlakes --run "nix run github:davhau/mach-nix#with.ipython.pandas.seaborn.bokeh.scikit-learn "
Finally if you really need to work with packages that are not compatible due to its dependencies, it is possible with technologies like NixOS/nix-pkgs.
Imagine the dependency graph of packages, when the number of packages grows large, the chance of encountering a conflict when upgrading/adding packages is much higher. To avoid this, simply create a new environment in Anaconda.
Be frugal, install only what you need. For me, I installed the following packages in my new environment:
pandas
scikit-learn
matplotlib
notebook
keras
And I have 84 packages in total.
I agree with Mayou36.
For example, I was doing the mistake to install new packages in the base environment using conda for some packages and pip for some other packages.
Why this is bad?
1.None of this is going to help with updating packages that have been > installed >from PyPI via pip, or any packages installed using python
setup.py install. conda list will give you some hints about the
pip-based Python packages you have in an environment, but it won't do
anything special to update them.
And I had all my projects in the same one environment! And I used update all -which is bad and did not update all-.
So, the best thing to do is to create a new environment for each project. Why?
2. A Conda environment is a directory that contains a specific collection of Conda packages that you have installed. For example, you
may be working on a research project that requires NumPy 1.18 and its
dependencies, while another environment associated with an finished
project has NumPy 1.12 (perhaps because version 1.12 was the most
current version of NumPy at the time the project finished). If you
change one environment, your other environments are not affected. You
can easily activate or deactivate environments, which is how you
switch between them.
So, to wrap it up:
Create a new environment for each project
Be aware for the differences in conda and pip
3.Only include the packages that you will actually need and update them properly only if necessary.
if working in MS windows, you can use Anaconda navigator. click on the environment, in the drop-down box, it's "installed" by default. You can select "updatable" and start from there
To update all possible packages I used conda update --update-all
It works!
I solved this problem with conda and pip.
Firstly, I run:
conda uninstall qt and conda uninstall matplotlib and conda uninstall PyQt5
After that, I opened the cmd and run this code that
pip uninstall qt , pip uninstall matplotlib , pip uninstall PyQt5
Lastly, You should install matplotlib in pip by this code that pip install matplotlib

Categories

Resources