Do Anaconda cloud packages manually pulled from their website come with all of the packages dependencies?
For example, I have package A that I need for a python project. It has a dependency tree like below:
pip show package_A
Name: package_A
Version: 1.0.1
Requires: package_X, package_Y
pip show package_X
Name: package_X
Version: 2.0.2
Requires:
pip show package_Y
Name:package_Y
Version: 3.0.3
Requires: package_M
pip show package_M
Name: package_M
Version: 4.0.4
Requires:
So if I wanted to manually pull down package_A from the anaconda cloud site, would I need to pull the *.tar.bz2 files for all packages or would the package_A-1.0.1-py36hafb9ca4_1.tar.bz2 file have all of the dependencies also?
I use pip to show the dependencies, but I will be using conda to install. SOmething like:
conda install /libs/package_A-1.0.1-py36hafb9ca4_1.tar.bz2
'conda install' command will resolve and install all dependencies automatically provided this was configured within the package. You can check package dependencies by running -
conda info package_A=1.0.1=py36hafb9ca4_1
However, if you you install directly from tarballs there is no dependency check. To install local packages you can use the option "--use-local"
conda install --use-local package_A=1.0.1=py36hafb9ca4_1
Related
I run the following code to install a package using pip (in this case from GitHub) on a server and on my local machine using Conda to handle my environments:
conda activate base
conda env remove --name test-phonetic
conda create --name test-phonetic python=3.8 -y &&
conda activate test-phonetic &&
python -m pip install --upgrade pip &&
# remember to set your GIT_TOKEN
python -m pip install -e git+https://${GIT_TOKEN}#github.com/username/phonetic-transcription.git#feature/transcriptor-class#egg=phonetic-transcription # from branch "feature/transcriptor-class"
I receive the following output from pip when running on the server:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
jupyter-client 7.3.1 requires entrypoints, which is not installed.
jupyter-client 7.3.1 requires jupyter-core>=4.9.2, which is not installed.
jupyter-client 7.3.1 requires pyzmq>=22.3, which is not installed.
jupyter-client 7.3.1 requires tornado>=6.0, which is not installed.
jupyter-client 7.3.1 requires traitlets, which is not installed.
I receive no error message when installing on my local machine.
Another user receives the following output when installing on the server via the same commands (also using conda for environment management):
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
bokeh 2.4.2 requires Jinja2>=2.9, which is not installed.
bokeh 2.4.2 requires numpy>=1.11.3, which is not installed.
bokeh 2.4.2 requires tornado>=5.1, which is not installed.
Why do we receive different "error" messages in each case given that Conda environments are theoretically distinct and do not see each other?
At first I thought Conda was using versions of packages from other environments (and thought to use the --copy flag to install all packages using copies instead of hard- or soft-linking which Conda presumably does across environments to save space) but indeed this is an option available with conda install, not pip install.
Why are the errors inconsistent when running the same code on different machines and how can I resolve these "errors"?
pip ERROR: pip's dependency resolver does not currently take into account all the packages that are installed has no answers and comments suggest using a clean environment to resolve the issue, but I have done this. Similar story for cannot resolve urllib3 version issue
I would like to build a Conda package for my project. However, there is one package that is on pip-only (not uploaded to Conda channel). How to include pip only package when using conda-build command?
I tried using Conda skeleton to build a package from PyPI URL but it doesn't work because the file on PyPI site is a .whl file instead of a tar.gz file like in the conda skeleton tutorial. How should I solve this problem?
This is the error I got for when running the conda build.
conda_build.exceptions.DependencyNeedsBuildingError: Unsatisfiable dependencies for platform osx-64: {'plaidml'}
and for skeleton build for plaidml package by using conda skeleton pypi plaidml-keras
Error: No source urls found for plaidml-keras
Is there a good practice of how to include the pip only package when building conda package?
I looks around in the conda-build docs, and it looks like you can build a conda package using a wheel as a dependency. From the conda-build user guide docs:
To build a conda package from a wheel file, install the .whl file in
the conda recipe's bld.bat or build.sh file.
You may download the .whl file in the source section of the conda
recipe's meta.yaml file.
You may instead put the URL directly in the pip install command.
EXAMPLE: The conda recipe for TensorFlow has a pip install command in
build.sh with the URL of a .whl file. The meta.yaml file does not
download or list the .whl file.
Note
It is important to pip install only the one desired package. Whenever
possible, install dependencies with conda and not pip.
We strongly recommend using the --no-deps option in the pip install
command.
If you run pip install without the --no-deps option, pip will often
install dependencies in your conda recipe and those dependencies will
become part of your package. This wastes space in the package and
increases the risk of file overlap, file clobbering, and broken
packages.
I know this topic has been beat to death but I have not been able to find a solution to the problem I'm having on SO or elsewhere, so I suspect that there may be a bug somewhere in my system.
I am on an older RHEL 6 platform with Python 3.4. I am developing an application that will run on this platform that uses Qt. I've installed all of the relevant libraries via yum (e.g. qt-devel, pyqt4-devel, etc.) and now want to install my application package as an "editable" package using pip install -e mypkg. I also have a couple of dependency requirements that are not on yum and must be installed via pip.
What I would like to do is create a virtualenv that "inherits" the system packages installed via yum but allows me to pip install my own packages into a virtualenv directory in my home directory.
From my Googling it looks like the best way to do this is to create a virtual env with the system's site packages directory:
$ python3 -m venv --system-site-packages ~/venv
However, when I try to install a package to this virtualenv's site-packages directory, it attempts to install it under /usr/lib and I get a Permission denied error.
So it appears that the --system-site-packages option makes my virtualenv completely share the site-packages directory from my system instead of using it as a "base", where further packages can be layered on top.
This answer states that using pip install -I should do what I want, but that does not appear to be the case:
(venv) $ pip3 install -I bitstring
...
error: could not create '/usr/lib/python3.4/site-packages/bitstring.py': Permission denied
Create the virtual environment without the --system-site-packages switch. After the environment was created go to the folder the environment was created in. It should have a file pyvenv.cfg. Edit this file. It has (among other text) a line
include-system-site-packages = false
Change this line to:
include-system-site-packages = true
Activate the environment. Module installations will now go to the virtual environment and the system site packages are visible too.
With Python 3.8, it seems --system-site-packages work as expected:
python3 -m venv --system-site-packages myProject
cat myProject/pyvenv.cfg
home = /usr/bin
include-system-site-packages = true
version = 3.8.5
After installation astroid, isort, wrapt, I got:
pip list -v
Package Version Location Installer
---------------------- -------------------- ------------------------------------------------------- ---------
apturl 0.5.2 /usr/lib/python3/dist-packages
astroid 2.4.2 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
isort 5.6.4 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
jedi 0.15.2 /usr/lib/python3/dist-packages
keyring 18.0.1 /usr/lib/python3/dist-packages
wrapt 1.12.1 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
Already installed 'system' packages are taken from /usr/lib/python3/dist-packages while locally (venv) installed packages from: /home/to/no/MR/auto-gen/lib/python3.8/site-packages
I used pip to install the Resource module to the default conda environment on my laptop: (C:\Users\my_username\Anaconda2). I think it is called root. I installed pip to the conda environment and so I'm 90% sure the resource was installed within the environment. And indeed running conda list shows that the package is listed within the environment. Here is a section of the output:
# packages in environment at C:\Users\conna\Anaconda2:
#
qtpy 1.2.1 py27_0
requests 2.14.2 py27_0
Resource 0.2.0 <pip>
rope 0.9.4 py27_1
ruamel_yaml 0.11.14 py27_1
scandir 1.5 py27_0
scikit-image 0.13.0 np112py27_0
However when I run
conda update Resource
I get the following error:
PackageNotInstalledError: Package is not installed in prefix.
prefix: C:\Users\conna\Anaconda2
package name: Resource
How is it possible that conda list shows the module is present but conda update can't see them? I also noticed that conda update doesn't recognize any packages with <pip>. What is happening?
conda only manages the packages that are installed using a conda command. If you installed a package with pip (or using python setup.py install or develop) it will show up with conda list (because that shows all packages no matter how they were installed) but conda won't manage that package. Simply because it doesn't know how!
So if you installed a package with pip you also need to upgrade/update it with pip:
pip install [package_name] --upgrade
Try this;
pip install Resource --upgrade
Can conda install be used to install source-distributions (i.e. non-archived import packages that have a setup.py)?
Yes and no. You can not conda install per se. However, as the Conda documentation says, Conda ships with pip, so you should be able to pip install -e . your package. You can also install with traditional python setup.py [install|develop].
Remember to activate your Conda environment before installation if you're using one instead of site packages.
As mentioned by vaiski, you can use pip and/or setup.py to build and install the package, but this method is not ideal because packages installed with pip and conda do not respect each other's dependencies.
Thus, if the source distribution includes a conda build recipe (meta.yaml), then you can created the anaconda archive on your own machine by using the conda-build tool:
$ conda build meta.yaml
Afterwards, you will have a local tar.gz of the build package with meta-data that conda can understand. This is what you download from the internet whenever you install a package using conda.
Finally, you can install the package you built locally using:
$ conda install --use-local