I work with conda environments and need some pip packages as well, e.g. pre-compiled wheels from ~gohlke.
At the moment I have two files: environment.yml for conda with:
# run: conda env create --file environment.yml
name: test-env
dependencies:
- python>=3.5
- anaconda
and requirements.txt for pip which can be used after activating above conda environment:
# run: pip install -i requirements.txt
docx
gooey
http://www.lfd.uci.edu/~gohlke/pythonlibs/bofhrmxk/opencv_python-3.1.0-cp35-none-win_amd64.whl
Is there a possibility to combine them in one file (for conda)?
Pip dependencies can be included in the environment.yml file like this (docs):
# run: conda env create --file environment.yml
name: test-env
dependencies:
- python>=3.5
- anaconda
- pip
- numpy=1.13.3 # pin version for conda
- pip:
# works for regular pip packages
- docx
- gooey
- matplotlib==2.0.0 # pin version for pip
# and for wheels
- http://www.lfd.uci.edu/~gohlke/pythonlibs/bofhrmxk/opencv_python-3.1.0-cp35-none-win_amd64.whl
It also works for .whl files in the same directory (see Dengar's answer) as well as with common pip packages.
One can also use the requirements.txt directly in the YAML. For example,
name: test-env
dependencies:
- python>=3.5
- anaconda
- pip
- pip:
- -r requirements.txt
Basically, any option you can run with pip install you can run in a YAML. See the Advanced Pip Example for a showcase of other capabilities.
Important Note
A previous version of this answer (and Conda's Advanced Pip Example) used a substandard file URI syntax:
- -r file:requirements.txt
Pip v21.2.1 introduced stricter behavior for URI parsing and no longer supports this. See this answer for details.
Just want to add that adding a wheel in the directory also works. I was getting this error when using the entire URL:
HTTP error 404 while getting http://www.lfd.uci.edu/~gohlke/pythonlibs/f9r7rmd8/opencv_python-3.1.0-cp35-none-win_amd64.whl
Ended up downloading the wheel and saving it into the same directory as the yml file.
name: test-env
dependencies:
- python>=3.5
- anaconda
- pip
- pip:
- opencv_python-3.1.0-cp35-none-win_amd64.whl
If you want to do it automatically it seems that if you do:
conda env export > environment.yml
already has the pip things you need. No need to run pip freeze > requirements4pip.txt separately for me or include it as a
- pip:
- -r file:requirements.txt
as another answer has mentioned.
See my yml file:
$ cat environment.yml
name: myenv
channels:
- pytorch
- dglteam
- defaults
- conda-forge
dependencies:
- _libgcc_mutex=0.1=main
- absl-py=0.12.0=py38h06a4308_0
- aiohttp=3.7.4=py38h27cfd23_1
- async-timeout=3.0.1=py38h06a4308_0
- attrs=20.3.0=pyhd3eb1b0_0
- beautifulsoup4=4.9.3=pyha847dfd_0
- blas=1.0=mkl
- blinker=1.4=py38h06a4308_0
- brotlipy=0.7.0=py38h27cfd23_1003
- bzip2=1.0.8=h7b6447c_0
- c-ares=1.17.1=h27cfd23_0
- ca-certificates=2021.4.13=h06a4308_1
- cachetools=4.2.1=pyhd3eb1b0_0
- cairo=1.14.12=h8948797_3
- certifi=2020.12.5=py38h06a4308_0
- cffi=1.14.0=py38h2e261b9_0
- chardet=3.0.4=py38h06a4308_1003
- click=7.1.2=pyhd3eb1b0_0
- conda=4.10.1=py38h06a4308_1
- conda-build=3.21.4=py38h06a4308_0
- conda-package-handling=1.7.3=py38h27cfd23_1
- coverage=5.5=py38h27cfd23_2
- cryptography=3.4.7=py38hd23ed53_0
- cudatoolkit=11.0.221=h6bb024c_0
- cycler=0.10.0=py38_0
- cython=0.29.23=py38h2531618_0
- dbus=1.13.18=hb2f20db_0
- decorator=4.4.2=pyhd3eb1b0_0
- dgl-cuda11.0=0.6.1=py38_0
- dill=0.3.3=pyhd3eb1b0_0
- expat=2.3.0=h2531618_2
- filelock=3.0.12=pyhd3eb1b0_1
- fontconfig=2.13.1=h6c09931_0
- freetype=2.10.4=h7ca028e_0
- fribidi=1.0.10=h7b6447c_0
- gettext=0.21.0=hf68c758_0
- glib=2.66.3=h58526e2_0
- glob2=0.7=pyhd3eb1b0_0
- google-auth=1.29.0=pyhd3eb1b0_0
- google-auth-oauthlib=0.4.4=pyhd3eb1b0_0
- graphite2=1.3.14=h23475e2_0
- graphviz=2.40.1=h21bd128_2
- grpcio=1.36.1=py38h2157cd5_1
- gst-plugins-base=1.14.0=h8213a91_2
- gstreamer=1.14.0=h28cd5cc_2
- harfbuzz=1.8.8=hffaf4a1_0
- icu=58.2=he6710b0_3
- idna=2.10=pyhd3eb1b0_0
- importlib-metadata=3.10.0=py38h06a4308_0
- intel-openmp=2021.2.0=h06a4308_610
- jinja2=2.11.3=pyhd3eb1b0_0
- joblib=1.0.1=pyhd3eb1b0_0
- jpeg=9b=h024ee3a_2
- kiwisolver=1.3.1=py38h2531618_0
- lcms2=2.12=h3be6417_0
- ld_impl_linux-64=2.33.1=h53a641e_7
- libarchive=3.4.2=h62408e4_0
- libffi=3.2.1=hf484d3e_1007
- libgcc-ng=9.1.0=hdf63c60_0
- libgfortran-ng=7.3.0=hdf63c60_0
- libglib=2.66.3=hbe7bbb4_0
- libiconv=1.16=h516909a_0
- liblief=0.10.1=he6710b0_0
- libpng=1.6.37=h21135ba_2
- libprotobuf=3.14.0=h8c45485_0
- libstdcxx-ng=9.1.0=hdf63c60_0
- libtiff=4.1.0=h2733197_1
- libuuid=1.0.3=h1bed415_2
- libuv=1.40.0=h7b6447c_0
- libxcb=1.14=h7b6447c_0
- libxml2=2.9.10=hb55368b_3
- lz4-c=1.9.2=he1b5a44_3
- markdown=3.3.4=py38h06a4308_0
- markupsafe=1.1.1=py38h7b6447c_0
- matplotlib=3.3.4=py38h06a4308_0
- matplotlib-base=3.3.4=py38h62a2d02_0
- mkl=2020.2=256
- mkl-service=2.3.0=py38h1e0a361_2
- mkl_fft=1.3.0=py38h54f3939_0
- mkl_random=1.2.0=py38hc5bc63f_1
- multidict=5.1.0=py38h27cfd23_2
- ncurses=6.2=he6710b0_1
- networkx=2.5.1=pyhd3eb1b0_0
- ninja=1.10.2=hff7bd54_1
- numpy=1.19.2=py38h54aff64_0
- numpy-base=1.19.2=py38hfa32c7d_0
- oauthlib=3.1.0=py_0
- olefile=0.46=pyh9f0ad1d_1
- openssl=1.1.1k=h27cfd23_0
- pandas=1.2.4=py38h2531618_0
- pango=1.42.4=h049681c_0
- patchelf=0.12=h2531618_1
- pcre=8.44=he6710b0_0
- pillow=8.2.0=py38he98fc37_0
- pip=21.0.1=py38h06a4308_0
- pixman=0.40.0=h7b6447c_0
- pkginfo=1.7.0=py38h06a4308_0
- protobuf=3.14.0=py38h2531618_1
- psutil=5.8.0=py38h27cfd23_1
- py-lief=0.10.1=py38h403a769_0
- pyasn1=0.4.8=py_0
- pyasn1-modules=0.2.8=py_0
- pycosat=0.6.3=py38h7b6447c_1
- pycparser=2.20=py_2
- pyjwt=2.0.1=pyhd8ed1ab_1
- pyopenssl=20.0.1=pyhd3eb1b0_1
- pyparsing=2.4.7=pyhd3eb1b0_0
- pyqt=5.9.2=py38h05f1152_4
- pysocks=1.7.1=py38h06a4308_0
- python=3.8.2=hcf32534_0
- python-dateutil=2.8.1=pyhd3eb1b0_0
- python-libarchive-c=2.9=pyhd3eb1b0_1
- python_abi=3.8=1_cp38
- pytorch=1.7.1=py3.8_cuda11.0.221_cudnn8.0.5_0
- pytz=2021.1=pyhd3eb1b0_0
- pyyaml=5.4.1=py38h27cfd23_1
- qt=5.9.7=h5867ecd_1
- readline=8.1=h27cfd23_0
- requests=2.25.1=pyhd3eb1b0_0
- requests-oauthlib=1.3.0=py_0
- ripgrep=12.1.1=0
- rsa=4.7.2=pyhd3eb1b0_1
- ruamel_yaml=0.15.100=py38h27cfd23_0
- scikit-learn=0.24.1=py38ha9443f7_0
- scipy=1.6.2=py38h91f5cce_0
- setuptools=52.0.0=py38h06a4308_0
- sip=4.19.13=py38he6710b0_0
- six=1.15.0=pyh9f0ad1d_0
- soupsieve=2.2.1=pyhd3eb1b0_0
- sqlite=3.35.4=hdfb4753_0
- tensorboard=2.4.0=pyhc547734_0
- tensorboard-plugin-wit=1.6.0=py_0
- threadpoolctl=2.1.0=pyh5ca1d4c_0
- tk=8.6.10=hbc83047_0
- torchaudio=0.7.2=py38
- torchtext=0.8.1=py38
- torchvision=0.8.2=py38_cu110
- tornado=6.1=py38h27cfd23_0
- typing-extensions=3.7.4.3=0
- typing_extensions=3.7.4.3=py_0
- urllib3=1.26.4=pyhd3eb1b0_0
- werkzeug=1.0.1=pyhd3eb1b0_0
- wheel=0.36.2=pyhd3eb1b0_0
- xz=5.2.5=h7b6447c_0
- yaml=0.2.5=h7b6447c_0
- yarl=1.6.3=py38h27cfd23_0
- zipp=3.4.1=pyhd3eb1b0_0
- zlib=1.2.11=h7b6447c_3
- zstd=1.4.5=h9ceee32_0
- pip:
- aioconsole==0.3.1
- lark-parser==0.6.5
- lmdb==0.94
- pexpect==4.6.0
- progressbar2==3.39.3
- ptyprocess==0.7.0
- pycapnp==1.0.0
- python-utils==2.5.6
- sexpdata==0.0.3
- tqdm==4.56.0
prefix: /home/miranda9/miniconda3/envs/myenv
Note that at the time of this writing doing conda env create --file environment.yml to create the yml env results in an error:
$ conda env create --file environment.yml
CondaValueError: prefix already exists: /home/miranda9/miniconda3/envs/myenv
Related
I'm trying to create an environment with conda after having installed Miniconda3, on macOS. I am using the conda env create -f SEM1-CB.yml command through terminal but it gives me the following error:
Collecting package metadata (repodata.json): done
Solving environment: failed
ResolvePackageNotFound:
- xz==5.2.5=h62dcd97_1
- mkl-service==2.4.0=py310h2bbff1b_0
- numpy-base==1.21.2=py310h0829f74_0
- libzlib==1.2.11=h8ffe710_1013
- vc==14.2=hb210afc_5
- lz4-c==1.9.3=h8ffe710_1
- numpy==1.21.2=py310hfca59bb_0
- jpeg==9d=h8ffe710_0
- mkl_random==1.2.2=py310h4ed8f06_0
- zlib==1.2.11=h8ffe710_1013
- libffi==3.4.2=h8ffe710_5
- mkl==2021.4.0=haa95532_640
- setuptools==58.5.3=py310h5588dad_0
- mkl_fft==1.3.1=py310ha0764ea_0
- ca-certificates==2021.10.26=haa95532_4
- libtiff==4.3.0=hd413186_2
- openssl==3.0.0=h8ffe710_2
- intel-openmp==2021.4.0=haa95532_3556
- libdeflate==1.8=h8ffe710_0
- zstd==1.5.0=h6255e5f_0
- python==3.10.0=hcf16a7b_2_cpython
- vs2015_runtime==14.29.30037=h902a5da_5
- lcms2==2.12=h2a16943_0
- lerc==3.0=h0e60522_0
- openjpeg==2.4.0=hb211442_1
- pillow==8.4.0=py310h22f3323_0
- tk==8.6.11=h8ffe710_1
- ucrt==10.0.20348.0=h57928b3_0
- freetype==2.10.4=h546665d_1
- jbig==2.1=h8d14728_2003
- libpng==1.6.37=h1d00b33_2
- sqlite==3.36.0=h8ffe710_2
- bzip2==1.0.8=h8ffe710_4
The SEM1-CB.yml file is as follows:
name: SEM1-CB
channels:
- conda-forge
- defaults
dependencies:
- blas=1.0=mkl
- bzip2=1.0.8=h8ffe710_4
- ca-certificates=2021.10.26=haa95532_4
- freetype=2.10.4=h546665d_1
- intel-openmp=2021.4.0=haa95532_3556
- jbig=2.1=h8d14728_2003
- jpeg=9d=h8ffe710_0
- lcms2=2.12=h2a16943_0
- lerc=3.0=h0e60522_0
- libdeflate=1.8=h8ffe710_0
- libffi=3.4.2=h8ffe710_5
- libpng=1.6.37=h1d00b33_2
- libtiff=4.3.0=hd413186_2
- libzlib=1.2.11=h8ffe710_1013
- lz4-c=1.9.3=h8ffe710_1
- mkl=2021.4.0=haa95532_640
- mkl-service=2.4.0=py310h2bbff1b_0
- mkl_fft=1.3.1=py310ha0764ea_0
- mkl_random=1.2.2=py310h4ed8f06_0
- numpy=1.21.2=py310hfca59bb_0
- numpy-base=1.21.2=py310h0829f74_0
- olefile=0.46=pyh9f0ad1d_1
- openjpeg=2.4.0=hb211442_1
- openssl=3.0.0=h8ffe710_2
- pillow=8.4.0=py310h22f3323_0
- pip=21.3.1=pyhd8ed1ab_0
- pysimplegui=4.55.1=pyhd8ed1ab_0
- python=3.10.0=hcf16a7b_2_cpython
- python_abi=3.10=2_cp310
- setuptools=58.5.3=py310h5588dad_0
- six=1.16.0=pyhd3eb1b0_0
- sqlite=3.36.0=h8ffe710_2
- tk=8.6.11=h8ffe710_1
- tzdata=2021e=he74cb21_0
- ucrt=10.0.20348.0=h57928b3_0
- vc=14.2=hb210afc_5
- vs2015_runtime=14.29.30037=h902a5da_5
- wheel=0.37.0=pyhd8ed1ab_1
- xz=5.2.5=h62dcd97_1
- zlib=1.2.11=h8ffe710_1013
- zstd=1.5.0=h6255e5f_0
- pip:
- click==8.0.3
- colorama==0.4.4
- flask==2.0.2
- itsdangerous==2.0.1
- jinja2==3.0.3
- markupsafe==2.0.1
- pymata4==1.15
- pyserial==3.5
- werkzeug==2.0.2
prefix:
I need to create and activate this environment in order to add it as interpreter on visual studio code. I have tried the procedure on microsoft windows and it works, but I need it to work even on macOS.
Thank you in advance for any help!
In SEM1-CB.yml file you have Windows-specific conda package builds specified, which aren't available in OSX index, hence ends up showing you ResolvePackageNotFound error. Check the screenshot below I took from: https://anaconda.org/conda-forge/xz/files
To fix this, when you export your conda environment file in Windows run this command: conda env export --from-history --name env_name > file_name.yml or manually remove build numbers from the file.
I have conda installed locally on my Windows PC and also installed remotely on a Linux server. I already have conda packages installed locally on my Windows PC, and I want to install the same packages on the Linux server. I have already tried the following steps:
Create a requirements.txt file containing the currently installed packages and their versions using the Anaconda Prompt on my Windows PC using the command conda list -e > requirements.txt.
Transfer this requirements.txt file to my Linux server.
Install these packages in my conda base environment using the command conda install --yes --file requirements.txt.
However, I get the following error message on my Linux server when I try to complete step 3:
Solving environment: failed
PackagesNotFoundError: The following packages are not available from current channels:
- ca-certificates==2020.12.8=haa95532_0
- m2w64-gcc-libs-core==5.3.0=7
- audioread==2.1.8=pypi_0
- pywin32-ctypes==0.2.0=py38_1000
- notebook==6.1.5=py38haa95532_0
- librosa==0.8.0=pypi_0
- numpy-base==1.19.2=py38ha3acd2a_0
- psutil==5.7.2=py38he774522_0
- numpy==1.19.2=py38hadc3359_0
- regex==2020.11.13=py38h2bbff1b_0
- spyder-kernels==1.10.0=py38haa95532_0
- appdirs==1.4.4=pypi_0
- ujson==4.0.1=py38ha925a31_0
- setuptools==51.0.0=py38haa95532_2
- sklearn-crfsuite==0.3.6=pypi_0
- pywinpty==0.5.7=py38_0
- m2w64-gmp==6.1.0=2
- pyyaml==5.3.1=py38he774522_1
- bzip2==1.0.8=he774522_0
- sounddevice==0.4.1=pypi_0
- certifi==2020.12.5=py38haa95532_0
- gpytorch==1.3.0=pypi_0
- winpty==0.4.3=4
- pyzmq==20.0.0=py38hd77b12b_1
- pytorch==1.7.1=py3.8_cpu_0
- lazy-object-proxy==1.4.3=py38h2bbff1b_2
- zeromq==4.3.3=ha925a31_3
- ipython==7.19.0=py38hd4e2768_0
- mkl_fft==1.2.0=py38h45dec08_0
- conda-package-handling==1.7.2=py38h76e460a_0
- vc==14.2=h21ff451_1
- cpuonly==1.0=0
- pip==20.3.1=py38haa95532_0
- tornado==6.1=py38h2bbff1b_0
- libarchive==3.4.2=h5e25573_0
- msys2-conda-epoch==20160418=1
- pandocfilters==1.4.3=py38haa95532_1
- scikit-learn==0.23.2=pypi_0
- torchaudio==0.7.2=py38
- soundfile==0.10.3.post1=pypi_0
- gsl==2.4=hfa6e2cd_4
- kiwisolver==1.3.0=py38hd77b12b_0
- argon2-cffi==20.1.0=py38he774522_1
- dataclasses==0.6=pypi_0
- libtiff==4.1.0=h56a325e_1
- torchvision==0.8.2=py38_cpu
- m2w64-libwinpthread-git==5.0.0.4634.697f757=2
- numba==0.51.2=pypi_0
- pooch==1.2.0=pypi_0
- cvxopt==1.2.0=py38hdc3235a_0
- tabulate==0.8.7=pypi_0
- pillow==8.0.1=py38h4fa10fc_0
- libpng==1.6.37=h2a8f88b_0
- libiconv==1.15=h1df5818_7
- rtree==0.9.4=py38h21ff451_1
- qt==5.9.7=vc14h73c81de_0
- ruamel_yaml==0.15.87=py38he774522_1
- libsodium==1.0.18=h62dcd97_0
- yaml==0.2.5=he774522_0
- m2w64-gcc-libs==5.3.0=7
- libspatialindex==1.9.3=h33f27b4_0
- jedi==0.17.2=py38haa95532_1
- tk==8.6.10=he774522_0
- six==1.15.0=py38haa95532_0
- python-crfsuite==0.9.7=pypi_0
- spyder==4.2.0=py38haa95532_0
- cffi==1.14.4=py38hcd4344a_0
- xz==5.2.5=h62dcd97_0
- console_shortcut==0.1.1=4
- sqlite==3.33.0=h2a8f88b_0
- pycosat==0.6.3=py38h2bbff1b_0
- pyrsistent==0.17.3=py38he774522_0
- markupsafe==1.1.1=py38he774522_0
- bcrypt==3.2.0=py38he774522_0
- libuv==1.40.0=he774522_0
- brotlipy==0.7.0=py38h2bbff1b_1003
- mistune==0.8.4=py38he774522_1000
- wrapt==1.11.2=py38he774522_0
- powershell_shortcut==0.0.1=3
- mkl-service==2.3.0=py38h196d8e1_0
- pysocks==1.7.1=py38haa95532_0
- typeguard==2.10.0=pypi_0
- jpeg==9b=hb83a4c4_2
- libxml2==2.9.10=hb89e7f3_3
- freetype==2.10.4=hd328e21_0
- python==3.8.5=h5fd99cc_1
- liblief==0.10.1=ha925a31_0
- sip==4.19.13=py38ha925a31_0
- scipy==1.5.4=pypi_0
- pywin32==227=py38he774522_1
- nltk==3.5=pypi_0
- py-lief==0.10.1=py38ha925a31_0
- threadpoolctl==2.1.0=pypi_0
- zlib==1.2.11=h62dcd97_4
- cudatoolkit==10.2.89=h74a9793_1
- zstd==1.4.5=h04227a9_0
- mkl_random==1.1.1=py38h47e9c7a_0
- glpk==4.65=hdc00fd2_2
- ninja==1.10.2=py38h6d14046_0
- joblib==0.17.0=pypi_0
- typed-ast==1.4.1=py38he774522_0
- pandas==1.1.3=py38ha925a31_0
- llvmlite==0.34.0=pypi_0
- resampy==0.2.2=pypi_0
- pynacl==1.4.0=py38h62dcd97_1
- vs2015_runtime==14.27.29016=h5e58377_2
- icu==58.2=ha925a31_3
- matplotlib-base==3.3.2=py38hba9282a_0
- menuinst==1.4.16=py38he774522_1
- pyqt==5.9.2=py38ha925a31_4
- cryptography==3.3.1=py38hcd4344a_0
- jupyter_core==4.7.0=py38haa95532_0
- ax-platform==0.1.19=pypi_0
- botorch==0.3.3=pypi_0
- win_inet_pton==1.1.0=py38haa95532_0
- pkginfo==1.6.1=py38haa95532_0
- openssl==1.1.1i=h2bbff1b_0
- wincertstore==0.2=py38_0
- matplotlib==3.3.2=pypi_0
- lz4-c==1.9.2=hf4a77e7_3
- pandoc==2.11=h9490d1a_0
- conda==4.9.2=py38haa95532_0
- ad3==2.3.dev0=pypi_0
- retrying==1.3.3=pypi_0
- plotly==4.14.1=pypi_0
- m2w64-gcc-libgfortran==5.3.0=6
- watchdog==0.10.4=py38haa95532_0
- chardet==3.0.4=py38haa95532_1003
Current channels:
- https://repo.anaconda.com/pkgs/main/linux-64
- https://repo.anaconda.com/pkgs/main/noarch
- https://repo.anaconda.com/pkgs/free/linux-64
- https://repo.anaconda.com/pkgs/free/noarch
- https://repo.anaconda.com/pkgs/r/linux-64
- https://repo.anaconda.com/pkgs/r/noarch
- https://repo.anaconda.com/pkgs/pro/linux-64
- https://repo.anaconda.com/pkgs/pro/noarch
To search for alternate channels that may provide the conda package you're
looking for, navigate to
https://anaconda.org
and use the search bar at the top of the page.
I am aware that the problem is that I am not looking in the correct conda channels, as the error message suggests, but I am not sure how to solve this problem.
Thanks for the help.
The official documentation has some additional steps to have environment working cross-platform. Here is the link to that.
However, if you are not using packages that are only available in anaconda channels, you can do the following.
Have pip on both your conda environment (windows and linux server)
Make requirements.txt file using pip freeze instead of conda's one. From the windows machine
$ pip freeze > requirements.txt
Install the packages in linux server normally with pip.
$ pip install -r requirements.txt
I am not saying that it is the best option. But this way usually is easier can support other environment management packages like pyenv.
I work with conda environments and need some pip packages as well, e.g. pre-compiled wheels from ~gohlke.
At the moment I have two files: environment.yml for conda with:
# run: conda env create --file environment.yml
name: test-env
dependencies:
- python>=3.5
- anaconda
and requirements.txt for pip which can be used after activating above conda environment:
# run: pip install -i requirements.txt
docx
gooey
http://www.lfd.uci.edu/~gohlke/pythonlibs/bofhrmxk/opencv_python-3.1.0-cp35-none-win_amd64.whl
Is there a possibility to combine them in one file (for conda)?
Pip dependencies can be included in the environment.yml file like this (docs):
# run: conda env create --file environment.yml
name: test-env
dependencies:
- python>=3.5
- anaconda
- pip
- numpy=1.13.3 # pin version for conda
- pip:
# works for regular pip packages
- docx
- gooey
- matplotlib==2.0.0 # pin version for pip
# and for wheels
- http://www.lfd.uci.edu/~gohlke/pythonlibs/bofhrmxk/opencv_python-3.1.0-cp35-none-win_amd64.whl
It also works for .whl files in the same directory (see Dengar's answer) as well as with common pip packages.
One can also use the requirements.txt directly in the YAML. For example,
name: test-env
dependencies:
- python>=3.5
- anaconda
- pip
- pip:
- -r requirements.txt
Basically, any option you can run with pip install you can run in a YAML. See the Advanced Pip Example for a showcase of other capabilities.
Important Note
A previous version of this answer (and Conda's Advanced Pip Example) used a substandard file URI syntax:
- -r file:requirements.txt
Pip v21.2.1 introduced stricter behavior for URI parsing and no longer supports this. See this answer for details.
Just want to add that adding a wheel in the directory also works. I was getting this error when using the entire URL:
HTTP error 404 while getting http://www.lfd.uci.edu/~gohlke/pythonlibs/f9r7rmd8/opencv_python-3.1.0-cp35-none-win_amd64.whl
Ended up downloading the wheel and saving it into the same directory as the yml file.
name: test-env
dependencies:
- python>=3.5
- anaconda
- pip
- pip:
- opencv_python-3.1.0-cp35-none-win_amd64.whl
If you want to do it automatically it seems that if you do:
conda env export > environment.yml
already has the pip things you need. No need to run pip freeze > requirements4pip.txt separately for me or include it as a
- pip:
- -r file:requirements.txt
as another answer has mentioned.
See my yml file:
$ cat environment.yml
name: myenv
channels:
- pytorch
- dglteam
- defaults
- conda-forge
dependencies:
- _libgcc_mutex=0.1=main
- absl-py=0.12.0=py38h06a4308_0
- aiohttp=3.7.4=py38h27cfd23_1
- async-timeout=3.0.1=py38h06a4308_0
- attrs=20.3.0=pyhd3eb1b0_0
- beautifulsoup4=4.9.3=pyha847dfd_0
- blas=1.0=mkl
- blinker=1.4=py38h06a4308_0
- brotlipy=0.7.0=py38h27cfd23_1003
- bzip2=1.0.8=h7b6447c_0
- c-ares=1.17.1=h27cfd23_0
- ca-certificates=2021.4.13=h06a4308_1
- cachetools=4.2.1=pyhd3eb1b0_0
- cairo=1.14.12=h8948797_3
- certifi=2020.12.5=py38h06a4308_0
- cffi=1.14.0=py38h2e261b9_0
- chardet=3.0.4=py38h06a4308_1003
- click=7.1.2=pyhd3eb1b0_0
- conda=4.10.1=py38h06a4308_1
- conda-build=3.21.4=py38h06a4308_0
- conda-package-handling=1.7.3=py38h27cfd23_1
- coverage=5.5=py38h27cfd23_2
- cryptography=3.4.7=py38hd23ed53_0
- cudatoolkit=11.0.221=h6bb024c_0
- cycler=0.10.0=py38_0
- cython=0.29.23=py38h2531618_0
- dbus=1.13.18=hb2f20db_0
- decorator=4.4.2=pyhd3eb1b0_0
- dgl-cuda11.0=0.6.1=py38_0
- dill=0.3.3=pyhd3eb1b0_0
- expat=2.3.0=h2531618_2
- filelock=3.0.12=pyhd3eb1b0_1
- fontconfig=2.13.1=h6c09931_0
- freetype=2.10.4=h7ca028e_0
- fribidi=1.0.10=h7b6447c_0
- gettext=0.21.0=hf68c758_0
- glib=2.66.3=h58526e2_0
- glob2=0.7=pyhd3eb1b0_0
- google-auth=1.29.0=pyhd3eb1b0_0
- google-auth-oauthlib=0.4.4=pyhd3eb1b0_0
- graphite2=1.3.14=h23475e2_0
- graphviz=2.40.1=h21bd128_2
- grpcio=1.36.1=py38h2157cd5_1
- gst-plugins-base=1.14.0=h8213a91_2
- gstreamer=1.14.0=h28cd5cc_2
- harfbuzz=1.8.8=hffaf4a1_0
- icu=58.2=he6710b0_3
- idna=2.10=pyhd3eb1b0_0
- importlib-metadata=3.10.0=py38h06a4308_0
- intel-openmp=2021.2.0=h06a4308_610
- jinja2=2.11.3=pyhd3eb1b0_0
- joblib=1.0.1=pyhd3eb1b0_0
- jpeg=9b=h024ee3a_2
- kiwisolver=1.3.1=py38h2531618_0
- lcms2=2.12=h3be6417_0
- ld_impl_linux-64=2.33.1=h53a641e_7
- libarchive=3.4.2=h62408e4_0
- libffi=3.2.1=hf484d3e_1007
- libgcc-ng=9.1.0=hdf63c60_0
- libgfortran-ng=7.3.0=hdf63c60_0
- libglib=2.66.3=hbe7bbb4_0
- libiconv=1.16=h516909a_0
- liblief=0.10.1=he6710b0_0
- libpng=1.6.37=h21135ba_2
- libprotobuf=3.14.0=h8c45485_0
- libstdcxx-ng=9.1.0=hdf63c60_0
- libtiff=4.1.0=h2733197_1
- libuuid=1.0.3=h1bed415_2
- libuv=1.40.0=h7b6447c_0
- libxcb=1.14=h7b6447c_0
- libxml2=2.9.10=hb55368b_3
- lz4-c=1.9.2=he1b5a44_3
- markdown=3.3.4=py38h06a4308_0
- markupsafe=1.1.1=py38h7b6447c_0
- matplotlib=3.3.4=py38h06a4308_0
- matplotlib-base=3.3.4=py38h62a2d02_0
- mkl=2020.2=256
- mkl-service=2.3.0=py38h1e0a361_2
- mkl_fft=1.3.0=py38h54f3939_0
- mkl_random=1.2.0=py38hc5bc63f_1
- multidict=5.1.0=py38h27cfd23_2
- ncurses=6.2=he6710b0_1
- networkx=2.5.1=pyhd3eb1b0_0
- ninja=1.10.2=hff7bd54_1
- numpy=1.19.2=py38h54aff64_0
- numpy-base=1.19.2=py38hfa32c7d_0
- oauthlib=3.1.0=py_0
- olefile=0.46=pyh9f0ad1d_1
- openssl=1.1.1k=h27cfd23_0
- pandas=1.2.4=py38h2531618_0
- pango=1.42.4=h049681c_0
- patchelf=0.12=h2531618_1
- pcre=8.44=he6710b0_0
- pillow=8.2.0=py38he98fc37_0
- pip=21.0.1=py38h06a4308_0
- pixman=0.40.0=h7b6447c_0
- pkginfo=1.7.0=py38h06a4308_0
- protobuf=3.14.0=py38h2531618_1
- psutil=5.8.0=py38h27cfd23_1
- py-lief=0.10.1=py38h403a769_0
- pyasn1=0.4.8=py_0
- pyasn1-modules=0.2.8=py_0
- pycosat=0.6.3=py38h7b6447c_1
- pycparser=2.20=py_2
- pyjwt=2.0.1=pyhd8ed1ab_1
- pyopenssl=20.0.1=pyhd3eb1b0_1
- pyparsing=2.4.7=pyhd3eb1b0_0
- pyqt=5.9.2=py38h05f1152_4
- pysocks=1.7.1=py38h06a4308_0
- python=3.8.2=hcf32534_0
- python-dateutil=2.8.1=pyhd3eb1b0_0
- python-libarchive-c=2.9=pyhd3eb1b0_1
- python_abi=3.8=1_cp38
- pytorch=1.7.1=py3.8_cuda11.0.221_cudnn8.0.5_0
- pytz=2021.1=pyhd3eb1b0_0
- pyyaml=5.4.1=py38h27cfd23_1
- qt=5.9.7=h5867ecd_1
- readline=8.1=h27cfd23_0
- requests=2.25.1=pyhd3eb1b0_0
- requests-oauthlib=1.3.0=py_0
- ripgrep=12.1.1=0
- rsa=4.7.2=pyhd3eb1b0_1
- ruamel_yaml=0.15.100=py38h27cfd23_0
- scikit-learn=0.24.1=py38ha9443f7_0
- scipy=1.6.2=py38h91f5cce_0
- setuptools=52.0.0=py38h06a4308_0
- sip=4.19.13=py38he6710b0_0
- six=1.15.0=pyh9f0ad1d_0
- soupsieve=2.2.1=pyhd3eb1b0_0
- sqlite=3.35.4=hdfb4753_0
- tensorboard=2.4.0=pyhc547734_0
- tensorboard-plugin-wit=1.6.0=py_0
- threadpoolctl=2.1.0=pyh5ca1d4c_0
- tk=8.6.10=hbc83047_0
- torchaudio=0.7.2=py38
- torchtext=0.8.1=py38
- torchvision=0.8.2=py38_cu110
- tornado=6.1=py38h27cfd23_0
- typing-extensions=3.7.4.3=0
- typing_extensions=3.7.4.3=py_0
- urllib3=1.26.4=pyhd3eb1b0_0
- werkzeug=1.0.1=pyhd3eb1b0_0
- wheel=0.36.2=pyhd3eb1b0_0
- xz=5.2.5=h7b6447c_0
- yaml=0.2.5=h7b6447c_0
- yarl=1.6.3=py38h27cfd23_0
- zipp=3.4.1=pyhd3eb1b0_0
- zlib=1.2.11=h7b6447c_3
- zstd=1.4.5=h9ceee32_0
- pip:
- aioconsole==0.3.1
- lark-parser==0.6.5
- lmdb==0.94
- pexpect==4.6.0
- progressbar2==3.39.3
- ptyprocess==0.7.0
- pycapnp==1.0.0
- python-utils==2.5.6
- sexpdata==0.0.3
- tqdm==4.56.0
prefix: /home/miranda9/miniconda3/envs/myenv
Note that at the time of this writing doing conda env create --file environment.yml to create the yml env results in an error:
$ conda env create --file environment.yml
CondaValueError: prefix already exists: /home/miranda9/miniconda3/envs/myenv
I would like to know how to install python libraries using yml file without making a new environment. I already have tensorflow environment in conda. I want to install list of libraries into this tensorflow environment. It is the only way I know manually add each of these libraries but it is very hard to do this list. Please give me solution for that
This is yml file:
name: virtual_platform
channels:
- menpo
- conda-forge
- peterjc123
- defaults
dependencies:
- ffmpeg=3.2.4=1
- freetype=2.7=vc14_1
- imageio=2.2.0=py35_0
- libtiff=4.0.6=vc14_7
- olefile=0.44=py35_0
- pillow=4.2.1=py35_0
- vc=14=0
- alabaster=0.7.10=py35_0
- astroid=1.5.3=py35_0
- babel=2.5.0=py35_0
- bleach=1.5.0=py35_0
- certifi=2016.2.28=py35_0
- cffi=1.10.0=py35_0
- chardet=3.0.4=py35_0
- colorama=0.3.9=py35_0
- decorator=4.1.2=py35_0
- docutils=0.14=py35_0
- entrypoints=0.2.3=py35_0
- html5lib=0.9999999=py35_0
- icu=57.1=vc14_0
- imagesize=0.7.1=py35_0
- ipykernel=4.6.1=py35_0
- ipython=6.1.0=py35_0
- ipython_genutils=0.2.0=py35_0
- isort=4.2.15=py35_0
- jedi=0.10.2=py35_2
- jinja2=2.9.6=py35_0
- jpeg=9b=vc14_0
- jsonschema=2.6.0=py35_0
- jupyter_client=5.1.0=py35_0
- jupyter_core=4.3.0=py35_0
- lazy-object-proxy=1.3.1=py35_0
- libpng=1.6.30=vc14_1
- markupsafe=1.0=py35_0
- mistune=0.7.4=py35_0
- mkl=2017.0.3=0
- nbconvert=5.2.1=py35_0
- nbformat=4.4.0=py35_0
- numpy=1.13.1=py35_0
- numpydoc=0.7.0=py35_0
- openssl=1.0.2l=vc14_0
- pandocfilters=1.4.2=py35_0
- path.py=10.3.1=py35_0
- pickleshare=0.7.4=py35_0
- pip=9.0.1=py35_1
- prompt_toolkit=1.0.15=py35_0
- psutil=5.2.2=py35_0
- pycodestyle=2.3.1=py35_0
- pycparser=2.18=py35_0
- pyflakes=1.6.0=py35_0
- pygments=2.2.0=py35_0
- pylint=1.7.2=py35_0
- pyqt=5.6.0=py35_2
- python=3.5.4=0
- python-dateutil=2.6.1=py35_0
- pytz=2017.2=py35_0
- pyzmq=16.0.2=py35_0
- qt=5.6.2=vc14_6
- qtawesome=0.4.4=py35_0
- qtconsole=4.3.1=py35_0
- qtpy=1.3.1=py35_0
- requests=2.14.2=py35_0
- rope=0.9.4=py35_1
- setuptools=36.4.0=py35_1
- simplegeneric=0.8.1=py35_1
- singledispatch=3.4.0.3=py35_0
- sip=4.18=py35_0
- six=1.10.0=py35_1
- snowballstemmer=1.2.1=py35_0
- sphinx=1.6.3=py35_0
- sphinxcontrib=1.0=py35_0
- sphinxcontrib-websupport=1.0.1=py35_0
- spyder=3.2.3=py35_0
- testpath=0.3.1=py35_0
- tornado=4.5.2=py35_0
- traitlets=4.3.2=py35_0
- vs2015_runtime=14.0.25420=0
- wcwidth=0.1.7=py35_0
- wheel=0.29.0=py35_0
- win_unicode_console=0.5=py35_0
- wincertstore=0.2=py35_0
- wrapt=1.10.11=py35_0
- zlib=1.2.11=vc14_0
- opencv3=3.1.0=py35_0
- pytorch=0.1.12=py35_0.1.12cu80
- torch==0.1.12
- torchvision==0.1.9
- pip:
- ipython-genutils==0.2.0
- jupyter-client==5.1.0
- jupyter-core==4.3.0
- prompt-toolkit==1.0.15
- pyyaml==3.12
- rope-py3k==0.9.4.post1
- torch==0.1.12
- torchvision==0.1.9
- win-unicode-console==0.5
You can use the conda env update command:
conda env update --name <your env name> -f <your file>.yml
or, if the environment you want to update is already activated, then
conda env update -f <your file>.yml
If you want to create the environment from your yml file:
conda env create -f environment.yml
The name of your environment is virtual_platform. If you want another name, just edit your yml name to desired name.
It is not recommended to install packages to your base environment but if that is what you want, and I believe you should not, you need to create a requirement.txt from dependencies listed on your yml.
Copy and paste all the dependencies
packages and there version to requirements.txt as:
python ==3.5
ffmpeg=3.2.4
freetype=2.7
imageio=2.2.0
...
Then do:
conda install --yes --file requirements.txt
The problem is that this will fail if any dependence fail to install. So I will recommend installing using yml which means having an environment separate from the rest.
I just need to import an Anaconda .yml environmental file in virtualenv virtual environment.
The reason I need to do this is because on nVidia Jetson TX2 developer board I cannot install and run Anaconda distribution (It is not compatible with ARM architecture). Virtualenv and Jupyter, instead, are installed and run flawlessly.
The .yml file is listed like this:
name: tfdeeplearning
channels:
- defaults
dependencies:
- bleach=1.5.0=py35_0
- certifi=2016.2.28=py35_0
- colorama=0.3.9=py35_0
- cycler=0.10.0=py35_0
- decorator=4.1.2=py35_0
- entrypoints=0.2.3=py35_0
- html5lib=0.9999999=py35_0
- icu=57.1=vc14_0
- ipykernel=4.6.1=py35_0
- ipython=6.1.0=py35_0
- ipython_genutils=0.2.0=py35_0
- ipywidgets=6.0.0=py35_0
- jedi=0.10.2=py35_2
- jinja2=2.9.6=py35_0
- jpeg=9b=vc14_0
- jsonschema=2.6.0=py35_0
- jupyter=1.0.0=py35_3
- jupyter_client=5.1.0=py35_0
- jupyter_console=5.2.0=py35_0
- jupyter_core=4.3.0=py35_0
- libpng=1.6.30=vc14_1
- markupsafe=1.0=py35_0
- matplotlib=2.0.2=np113py35_0
- mistune=0.7.4=py35_0
- mkl=2017.0.3=0
- nbconvert=5.2.1=py35_0
- nbformat=4.4.0=py35_0
- notebook=5.0.0=py35_0
- numpy=1.13.1=py35_0
- openssl=1.0.2l=vc14_0
- pandas=0.20.3=py35_0
- pandocfilters=1.4.2=py35_0
- path.py=10.3.1=py35_0
- pickleshare=0.7.4=py35_0
- pip=9.0.1=py35_1
- prompt_toolkit=1.0.15=py35_0
- pygments=2.2.0=py35_0
- pyparsing=2.2.0=py35_0
- pyqt=5.6.0=py35_2
- python=3.5.4=0
- python-dateutil=2.6.1=py35_0
- pytz=2017.2=py35_0
- pyzmq=16.0.2=py35_0
- qt=5.6.2=vc14_6
- qtconsole=4.3.1=py35_0
- requests=2.14.2=py35_0
- scikit-learn=0.19.0=np113py35_0
- scipy=0.19.1=np113py35_0
- setuptools=36.4.0=py35_1
- simplegeneric=0.8.1=py35_1
- sip=4.18=py35_0
- six=1.10.0=py35_1
- testpath=0.3.1=py35_0
- tk=8.5.18=vc14_0
- tornado=4.5.2=py35_0
- traitlets=4.3.2=py35_0
- vs2015_runtime=14.0.25420=0
- wcwidth=0.1.7=py35_0
- wheel=0.29.0=py35_0
- widgetsnbextension=3.0.2=py35_0
- win_unicode_console=0.5=py35_0
- wincertstore=0.2=py35_0
- zlib=1.2.11=vc14_0
- pip:
- ipython-genutils==0.2.0
- jupyter-client==5.1.0
- jupyter-console==5.2.0
- jupyter-core==4.3.0
- markdown==2.6.9
- prompt-toolkit==1.0.15
- protobuf==3.4.0
- tensorflow==1.3.0
- tensorflow-tensorboard==0.1.6
- werkzeug==0.12.2
- win-unicode-console==0.5
prefix: C:\Users\Marcial\Anaconda3\envs\tfdeeplearning
pip can install from a requirements.txt file, which would look like
the items in the sequence that is the value for the key
pip in your .yml file, but without the dashes:
ipython-genutils==0.2.0
jupyter-client==5.1.0
jupyter-console==5.2.0
jupyter-core==4.3.0
markdown==2.6.9
prompt-toolkit==1.0.15
protobuf==3.4.0
tensorflow==1.3.0
tensorflow-tensorboard==0.1.6
werkzeug==0.12.2
win-unicode-console==0.5
Assuming that the end of your file actually looks like:
.
.
.
- wincertstore=0.2=py35_0
- zlib=1.2.11=vc14_0
- pip:
- ipython-genutils==0.2.0
- jupyter-client==5.1.0
- jupyter-console==5.2.0
- jupyter-core==4.3.0
- markdown==2.6.9
- prompt-toolkit==1.0.15
- protobuf==3.4.0
- tensorflow==1.3.0
- tensorflow-tensorboard==0.1.6
- werkzeug==0.12.2
- win-unicode-console==0.5
prefix: C:\Users\Marcial\Anaconda3\envs\tfdeeplearning
(i.e. the entry for pip is indented to make this a valid YAML file),
and is named anaconda-project.yml, you can do:
import ruamel.yaml
yaml = ruamel.yaml.YAML()
data = yaml.load(open('anaconda-project.yml'))
requirements = []
for dep in data['dependencies']:
if isinstance(dep, str):
package, package_version, python_version = dep.split('=')
if python_version == '0':
continue
requirements.append(package + '==' + package_version)
elif isinstance(dep, dict):
for preq in dep.get('pip', []):
requirements.append(preq)
with open('requirements.txt', 'w') as fp:
for requirement in requirements:
print(requirement, file=fp)
resulting in a requirement.txt file, which can be used with:
pip install -r requirements.txt
Please note:
the non-pip packages might not be available from PyPI
the current pip version is 18.1, the one in that requirements list is old
that according to the official YAML FAQ, using .yml as an
extension for your YAML file should only be done if the recommended
.yaml extension. On modern filesystems that is never the case. I
don't know if Anaconda is, as so often, non-conform, or that you
have a choice in the matter.
since the introduction of binary wheels a few years ago, and many
packages supporting them, it is often (and for me always) possible
to just use virtualenvs and pip. And thereby circumventing the
problems caused by Anaconda not being 100% compliant and not being
up-to-date with all its packages (compared to PyPI).