How to add local tar.gz package in conda meta.yaml requierement? - python

I have a meta.yaml recipe for conda, to build a package. (we will call it mypackage)
I want this package to use a local (tar.bz2) file in his requirement section (build&run) (we will call it locapackagedep)
Here is an example of what I would like to do
requirements:
build:
- setuptools
- wheel
- nodejs=16
- yarn
- jupyterlab
- /my/path/to/locapackagedep
- ipympl
host:
- python {{ python }}
run:
- python=3.8
- jupyterlab
- locapackagedep
I can't find any doc on it ....

I believe one needs to specify the package by name, and use the -c flag to indicate a local path that contains the build.
Something like:
requirements:
build:
- setuptools
- wheel
- nodejs=16
- yarn
- jupyterlab
- locapackagedep
- ipympl
host:
- python {{ python }}
run:
- python=3.8
- jupyterlab
- locapackagedep
and
conda build -c file://my/path/to .

Related

Can't find NVidia cudatoolkit package when creating environment from .yml

Hopefully someone can help me, I've tried to search the internet for a solution, but can't seem to find a solution..
I'm trying to create an environment using conda env create -n deltaconv -f environment.yml and somehow conda I'm getting this response:
[Folder]>conda env create -n deltaconv -f environment.yml
Collecting package metadata (repodata.json): done
Solving environment: failed
ResolvePackageNotFound:
- nvidia::cudatoolkit=11.3
I've just freshly installed Miniconda for the task and the environment looks like this:
channels:
- nvidia
- pytorch
- pyg
dependencies:
- pip=21.2.4
- python=3.9.12
- setuptools=52.0.0
- wheel=0.36.2
- protobuf~=3.19.0
- nvidia::cudatoolkit=11.3
- pytorch::pytorch=1.11.0
- pytorch::torchvision
- pytorch::torchaudio
- pyg::pyg=2.0.4
- pip:
- numpy==1.21.5
- progressbar2==4.0.0
- tensorboard==2.8.0
- jupyter==1.0.0
- openmesh==1.2.1
- h5py==3.6.0
- pytest==7.1.2
- deltaconv==1.0.0
Does anyone know why conda is unable to find cudatoolkit 11.3 from the nvidia channel?

Conda can't install packages from requirements.txt available in conda-forge, although package exists in conda-forge

I added conda-forge to the conda channels:
$ conda config --show channels
channels:
- conda-forge
- defaults
my requirements.txt contains, among others, these lines:
ipython-genutils==0.2.0
jupyter-client==6.1.12
jupyterlab-pygments==0.1.2
appnope==0.1.2
jupyterlab-widgets==1.0.0
data==0.4
prometheus-client==0.11.0
latex==0.7.0
scipy==1.5.4
jupyter-core==4.7.1
jupyter-console==6.4.0
async-generator==1.10
vg==1.10.0
sklearn==0.0
postgis==1.0.4
When I try to create a new environment from this requirements.txt using conda with
conda create --name myenv --file requirements.txt
I get the following errors:
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed
PackagesNotFoundError: The following packages are not available from current channels:
- ipython-genutils==0.2.0
- jupyter-client==6.1.12
- jupyterlab-pygments==0.1.2
- appnope==0.1.2
- jupyterlab-widgets==1.0.0
- data==0.4
- prometheus-client==0.11.0
- latex==0.7.0
- scipy==1.5.4
- jupyter-core==4.7.1
- jupyter-console==6.4.0
- async-generator==1.10
- vg==1.10.0
- sklearn==0.0
- postgis==1.0.4
Current channels:
- https://conda.anaconda.org/conda-forge/linux-64
- https://conda.anaconda.org/conda-forge/noarch
- https://repo.anaconda.com/pkgs/main/linux-64
- https://repo.anaconda.com/pkgs/main/noarch
- https://repo.anaconda.com/pkgs/r/linux-64
- https://repo.anaconda.com/pkgs/r/noarch
To search for alternate channels that may provide the conda package you're
looking for, navigate to
https://anaconda.org
and use the search bar at the top of the page.
As you can see, conda-forge is listed under "current channels" and ipython-genutils==0.2.0 is available in conda-forge. However, the package is not found. How can I fix this problem?
I tried both conda config --set channel_priority flexible and ... stable
I run Ubuntu 20.04 LTS, Python 3.10 and Conda 4.12.0
It looks to me like this should have been a requirements.txt to be used by pip. Note that conda packages can have slightly different names than what is available on pypi.
ipython-genutils is not the correct name, looking at the link you have provided, the name of the package is ipython_genutils with an underscore. The same is true for the other packages that you have written with a hyphen. They should all be spelled with an underscore.
That leaves
- sklearn==0.0
- latex==0.7.0
- vg==1.10.0
- scipy==1.5.4
- postgis==1.0.4
- data==0.4
- appnope==0.1.2
sklearn==0.0 seems to be a corrupt line in your file. The package's name is scikit-learn. latex, vg and data are not available on conda channels as far as I can tell. The same goes for scipy==1.5.4, only 1.5.3 and 1.6 are available. postgis only goes back to 2.4.3 on conda-forge , see here, but also seems to be different from what is available on pypi. appnope is a package only available for macOS, see it's description:
Simple package for disabling App Nap on macOS >= 10.9, which can be problematic.
So with that in mind, we can create a yml file that installs from both conda channels and from pip (Changes to your file: replaced - with _, removed appnope, added pip dependency, renamed sklearn to scikit-learn and moved it together with latex, scipy, vg, data, postgis to pip requirements. If you are flexible with scipy==1.5.4, I would advise to change it to scipy==1.5.3 or scipy==1.6.0 and move scipy and sklearn out of the pip installed packages):
name: myenv
dependencies:
- ipython_genutils==0.2.0
- jupyter_client==6.1.12
- jupyterlab_pygments==0.1.2
- jupyterlab_widgets==1.0.0
- prometheus_client==0.11.0
- jupyter_core==4.7.1
- jupyter_console==6.4.0
- async_generator==1.10
- pip
- pip:
- scikit-learn
- latex==0.7.0
- scipy==1.5.4
- vg==1.10.0
- data==0.4.0
- postgis==1.0.4
Save this as environment.yml and then do
conda env create -f environment.yml

How to specify channel-specific package dependencies in conda package meta.yaml? [duplicate]

I am writing a package for conda-forge and needed to specify a dependency on another conda-forge dependency. Essentially, I need to install a pinned version of the conda-forge gdal package, because it actually compiles the version of libtiff that supports BIGTIFF files....
Now if I was installing gdal into a conda environment, I would write something like.
conda install -c conda-forge gdal=2.4.4
I would to get this version of gdal=2.4.4 from conda-forge installed when installing the package. Now in the meta.yaml file, I can specify package dependencies like so, but I did not see how to specify a URL to a tar file, or whatever would work.
{% set version = "0.0.1" %}
package:
name: mypackage
version: {{ version }}
source:
url: https://github.com/myrepo/{{ version }}.tar.gz
sha256: ****6a63
build:
number: 1
skip: true # [win and py27]
entry_points:
- mycli = mypackage.main:main
requirements:
build:
- python
-
host:
- python
- pip
- numpy
- gdal # <----- want to specify from conda-forge
run:
- python
- gdal # <----- want to specify from conda-forge
Any suggestions about how to do this would be appreciated.
I don't think it's possible to specify the channel in meta.yaml. The following issue is still unresolved in the conda-build issue tracker:
https://github.com/conda/conda-build/issues/532
As a workaround, if you know the exact version of gdal that you need, you can specify the exact version and "build string" in the recipe.
The only annoying thing is that you'll have to list gdal once for every combination of platform and python version your recipe needs to support.
requirements:
build:
- python
-
host:
- python
- pip
- numpy
- gdal 2.4.4 py36h02fde04_1 # [osx and py==36]
- gdal 2.4.4 py37h622575a_1 # [osx and py==37]
- gdal 2.4.4 py38h57202bd_1 # [osx and py==38]
- gdal 2.4.4 py36hbb8311d_1 # [linux and py==36]
- gdal 2.4.4 py37hf8c3989_1 # [linux and py==37]
- gdal 2.4.4 py38hfe926b7_1 # [linux and py==38]
run:
- python
- gdal 2.4.4 py36h02fde04_1 # [osx and py==36]
- gdal 2.4.4 py37h622575a_1 # [osx and py==37]
- gdal 2.4.4 py38h57202bd_1 # [osx and py==38]
- gdal 2.4.4 py36hbb8311d_1 # [linux and py==36]
- gdal 2.4.4 py37hf8c3989_1 # [linux and py==37]
- gdal 2.4.4 py38hfe926b7_1 # [linux and py==38]
(I copied those from the gdal package listing on the conda-forge channel.)
BTW, since you mentioned that the real important difference for you is libtiff, then should you be pinning libtiff instead of gdal? Or maybe both?
Edit:
It would be nice to avoid repeating the whole list of build strings in the host and run sections.
As you suggested in the comments, one option is to define the build string in conda_build_config.yaml:
# conda_build_config.yaml
gdal_build:
- py36h02fde04_1 # [osx and py==36]
- py37h622575a_1 # [osx and py==37]
- py38h57202bd_1 # [osx and py==38]
- py36hbb8311d_1 # [linux and py==36]
- py37hf8c3989_1 # [linux and py==37]
- py38hfe926b7_1 # [linux and py==38]
# meta.yaml
requirements:
build:
- python
-
host:
- python
- pip
- numpy
- gdal 2.4.4 {{ gdal_build }}
run:
- python
- gdal 2.4.4 {{ gdal_build }}
Another option is to define a lookup table in a jinja variable, directly in meta.yaml. This is slightly uglier, perhaps, but at least all of logic is contained in a single file. I'm not sure which to prefer.
{% set platform = 'linux' if linux else 'osx' if osx else 'win' %}
{%
set gdal_builds = {
'osx': {
36: 'py36h02fde04_1',
37: 'py37h622575a_1',
38: 'py38h57202bd_1',
},
'linux': {
36: 'py36hbb8311d_1',
37: 'py37hf8c3989_1',
38: 'py38hfe926b7_1',
}
}
%}
requirements:
build:
- python
-
host:
- python
- pip
- numpy
- gdal 2.4.4 {{ gdal_builds[platform][py] }}
run:
- python
- gdal 2.4.4 {{ gdal_builds[platform][py] }}

How to install all missing dependencies in enviroment anaconda file?

Good night! I have a .yml file with this structure:
name: web
channels:
- defaults
dependencies:
- zope.event=4.4=py37_0
- zope.interface=5.1.0=py37haf1e3a3_0
- zstd=1.4.5=h41d2c2f_0
- pip:
- asgiref==3.2.10
- cloudpickle==1.3.0
that is actually mutch bigger than this. When i run conda env create --file ambiente.yml i get Solving environment: failed ResolvePackageNotFound: and a list of all missing dependencies. How can a install all dependencies at once?

Conda custom package installs into wrong python version directory

After building a conda package and installing it into a new empty environment my package cannot be imported due to it being placed into the python3.8/site-packages directory whereas the environment's python executable and all of the package dependencies are under python3.7.
Starting from a an empty env.:
conda create -n myenv
conda install --use-local mypackage
The resulting install ends up with the following:
~/miniconda3/envs/myenv/lib/python3.8/site-packages
|-mypackage/
|-mypackage-0.0.0-py3.8.egg.info/
~/miniconda3/envs/myenv/lib/python3.7/site-packages
|- all of the dependencies...
The resulting conda env also ends up having its python version set to 3.7 as well. So obviously, now when I open a python console and attempt to import my package it fails. The perplexing thing is that I do have an import test in my meta.yml that tests importing my package that seems to pass during the conda build process.
If I pin my meta.yml python version to python=3.7 instead of python>=3.7 it works. My package ends up installed in python3.7/site-packages with everything else and it works fine.
The relevant build requirements from my meta.yml:
requirements:
build:
- setuptools
- nodejs>=14.5.0
- mkdocs>=1.1.2
- mkdocs-material>=5.4.0
- mkdocs-material-extensions>=1.0
host:
- python
run:
- python>=3.7
- rabbitmq-server>=3.7.16
- pika>=1.1.0
- pyzmq>=19.0.1
- pyyaml>=5.3.1
- numpy>=1.18.5
- sqlalchemy>=1.3.18
- sqlite>=3.28.0
- netifaces>=0.10.9
- psutil>=5.7.0
- uvloop>=0.14.0
- numexpr>=2.7.1
- fastapi>=0.59.0
- uvicorn>=0.11.3
test:
imports:
- mypackage
The relevant line from my conda recipe build.sh:
$PYTHON setup.py install

Categories

Resources