I am trying to build (conda-build) and test (pytest) my python project with help of anaconda and TeamCity.
TeamCity is working fine for Linux, but for Windows I have an error on the testing build step (Linux and Windows use the same meta.yaml file):
failed to get install actions, retrying: exception was:
Unsatisfiable dependencies for platform win-64: {'urlib3==1.26.11=pyhd8ed1ab_0', ... <GIANT LIST OF LIBRARIES>
Important to say that the first step (build packages with help of conda-build) is working
But the second step (test step) is failing with the error above. So i think that the error deals with pytest related libs.
**meta.yaml**
{% set version = "1.1.0" %}
package:
name: myname
version: {{ version }}
source:
path ..
build:
number: 0
skip: True
pin_depends: strict
requirements:
build:
- python >= 3.8,<3.9
run:
- pythonv>=3.8,<3.9
- docker-py
- holidays
- scipy
- requests >=2.23.0
- numpy <1.22.0
- matplotlib >= 3.5.0,<4.0
...
- requests-negotiate-sspi # [win]
test:
requires:
- pytest
- pytest-asyncio
- pytest-cov
- pytz
- coverage
- dateutils
- pytest-xdist
- pytest-benchmark
- teamcity-messages
- pytest-bdd
source_files:
- conftest.py
comands:
- pytest <PATH_TO_FILE>
TeamCity test step looks like this:
conda-build <MY_PATH>\win_64\*.tar.bz2 <MY CHANNELS> --override-channels --test
Related
Gitlab version is 13.6.6
Gitlab-runner version is 11.2.0
my .gitlab-ci.yml:
image: "python:3.7"
before_script:
- pip install flake8
flake8:
stage: test
script:
- flake8 -max-line-length=79
tags:
- test
The only information obtained from Pipelines is script failure and the output of failed job is No job log. How can I get more detailed error output?
Using artifacts can help you.
image: "python:3.7"
before_script:
- pip install flake8
flake8:
stage: test
script:
- flake8 -max-line-length=79
- cd path/to
tags:
- test
artifacts:
when: on_failure
paths:
- path/to/test.log
The log file can be downloaded via the web interface.
Note:- Using when: on_failure will ensure that test.log will only be collected if the build fails, saving disk space on successful builds.
I want to cache the dependencies in requirement.txt. See https://learn.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops#pythonpip. Here is my azure-pipelines.yml
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python38:
python.version: '3.8'
variables:
PIP_CACHE_DIR: $(Pipeline.Workspace)/.pip
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- task: Cache#2
inputs:
key: 'python | "$(Agent.OS)" | requirements.txt'
restoreKeys: |
python | "$(Agent.OS)"
python
path: $(PIP_CACHE_DIR)
displayName: Cache pip packages
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- script: |
pip install pytest pytest-azurepipelines
pytest
displayName: 'pytest'
The dependencies specified in my requirements.txt are installed in every pipeline run.
The pipeline task Cache#2 gives the following output.
Starting: Cache pip packages
==============================================================================
Task : Cache
Description : Cache files between runs
Version : 2.0.1
Author : Microsoft Corporation
Help : https://aka.ms/pipeline-caching-docs
==============================================================================
Resolving key:
- python [string]
- "Linux" [string]
- requirements.txt [file] --> EBB7474E7D5BC202D25969A2E11E0D16251F0C3F3F656F1EE6E2BB7B23868B10
Resolved to: python|"Linux"|jNwyZU113iWcGlReTrxg8kzsyeND5OIrPLaN0I1rRs0=
Resolving restore key:
- python [string]
- "Linux" [string]
Resolved to: python|"Linux"|**
Resolving restore key:
- python [string]
Resolved to: python|**
ApplicationInsightsTelemetrySender will correlate events with X-TFS-Session 85b76fe3-b469-4330-a584-db569bc45342
Getting a pipeline cache artifact with one of the following fingerprints:
Fingerprint: `python|"Linux"|jNwyZU113iWcGlReTrxg8kzsyeND5OIrPLaN0I1rRs0=`
Fingerprint: `python|"Linux"|**`
Fingerprint: `python|**`
There is a cache miss.
ApplicationInsightsTelemetrySender correlated 1 events with X-TFS-Session 85b76fe3-b469-4330-a584-db569bc45342
Finishing: Cache pip packages
Enabling system diagnostics and viewing the log of Post-job: Cache pip packages showed the reason why no cache was created.
##[debug]Evaluating condition for step: 'Cache pip packages'
##[debug]Evaluating: AlwaysNode()
##[debug]Evaluating AlwaysNode:
##[debug]=> True
##[debug]Result: True
Starting: Cache pip packages
==============================================================================
Task : Cache
Description : Cache files between runs
Version : 2.0.1
Author : Microsoft Corporation
Help : https://aka.ms/pipeline-caching-docs
==============================================================================
##[debug]Skipping because the job status was not 'Succeeded'.
Finishing: Cache pip packages
There were failing tests in the build pipeline. The cache was used after I removed the failing tests.
I have a travis job that looks like this:
jobs:
include:
- stage: "Unit tests"
language: python
python:
- "3.6"
- "3.7"
install:
- pip install -r requirements.txt
script:
- python -m unittest test.client
I would expect this unit test to run two jobs one for python 3.6 and one for 3.7 however it always only runs for the first version listed. Am I missing something here? I followed the guide from the docs
Thanks
The python versions are not defined within the jobs but on the root level.
python:
- "3.6"
- "3.7"
jobs:
...
I found this out because travis recently introduced a build config validation. It can be found under your build -> View config -> Build config validation
Fist of all, I'm a total newbie, please bear my idiocy :)
I run this:
conda env create -f env.yml
Here's the yml file:
name: DAND
channels: !!python/tuple
- defaults
dependencies:
- _nb_ext_conf=0.3.0=py27_0
- anaconda-client=1.6.0=py27_0
- appnope=0.1.0=py27_0
- backports=1.0=py27_0
- backports_abc=0.5=py27_0
- beautifulsoup4=4.5.1=py27_0
- clyent=1.2.2=py27_0
- configparser=3.5.0=py27_0
- cycler=0.10.0=py27_0
- decorator=4.0.10=py27_1
- entrypoints=0.2.2=py27_0
- enum34=1.1.6=py27_0
- freetype=2.5.5=1
- functools32=3.2.3.2=py27_0
- get_terminal_size=1.0.0=py27_0
- icu=54.1=0
- ipykernel=4.5.2=py27_0
- ipython=5.1.0=py27_1
- ipython_genutils=0.1.0=py27_0
- ipywidgets=5.2.2=py27_0
- jinja2=2.8=py27_1
- jsonschema=2.5.1=py27_0
- jupyter=1.0.0=py27_3
- jupyter_client=4.4.0=py27_0
- jupyter_console=5.0.0=py27_0
- jupyter_core=4.2.1=py27_0
- libpng=1.6.22=0
- markupsafe=0.23=py27_2
- matplotlib=1.5.3=np111py27_1
- mistune=0.7.3=py27_1
- mkl=11.3.3=0
- nb_anacondacloud=1.2.0=py27_0
- nb_conda=2.0.0=py27_0
- nb_conda_kernels=2.0.0=py27_0
- nbconvert=4.2.0=py27_0
- nbformat=4.2.0=py27_0
- nbpresent=3.0.2=py27_0
- nltk=3.2.1=py27_0
- notebook=4.3.0=py27_0
- numpy=1.11.2=py27_0
- openssl=1.0.2j=0
- pandas=0.19.1=np111py27_0
- path.py=8.2.1=py27_0
- pathlib2=2.1.0=py27_0
- pexpect=4.0.1=py27_0
- pickleshare=0.7.4=py27_0
- pip=9.0.1=py27_1
- prompt_toolkit=1.0.9=py27_0
- ptyprocess=0.5.1=py27_0
- pygments=2.1.3=py27_0
- pymongo=3.3.0=py27_0
- pyparsing=2.1.4=py27_0
- pyqt=5.6.0=py27_1
- python=2.7.12=1
- python-dateutil=2.6.0=py27_0
- python.app=1.2=py27_4
- pytz=2016.10=py27_0
- pyyaml=3.12=py27_0
- pyzmq=16.0.2=py27_0
- qt=5.6.2=0
- qtconsole=4.2.1=py27_1
- readline=6.2=2
- requests=2.12.3=py27_0
- scikit-learn=0.17.1=np111py27_2
- scipy=0.18.1=np111py27_0
- seaborn=0.7.1=py27_0
- setuptools=27.2.0=py27_0
- simplegeneric=0.8.1=py27_1
- singledispatch=3.4.0.3=py27_0
- sip=4.18=py27_0
- six=1.10.0=py27_0
- sqlite=3.13.0=0
- ssl_match_hostname=3.4.0.2=py27_1
- terminado=0.6=py27_0
- tk=8.5.18=0
- tornado=4.4.2=py27_0
- traitlets=4.3.1=py27_0
- unicodecsv=0.14.1=py27_0
- wcwidth=0.1.7=py27_0
- wheel=0.29.0=py27_0
- widgetsnbextension=1.2.6=py27_0
- xlrd=1.0.0=py27_0
- yaml=0.1.6=0
- zlib=1.2.8=3
- pip:
- backports-abc==0.5
- backports.shutil-get-terminal-size==1.0.0
- backports.ssl-match-hostname==3.4.0.2
- ipython-genutils==0.1.0
- jupyter-client==4.4.0
- jupyter-console==5.0.0
- jupyter-core==4.2.1
- nb-anacondacloud==1.2.0
- nb-conda==2.0.0
- nb-conda-kernels==2.0.0
- prompt-toolkit==1.0.9
prefix: /Users/mat/anaconda/envs/DAND
The error I run into:
Collecting package metadata (repodata.json): done
Solving environment: failed
ResolvePackageNotFound:
- jupyter_console==5.0.0=py27_0
- freetype==2.5.5=1
- pyzmq==16.0.2=py27_0
- configparser==3.5.0=py27_0
- scipy==0.18.1=np111py27_0
- libpng==1.6.22=0
- ...then the list goes on and list all of the dependencies in the yml file, except the ones under pip
Things I've attempted:
I got this yaml file from a Udacity online class I'm taking, I downloaded from the website, so I don't think conda env export --no-builds > env.yml method applies to me.
I tried the solution in here, I simply moved everything under pip block, and run into a new error. Maybe I'm misunderstanding the solution.
The new error I run into:
Warning: you have pip-installed dependencies in your environment file, but you do not list pip itself as one of your conda dependencies. Conda may not use the correct pip to install your packages, and they may end up in the wrong place. Please add an explicit pip dependency. I'm adding one for you, but still nagging you.
Collecting package metadata (repodata.json): done
Solving environment: done
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
Ran pip subprocess with arguments:
['/Users/yulia/anaconda3/envs/DAND/bin/python', '-m', 'pip', 'install', '-U', '-r', '/Users/yulia/data analysis -uda/condaenv.mo_ctuap.requirements.txt']
Pip subprocess output:
Pip subprocess error:
ERROR: Double requirement given: backports_abc==0.5=py27_0 (from -r /Users/yulia/data analysis -uda/condaenv.mo_ctuap.requirements.txt (line 12)) (already in backports-abc==0.5 (from -r /Users/yulia/data analysis -uda/condaenv.mo_ctuap.requirements.txt (line 1)), name='backports-abc')
CondaEnvException: Pip failed
I read some other posts suggesting to use pip to install the requirements.txt file, and some posts about "CondaEnvException: Pip failed" situation. But they didn't write explicit solutions, most of the time I'm really confused about those solutions.
Please let me know what I'm missing here, this is getting frustrating as I cannot set up the proper environment to continue the class. Thank you so much in advance!
UPDATE
It seems that things might work better in the end when you skip using the env file. Instead, create an env with required dependencies manually, this way libraries are up-to-date and notebooks appear to work properly.
$ conda create -n DAND python=2 numpy pandas matplotlib seaborn
Look for required libraries in your course's "Setting up your system" (or similar) section. The ones in my example are based on Udacity's "Intro to Data Analysis" course.
Older answer
I had a similar problem and what eventually worked for me was adding two more channels in the channels section of this YAML file.
Before:
channels: !!python/tuple
- defaults
After:
channels: !!python/tuple
- defaults
- conda-forge
- anaconda
Then all the packages even with the version restrictions were found.
In case you get some errors about conflicting version, make sure to set conda config channel_priority to false:
$ conda config --set channel_priority false
Is there a way to configure travis-ci to make the Python versions dependent on a certain env var?
Please consider the following travis.yml config:
language: python
python:
- "2.5"
- "2.6"
- "2.7"
env:
- DJANGO=1.3.4
- DJANGO=1.4.2
- DJANGO=https://github.com/django/django/zipball/master
install:
- pip install -q Django==$DJANGO --use-mirrors
- pip install -e . --use-mirrors
script:
- python src/runtests.py
Among Django 1.3 (DJANGO=1.3.4) and 1.4 (DJANGO=1.4.2) i also want to test against the latest development version of Django (DJANGO=https://github.com/django/django/zipball/master), which is basically Django 1.5.
The problem i see is that travis-ci will automatically run the integration against all specified Python versions. Django 1.5 however doesn't support Python 2.5 anymore. Is it possible to omit it for the Django development version so that i get integrations like this only:
DJANGO=1.3.4 --> python "2.5", "2.6", "2.7"
DJANGO=1.4.2 --> python "2.5", "2.6", "2.7"
DJANGO=https://github.com/django/django/zipball/master --> python "2.6", "2.7"
UPDATE:
Here's a link to a live example based on Odi's answer which i've been using successfully for a few months:
https://github.com/deschler/django-modeltranslation/blob/master/.travis.yml
You can specify configurations that you want to exclude from the build matrix (i.e. combinations that you don't want to test).
Add this to your .travis.yml:
matrix:
exclude:
- python: "2.5"
env: DJANGO=https://github.com/django/django/zipball/master
Note: only exact matches will be excluded.
See the build documentation (section The Build Matrix) for further information.