I had setup codecov with gitlab pipelines a while back and was able to see coverage reports in codecov. Since the initial setup the reports stopped processing after a few commits, and I have not been able to figure out what I'm doing wrong to get the reports processing again.
In gitlab pipelines I use tox and pip install codecov:
test:
stage: test
script:
- pip install circuitpython-build-tools Sphinx sphinx-rtd-theme tox codecov
- tox
- codecov -t $CODECOV_TOKEN
artifacts:
paths:
- htmlcov/
In tox I run coverage:
[testenv:coverage]
deps = -rrequirements.txt
-rtest-requirements.txt
commands = coverage run -m unittest discover tests/
coverage html
In codecov I can see where the upload attempts to process, but it fails without much description:
There was an error processing coverage reports.
I've referenced the python tutorials, but can't see what I'm getting wrong.
https://github.com/codecov/codecov-python
https://github.com/codecov/example-python
Looks like maybe this was something on the codecov.io side. I didn't change anything, but came into coverage reports parsing and the badge working again this morning.
Related
I have installed nose2 with the following command:
pip3 install nose2
And I have installed the coverage in the custom path with the following command:
pip3 install --target=/tmp/coverage_pkg coverage
I want to execute the test cases and generate the coverage report. Is there a way to map my coverage plugin installed in custom path to the nose2?
I tried to execute the below command:
nose2 --with-coverage
I got the following output:
Warning: you need to install "coverage_plugin" extra requirements to use this plugin. e.g. `pip install nose2[coverage_plugin]`
----------------------------------------------------------------------
Ran 0 tests in 0.000s
OK
I want to generate the coverage report by using the coverage package installed in custom path.
Can someone please explain if I can achieve this?
I was able to achieve this using the following command:
cd <custom location where packages are installed>
python3 -m nose2 --with-coverage -s "path_to_source_dir" --coverage "path_to_source_dir"
You need to stay in the location where nose2 and coverage in installed(Custom dir/ Eg: /tmp/coverage_pkg).
For more configuration and generating junit report and coverage report we can use the unittest.cfg file and .coveragerc for controlling coverage report and unit testing.
Sample unittest.cfg:
[unittest]
plugins = nose2.plugins.junitxml
start-dir =
code-directories =
test-file-pattern = *_test.py
test-method-prefix = t
[junit-xml]
always-on = True
keep_restricted = False
path = nose2-junit.xml
test_fullname = False
[coverage]
always-on = True
coverage =
coverage-config = .coveragerc
coverage-report = term-missing
xml
Sample .coveragerc:
[run]
branch = True
[xml]
output =coverage.xml
[report]
show_missing = True
omit=
/test/*
/poc1.py
/bin/__init__.py
Use the below command for using config file and generate unittest report and coverage report.
python3 -m nose2 --config "<path to config file>/unittest.cfg"
The error message includes the exact command you need:
pip install nose2[coverage_plugin]
I have recently added the cobertura coverage report to my repository, but it still does not show the coverage in an MR's diff.
Here is the job of my .gitlab-ci.yml that generates the coverage report:
coverage-report:
stage: coverage
script:
- tox -e coverage-report
coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
artifacts:
name: "coverage"
paths:
- public/coverage
expire_in: 1 week
reports:
cobertura: public/coverage/coverage.xml
expose_as: "coverage"
And here is my tox.ini:
[tox]
envlist =
coverage-report
minversion = 3.4
[testenv:coverage-report]
basepython = python2.7-32
skip_install = True
deps =
coverage
commands =
coverage run -m pytest -s -vv -x --junitxml=public/test-report.xml tests/
coverage report
coverage html
coverage xml
I am pretty sure everything goes well with the report because not only does its XML exist under public/coverage (which I can see through the published artifacts), but the coverage % summary also shows up in the job and MR. But the coverage still does not show up in the MR's diff. I also tried opening the Network tab of my browser and look for the merge_requests/26/coverage_reports.json HTTP request, and that is coming up empty (more specifically, the response is {"files":{}}), which I do not think is supposed to be happening.
I am using Python 2.7-32 and Coverage.py to get the report. My GitLab is self-hosted with version 14.9.5-ee. Here is a link to download my coverage.xml. It is not the complete coverage, but it shows 2 files which show up in the MR's diff but have no coverage information.
Give coverage_report in this form.
artifacts:
expire_in: 1 week
paths:
- public/coverage
reports:
coverage_report:
coverage_format: cobertura
path: public/coverage/coverage.xml
Also at times it can take some time for the coverage to be reflected in the diff. Gitlab runs it a background process when it's completed it should show. Sometimes takes a while for me.
I am trying to pblish code coverage results on the pipeline run summary page. This is my pipeline.yaml file:
- bash: |
pip install .[test]
pip install pytest pytest-azurepipelines pytest-cov
pytest --junitxml=junit.xml --cov=./src_dir --cov-report=xml --cov-report=html tests
displayName: Test
- task: PublishCodeCoverageResults#1
inputs:
codeCoverageTool: Cobertura
summaryFileLocation: '$(System.DefaultWorkingDirectory)/**/coverage.xml'
reportDirectory: '$(System.DefaultWorkingDirectory)/**/htmlcov'
The coverage report keeps showing 0% always
How to get the correct code coverage results?
Thanks!
I was able to get the correct code coverage using the following in my Azure pipeline "Test" stage:
echo "
[run]
source =
$(Build.Repository.Name)" > .coveragerc
coverage run --context=mytest -m pytest -v -rA --junitxml=junit.xml --rootdir=tests
coverage report -m --context=mytest
tl;dr:
I'm setting up CI for a project of mine, hosted on github, using tox and travis-ci. At the end of the build, I run converalls to push the coverage reports to coveralls.io. I would like to make this command 'conditional' - for execution only when the tests are run on travis; not when they are run on my local machine. Is there a way to make this happen?
The details:
The package I'm trying to test is a python package. I'm using / planning to use the following 'infrastructure' to set up the tests :
The tests themselves are of the py.test variety.
The CI scripting, so to speak, is from tox. This lets me run the tests locally, which is rather important to me. I don't want to have to push to github every time I need a test run. I also use numpy and matplotlib in my package, so running an inane number of test cycles on travis-ci seems overly wasteful to me. As such, ditching tox and simply using .travis.yml alone is not an option.
The CI server is travis-ci
The relevant test scripts look something like this :
.travis.yml
language: python
python: 2.7
env:
- TOX_ENV=py27
install:
- pip install tox
script:
- tox -e $TOX_ENV
tox.ini
[tox]
envlist = py27
[testenv]
passenv = TRAVIS TRAVIS_JOB_ID TRAVIS_BRANCH
deps =
pytest
coverage
pytest-cov
coveralls
commands =
py.test --cov={envsitepackagesdir}/mypackage --cov-report=term --basetemp={envtmpdir}
coveralls
This file lets me run the tests locally. However, due to the final coveralls call, the test fails in principle, with :
py27 runtests: commands[1] | coveralls
You have to provide either repo_token in .coveralls.yml, or launch via Travis
ERROR: InvocationError: ...coveralls'
This is an expected error. The passenv bit sends along the necessary information from travis to be able to write to coveralls, and without travis there to provide this information, the command should fail. I don't want this to push the results to coveralls.io, either. I'd like to have coveralls run only if the test is occuring on travis-ci. Is there any way in which I can have this command run conditionally, or set up a build configuration which achieves the same effect?
I've already tried moving the coveralls portion into .travis.yml, but when that is executed coveralls seems to be unable to locate the appropriate .coverage file to send over. I made various attempts in this direction, none of which resulted in a successful submission to coveralls.io except the combination listed above. The following was what I would have hoped would work, given that when I run tox locally I do end up with a .coverage file where I'd expect it - in the root folder of my source tree.
No submission to coveralls.io
language: python
python: 2.7
env:
- TOX_ENV=py27
install:
- pip install tox
- pip install python-coveralls
script:
- tox -e $TOX_ENV
after_success:
- coveralls
An alternative solution would be to prefix the coveralls command with a dash (-) to tell tox to ignore its exit code as explained in the documentation. This way even failures from coveralls will be ignored and tox will consider the test execution as successful when executed locally.
Using the example configuration above, it would be as follows:
[tox]
envlist = py27
[testenv]
passenv = TRAVIS TRAVIS_JOB_ID TRAVIS_BRANCH
deps =
pytest
coverage
pytest-cov
coveralls
commands =
py.test --cov={envsitepackagesdir}/mypackage --cov-report=term --basetemp={envtmpdir}
- coveralls
I have a similar setup with Travis, tox and coveralls. My idea was to only execute coveralls if the TRAVIS environment variable is set. However, it seems this is not so easy to do as tox has trouble parsing commands with quotes and ampersands. Additionally, this confused Travis me a lot.
Eventually I wrote a simple python script run_coveralls.py:
#!/bin/env/python
import os
from subprocess import call
if __name__ == '__main__':
if 'TRAVIS' in os.environ:
rc = call('coveralls')
raise SystemExit(rc)
In tox.ini, replace your coveralls command with python {toxinidir}/run_coveralls.py.
I am using a environmental variable to run additional commands.
tox.ini
commands =
coverage run runtests.py
{env:POST_COMMAND:python --version}
.travis.yml
language: python
python:
- "3.6"
install: pip install tox-travis
script: tox
env:
- POST_COMMAND=codecov -e TOX_ENV
Now in my local setup, it print the python version. When run from Travis it runs codecov.
Alternative solution if you use a Makefile and dont want a new py file:
define COVERALL_PYSCRIPT
import os
from subprocess import call
if __name__ == '__main__':
if 'TRAVIS' in os.environ:
rc = call('coveralls')
raise SystemExit(rc)
print("Not in Travis CI, skipping coveralls")
endef
export COVERALL_PYSCRIPT
coveralls: ## runs coveralls if TRAVIS in env
#python -c "$$COVERALL_PYSCRIPT"
In tox.ini add make coveralls to commands
Is it possible to configure pycharm / intellij idea to run tox tests? I want to test my code against different python versions in separated py environments. I was trying to configure it, but so far I only managed to configure single py.test support.
See this PyCharm issue: PY-9727
Credit goes to Andrey Vlasovskikh at PyCharm
As a workaround, you can create a Python file in your project that imports and launches Tox:
import tox
tox.cmdline()
and then run this file via your project interpreter using its context menu.
You'll get hyperlinks in the console output for stack traces, but nothing more.
I'm afraid it's not supported, PyCharm will use the configured interpreter to run the tests.
You are welcome to submit a feature request.
7 years passed and now you can run Tox inside PyCharm. If you use pytest for testing it will even show Test Result same way like with running test locally by PyCharm.
To run tests in pycharm I use config like this
[tox]
envlist = py3.{7,8},codestyle,flake8,lint
minversion = 3.7
[testenv]
usedevelop = true
deps =
pytest
pytest-cov
commands = pytest --cov=src
[testenv:codestyle]
deps = pycodestyle
commands = pycodestyle src tests
[testenv:flake8]
deps = flake8
commands = flake8 src tests
[testenv:lint]
deps = pylint
commands = pylint src tests --rcfile=.pylintrc
Here I described why it looks like this. Also, there is integration with GitHub CI to run it on every push.