Context
After setting up a conda environment.yml and trying to install it with a python 3.8 version, I am experiencing some difficulties.
Attempts
I tried explicitly specifying the python version at the environment creation command:
conda env create --file environment.yml python=3.8
I tried explicitly including the python version in the environment.yml file:
...
dependencies:
- anaconda
- python=3.8
- conda:
# Run python tests.
- pytest-cov
...
And I tried explicitly installing python 3.8 inside the environment with:
conda activate env_name
conda install python==3.8
Which yields a conflict:
Found conflicts! Looking for incompatible packages.
So I tried determining what the conflict is, by evaluating the 2000 output lines that describe the conflicts. I think the first conflict is most relevant:
UnsatisfiableError: The following specifications were found to be incompatible with a past
explicit spec that is not an explicit spec in this operation (pip):
- python=3.8 -> pip
Environment
The conda environment consists of the following four files:
environment.yml
pyproject.toml
requirements.txt
.pre-commit-config.yaml
Which have contents:
environment.yml:
# This file is to automatically configure your environment. It allows you to
# run the code with a single command without having to install anything
# (extra).
# First run:: conda env create --file environment.yml
# If you change this file, run: conda env update --file environment.yml
# Instructions for this networkx-to-lava-nc repository only. First time usage
# On Ubuntu (this is needed for lava-nc):
# sudo apt upgrade
# sudo apt full-upgrade
# yes | sudo apt install gcc
# Conda configuration settings. (Specify which modules/packages are installed.)
name: networkx-to-lava
channels:
- conda-forge
dependencies:
- python=3.8
- conda:
# Run python tests.
- pytest-cov
# Generate plots.
- matplotlib
# Run graph software quickly.
- networkx
- pip
- pip:
# Run pip install on .tar.gz file in GitHub repository (For lava-nc only).
- https://github.com/lava-nc/lava/releases/download/v0.3.0/lava-nc-0.3.0.tar.gz
# Turns relative import paths into absolute import paths.
- absolufy-imports
# Auto format Python code to make it flake8 compliant.
- autoflake
# Scan Python code for security issues.
- bandit
# Code formatting compliance.
- black
# Correct code misspellings.
- codespell
# Verify percentage of code that has at least 1 test.
- coverage
# Auto formats the Python documentation written in the code.
- docformatter
# Auto generate docstrings.
- flake8
# Auto sort the import statements.
- isort
# Auto format Markdown files.
- mdformat
# Auto check static typing.
- mypy
# Auto generate documentation.
- pdoc3
# Auto check programming style aspects.
- pylint
# Auto generate docstrings.
- pyment
# Identify and remove dead code.
- vulture
# Include GitHub pre-commit hook.
- pre-commit
# TODO: identify exact function(and usage).
# Seems to be an autoformatter like black, but installed using npm instead of pip.
- prettier
# Automatically upgrades Python syntax to the new Python version syntax.
- pyupgrade
# Another static type checker for python like mypy.
- pyright
pyproject.toml:
# This is used to configure the black, isort and mypy such that the packages don't conflict.
# This file is read by the pre-commit program.
[tool.black]
line-length = 79
include = '\.pyi?$'
exclude = '''
/(
\.git
| \.mypy_cache
| build
| dist
)/
'''
[tool.coverage.run]
# Due to a strange bug with xml output of coverage.py not writing the full-path
# of the sources, the full root directory is presented as a source alongside
# the main package. As a result any importable Python file/package needs to be
# included in the omit
source = [
"foo",
".",
]
# Excludes the following directories from the coverage report
omit = [
"tests/*",
"setup.py",
]
[tool.isort]
profile = "black"
[tool.mypy]
ignore_missing_imports = true
[tool.pylint.basic]
bad-names=[]
[tool.pylint.messages_control]
# Example: Disable error on needing a module-level docstring
disable=[
"import-error",
"invalid-name",
"fixme",
]
[tool.pytest.ini_options]
# Runs coverage.py through use of the pytest-cov plugin
# An xml report is generated and results are output to the terminal
addopts = "--cov --cov-report xml:cov.xml --cov-report term"
# Sets the minimum allowed pytest version
minversion = 5.0
# Sets the path where test files are located (Speeds up Test Discovery)
testpaths = ["tests"]
requirements.txt
# This file ensures that the pre-commit service is ran every time you commit.
# Basically it ensures people only push files to GIT that are up to standard.
pre-commit
.pre-commit-config.yaml
# This file specifies which checks are performed by the pre-commit service.
# The pre-commit service prevents people from pushing code to git that is not
# up to standards. # The reason mirrors are used instead of the actual
# repositories for e.g. black and flake8, is because those repositories also
# need to contain a pre-commit hook file, which they often don't by default.
# So to resolve that, a mirror is created that includes such a file.
default_language_version:
python: python3.10 # or python3
repos:
# Test if the python code is formatted according to the Black standard.
- repo: https://github.com/Quantco/pre-commit-mirrors-black
rev: 22.3.0
hooks:
- id: black-conda
args:
- --safe
- --target-version=py36
# Test if the python code is formatted according to the flake8 standard.
- repo: https://github.com/Quantco/pre-commit-mirrors-flake8
rev: 4.0.1
hooks:
- id: flake8-conda
# Test if the import statements are sorted correctly.
- repo: https://github.com/PyCQA/isort
rev: 5.10.1
hooks:
- id: isort
args: ["--profile", "black", --line-length=79]
# Test if the variable typing is correct. (Variable typing is when you say:
# def is_larger(nr: int) -> bool: instead of def is_larger(nr). It makes
# it explicit what type of input and output a function has.
# - repo: https://github.com/python/mypy
- repo: https://github.com/pre-commit/mirrors-mypy
# - repo: https://github.com/a-t-0/mypy
rev: v0.950
hooks:
- id: mypy
# Tests if there are spelling errors in the code.
- repo: https://github.com/codespell-project/codespell
rev: v2.1.0
hooks:
- id: codespell
# Performs static code analysis to check for programming errors.
- repo: local
hooks:
- id: pylint
name: pylint
entry: pylint
language: system
types: [python]
args:
[
"-rn", # Only display messages
"-sn", # Don't display the score
]
# Runs additional tests that are created by the pre-commit software itself.
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.2.0
hooks:
# Check user did not add large files.
- id: check-added-large-files
# Check if `.py` files are written in valid Python syntax.
- id: check-ast
# Require literal syntax when initializing empty or zero Python builtin types.
- id: check-builtin-literals
# Checks if there are filenames that would conflict if case is changed.
- id: check-case-conflict
# Checks if the Python functions have docstrings.
- id: check-docstring-first
# Checks if any `.sh` files have a shebang like #!/bin/bash
- id: check-executables-have-shebangs
# Verifies json format of any `.json` files in repo.
- id: check-json
# Checks if there are any existing merge conflicts caused by the commit.
- id: check-merge-conflict
# Checks for symlinks which do not point to anything.
- id: check-symlinks
# Checks if xml files are formatted correctly.
- id: check-xml
# Checks if .yml files are valid.
- id: check-yaml
# Checks if debugger imports are performed.
- id: debug-statements
# Detects symlinks changed to regular files with content path symlink was pointing to.
- id: destroyed-symlinks
# Checks if you don't accidentally push a private key.
- id: detect-private-key
# Replaces double quoted strings with single quoted strings.
# This is not compatible with Python Black.
#- id: double-quote-string-fixer
# Makes sure files end in a newline and only a newline.
- id: end-of-file-fixer
# Removes UTF-8 byte order marker.
- id: fix-byte-order-marker
# Add <# -*- coding: utf-8 -*-> to the top of python files.
- id: fix-encoding-pragma
# Checks if there are different line endings, like \n and crlf.
- id: mixed-line-ending
# Asserts `.py` files in folder `/test/` (by default:) end in `_test.py`.
- id: name-tests-test
# Override default to check if `.py` files in `/test/` START with `test_`.
args: ['--django']
# Ensures JSON files are properly formatted.
- id: pretty-format-json
# Sorts entries in requirements.txt and removes incorrect pkg-resources entries.
- id: requirements-txt-fixer
# Sorts simple YAML files which consist only of top-level keys.
- id: sort-simple-yaml
# Removes trailing whitespaces at end of lines of .. files.
- id: trailing-whitespace
- repo: https://github.com/PyCQA/autoflake
rev: v1.4
hooks:
- id: autoflake
args: ["--in-place", "--remove-unused-variables", "--remove-all-unused-imports", "--recursive"]
name: AutoFlake
description: "Format with AutoFlake"
stages: [commit]
- repo: https://github.com/PyCQA/bandit
rev: 1.7.4
hooks:
- id: bandit
name: Bandit
stages: [commit]
# Enforces formatting style in Markdown (.md) files.
- repo: https://github.com/executablebooks/mdformat
rev: 0.7.14
hooks:
- id: mdformat
additional_dependencies:
- mdformat-toc
- mdformat-gfm
- mdformat-black
- repo: https://github.com/MarcoGorelli/absolufy-imports
rev: v0.3.1
hooks:
- id: absolufy-imports
files: '^src/.+\.py$'
args: ['--never', '--application-directories', 'src']
- repo: https://github.com/myint/docformatter
rev: v1.4
hooks:
- id: docformatter
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.9.0
hooks:
- id: python-use-type-annotations
- id: python-check-blanket-noqa
- id: python-check-blanket-type-ignore
# Updates the syntax of `.py` files to the specified python version.
# It is not compatible with: pre-commit hook: fix-encoding-pragma
# - repo: https://github.com/asottile/pyupgrade
# rev: v2.32.1
# hooks:
# - id: pyupgrade
# args: [--py310-plus]
- repo: https://github.com/markdownlint/markdownlint
rev: v0.11.0
hooks:
- id: markdownlint
Package Conflict Output
conda install python=3.8
Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: |
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed \
UnsatisfiableError: The following specifications were found to be incompatible with a past
explicit spec that is not an explicit spec in this operation (pip):
- python=3.8 -> pip
The following specifications were found to be incompatible with each other:
Output in format: Requested package -> Available versions
Package openjpeg conflicts for:
openjpeg
pillow -> openjpeg[version='>=2.3.0,<3.0a0|>=2.4.0,<2.5.0a0']
matplotlib-base -> pillow[version='>=6.2.0'] -> openjpeg[version='>=2.3.0,<3.0a0|>=2.4.0,<2.5.0a0']
Package ncurses conflicts for:
wheel -> python -> ncurses[version='6.0.*|>=6.0,<7.0a0|>=6.1,<7.0a0|>=6.2,<7.0a0|>=6.3,<7.0a0']
krb5 -> libedit[version='>=3.1.20210216,<3.2.0a0'] -> ncurses[version='6.0.*|>=6.1,<7.0a0|>=6.2,<7.0.0a0|>=6.2,<7.0a0|>=6.3,<7.0a0']
pluggy -> python[version='>=3.7,<3.8.0a0'] -> ncurses[version='6.0.*|>=6.0,<7.0a0|>=6.1,<7.0a0|>=6.2,<7.0a0|>=6.3,<7.0a0']
...
...
- tornado -> libgcc-ng[version='>=7.5.0'] -> __glibc[version='>=2.17']
- unicodedata2 -> libgcc-ng[version='>=7.5.0'] -> __glibc[version='>=2.17']
- xorg-libxau -> libgcc-ng[version='>=9.3.0'] -> __glibc[version='>=2.17']
- xorg-libxdmcp -> libgcc-ng[version='>=9.3.0'] -> __glibc[version='>=2.17']
- xz -> libgcc-ng[version='>=7.5.0'] -> __glibc[version='>=2.17']
- zlib -> libgcc-ng[version='>=7.5.0'] -> __glibc[version='>=2.17']
- zstd -> libgcc-ng[version='>=7.5.0'] -> __glibc[version='>=2.17']
Your installed version is: 2.33
Question
How can I determine the conflicting package in this conda environment and/or how can I create the environment using python 3.8?
After specifying the python version as the first dependency, and removing the unneeded elements as suggested by merv, I found a working yaml. I removed anaconda, and the conda channel. Furthermore, I ensured the default_version in the .pre-commit-config.yaml file was set to:
default_language_version:
python: python3.8. # or python3
I also deleted the .mypy_cache folder in the .git repository (even though I think this was not required). And I deleted the directory /home/<username>/.cache/pre-commit before running pre-commit run --all-files. (I thought it was worth mentioning as it is inherent to this environment.yml)
I did not have to specify the python version in the environment creation command. Instead, I ran:
conda env create --file environment.yml
The working environment.yml content is:
# This file is to automatically configure your environment. It allows you to
# run the code with a single command without having to install anything
# (extra).
# First run: conda env create --file environment.yml
# If you change this file, run: conda env update --file environment.yml
# Instructions for this networkx-to-lava-nc repository only. First time usage
# On Ubuntu (this is needed for lava-nc):
# sudo apt upgrade
# sudo apt full-upgrade
# yes | sudo apt install gcc
# Conda configuration settings. (Specify which modules/packages are installed.)
name: nx2lava
channels:
- conda-forge
dependencies:
# Specify specific python version.
- python=3.8
# Run python tests.
- pytest-cov
# Generate plots.
- matplotlib
# Run graph software quickly.
- networkx
- pip
- pip:
# Run pip install on .tar.gz file in GitHub repository (For lava-nc only).
- https://github.com/lava-nc/lava/releases/download/v0.3.0/lava-nc-0.3.0.tar.gz
# Turns relative import paths into absolute import paths.
- absolufy-imports
# Auto format Python code to make it flake8 compliant.
- autoflake
# Scan Python code for security issues.
- bandit
# Code formatting compliance.
- black
# Correct code misspellings.
- codespell
# Verify percentage of code that has at least 1 test.
- coverage
# Auto formats the Python documentation written in the code.
- docformatter
# Auto generate docstrings.
- flake8
# Auto sort the import statements.
- isort
# Auto format Markdown files.
- mdformat
# Auto check static typing.
- mypy
# Auto generate documentation.
- pdoc3
# Auto check programming style aspects.
- pylint
# Auto generate docstrings.
- pyment
# Identify and remove dead code.
- vulture
# Include GitHub pre-commit hook.
- pre-commit
# TODO: identify exact function(and usage).
# Seems to be an autoformatter like black, but installed using npm instead of pip.
- prettier
# Automatically upgrades Python syntax to the new Python version syntax.
- pyupgrade
# Another static type checker for python like mypy.
- pyright
Which returns the following to the python --version command:
Python 3.8.13
Related
I have a GUI program I'm managing, written in Python. For the sake of not having to worry about environments, it's distributed as an executable built with PyInstaller. I can run this build from a function defined in the module as MyModule.build() (because to me it makes more sense to manage that script alongside the project itself).
I want to automate this to some extent, such that when a new release is added on Gitlab, it can be built and attached to the release by a runner. The approach I currently have to this is functional but hacky:
I use the Gitlab API to download the source of the tag for the release. I run python -m pip install -r {requirements_path} and python -m pip install {source_path} in the runner's environment. Then import and run the MyModule.build() function to generate an executable. Which is then uploaded and linked to the release with the Gitlab API.
Obviously the middle section is wanting. What are best approaches for similar projects? Can the package and requirments be installed in a separate venv than the one the runner script it running in?
One workflow would be to push a tag to create your release. The following jobs have a rules: configuration so they only run on tag pipelines.
One job will build the executable file. Another job will create the GitLab release using the file created in the first job.
build:
rules:
- if: "$CI_COMMIT_TAG" # Only run when tags are pushed
image: python:3.9-slim
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
cache: # https://docs.gitlab.com/ee/ci/caching/#cache-python-dependencies
paths:
- .cache/pip
- venv/
script:
- python -m venv venv
- source venv/bin/activate
- python -m pip install -r requirements.txt # package requirements
- python -m pip install pyinstaller # build requirements
- pyinstaller --onefile --name myapp mypackage/__main__.py
artifacts:
paths:
- dist
create_release:
rules:
- if: "$CI_COMMIT_TAG"
needs: [build]
image: registry.gitlab.com/gitlab-org/release-cli:latest
script: # zip/upload your binary wherever it should be downloaded from
- echo "Uploading release!"
release: # create GitLab release
tag_name: $CI_COMMIT_TAG
name: 'Release of myapp version $CI_COMMIT_TAG'
description: 'Release created using the release-cli.'
assets: # link uploaded asset(s) to the release
- name: 'release-zip'
url: 'https://example.com/downloads/myapp/$CI_COMMIT_TAG/myapp.zip'
My current pylint configuration:
Installed via pip (in requirements.txt).
The .pre-commit-config.yaml:
- repo: local
hooks:
name: pylint
entry: pylint
language: system
types: [ python ]
files: ^src/
args:
[
"-rn", # display messages
"--rcfile=.pylintrc",
"--fail-under=8.5"
]
Execution method:
source venv/bin/activate &&\
pip freeze &&\
pre-commit install &&\
pre-commit run --all-files
When all .py files receive score higher than 8.5 then pylint is just passing and do not displaying any message. Is there any method to see all of the communicates even if --fail-under is met? (so we know what some is wrong with the files)
there is a setting which will force the output to always display: verbose: true but it is only intended for debug purposes as it tends to make the output noisy and your contributors will be likely to mentally filter it out as warning noise
using your example:
- repo: local
hooks:
name: pylint
entry: pylint
language: system
types: [ python ]
files: ^src/
args:
[
"-rn", # display messages
"--rcfile=.pylintrc",
"--fail-under=8.5"
]
verbose: true
unrelated, but there's no reason to use args for a repo: local hook since nothing can override it, you can specify those directly in entry:
entry: pylint --rn --rcfile=.pylintrc --fail-under=8.5
disclaimer: I created pre-commit
I have a project on ReadTheDocs that I'm trying to build. I'm using a very basic .readthedocs.yaml file that reads:
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the version of Python and other tools you might need
build:
os: ubuntu-20.04
tools:
python: "3.9"
# You can also specify other tool versions:
# nodejs: "16"
# rust: "1.55"
# golang: "1.17"
# Build documentation in the docs/ directory with Sphinx
sphinx:
builder: html
configuration: docs/source/conf.py
fail_on_warning: true
# If using Sphinx, optionally build your docs in additional formats such as PDF
# formats:
# - pdf
# Optionally declare the Python requirements required to build your docs
python:
install:
- requirements: docs/requirements.txt
conda:
environment: environment.yml
Unfortunately, the RTD build logs seem to tell me that after cloning and writing out the environment.yml file, the build process runs python env create --quiet --name develop --file environment.yml. This obviously fails with "no such file or directory" (Error 2) since, well, no such file or directory as env exists in the directory structure. Shouldn't RTD be running conda create here? How do I make it do the right thing?
Thanks,
Eli
This problem is described in https://github.com/readthedocs/readthedocs.org/issues/8595
In summary, python: "3.9" now means that CPython 3.9 + venv are used, regardless of conda.environment.
If one wants to use a conda environment, python: "miniconda3-4.7" or python: "mambaforge-4.10" need to be specified.
In the future, a better error message should be shown at least. Feel free to upvote the issue.
I have a simple python project with a single file currently. It exists within the static/cgi-bin folder of my project. Currently, in the base of my directory, I have a .pre-commit-config.yaml file, and I have not touched the files in the .git/hooks folder. I would like to create pre-commit and pre-push hooks, but I can not seem to get it working.
When I try to commit, the following happens:
isort................................................(no files to check)Skipped
flake8...............................................(no files to check)Skipped
black................................................(no files to check)Skipped
When I try to push, I get the following error:
pytest...................................................................Failed
hookid: pytest
============================= test session starts ==============================
platform darwin -- Python 2.7.15, pytest-4.0.2, py-1.7.0, pluggy-0.8.0
rootdir: /Users/.../deployment, inifile:
collected 0 items
========================= no tests ran in 0.01 seconds =========================
error: failed to push some refs to '...git'
Note that deployment is the folder I am working in.
My code in the yaml file is:
repos:
- repo: local
hooks:
- id: isort
name: isort
entry: isort
language: system
types: [python]
stages: [commit]
- id: flake8
name: flake8
language: system
entry: flake8
types: [python]
stages: [commit]
- id: black
language_version: python3.6
name: black
language: system
entry: black
types: [python]
stages: [commit]
- id: pytest
name: pytest
language: system
entry: pytest
pass_filenames: false
always_run: true
stages: [push]
pre-commit will pass a list of staged files which match types / files to the entry listed.
Your commit shows "no files to check" because there were no python files in your commit. You probably want to run pre-commit run --all-files when first introducing new hooks
As for your pytest hook, pytest exits nonzero when it does not run any tests and so that is failing.
I took these from yml file of packages that my current environment is missing. How do I just install these within my current environment?
channels:
- defaults
dependencies:
- appdirs=1.4.3=py36h28b3542_0
- asn1crypto=0.24.0=py36_0
- attrs=18.2.0=py36h28b3542_0
- blas=1.0=mkl
- cffi=1.11.5=py36h6174b99_1
- constantly=15.1.0=py36h28b3542_0
- cryptography=2.3.1=py36hdbc3d79_0
- freetype=2.9.1=hb4e5f40_0
- html5lib=1.0.1=py36_0
- idna=2.7=py36_0
- incremental=17.5.0=py36_0
- intel-openmp=2019.0=118
- libgfortran=3.0.1=h93005f0_2
- libxml2=2.9.4=0
- libxslt=1.1.29=hc208041_6
- lxml=4.1.1=py36h6c891f4_0
- mkl=2019.0=118
- mkl_fft=1.0.6=py36hb8a8100_0
- mkl_random=1.0.1=py36h5d10147_1
- numpy=1.15.3=py36h6a91979_0
- numpy-base=1.15.3=py36h8a80b8c_0
- pyasn1=0.4.4=py36h28b3542_0
- pyasn1-modules=0.2.2=py36_0
- pycparser=2.19=py36_0
- pyopenssl=18.0.0=py36_0
- service_identity=17.0.0=py36h28b3542_0
- twisted=17.5.0=py36_0
- zope=1.0=py36_1
- zope.interface=4.5.0=py36h1de35cc_0
- pip:
- absl-py==0.2.2
- ete3==3.1.1
- grpcio==1.12.1
Conda Env Update
If you have a YAML file, then the most efficacious way to apply it to a given env is with conda env update:
conda env update --file environment.yml
⚠️ Warning: The conda env commands don't prompt you to review and approve the transactions - it simply executes the changes. Be sure to carefully review the YAML file to ensure all of the changes are desired.
Conda Install
The format that Conda accepts for conda install --file is that which matches the output of conda list --export. It's not a YAML, but a simple text file with one package per line, similar to the one produced by pip freeze, except for the single equality sign ('=' rather than ==).
conda list --export
appdirs=1.4.3=py36h28b3542_0
asn1crypto=0.24.0=py36_0
...
zope=1.0=py36_1
zope.interface=4.5.0=py36h1de35cc_0
Note that the builds are not required, e.g., the following would also work and may actually be slightly more portable across architectures
appdirs=1.4.3
asn1crypto=0.24.0
...
zope=1.0
zope.interface=4.5.0
Unfortunately, conda install doesn't support PyPI packages; you'd have to install those separately via pip install in your activated env.