Azure devops pipelines cache python dependencies - python

I want to cache the dependencies in requirement.txt. See https://learn.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops#pythonpip. Here is my azure-pipelines.yml
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python38:
python.version: '3.8'
variables:
PIP_CACHE_DIR: $(Pipeline.Workspace)/.pip
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- task: Cache#2
inputs:
key: 'python | "$(Agent.OS)" | requirements.txt'
restoreKeys: |
python | "$(Agent.OS)"
python
path: $(PIP_CACHE_DIR)
displayName: Cache pip packages
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- script: |
pip install pytest pytest-azurepipelines
pytest
displayName: 'pytest'
The dependencies specified in my requirements.txt are installed in every pipeline run.
The pipeline task Cache#2 gives the following output.
Starting: Cache pip packages
==============================================================================
Task : Cache
Description : Cache files between runs
Version : 2.0.1
Author : Microsoft Corporation
Help : https://aka.ms/pipeline-caching-docs
==============================================================================
Resolving key:
- python [string]
- "Linux" [string]
- requirements.txt [file] --> EBB7474E7D5BC202D25969A2E11E0D16251F0C3F3F656F1EE6E2BB7B23868B10
Resolved to: python|"Linux"|jNwyZU113iWcGlReTrxg8kzsyeND5OIrPLaN0I1rRs0=
Resolving restore key:
- python [string]
- "Linux" [string]
Resolved to: python|"Linux"|**
Resolving restore key:
- python [string]
Resolved to: python|**
ApplicationInsightsTelemetrySender will correlate events with X-TFS-Session 85b76fe3-b469-4330-a584-db569bc45342
Getting a pipeline cache artifact with one of the following fingerprints:
Fingerprint: `python|"Linux"|jNwyZU113iWcGlReTrxg8kzsyeND5OIrPLaN0I1rRs0=`
Fingerprint: `python|"Linux"|**`
Fingerprint: `python|**`
There is a cache miss.
ApplicationInsightsTelemetrySender correlated 1 events with X-TFS-Session 85b76fe3-b469-4330-a584-db569bc45342
Finishing: Cache pip packages

Enabling system diagnostics and viewing the log of Post-job: Cache pip packages showed the reason why no cache was created.
##[debug]Evaluating condition for step: 'Cache pip packages'
##[debug]Evaluating: AlwaysNode()
##[debug]Evaluating AlwaysNode:
##[debug]=> True
##[debug]Result: True
Starting: Cache pip packages
==============================================================================
Task : Cache
Description : Cache files between runs
Version : 2.0.1
Author : Microsoft Corporation
Help : https://aka.ms/pipeline-caching-docs
==============================================================================
##[debug]Skipping because the job status was not 'Succeeded'.
Finishing: Cache pip packages
There were failing tests in the build pipeline. The cache was used after I removed the failing tests.

Related

python - Automated building of executables

I have a GUI program I'm managing, written in Python. For the sake of not having to worry about environments, it's distributed as an executable built with PyInstaller. I can run this build from a function defined in the module as MyModule.build() (because to me it makes more sense to manage that script alongside the project itself).
I want to automate this to some extent, such that when a new release is added on Gitlab, it can be built and attached to the release by a runner. The approach I currently have to this is functional but hacky:
I use the Gitlab API to download the source of the tag for the release. I run python -m pip install -r {requirements_path} and python -m pip install {source_path} in the runner's environment. Then import and run the MyModule.build() function to generate an executable. Which is then uploaded and linked to the release with the Gitlab API.
Obviously the middle section is wanting. What are best approaches for similar projects? Can the package and requirments be installed in a separate venv than the one the runner script it running in?
One workflow would be to push a tag to create your release. The following jobs have a rules: configuration so they only run on tag pipelines.
One job will build the executable file. Another job will create the GitLab release using the file created in the first job.
build:
rules:
- if: "$CI_COMMIT_TAG" # Only run when tags are pushed
image: python:3.9-slim
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
cache: # https://docs.gitlab.com/ee/ci/caching/#cache-python-dependencies
paths:
- .cache/pip
- venv/
script:
- python -m venv venv
- source venv/bin/activate
- python -m pip install -r requirements.txt # package requirements
- python -m pip install pyinstaller # build requirements
- pyinstaller --onefile --name myapp mypackage/__main__.py
artifacts:
paths:
- dist
create_release:
rules:
- if: "$CI_COMMIT_TAG"
needs: [build]
image: registry.gitlab.com/gitlab-org/release-cli:latest
script: # zip/upload your binary wherever it should be downloaded from
- echo "Uploading release!"
release: # create GitLab release
tag_name: $CI_COMMIT_TAG
name: 'Release of myapp version $CI_COMMIT_TAG'
description: 'Release created using the release-cli.'
assets: # link uploaded asset(s) to the release
- name: 'release-zip'
url: 'https://example.com/downloads/myapp/$CI_COMMIT_TAG/myapp.zip'

how to template python tasks in azure devops pipelines

I have two repositories A & B.
Azure Repository A - Contains a python app
Azure Repository B - Contains .yml templates and .py scripts I want to run in the .yml templates
According to the documentations.. I cannot do this because when I expand the template into the calling repository A's pipeline.. it will be like a code directive and just inject the code.. it will not know or care about the .py files in the respoitory.
What are my options without doing all my .py routines as inline ?
Azure Repo A's Pipeline Yaml file
trigger: none
resources:
pipelines:
- pipeline: my_project_a_pipeline
source: trigger_pipeline
trigger:
branches:
include:
- master
repositories:
- repository: template_repo_b
type: git
name: template_repo_b
ref: main
stages:
- template: pipelines/some_template.yml#template_repo_b
parameters:
SOME_PARAM_KEY: "some_param_value"
Azure Repo B's some_template.yml
parameters:
- name: SOME_PARAM_KEY
type: string
stages:
- stage: MyStage
displayName: "SomeStage"
jobs:
- job: "MyJob"
displayName: "MyJob"
steps:
- bash: |
echo Bashing
ls -la
displayName: 'Execute Warmup'
- task: PythonScript#0
inputs:
scriptSource: "filePath"
scriptPath: /SOME_PATH_ON_REPO_B/my_dumb_script.py
script: "my_dumb_script.py"
Is there an option to wire in the .py files into a completely separate repo C... add C to resources of B templates.. and be on my way ?
EDIT:
I can see In Azure templates repository, is there a way to mention repository for a filePath parameter of azure task 'pythonScript'? but then how do I consume the python package.. can I still use the PythonScript task ? sounds like I would then need to call my pip packaged code straight from bash ??
I figured it out.. how to pip install py files in azure devops pipelines.. using azure repositories.. via a template in the same repo
just add a reference to yourself at the top of any template
In the consuming repo
repositories:
- repository: this_template_repo
type: git
name: this_template_repo
ref: master
then add a job, referencing yourself by that name
- job: "PIP_INSTALL_LIBS"
displayName: "pip install libraries to agent"
steps:
- checkout: this_template_repo
path: this_template_repo
- bash: |
python3 -m pip install setuptools
python3 -m pip install -e $(Build.SourcesDirectory)/somepypimodule/src --force-reinstall --no-deps
displayName: 'pip install pip package'

How to get detailed error information when gitlab-ci fails

Gitlab version is 13.6.6
Gitlab-runner version is 11.2.0
my .gitlab-ci.yml:
image: "python:3.7"
before_script:
- pip install flake8
flake8:
stage: test
script:
- flake8 -max-line-length=79
tags:
- test
The only information obtained from Pipelines is script failure and the output of failed job is No job log. How can I get more detailed error output?
Using artifacts can help you.
image: "python:3.7"
before_script:
- pip install flake8
flake8:
stage: test
script:
- flake8 -max-line-length=79
- cd path/to
tags:
- test
artifacts:
when: on_failure
paths:
- path/to/test.log
The log file can be downloaded via the web interface.
Note:- Using when: on_failure will ensure that test.log will only be collected if the build fails, saving disk space on successful builds.

How to add --no-bundler command in azure build pipeline

I am having some trouble with running my function app in python. When i push the function directly through func azure functionapp publish air-temperature-v2 --no-bundler. This publishes the function directly to portal.azure and the function works as expected. However if i try to commit and push to the Azure repos and it generates its build, everything is successful but when I try to run the function, it gives a module name 'pandas' not found error. It works fine locally & online (using no bundler command). My question is, how can I add the no bundler command in azure python pipeline? My yaml is as follows :
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
strategy:
matrix:
Python36:
python.version: '3.6'
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- script: python HttpExample/__init__.py
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.SourcesDirectory)'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/Application$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: # (no value); this input is optional
- task: PublishBuildArtifacts#1
#- script: |
# pip install pytest pytest-azurepipelines
# pytest
# displayName: 'pytest'
# ignore
- task: AzureFunctionApp#1
inputs:
azureSubscription: 'zohair-rg'
appType: 'functionAppLinux'
appName: 'air-temperature-v2'
package: '$(Build.ArtifactStagingDirectory)/Application$(Build.BuildId).zip'
startUpCommand: 'func azure functionapp publish air-temperature-v2 --no-bundler'
I have even tried add the no bundler command as the startup command but it still does not work.
This could be related to azure-function-core-tools version issue, please try following and deploy:
Please update your azure-function-core-tool version to the latest
Please try to deploy your build using below command:
func azure functionapp publish <app_name> --build remote
There was a similar issue raise sometime back, couldn't recall but this fix worked.
Alternatively , have you considere Azure CLI task to deploy azure function , here is a detailed article explaining the Azure CI CD using azure CLi for python.
https://clemenssiebler.com/deploy-azure-functions-python-azure-devops/
Hope it helps.

Is there a Python/Django equivalent to Rails bundler-audit?

I'm fairly new to Django so apologies in advance if this is obvious.
In Rails projects, I use a gem called bundler-audit to check that the patch level of the gems I'm installing don't include security vulnerabilities. Normally, I incorporate running bundler-audit into my CI pipeline so that any time I deploy, I get a warning (and fail) if a gem has a security vulnerability.
Is there a similar system for checking vulnerabilities in Python packages?
After writing out this question, I searched around some more and found Safety, which was exactly what I was looking for.
In case anyone else is setting up CircleCI for a Django project and wants to check their packages for vulnerabilities, here is the configuration I used in my .circleci/config.yml:
version: 2
jobs:
build:
# build and run tests
safety_check:
docker:
- image: circleci/python:3.6.1
steps:
- checkout
- run:
command: |
python3 -m venv env3
. env3/bin/activate
pip install safety
# specify requirements.txt
safety check -r requirements.txt
merge_master:
# merge passing code into master
workflows:
version: 2
test_and_merge:
jobs:
- build:
filters:
branches:
ignore: master
- safety_check:
filters:
branches:
ignore: master
- merge_master:
filters:
branches:
only: develop
requires:
- build
# code is only merged if safety check passes
- safety_check
To check that this works, run pip install insecure-package && pip freeze > requirements.txt then push and watch for Circle to fail.

Categories

Resources