I'm trying to create a GitHub workflow that runs a python script (which outputs three graphs), add those graphs to the readme.md then commit the changes to the repo and display the graphs on the readme page. I would like to trigger a new push happening.
as a bash script it would look like this:
git pull
python analysis_1.py
git add .
git commit -m "triggered on action"
git push
I'm not really sure where to start on it or how to set up the action. I tried setting up one but it wouldn't make any changes.
See this answer for how to commit back to your repository during a workflow.
In your case it might look something like this. Tweak it where necessary.
on:
push:
branches:
- master
jobs:
updateGraphs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: actions/setup-python#v1
with:
python-version: '3.x'
- name: Generate graphs
run: python analysis_1.py
- name: Update graphs
run: |
git config --global user.name 'Your Name'
git config --global user.email 'your-username#users.noreply.github.com'
git commit -am "Update graphs"
git push
Alternatively, raise a pull request instead of committing immediately using create-pull-request action.
on:
push:
branches:
- master
jobs:
updateGraphs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: actions/setup-python#v1
with:
python-version: '3.x'
- name: Generate graphs
run: python analysis_1.py
- name: Create Pull Request
uses: peter-evans/create-pull-request#v2
with:
token: ${{ secrets.GITHUB_TOKEN }}
commit-message: Update graphs
title: Update graphs
branch: update-graphs
Related
My repo contains a main.py that generates a html map and save results in a csv. I want the action to:
execute the python script (-> this seems to be ok)
that the file generated would then be in the repo, hence having the file generated to be added, commited and pushed to the main branch to be available in the page associated with the repo.
name: refresh map
on:
schedule:
- cron: "30 11 * * *" #runs at 11:30 UTC everyday
jobs:
getdataandrefreshmap:
runs-on: ubuntu-latest
steps:
- name: checkout repo content
uses: actions/checkout#v3 # checkout the repository content to github runner.
- name: setup python
uses: actions/setup-python#v4
with:
python-version: 3.8 #install the python needed
- name: Install dependencies
run: |
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: execute py script
uses: actions/checkout#v3
run: |
python main.py
git config user.name github-actions
git config user.email github-actions#github.com
git add .
git commit -m "crongenerated"
git push
The github-action does not pass when I include the 2nd uses: actions/checkout#v3 and the git commands.
Thanks in advance for your help
If you want to run a script, then you don't need an additional checkout step for that. There is a difference between steps that use workflows and those that execute shell scripts directly. You can read more about it here.
In your configuration file, you kind of mix the two in the last step. You don't need an additional checkout step because the repo from the first step is still checked out. So you can just use the following workflow:
name: refresh map
on:
schedule:
- cron: "30 11 * * *" #runs at 11:30 UTC everyday
jobs:
getdataandrefreshmap:
runs-on: ubuntu-latest
steps:
- name: checkout repo content
uses: actions/checkout#v3 # checkout the repository content to github runner.
- name: setup python
uses: actions/setup-python#v4
with:
python-version: 3.8 #install the python needed
- name: Install dependencies
run: |
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: execute py script
run: |
python main.py
git config user.name github-actions
git config user.email github-actions#github.com
git add .
git commit -m "crongenerated"
git push
I tested it with a dummy repo and everything worked.
I've set up coverage.py in my GitHub actions to generate a code coverage report. It displays the report if I go to each action, how do I store the report on my repository and display the code coverage percentage as a badge as well.
This is my build.yml file with the code coverage:
name: Build
on:
push:
branches: ["main"]
pull_request:
branches: ["main"]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- name: Install Python Dependencies
run: |
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Run linter on src directory
run: |
pylint --rcfile=pylintrc src
- name: Run tests and coverage report
run: |
coverage run --data-file tests/.coverage -m pytest -s
coverage json --data-file tests/.coverage -o buf/tests/coverage.json
coverage report --data-file tests/.coverage
Is it possible for me to add a pylint badge to the repo from the actions as well ?
Yes, it's possible. Here's a blog post detailing a simple way: https://nedbatchelder.com/blog/202209/making_a_coverage_badge.html
I am trying to setup a simple github actions workflow for sql linting using sqlfluff package. here is sunrise movement workflow which is simple and clean.
name: Lint Models
on: [pull_request]
jobs:
lint-models:
runs-on: ubuntu-latest
steps:
- uses: "actions/checkout#v2"
- uses: "actions/setup-python#v2"
with:
python-version: "3.8"
- name: Install SQLFluff
run: "pip install sqlfluff==0.12.0"
- name: Lint models
run: "sqlfluff lint models"
When I tried to run it in github actions, it is giving me the following error message. Not quite sure why it is throwing error. Help is appreciated as I am trying to learn github acitons for the first time.
You have this:
run: "sqlfluff lint models"
This says to lint the directory called models. The directory does not exist in your repo (is it a sub folder?).
I have two repositories A & B.
Azure Repository A - Contains a python app
Azure Repository B - Contains .yml templates and .py scripts I want to run in the .yml templates
According to the documentations.. I cannot do this because when I expand the template into the calling repository A's pipeline.. it will be like a code directive and just inject the code.. it will not know or care about the .py files in the respoitory.
What are my options without doing all my .py routines as inline ?
Azure Repo A's Pipeline Yaml file
trigger: none
resources:
pipelines:
- pipeline: my_project_a_pipeline
source: trigger_pipeline
trigger:
branches:
include:
- master
repositories:
- repository: template_repo_b
type: git
name: template_repo_b
ref: main
stages:
- template: pipelines/some_template.yml#template_repo_b
parameters:
SOME_PARAM_KEY: "some_param_value"
Azure Repo B's some_template.yml
parameters:
- name: SOME_PARAM_KEY
type: string
stages:
- stage: MyStage
displayName: "SomeStage"
jobs:
- job: "MyJob"
displayName: "MyJob"
steps:
- bash: |
echo Bashing
ls -la
displayName: 'Execute Warmup'
- task: PythonScript#0
inputs:
scriptSource: "filePath"
scriptPath: /SOME_PATH_ON_REPO_B/my_dumb_script.py
script: "my_dumb_script.py"
Is there an option to wire in the .py files into a completely separate repo C... add C to resources of B templates.. and be on my way ?
EDIT:
I can see In Azure templates repository, is there a way to mention repository for a filePath parameter of azure task 'pythonScript'? but then how do I consume the python package.. can I still use the PythonScript task ? sounds like I would then need to call my pip packaged code straight from bash ??
I figured it out.. how to pip install py files in azure devops pipelines.. using azure repositories.. via a template in the same repo
just add a reference to yourself at the top of any template
In the consuming repo
repositories:
- repository: this_template_repo
type: git
name: this_template_repo
ref: master
then add a job, referencing yourself by that name
- job: "PIP_INSTALL_LIBS"
displayName: "pip install libraries to agent"
steps:
- checkout: this_template_repo
path: this_template_repo
- bash: |
python3 -m pip install setuptools
python3 -m pip install -e $(Build.SourcesDirectory)/somepypimodule/src --force-reinstall --no-deps
displayName: 'pip install pip package'
I am trying to publish a Python package to PyPI, from a Github workflow, but the authentication fails for "Test PyPI". I successfully published to Test PyPI from the command line, so my API token must be correct. I also checked for leading and trailing spaces in the secret value (i.e., on GitHub).
As the last commits show, I tried a few things without success.
I first tried to inline simple bash commands into the workflow as follows, but I have not been able to get my secrets into environment variables. Nothing showed up in the logs when I printed these variables.
- name: Publish on Test PyPI
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TEST_TOKEN }}
TWINE_REPOSITORY_URL: "https://test.pypi.org/legacy/"
run: |
echo "$TWINE_PASSWORD"
pip install twine
twine check dist/*
twine upload dist/*
I also tried to use a dedicated GitHub Action as follows, but it does not work either. I guess the problem comes from the secrets not being available in my workflow. What puzzled me is that my workflow uses another token/secret just fine! Though, if I put it in an environment variable, nothing is printed out. I also recreated my secrets under different names (PYPI_TEST_TOKEN and TEST_PYPI_API_TOKEN) but to no avail.
- name: Publish to Test PyPI
uses: pypa/gh-action-pypi-publish#release/v1
with:
user: __token__
password: ${{ secrets.TEST_PYPI_API_TOKEN }}
repository_url: https://test.pypi.org/legacy/
I guess I miss something obvious (as usual). Any help is highly appreciated.
I eventually figured it out. My mistake was that I defined my secrets within an environment and, by default, workflows do not run in any specific environment. For this to happen, I have to explicitly name the environment in the job description as follows:
jobs:
publish:
environment: CI # <--- /!\ Here is the link to the environment
needs: build
runs-on: ubuntu-latest
if: startsWith(github.ref, 'refs/tags/v')
steps:
- uses: actions/checkout#v2
# Some more steps here ...
- name: Publish to Test PyPI
env:
TWINE_USERNAME: "__token__"
TWINE_PASSWORD: ${{ secrets.TEST_PYPI_API_TOKEN }}
TWINE_REPOSITORY_URL: "https://test.pypi.org/legacy/"
run: |
echo KEY: '${TWINE_PASSWORD}'
twine check dist/*
twine upload --verbose --skip-existing dist/*
The documentation mentions it actually.
Thanks to those who commented for pointing me in the right direction.
This is the problem I struggled with, since I am working with multiple environments and they all share same named secrets with different values the following solution worked for me. Isolated pieces are described here and there, but it wasn't obvious how to piece it together.
At first I define that environment is selected during workflow_dispatch event:
on:
workflow_dispatch:
inputs:
environment:
type: choice
description: Select the environment
required: true
options:
- TEST
- UAT
I then reference it in jobs context:
jobs:
run-portal-tests:
runs-on: ubuntu-latest
environment: ${{ github.event.inputs.environment }}
Finally to be used in the step I need them in:
- name: Run tests
env:
ENDPOINT: ${{ secrets.ENDPOINT }}
TEST_USER: ${{ secrets.TEST_USER }}
TEST_USER_PASSWORD: ${{ secrets.TEST_USER_PASSWORD }}
CLIENT_ID: ${{ secrets.CLIENT_ID }}
CLIENT_SECRET: ${{ secrets.CLIENT_SECRET }}
run: python3 main.py