Setting up github actions SQL linting - python

I am trying to setup a simple github actions workflow for sql linting using sqlfluff package. here is sunrise movement workflow which is simple and clean.
name: Lint Models
on: [pull_request]
jobs:
lint-models:
runs-on: ubuntu-latest
steps:
- uses: "actions/checkout#v2"
- uses: "actions/setup-python#v2"
with:
python-version: "3.8"
- name: Install SQLFluff
run: "pip install sqlfluff==0.12.0"
- name: Lint models
run: "sqlfluff lint models"
When I tried to run it in github actions, it is giving me the following error message. Not quite sure why it is throwing error. Help is appreciated as I am trying to learn github acitons for the first time.

You have this:
run: "sqlfluff lint models"
This says to lint the directory called models. The directory does not exist in your repo (is it a sub folder?).

Related

terraform not found in bitbucket

So i am trying to create a pipeline on bitbucket. On my local computer, I navigate to the folder cd terraform/environments/devand run terraform init without an issue. However, when I run the test pipeline on bitbucket, it stops on the second action because
bash: terraform: command not found
How can I fix this? I believe I need to install terraform on bitbucket somehow but I am not sure how to do so. Do I use python pip commands? If so, how and why?
image: atlassian/default-image:2
pipelines:
branches:
test:
- step:
name: 'Navigate to Dev'
script:
- cd terraform/environments/dev
condition:
changesets:
includePaths:
- "terraform/modules"
- "terraform/environments/dev"
- step:
name: 'Initialize Terraform'
script:
- terraform init
You need the correct image for your build agent. In this situation, the agent basically only needs terraform installed and accessible:
image: hashicorp/terraform
This will fix your issue. You can also of course set the tag for the image to your specific version of Terraform.

how to template python tasks in azure devops pipelines

I have two repositories A & B.
Azure Repository A - Contains a python app
Azure Repository B - Contains .yml templates and .py scripts I want to run in the .yml templates
According to the documentations.. I cannot do this because when I expand the template into the calling repository A's pipeline.. it will be like a code directive and just inject the code.. it will not know or care about the .py files in the respoitory.
What are my options without doing all my .py routines as inline ?
Azure Repo A's Pipeline Yaml file
trigger: none
resources:
pipelines:
- pipeline: my_project_a_pipeline
source: trigger_pipeline
trigger:
branches:
include:
- master
repositories:
- repository: template_repo_b
type: git
name: template_repo_b
ref: main
stages:
- template: pipelines/some_template.yml#template_repo_b
parameters:
SOME_PARAM_KEY: "some_param_value"
Azure Repo B's some_template.yml
parameters:
- name: SOME_PARAM_KEY
type: string
stages:
- stage: MyStage
displayName: "SomeStage"
jobs:
- job: "MyJob"
displayName: "MyJob"
steps:
- bash: |
echo Bashing
ls -la
displayName: 'Execute Warmup'
- task: PythonScript#0
inputs:
scriptSource: "filePath"
scriptPath: /SOME_PATH_ON_REPO_B/my_dumb_script.py
script: "my_dumb_script.py"
Is there an option to wire in the .py files into a completely separate repo C... add C to resources of B templates.. and be on my way ?
EDIT:
I can see In Azure templates repository, is there a way to mention repository for a filePath parameter of azure task 'pythonScript'? but then how do I consume the python package.. can I still use the PythonScript task ? sounds like I would then need to call my pip packaged code straight from bash ??
I figured it out.. how to pip install py files in azure devops pipelines.. using azure repositories.. via a template in the same repo
just add a reference to yourself at the top of any template
In the consuming repo
repositories:
- repository: this_template_repo
type: git
name: this_template_repo
ref: master
then add a job, referencing yourself by that name
- job: "PIP_INSTALL_LIBS"
displayName: "pip install libraries to agent"
steps:
- checkout: this_template_repo
path: this_template_repo
- bash: |
python3 -m pip install setuptools
python3 -m pip install -e $(Build.SourcesDirectory)/somepypimodule/src --force-reinstall --no-deps
displayName: 'pip install pip package'

Committing python generated graphs to repo using github actions

I'm trying to create a GitHub workflow that runs a python script (which outputs three graphs), add those graphs to the readme.md then commit the changes to the repo and display the graphs on the readme page. I would like to trigger a new push happening.
as a bash script it would look like this:
git pull
python analysis_1.py
git add .
git commit -m "triggered on action"
git push
I'm not really sure where to start on it or how to set up the action. I tried setting up one but it wouldn't make any changes.
See this answer for how to commit back to your repository during a workflow.
In your case it might look something like this. Tweak it where necessary.
on:
push:
branches:
- master
jobs:
updateGraphs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: actions/setup-python#v1
with:
python-version: '3.x'
- name: Generate graphs
run: python analysis_1.py
- name: Update graphs
run: |
git config --global user.name 'Your Name'
git config --global user.email 'your-username#users.noreply.github.com'
git commit -am "Update graphs"
git push
Alternatively, raise a pull request instead of committing immediately using create-pull-request action.
on:
push:
branches:
- master
jobs:
updateGraphs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: actions/setup-python#v1
with:
python-version: '3.x'
- name: Generate graphs
run: python analysis_1.py
- name: Create Pull Request
uses: peter-evans/create-pull-request#v2
with:
token: ${{ secrets.GITHUB_TOKEN }}
commit-message: Update graphs
title: Update graphs
branch: update-graphs

Fail build if coverage lowers

I have GitHub Actions that build and test my Python application. I am also using pytest-cov to generate a code coverage report. This report is being uploaded to codecov.io.
I know that codecov.io can't fail your build if the coverage lowers, so how do I go about with GitHub Actions to fail the build if the coverage drops? Do I have to check the previous values and compare with the new "manually" (having to write a script)? Or is there an existing solution for this?
One solution is that you can do a job with 2 steps which are :
Check if the coverage has dropped or not
Build in function of your result
If the step 1 fail, no build.
You can do a python script and return an error if the coverage drops.
Try something like that :
jobs:
build:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v1
- name: Set Up Python
uses: actions/setup-python#v1
- name: Test Coverage
run: python check_coverage.py
- name: Build
if: success()
run: python do_something.py # <= here you're doing your build
I hope it helps.
There is nothing built-in, instead you should use one of the many integrations like sonarqube, if I don’t want to write a custom script.

Is there a Python/Django equivalent to Rails bundler-audit?

I'm fairly new to Django so apologies in advance if this is obvious.
In Rails projects, I use a gem called bundler-audit to check that the patch level of the gems I'm installing don't include security vulnerabilities. Normally, I incorporate running bundler-audit into my CI pipeline so that any time I deploy, I get a warning (and fail) if a gem has a security vulnerability.
Is there a similar system for checking vulnerabilities in Python packages?
After writing out this question, I searched around some more and found Safety, which was exactly what I was looking for.
In case anyone else is setting up CircleCI for a Django project and wants to check their packages for vulnerabilities, here is the configuration I used in my .circleci/config.yml:
version: 2
jobs:
build:
# build and run tests
safety_check:
docker:
- image: circleci/python:3.6.1
steps:
- checkout
- run:
command: |
python3 -m venv env3
. env3/bin/activate
pip install safety
# specify requirements.txt
safety check -r requirements.txt
merge_master:
# merge passing code into master
workflows:
version: 2
test_and_merge:
jobs:
- build:
filters:
branches:
ignore: master
- safety_check:
filters:
branches:
ignore: master
- merge_master:
filters:
branches:
only: develop
requires:
- build
# code is only merged if safety check passes
- safety_check
To check that this works, run pip install insecure-package && pip freeze > requirements.txt then push and watch for Circle to fail.

Categories

Resources