Unrecognized arguments YAML error when running pytest - python

I have a github action which runs some unit tests when a push is made to the repository. All the commands in the YAML execute successfully such as installing requirments.txt but then returns the following error when it tries to run the pytest command
python3 -m pytest verify/test.py --ds myapp.settings_pytest.
ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]
__main__.py: error: unrecognized arguments: --ds myapp.settings_pytest.
Strangely the command runs fine locally so I am confused as to why I only encounter this when it is ran from my YAML file. I am also encountering the same error when the same YAML file runs on my AWS build server.
test.yml
name: Run tests
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
env:
django_secret_key: ${{ secrets.DJANGO_SECRET_KEY }}
jobs:
build:
runs-on: ubuntu-latest
strategy:
max-parallel: 4
matrix:
python-version: [3.8]
steps:
- uses: actions/checkout#v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python#v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Run Tests
run: |
export DJANGO_SETTINGS_MODULE="myapp.settings_pytest"
python3 -m pytest verify/test.py --ds myapp.settings_pytest

It has nothing to do with YML - you have to install Diango to get access to --ds options.
pip install Django==4.0.2
Or better - keep it in your requirements.txt

Related

Python unit testing with Docker and environment variables

I am trying to run unit tests for my application with a Docker container (and possibly in a GitHub workflow), but I can't figure out how to correctly pass env variables to it.
So normally for the building process I have a pretty standard Dockerfile
FROM python:3.7-alpine3.15
WORKDIR /app
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
COPY src/ .
CMD [ "python3", "main.py" ]
and a workflow that builds it and pushes the image to Docker Hub. Then of course the usual docker run --env-file=.env ... command to run the application fetching the variables from a file.
Now I am adding tests to the code. The application needs some env variables to function properly (auth keys and other stuff), and so of course also to run the tests. I don't want to export the variables in my system and run the test from my terminal, so I want to use Docker. But I'm not really sure how to properly do it.
My goal is to be able to run the tests locally and to also have a workflow that runs on PRs, without committing the variables in the repo.
This is what I've tried so far:
Add test to the Dockerfile: adding RUN python -m unittest -s tests doesn't really work because at build time Docker doesn't have access to the .env file
Add GitHub workflow with the test command: even using a GitHub environment to store secrets and deploying the job into that for some reason doesn't fetch the variables. Plus I would like to be able to test the code before pushing the changes, and have this workflow run only on PRs.
jobs:
test:
runs-on: ubuntu-latest
environment: test
steps:
- name: Checkout
uses: actions/checkout#v2
- name: Set up Python 3.7
uses: actions/setup-python#v2
with:
python-version: 3.7
cache: 'pip'
- name: Install dependencies
run: pip install -r requirements.txt
- name: Run tests
shell: bash
env:
EMAIL: ${{ secrets.EMAIL }}
AUTH_KEY: ${{ secrets.AUTH_KEY }}
ZONE_NAME: ${{ secrets.ZONE_NAME }}
RECORD_ID: ${{ secrets.RECORD_ID }}
CHECK_INTERVAL: ${{ secrets.CHECK_INTERVAL }}
SENDER_ADDRESS: ${{ secrets.SENDER_ADDRESS }}
SENDER_PASSWORD: ${{ secrets.SENDER_PASSWORD }}
RECEIVER_ADDRESS: ${{ secrets.RECEIVER_ADDRESS }}
run: |
python -m unittest discover -t src -s tests
Here you can find the full source code if needed.
Change
RUN python -m unittest -s tests
to
CMD python -m unittest -s tests
and unittest will be launched not at build stage but at tests container start, when you can use your env file

Github secrets cannot be included into Python

Here is my workflow file
jobs:
build:
runs-on: windows-latest
environment: Main
env:
MAINAPI: ${{secrets.MAINAPI }}
TESTAPI: ${{secrets.TESTAPI }}
BRAINID: ${{secrets.BRAINID }}
BRAINKEY: ${{secrets.BRAINKEY }}
steps:
- uses: actions/checkout#v3
- name: Set up Python 3.10
uses: actions/setup-python#v3
with:
python-version: "3.10"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 pytest
python -m pip install PyAudio-0.2.11-cp311-cp311-win_amd64.whl
pip install -r requirements.txt --no-deps
- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Test with pytest
run: |
pytest
env:
MAINAPI: ${{secrets.MAINAPI }}
TESTAPI: ${{secrets.TESTAPI }}
BRAINID: ${{secrets.BRAINID }}
BRAINKEY: ${{secrets.BRAINKEY }}
Here is my code
mainapi = str(os.environ["MAINAPI"])
apiurl = str(os.environ["TESTAPI"])
I have set the secrets as Environment secrets, repository sercrets and even as an environment. However, none of them seems to work. Pls help

Can't find installed modules in azure devops pipeline

I'm building stage pipeline (env prp, tests, code). Currently have faced the blocker. Seems like each stage is kindda individual process. My requirements.txt is being installed correctly but then test stage raises the ModuleNotFoundError. Appreciate any hints how to make it working :)
yaml:
trigger: none
parameters:
- name: "input_files"
type: object
default: ['a-rg', 't-rg', 'd-rg', 'p-rg']
stages:
- stage: 'Env_prep'
jobs:
- job: "install_requirements"
steps:
- script: |
python -m pip install --upgrade pip
python -m pip install -r requirements.txt
- stage: 'Tests'
jobs:
- job: 'Run_tests'
steps:
- script: |
python -m pytest -v tests/variableGroups_tests.py
Different jobs and stages are capable of being executed on different agents in Azure Pipelines. In your case, the installation requirements are a direct prerequisite of the tests being run, so everything should be in one job:
trigger: none
parameters:
- name: "input_files"
type: object
default: ['a-rg', 't-rg', 'd-rg', 'p-rg']
stages:
- stage: Test
jobs:
- job:
steps:
- script: |
python -m pip install --upgrade pip
python -m pip install -r requirements.txt
displayName: Install Required Components
- script: |
python -m pytest -v tests/variableGroups_tests.py
displayName: Run Tests
Breaking those into separate script steps isn't even necessary unless you want the log output to be separate in the console.

Trouble installing Weasyprint & Cairo on Linux Web App

I have a Django app that uses Weasyprint to generate PDF outputs. This works fine on my local development machine.
I am able to successfully deploy to Azure Web Apps, but get the following error message:
2020-11-17T07:34:14.287002623Z OSError: no library called "cairo" was found
2020-11-17T07:34:14.287006223Z no library called "libcairo-2" was found
2020-11-17T07:34:14.287009823Z cannot load library 'libcairo.so.2': libcairo.so.2: cannot open shared
object file: No such file or directory
2020-11-17T07:34:14.287016323Z cannot load library 'libcairo.2.dylib': libcairo.2.dylib: cannot open
shared object file: No such file or directory
2020-11-17T07:34:14.287020123Z cannot load library 'libcairo-2.dll': libcairo-2.dll: cannot open
shared object file: No such file or directory
Per Weasyprint's documentation (https://weasyprint.readthedocs.io/en/stable/install.html#debian-ubuntu), I have attempted to make the reccommended installations via a custom deployment script which looks like such:
jobs:
build:
name: Build and Deploy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python#v2
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: Install dependencies
run: |
sudo apt-get install build-essential python3-dev python3-pip python3-setuptools python3-wheel python3-cffi libcairo2 libpango-1.0-0 libpangocairo-1.0-0 libgdk-pixbuf2.0-0 libffi-dev shared-mime-info
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Deploy Web App using GH Action azure/webapps-deploy
uses: azure/webapps-deploy#v2
with:
app-name: ${{ env.AZURE_WEBAPP_NAME }}
publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE }}
package: ${{ env.AZURE_WEBAPP_PACKAGE_PATH }}
However, my problem persists and I still receive the same message.
Does anybody have experience installing Weasyprint & Cairo on a Linux-based Web App?
I appreciate any help in advance.
UPDATE
Currently, I am able to deploy using the default deployment script created by Azure (shown below). I am then able to SSH into the deployment machine and manually activate the virtual environment & install the requisite packages. This process works and my application now works as expected.
I'd like to roll this command into the deployment process somehow (either as part of the default script or via a post deployment action).
GITHUB ACTIONS DEPLOYMENT SCRIPT
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#master
- name: Set up Python version
uses: actions/setup-python#v1
with:
python-version: '3.6'
- name: Build using AppService-Build
uses: azure/appservice-build#v2
with:
platform: python
platform-version: '3.6'
- name: 'Deploy to Azure Web App'
uses: azure/webapps-deploy#v2
with:
app-name: {{appname}}
slot-name: {{slotname}}
publish-profile: {{profilename}}
MANUAL VIRTUAL ENV ACTIVATION & INSTALLS
source /home/site/wwwroot/pythonenv3.6/bin/activate
sudo apt-get install {{ additional packages }}
The required dependencies and the things you want to install can be successfully added to the .yml file, but whether it takes effect for your webapp, you still need to test, and specific problems need to be analyzed in detail.
If that doesn't work, it is recommended to install ssh manually.
I add linux command in .yml file to apt-get install xxx.
For more details,
Below is my .yml file. It works fine.
# Docs for the Azure Web Apps Deploy action: https://github.com/Azure/webapps-deploy
# More GitHub Actions for Azure: https://github.com/Azure/actions
name: Build and deploy Python app to Azure Web App - pyodbcInstallENV
on:
push:
branches:
- master
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#master
- name: Set up Python version
uses: actions/setup-python#v1
with:
python-version: '3.8'
- name: Install custom env
run: |
cd /home
sudo apt-get update
sudo apt-get install g++
sudo apt-get install unixodbc-dev
pip install pyodbc
- name: Build using AppService-Build
uses: azure/appservice-build#v2
with:
platform: python
platform-version: '3.8'
- name: 'Deploy to Azure Web App'
uses: azure/webapps-deploy#v2
with:
app-name: 'pyodbcInstallENV'
slot-name: 'production'
publish-profile: ${{ secrets.AzureAppService_PublishProfile_d712769***2017c9521 }}

How to reference a directory in my repo during a GitHub action build?

I have some test data that I use for unit tests with pytest. I set their location with environment variables. Looking at my pytest logs the build sees the environment vars but the locations they reference don't exist. In the GitHub Actions docs the repo should be in /home/runner/Repo/. Below is my folder structure. Anyone see any obvious issues?
Repo/
notebooks/
repo/
__init__.py
tests/
tests_db.hdf5
Sample_Raw/
...
__init__.py
test_obj1.py
test_obj2.py
obj1.py
obj2.py
utils.py
build yaml
name: build-test
on:
push:
branches:
- '*' # all branches for now
jobs:
build-and-run:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest]
python-version: [3.8]
steps:
- uses: actions/checkout#v2
- uses: actions/setup-python#v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Generate coverage report
env:
DB_URL: /home/runner/work/Repo/repo/tests/test_db.hdf5
RAW_FOLDER: /home/runner/work/Repo/repo/tests/Sample_Raw/
run: |
pip install pytest
pip install pytest-cov
pytest --cov=./ --cov-report=xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action#v1
with:
token: ${{ secrets.CODECOV_TOKEN }}
file: ./coverage.xml
name: codecov-umbrella

Categories

Resources