Run single folder of pytest tests in vscode - python

I have a sort of a "micro-service" Python repo with a setup similar to the following:
sample
├── bar
│ ├── src
│ │ └── main.py
│ └── tests
│ └── test_main.py
├── foo
│ ├── src
│ │ └── main.py
│ └── tests
│ └── test_main.py
└── shared
├── src
│ └── main.py
└── tests
└── test_main.py
In vscode I only have the option of running all tests in foo,bar,shared or running individual test methods in the subfolders. What I want to do is be able to quickly run just the foo/tests/.
Is there some way I can configure pytest/Python to do this? I don't want to split each top level folder into its own workspace because I regularly jump back and for between them and don't want to have multiple windows per workspace open.

You should be able to run the command pytest foo/tests/ in the terminal according to the
pytest documentation

Related

Dynamically create and run tests in Github Actions

I have a git project with directory structure similar to this:
.
├── .github/
│ └── workflows/
│ └── test.yml
└── my_packages/
├── package_1/
│ ├── tests/
│ │ └── test_package.py
│ ├── package_logic.py
│ ├── configurations.yml
│ └── requirements.py
├── package_2/
│ ├── tests/
│ │ └── test_package.py
│ ├── package_logic.py
│ ├── configurations.yml
│ └── requirements.py
├── ...
└── package_n/
├── tests/
│ └── test_package.py
├── package_logic.py
├── configurations.yml
└── requirements.py
There are n packages in this repository and every package is independent one from another with its own requirements.py, logic, and tests.
There is also test.yml file, which is currently not implemented, that needs to create a testing environment for every package separately and run its test (pytest). Another requirement is that every package defines its python version in the configurations.yml file in which the module test should run. And finally, the test should run only for modules which code was changed.
How can this be implemented in Github Actions? And is it possible to run every module in a separate job (or just in parallel)?

Accessing Folder Structure for Spark Submit zipped PyFile

I have a folder structure that looks like this:
├── repo
│ ├── src
│ └── main.py
│ ├── __init__.py
│ ├── utils
│ ├── __init__.py
│ └── helpers.py
│ └── predict
│ ├── __init__.py
│ └── predict.py
│
I am submitting a spark job with the --file src/predict/predict.py and a py-files src.zip (the spark submit command is built from main.py) of the entire source directory.
In predict, I want to reference a method in helpers.py. I do this by attempting an import: from src.utils.helpers import helper_func
However this does not work: ModuleNotFoundError: No module named 'src'. The imports work locally, but I am trying to understand how this works in the EMR. Ideally I dont want to have to change the importing from local to EMR. I just want them to be the same.

Pytest: import in another test causing mocks to fail

I've been asked to take a look at some tests in our project. The component tests all run individually no problem, but when run together we get hundreds of errors. Given the scale of the problem I'm no longer actively looking into it, but I've become really frustrated trying to figure out why it's happening.
A section of the project layout is below. There are modules dealing with various aspects of the project defined in the 'lambdas' folder, with a 'common' folder that contains utilities used by other files. I'm specifically looking at some tests in the 'episodes/api' directory.
.
└── lambdas
├── access
├── ...
├── common
│ ├── dal
│ └── utils
│ │ ├── api_utils.py
│ │ ├── audit_utils.py
│ │ └── date_utils.py
│ └── ...
└── episodes
└── api
├── accept_participant
│ ├── component_tests
│ │ ├── __init__.py
│ │ └── accept_participant_test.py
│ ├── unit_tests
│ │ ├── __init__.py
│ │ └── accept_participant_test.py
│ ├── accept_participant.py
│ └── __init__.py
└── reject_participant
│ ├── component_tests
│ │ ├── __init__.py
│ │ └── reject_participant_test.py
│ ├── unit_tests
│ │ ├── __init__.py
│ │ └── reject_participant_test.py
│ ├── reject_participant.py
│ └── __init__.py
accept_participant.py imports two methods from common.utils.audit_utils, one of which uses a boto3.client to communicate with SQS. The component test accept_participant_test mocks out the boto3.client call, which all works fine when run on its own. It also works fine when run in conjunction with the other component test, as long as the unit tests are not physically present. If the unit tests are run, or even if they're present but not selected for run (using a pytest.mark), the accept_participant_test component test fails.
The unit test causing the problem is the reject_participant_test. It imports common.utils.audit_utils in order to check the value of an enum in it. When this line is commented out, the component test passes ok.
I'm probably missing a gotcha here somewhere, but why does this one import cause a problem? The unit test isn't even being run. I'm thinking I might need to change the boto3.client patch, but I've tried all sorts of different variations and it either tells me it's not a package or doesn't exist in the namespace.

Nose2 fails to import modules, but only when run in docker

I have a python Celery project of the following structure:
├── celeryconfig.py
├── Dockerfile
│ └── _templates
├── framework
│ ├── celery
│ │ ├── celery.py
│ │ ├── __init__.py
│ │ └── __init__.cpython-36.pyc
│ ├── tasks
│ │ ├── tasks.py
│ │ ├── __init__.py
│ ├── utilities
│ │ ├── utilities.py
│ │ ├── __init__.py
│ ├── tests
│ │ ├── __init__.py
│ │ └── test_celery.py
Which I run from the top level directory using the command celery framework.celery.celery worker
I have tets in the tests directory that I run using nose2. When I run it in the directory the tests pass without issue. However when the tests are run as part of a docker build process, the nose2 process fails because it can't make sense of the imports, for example
# test_celery.py
from framework.tasks.tasks import my_function
def test_my_function():
# Do some testing.
The import of the function succeeds, but fails on the imports of the imported file, which look like:
# tasks.py
from framework.utilities.utilities import some_other_function
Part of the issue I'm having is that Celery is particular about how it itself is structured, so attempts to restructure the directory have just resulted in celery not being able to start. I'm not quite sure why the tests only fail in a docker build, which looks like this:
FROM google/cloud-sdk:198.0.0-alpline
COPY . /project
WORKDIR /project
RUN apk add --no-cache python3 python3-dev && pip3 install -r requirements.txt
RUN nose2

Python module import issue subdir

I have the following directory structure in my Python3 project:
├── README.md
├── requirements.txt
├── schemas
│ ├── collector.sql
│ └── controller.sql
├── src
│ ├── controller.db
│ ├── controller.py
│ ├── measurement_agent.py
├── tests
│ ├── regression.py
│ ├── test-invalid-mac.py
│ ├── test-invalid-url.py
│ ├── test-register-ma-json.py
│ ├── test-register-ma-return-code.py
│ ├── test-send-capabilities-return-code.py
│ ├── test-valid-mac.py
│ └── test-valid-url.py
└── todo
In my tests folder I have some regression tests which are ran to check the consistency of the code from src/measurement_agent.py. The problem now is that I do not want to add to my path manually the measurement_agent.py to make an import from it. I would want to know if there is any trick how to tell Python to look in my root for the import I am trying to use.
Currently I am doing:
import os.path
ma = os.path.abspath('..') + '/src'
sys.path.append(ma)
from measurement_agent import check_hardware_address
and would want to have something just like
from measurement_agent import check_hardware_address
without using any os.path tricks.
Any suggestions?
Relative imports
Make sure there is an __init__.py in all folders including the top-most (the parent)
Use a relative import, like this:
from ..src import measurement_agent
Now to run your code, cd up to the parent of your parent directory and then
python -m parent.test.regression

Categories

Resources