Pytest and database cleanup before running tests - python

i am using Flask to build a web service and pytest for testing
i am using pytest fixtures to set up and tear down the test resources but i need to test a POST endpoint that will create some records in the database
How do we clean up these records ?

You can use a fixture to do that cleanup.
#pytest.fixture
def cleanup():
yield
# This is executed when the test using the fixture is done
db_cleanup()
def test_records_created(cleanup): # pylint: disable=redefined-outer-name,unused-argument
response = app.test_client().post('/path', json=payload)
assert response.status_code == 200
assert ...

Related

Reuse function as pytest fixture

I have a function in my code that is being used by fastapi to provide a db session to the endpoints:
def get_db() -> Generator[Session, None, None]:
try:
db = SessionLocal()
yield db
finally:
db.close()
I want to use the same function as a pytest fixture. If I do something like the following, the fixture is not being recognized:
pytest.fixture(get_db, name="db", scope="session")
def test_item_create(db: Session) -> None:
...
test_item_create throws an error about db not being a fixture: fixture 'db' not found.
So I can rewrite get_db in my conftest.py and wrap it with pytest.fixture and get things working, but I was wondering if there's a better way of reusing existing functions as fixtures. If I have more helper functions like get_db, it'd be nice not to have rewrite them for tests.
I think pytest cannot find the fixture as things are written in your example. Maybe you are trying to get to something like this?
db = pytest.fixture(get_db, name="db", scope="session")
def test_item_create(db: Session) -> None:
...

How can I measure code coverage of API integration tests?

In my company we have a Django project that has API endpoints.
We are using Pytest as testing framework and some of our tests executing requests to such endpoints using requests.Session(). For example:
The test perform a GET request to /api/account/data
content of root/dir/tests/test_api.py
def test_some_api(self):
client = requests.Session()
response = client.get('/api/account/data')
assert response.status_code == 200
When performing the request, the backend execute this function:
`
root/models/account/api.py
def user_data(request, format=None):
"""
#api {get} /api/account/data
"""
if request.method == "GET":
return APIResponse(code=200)
We would like to measure the code executed in `user_settings` by the test, but I failed to make it work. We are using:
coverage==4.5.4 # forced to use an old version as we have dependencies conflict
pytest-cov==2.10.1
To measure coverage I run this command from root
pytest -v --cov-report html:html-report --cov=dir --cov-config=dir/.coveragerc
.coveragerc - has only files to exclude
I verified in backend logs that when the test runs, it execute the if block inside user_settings
I have tried adding this code to conftest.py as written in coverage docs, but it still didn't measure user_settings code execution.
#pytest.fixture(scope='session', autouse=True)
def run_cov():
process = coverage.process_startup()
source_path = os.path.join(_home_path, "/models/account")
cov = coverage.Coverage(config_file='root/.coveragerc', source=[source_path])
cov.start()
yield cov # Tests runs here
cov.stop()
cov.save()
cov.html_report(directory='html-report')
I've seen this solution How to generate coverage report for http based integration tests?, but failed to understand how to implement it on my end.
I expected user_settings code to be measured by coverage report.

Parallelize integration tests with pytest-xdist

I was wondering if it is possible to use an in memory SQLite database to perform integration tests in parallel using pytest and pytrest-xdist on a FastAPI application?
Update
I have a good number of tests that I would like to run during my CI (GitLab CI), however, due to the number of IOPS that need to be executed for each test when using a file for SQLite the job times out so I would like to use an in-memory database, as well as parallelize the tests using pytest-xdist.
Every endpoint uses FastAPI's dependency injection for the db context, and what I have tried is to create a fixture for the app as so:
#pytest.fixture(scope="function")
def app():
"""
Pytest fixture that creates an instance of the FastAPI application.
"""
app = create_app()
app.dependency_overrides[get_db] = override_get_db
return app
def override_get_db():
SQLALCHEMY_DATABASE_URL = f"sqlite:///:memory:"
engine = create_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
Base.metadata.drop_all(bind=engine)
Base.metadata.create_all(bind=engine)
TestLocalSession = sessionmaker(autocommit=False, autoflush=False, bind=engine)
init_db(session=TestLocalSession)
engine.execute("PRAGMA foreign_keys=ON;")
try:
db = TestLocalSession()
yield db
finally:
db.close()
Because the endpoints are all sync I also need to use httpx instead of the built in TestClient:
#pytest.fixture(scope='function')
async def client(app):
"""
Pytest fixture that creates an instance of the Flask test client.
"""
async with AsyncClient(
app=app, base_url=f"{settings.BASE_URL}{settings.API_PREFIX}"
) as client:
yield client
The issue I have when I run this test (without pytest-xdist) is that the database is being created in a seperate thread as that which is being injected into the endpoints so I always get a SQL error: sqlite3.OperationalError: no such table: certification
Any suggestion on how to solve this? Thanks.

pytest for django rest frame work api returns 301

I made simple Django application which returns {'result': 'OK'} for endpoint '/api/v1/test/ping/'. Now I am trying to test it with pytest.
My test directory
/test
conftest.py
/test_app
test_api.py
My conftest.py
import pytest
from rest_framework.test import APIClient
#pytest.fixture
def api_client():
return APIClient
My test_api.py
import pytest
def test_api1(api_client):
response = api_client().get("/api/v1/test/ping")
assert response.status_code == 200
Test script execution fails:
test_api.py::test_api1 FAILED [100%]
test\test_gml_api\test_api.py:3 (test_api1)
301 != 200
But code works correct if I run server and check it manually! Give me please an advise how to solve this?

Running a single test works, but running multiple tests fails - Flask and Pytest

This is really strange. I have the following simple flask application:
- root
- myapp
- a route with /subscription_endpoint
- tests
- test_az.py
- test_bz.py
test_az.py and test_bz.py look both the same. There is a setup (taken from https://diegoquintanav.github.io/flask-contexts.html) and then one simple test:
import pytest
from myapp import create_app
import json
#pytest.fixture(scope='module')
def app(request):
from myapp import create_app
return create_app('testing')
#pytest.fixture(autouse=True)
def app_context(app):
"""Creates a flask app context"""
with app.app_context():
yield app
#pytest.fixture
def client(app_context):
return app_context.test_client(use_cookies=True)
def test_it(client):
sample_payload = {"test": "test"}
response = client.post("/subscription_endpoint", json=sample_payload)
assert response.status_code == 500
running pytest, will run both files, but test_az.py will succeed, while test_bz.py will fail. The http request will return a 404 error, meaning test_bz cannot find the route in the app.
If I run them individually, then they booth succeed. This is very strange! It seems like the first test is somehow influencing the second test.
I have added actually a third test test_cz.py, which will fail as well. So only the first one will ever run. I feel like this has something todo with those fixtures, but no idea where to look.
Create a conftest.py for fixtures e.g. for client fixture and use the same fixture in both tests?
Now if you're saying that the provided code is the example of a test that is the same in another file, then you are creating 2 fixtures for a client. I would first clean it up and create a 1 conftest.py that contains all the fixtures and then use them in your tests this might help you.
Check out also how to use pytest as described in Flask documentation

Categories

Resources