Disable before_serving function while running the pytest in quart - python

I am using Quart App.
I am calling a service in my before_serving(app_initionalization) function and I do not want to call that in pytests. Actually, I want to disable my before_serving function or something like mock it.
import pytest
#pytest.mark.asyncio
async def test_my_api_call(test_app: Pint, headers: dict):
test_client = test_app.test_client()
response = await test_client.get("/get_user", headers)
assert response.status_code == 200
This is my test_app.
#pytest.fixture(name="test_app", scope="function")
async def _test_app(s3_client, tmp_path, async_mongodb):
os.environ["BLOB_STORE"] = str(tmp_path)
db_config['db'] = async_mongodb
async with app.test_app() as test_app:
yield test_app

Your fixture will run the before-serving startup functions as it uses the test_app,
async with app.test_app() as test_app:
As you don't wish to run these you can change your fixture to,
#pytest.fixture(name="test_app", scope="function")
async def _test_app(s3_client, tmp_path, async_mongodb):
os.environ["BLOB_STORE"] = str(tmp_path)
db_config['db'] = async_mongodb
return app

Related

How to forward headers using FastAPI - Tracing use cases

Below is simple server written with FastAPI and running with Uvicorn.
In order to send the value to the next hop, the '/destination' url, I need to pass the value to the forward_request method.
In this implementation, passing the value is easy, because the calls' depth is just 1 function more.
But if I have a function that calls a function that calls a function...., I need to pass the value again and again and...
Is there a simpler way to share the value downstream without passing it?
Is there a way, without using inspect or some dark magic, to understand what is the scope that the function forward_request lives in?
Why am I asking this question?
I am using Jaeger for tracing and I need to forward the header x-request-id that I receive in the 1st server (/source in this example) to the 2nd server (/destination in this example).
If FastAPI\Uvicorn processed just 1 request at a time, I could have shared the value as a singleton class and access it from anywhere, but since requests are handled in parallel, the function forward_request doesn't have the context of who called it.
Some black magic
Function forward_request can inspect the call stack and figure from it, what is the value that was received, but that is one ugly way to do things.
Server Code
import asyncio
import aiohttp as aiohttp
from fastapi import FastAPI, Request
import uvicorn
from datetime import datetime
app = FastAPI()
#app.get("/source")
async def route1(value: int, request: Request):
print(f'source {value}')
start = datetime.now().strftime("%H:%M:%S")
await asyncio.sleep(5)
resp: dict = await forward_request(value)
end = datetime.now().strftime("%H:%M:%S")
return {
"start": start,
"end": end,
"value": value,
"resp": resp
}
#app.get("/destination")
async def route2(value, request: Request):
print(f'destination {value}')
start = datetime.now().strftime("%H:%M:%S")
await asyncio.sleep(5)
end = datetime.now().strftime("%H:%M:%S")
return {
"start": start,
"end": end,
"value": value
}
async def forward_request(value: int) -> dict:
async with aiohttp.ClientSession() as session:
async with session.get('http://127.0.0.1:5000/destination', params={'value': value}) as resp:
resp = await resp.json()
return resp
if __name__ == "__main__":
uvicorn.run("main2:app", host="127.0.0.1", port=5000, log_level="info", workers=1)

Testing asynchronous FastAPI endpoints with dependencies

I've encountered this problem, and I can't see any solution, though it must be a common one. So, maybe I'm missing something here.
I'm working on FastAPI app with asynchronous endpoints and asynchronous connection with database. Database connection is passed as a dependency. I want to write some asynchronous tests for said app.
engine = create_async_engine(connection_string, echo=True)
def get_session():
return sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
#router.post("/register")
async def register(
user_data: UserRequest,
authorize: AuthJWT = Depends(),
async_session: sessionmaker = Depends(get_session),
):
"""Register new user."""
if authorize.get_jwt_subject():
raise LogicException("already authorized")
session: AsyncSession
async with async_session() as session:
query = await session.execute(
select(UserModel).where(UserModel.name == user_data.name)
)
...
I'm using AsyncSession to work with database. So in my test, db connection also has to be asynchronous.
engine = create_async_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
app.dependency_overrides[get_session] = lambda: sessionmaker(
engine, class_=AsyncSession, expire_on_commit=False
)
#pytest.mark.asyncio
async def test_create_user():
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
async with AsyncClient(app=app, base_url="http://test") as ac:
response = await ac.post(
"/register",
json={"name": "TestGuy", "password": "TestPass"},
)
assert response.status_code == 200, response.text
When running the test, I get the following error:
...
coin_venv\lib\site-packages\fastapi\routing.py:217: in app
solved_result = await solve_dependencies(
coin_venv\lib\site-packages\fastapi\dependencies\utils.py:529: in solve_dependencies
solved = await run_in_threadpool(call, **sub_values)
AttributeError: module 'anyio' has no attribute 'to_thread'
I concluded that error appears only when there is a dependency in an endpoint. Weird part is that I don't even have anyio in my environment.
So, is there a way to test asynchronous FastAPI endpoints with dependencies and asynchronous db connection? Surely, there must be something, it's not like this situation is something unique...
UPD: I tried using decorator #pytest.mark.anyio and also have installed trio and anyio. Now pytest seem to discover two distinct tests in this one:
login_test.py::test_create_user[asyncio]
login_test.py::test_create_user[trio]
Both fails, first one with what seems to be a valid error in my code, and second one with:
RuntimeError: There is no current event loop in thread 'MainThread'.
I guess it is true, though I don't really know if pytest creates eventloop to test async code. Anyway, I don't need the second test, why it is here, and how can I get rid of it?
It turned out, I can specify backend to run tests like this:
#pytest.fixture
def anyio_backend():
return 'asyncio'
So, now I have only the right tests running)
pytest runs on different eventloop (not get_running_loop), and so, when you try to run it in the same context, it raises exception. I suggest you to consider using nest_asyncio (https://pypi.org/project/nest-asyncio/), so that pytest can run in the same eventloop.
import nest_asyncio
nest_asyncio.apply()

How to mock httpx.AsyncClient() in Pytest

I need to write test case for a function which use to fetch data from API. In there i used httpx.AsyncClient() as context manager. But i don't understand how to write test case for that function.
async def make_dropbox_request(url, payload, dropbox_token):
async with httpx.AsyncClient(timeout=None, follow_redirects=True) as client:
headers = {
'Content-Type': 'application/json',
'authorization': 'Bearer '+ dropbox_token
}
# make the api call
response = await client.post(url, headers=headers, json=payload)
if response.status_code not in [200]:
print('Dropbox Status Code: ' + str(response.status_code))
if response.status_code in [200, 202, 303]:
return json.loads(response.text)
elif response.status_code == 401:
raise DropboxAuthenticationError()
elif response.status_code == 429:
sleep_time = int(response.headers['Retry-After'])
if sleep_time < 1*60:
await asyncio.sleep(sleep_time)
raise DropboxMaxRateLimitError()
raise DropboxMaxDailyRateLimitError()
raise DropboxHTTPError()
I need to write test cases without calling the API. So there for i believe in this case i need to mock client.post() but i do not understand how to do that. If anyone can help me to figure this out that would be really helpful for me.
This image also include my code block
TL;DR: use return_value.__aenter__.return_value to mock the async context.
Assuming you are using Pytest and pytest-mock, your can use the mocker fixture to mock httpx.AsyncClient.
Since the post function is async, you will need to use an AsyncMock.
Finally, since you use an async context, you will also need to use return_value.__aenter__.return_value to properly mock the returned context. Note for a synchronous context, simply use __enter__ instead of __aenter__.
#pytest.fixture
def mock_AsyncClient(mocker: MockerFixture) -> Mock:
mocked_AsyncClient = mocker.patch(f"{TESTED_MODULE}.AsyncClient")
mocked_async_client = Mock()
response = Response(status_code=200)
mocked_async_client.post = AsyncMock(return_value=response)
mocked_AsyncClient.return_value.__aenter__.return_value = mocked_async_client
return mocked_async_client
I also faced with same issue and handled it with patch decorator. I share my code, so that might help for others.
from unittest.mock import patch
import pytest
import httpx
from app.services import your_service
#pytest.mark.anyio
#patch(
'app.services.your_service.httpx.AsyncClient.post',
return_value = httpx.Response(200, json={'id': '9ed7dasdasd-08ff-4ae1-8952-37e3a323eb08'})
)
async def test_get_id(mocker):
result = await your_service.get_id()
assert result == '9ed7dasdasd-08ff-4ae1-8952-37e3a323eb08'
You can try out the RESPX mocking library to test and mock your HTTPX clients.
​
​
In your case, something like this should do it:
​
​
async def make_dropbox_request(url, payload, dropbox_token):
...
response = await client.post(url, headers=headers, json=payload)
...
return response.json()
​
​
#respx.mock
async def test_dropbox_endpoint():
url = "https://dropbox-api/some-endpoint/"
endpoint = respx.post(url).respond(json={"some": "data"})
result = await make_dropbox_request(url, ..., ...)
assert endpoint.called
assert result == {"some": "data"}
​
To be dry and not repeat the mocking in each test, you can set up your own pytest fixture, or respx instance, globally that pre-mocks all dropbox api endpoints, and then in each test just alter response/error depending on the scenario for the test, to get full test coverage on make_dropbox_request.
​
#pytest.fixture()
async def dropbox_mock():
async with respx.mock() as dropbox:
# default endpoints and their responses
dropbox.post("some-endpoint", name="foo").respond(404)
dropbox.post("some-other-endpoint", name="bar").respond(404)
# ^ name routes for access in tests
yield dropbox
​
​
async def test_some_case(dropbox_mock):
dropbox_mock["foo"].respond(json={})
....

Caching async requests in Pytest test function

I have implemented a test function in pytest which loads data from files, casts it into Python objects and provides a new object for each test.
Each one of these objects contains a request I need to make to the server and the expected responses, the function looks like this:
#pytest.mark.asyncio
#pytest.mark.parametrize('test', TestLoader.load(JSONTest, 'json_tests'))
async def test_json(test: JSONTest, groups: Set[TestGroup], client: httpx.AsyncClient):
skip_if_not_in_groups(test, groups)
request = Request(url=test.url, body=test.body.dict())
response = await client.post(request.url, json=request.body)
# Assertions down here...
Many times I send many requests that contain the same http endpoint with the same body so the response is the same, but I'm testing for different things in the response.
Because of that I thought of implementing an in-memory cache so that for each test run the same requests won't be implemented twice.
What I've tried to do is create a request object, with its own __hash__ implementation and use the #asyncstdlib.lru_cache on the function, it didn't seem to work.
# Does not work...
#asyncstdlib.lru_cache
async def send_request(request: Request, client: httpx.AsyncClient):
return await client.post(request.url, json=request.body)
#pytest.mark.asyncio
#pytest.mark.parametrize('test', TestLoader.load(JSONTest, 'json_tests'))
async def test_json(test: JSONTest, groups: Set[TestGroup], client: httpx.AsyncClient):
skip_if_not_in_groups(test, groups)
request = Request(url=test.url, body=test.body.dict())
response = await send_request(request)
The client I'm using: httpx.AsyncClient also implements __hash__, it's coming from a pytest.fixture in conftest.py and it has a scope of 'session':
# conftest.py
#pytest.fixture(scope='session')
def event_loop(request):
loop = asyncio.get_event_loop_policy().new_event_loop()
yield loop
loop.close()
#pytest.fixture(scope='session')
async def client() -> httpx.AsyncClient:
async with httpx.AsyncClient() as client:
yield client
Just let go of the opaque 3rd party cache, and cache yourself.
Since you don't require cleaning-up the cache during a single execution, a plain dictionary will work:
_cache = {}
async def send_request(request: Request, client: httpx.AsyncClient):
if request.url not in _cache:
_cache[request.url] = await client.post(request.url, json=request.body)
return _cache[request.url]

Is it possible to use Flask RestX wih Flask's 2.0+ async await?

Usage of async/await was presented in Flask 2.0. (https://flask.palletsprojects.com/en/2.0.x/async-await/)
I am using Flask-RestX so is it possible to use async/await in RestX requests handlers?
Something like:
#api.route('/try-async')
class MyResource(Resource):
#api.expect(some_schema)
async def get(self):
result = await async_function()
return result
is not working and when I try to reach this endpoint I'm getting error:
TypeError: Object of type coroutine is not JSON serializable
Is there any info on that?
Package versions:
flask==2.0.1
flask-restx==0.4.0
and I've also installed flask[async] as documentation suggests.
I've gotten around this by using an internal redirect
#api.route('/try-async')
class MyResource(Resource):
#api.expect(some_schema)
def get(self):
return redirect(url_for('.hidden_async'), code=307)
#api.route('/hidden-async', methods=['GET'])
async def hidden_async():
result = await async_function()
return result
Redirecting with code=307 will ensure any method and body are unchanged after the redirect (Link). So passing data to the async function is possible as well.
#api.route('/try-async')
class MyResource(Resource):
#api.expect(some_schema)
def post(self):
return redirect(url_for('.hidden_async'), code=307)
#api.route('/hidden-async', methods=['POST'])
async def hidden_async():
data = request.get_json()
tasks = [async_function(d) for d in data]
result = await asyncio.gather(tasks)
return result

Categories

Resources