I'm trying to test my FastAPI endpoints by overriding the injected database using the officially recommended method in the FastAPI documentation.
The function I'm injecting the db with is a closure that allows me to build any desired database from a MongoClient by giving it the database name whilst (I assume) still working with FastAPI depends as it returns a closure function's signature. No error is thrown so I think this method is correct:
# app
def build_db(name: str):
def close():
return build_singleton_whatever(MongoClient, args....)
return close
Adding it to the endpoint:
# endpoint
#app.post("/notification/feed")
async def route_receive_notifications(db: Database = Depends(build_db("someDB"))):
...
And finally, attempting to override it in the tests:
# pytest
# test_endpoint.py
fastapi_app.dependency_overrides[app.build_db] = lambda x: lambda: x
However, the dependency doesn't seem to override at all and the test ends up creating a MongoClient with the IP of the production database as in normal execution.
So, any ideas on overriding FastAPI dependencies that are given parameters in their endpoints?
I have tried creating a mock closure function with no success:
def mock_closure(*args):
def close():
return args
return close
app.dependency_overrides[app.build_db] = mock_closure('otherDB')
And I have also tried providing the same signature, including the parameter, with still no success:
app.dependency_overrides[app.build_db('someDB')] = mock_closure('otherDB')
Edit note I'm also aware I can create a separate function that creates my desired database and use that as the dependency, but I would much prefer to use this dynamic version as it's more scalable to using more databases in my apps and avoids me writing essentially repeated functions just so they can be cleanly injected.
I use next fixtures for main db overriding to db for testing:
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from settings import get_settings
#pytest.fixture()
async def get_engine():
engine = create_async_engine(get_settings().test_db_url)
yield engine
await engine.dispose()
#pytest.fixture()
async def db_session(get_engine) -> AsyncSession:
async with get_engine.begin() as connection:
async with async_session(bind=connection) as session:
yield session
await session.close()
#pytest.fixture()
def override_get_async_session(db_session: AsyncSession) -> Callable:
async def _override_get_async_session():
yield db_session
return _override_get_async_session
There are two issues with your implementation getting in your way:
As you are calling build_db right in the route_receive_notifications function definition, the latter receives nested close function as a dependency. And it's impossible to override it. To fix this you would need to avoid calling your dependency right away and still provide it with db name. For that you can either define a new dependency to inject name into build_db:
# app
def get_db_name():
return "someDB"
def build_db(name: str = Depends(get_db_name)):
...
# endpoint
#app.post("/notification/feed")
async def route_receive_notifications(db: Database = Depends(build_db)):
...
or use functools.partial (shorter but less elegant):
# endpoint
from functools import partial
#app.post("/notification/feed")
async def route_receive_notifications(db: Database = Depends(partial(build_db, "someDB"))):
...
FastAPI requires dependency overriding function to have the same signature as the original dependency. Simply switching from *args to a single parameter is enough, although using the same argument name and type makes it easier to support in future. Of course you need to provide the function itself as a value for dependency_overrides without calling it:
def mock_closure(name: str):
def close():
return name
return close
app.dependency_overrides[app.build_db] = mock_closure
Related
I have a class that inherits from another class in which we build a client:
class Client(ClientLibrary):
def __init__(self, hosts=[{'host':<HOST_ADDRESS>, 'port':<PORT>}], **kwargs):
''' alternative constructor, where i'd pass in some defaults to simplify connection'''
super().__init__(hosts, *args, **kwargs)
def some_method(self):
...
I want to test this class, and already have a test server set up that I want to connect to for testing. My initial approach was to create a MockClient that inherits from the original Client but swaps out the hosts parameter for the test host like so:
# I create a mock client that inherits from the original `Client` class, but passes in the host and port of the test server.
class MockClient(Client):
def __init__(self, hosts=[{'host':MOCK_HOST, 'port':MOCK_PORT}]):
super().__init__(hosts=hosts)
The idea was then that i'd use this mock client in the tests, however I have faced a lot of issues where I am testing functions that encapsulate the original Client class. I have tried patching it but keep on running into issues.
Is there a better way to approach this? And can this be done using pytest fixtures?
I want to be able to perform the following sorts of tests:
class TestFunctionThatUtilisesClient:
def test_in_which_class_is_constructed_explicitly(self):
client = Client()
r = client.some_method()
assert r == 'something'
def test_in_which_class_is_constructed_implicitly(self):
r = another_method() # Client() is called somewhere in here
assert r == 'something else'
This might be a newbie question, but I can't get dependency_overrides to work for testing.
Following the docs this should be simple to implement but I'm missing something…
Consider the following code:
In main.py:
from fastapi import FastAPI
from routes import router
app = FastAPI()
app.include_router(router)
In routes.py:
from fastapi import APIRouter, status
from fastapi.param_functions import Depends
from fastapi.responses import JSONResponse
from authentication import Authentication
router = APIRouter()
#router.get("/list/", dependencies=[Depends(Authentication(role=user))])
async def return_all():
response = JSONResponse(
status_code=status.HTTP_200_OK,
content="Here is all the objects!"
)
return response
In test_list.py:
from unittest import TestCase
from fastapi.testclient import TestClient
from main import app
from authentication import Authentication
def override_dependencies():
return {"Some": "Thing"}
client = TestClient(app)
app.dependency_overrides[Authentication] = override_dependencies
class ListTestCase(TestCase):
def test_list_get(self):
response = client.get("/list/")
self.assertEqual(200, response.status_code)
Gives the following error:
self.assertEqual(200, response.status_code)
AssertionError: 200 != 403
i.e., it tried to authenticate but was denied. Hence, it doesn't seem that it overrides my dependency.
Note that Depends is used in the path operation decorator (#router.get), and not the function as in the docs…
I see that you are using Authentication(role=user) as a dependency, but then you are trying to override Authentication and these are two different callables, the former being actually Authentication(role=user).__call__; thus I guess FastAPI is not able to match the correct override.
The problem with this approach is that Authentication(role=user).__call__ is an instance method, but in order to override a dependency in dependency_overrides you must be able to adress the callable statically, so that it is always the same every time you call it.
I had to implement a similar thing, and i solved this by injecting the authentication logic in like this:
class RestIdentityValidator:
methods: List[AuthMethod]
def __init__(self, *methods: AuthMethod):
self.methods = list(dict.fromkeys([e for e in AuthMethod] if methods is None else methods))
def __call__(
self,
bearer_token_identity: IdentityInfo | None = Depends(get_bearer_token_identity),
basic_token_identity: IdentityInfo | None = Depends(get_basic_token_identity)
) -> IdentityInfo | None:
identity = None
for auth_method in self.methods:
match auth_method:
case AuthMethod.BEARER:
identity = bearer_token_identity
case AuthMethod.BASIC:
identity = basic_token_identity
case _:
pass
if identity is not None:
break
return identity
and then used it like this:
#router.get("/users", response_model=Response[List[UserOut]])
async def get_all_users(
identity: IdentityInfo = Depends(RestIdentityValidator(AuthMethod.BEARER)),
...
Then you can inject and override this normally. The fact that you are using it in the decorator shouldn't make any difference in this case.
Now this sample is not implementing the same functionality as yours, but I hope it can give you a hint on how to solve your problem.
I need to monkeypatch a class method that is decorated with the #method annotation of the jsonrpcserver library. The class is implementing a rpc server that is started as an asyncio server and is launched using a pytest fixture like this
# conftest.py
#pytest.fixture(autouse=True, scope="module")
#pytest.mark.asyncio
async def rpc_server(...):
rpc_server = RpcServer(
addr="127.0.0.1",
port=9500,
...
)
task = asyncio.create_task(rpc_server.start())
yield rpc_server
task.cancel()
The test should monkeypatch one of the method of the RpcServer class
# test_rpc.py
#pytest.mark.asyncio
async def test_rpc_server_exception(
rpc_server: RpcServer,
...
monkeypatch: MonkeyPatch,
):
async def raise_runtime_error():
raise RuntimeError()
monkeypatch.setattr(
RpcServer, "method_to_be_patched", raise_runtime_error, raising=True
)
... # making an rpc request to launch method_to_be_patched
assert ...
method_to_be_patched is invoked by async_dispatch of the jsonrpcserver library once a new request is received and looks like this
# rpc_server.py
#method
async def method_to_be_patched(self, ...) -> str:
...
return ...
The problem is that monkeypatch is not patching anything and the test pass without raising any exception (like I need to). I've tried to monkeypatch RpcServer and the instance yield from the pytest fixture without any success yet by debugging it seems that the class method correctly points to the dummy function but still the original one is invoked.
EDIT: the issue arises because of python imports work. As far as I understood when importing like from ... import ... I'm creating a new reference so basically I'm patching the reference created from test_rpc.py and not the one in rpc_server.py (correct me if I'm wrong).
So I tried
# test_rpc.py
#pytest.mark.asyncio
async def test_rpc_server_exception(
rpc_server: RpcServer,
...
monkeypatch: MonkeyPatch,
):
async def raise_runtime_error():
raise RuntimeError()
import network # the package containing rpc_server.py
monkeypatch.setattr(
network.rpc_server.RpcServer, "method_to_be_patched", raise_runtime_error, raising=True
)
... # making an rpc request to launch method_to_be_patched
assert ...
but still not getting the intended behaviour.
The project tree is like this
/src
|rpc_server.py
/test
|conftest.py
/e2e
|test_rpc.py
The solution is to monkeypatch where the method is invoked so since I'm using jsonrpcserver here I had to monkeypatch the call method defined inside the async_dispatch module and now it is working as I expected
I have a bunch of functions that look like this:
def insert_user(user: User, db: Connection) -> None:
...
def add_role_to_user(user: User, role: str, db: Connection) -> None:
...
def create_post(post: Post, owner: User, db: Connection) -> None:
...
# etc.
What these functions all have in common is that they take a Connection parameter called db that they use to modify a database. For performance reasons, I want the functions to be able to pass the db parameter between each other and not create a new connection every time. However, for convenience reasons, I also don't want to have to create and pass the db parameter every time I call the functions myself.
For that reason I have created a decorator:
def provide_db(fn):
...
This decorator checks if the keyword arguments contain the key "db", and if not, it creates a connection and passes it to the function. Usage:
#provide_db
def insert_user(user: User, db: Connection) -> None:
...
This works perfectly! I can now call the database functions without worrying about connecting to the database, and the functions can pass the db parameters to each other.
However, for this to be typed properly, the decorator needs to modify the function signature of the wrapped function, changing the db parameter from Connection to Optional[Connection].
Is this currently possible with Python's type hints? If so, how is it done?
This is the provide_db function:
def provide_db(fn):
"""Decorator that defaults the db argument to a new connection
This pattern allows callers to not have to worry about passing a db
parameter, but also allows functions to pass db connections to each other to
avoid the overhead of creating unnecessary connections.
"""
if not "db" in fn.__code__.co_varnames:
raise ValueError("Wrapped function doesn't accept a db argument")
db_arg_index = fn.__code__.co_varnames.index("db")
#wraps(fn)
def wrapper(*args, **kwargs) -> Result:
if len(args) > db_arg_index and args[db_arg_index] is not None:
pass # db was passed as a positional argument
elif "db" in kwargs and kwargs["db"] is not None:
pass # db was passed as a keyword argument
else:
kwargs["db"] = connect()
return fn(*args, **kwargs)
wrapper.__annotations__ = Optional[fn.__annotations__["db"]]
return wrapper
Function annotations are documented as writable in the datamodel and in PEP 526.
Follow this simplified example:
from __future__ import annotations
from typing import Optional
def provide_db(func):
func.__annotations__["db"] = Optional[func.__annotations__["db"]]
return func
#provide_db
def f(db: Connection):
pass
print(f.__annotations__)
For this requirement, it seems be good to have all these functions in a class.
class DbOps(object):
def __init__(self, db):
self.__db = db
def insert_user(user: User) -> None:
#use self__.db to connect db and insert user
Then :
db_ops = DbOps(db)
db_ops.insert_user(...)
etc
The following doesn't execute foo and gives
RuntimeWarning: coroutine 'foo' was never awaited
# urls.py
async def foo(data):
# process data ...
#api_view(['POST'])
def endpoint(request):
data = request.data.get('data')
# How to call foo here?
foo(data)
return Response({})
Django is an synchronous language but it supports Async behavior.
Sharing the code snippet which may help.
import asyncio
from channels.db import database_sync_to_async
def get_details(tag):
response = another_sync_function()
# Creating another thread to execute function
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
async_result = loop.run_until_complete(remove_tags(response, tag))
loop.close()
# Async function
async def remove_tags(response, tag_id):
// do something here
# calling another function only for executing database queries
await tag_query(response, tag_id)
#database_sync_to_async
def tag_query(response, tag_id):
Mymodel.objects.get(all_tag_id=tag_id).delete()
This way i called async function in synchronous function.
Reference for database sync to async decorator
Found a way to do it.
Create another file bar.py in the same directory as urls.py.
# bar.py
def foo(data):
// process data
# urls.py
from multiprocessing import Process
from .bar import foo
#api_view(['POST'])
def endpoint(request):
data = request.data.get('data')
p = Process(target=foo, args=(data,))
p.start()
return Response({})
You can't await foo in this context. Seeing that Django is mainly a synchronous library, it doesn't interact well with asynchronous code. The best advice I can give it to try avoid using an asynchronous function here, or perhaps use another method of concurrency (ie threading or multiprocessing).
Note: there is a great answer given about Django's synchronous nature that can be found here: Django is synchronous or asynchronous?.