Unable to monkeypatch an rpc server class method - python

I need to monkeypatch a class method that is decorated with the #method annotation of the jsonrpcserver library. The class is implementing a rpc server that is started as an asyncio server and is launched using a pytest fixture like this
# conftest.py
#pytest.fixture(autouse=True, scope="module")
#pytest.mark.asyncio
async def rpc_server(...):
rpc_server = RpcServer(
addr="127.0.0.1",
port=9500,
...
)
task = asyncio.create_task(rpc_server.start())
yield rpc_server
task.cancel()
The test should monkeypatch one of the method of the RpcServer class
# test_rpc.py
#pytest.mark.asyncio
async def test_rpc_server_exception(
rpc_server: RpcServer,
...
monkeypatch: MonkeyPatch,
):
async def raise_runtime_error():
raise RuntimeError()
monkeypatch.setattr(
RpcServer, "method_to_be_patched", raise_runtime_error, raising=True
)
... # making an rpc request to launch method_to_be_patched
assert ...
method_to_be_patched is invoked by async_dispatch of the jsonrpcserver library once a new request is received and looks like this
# rpc_server.py
#method
async def method_to_be_patched(self, ...) -> str:
...
return ...
The problem is that monkeypatch is not patching anything and the test pass without raising any exception (like I need to). I've tried to monkeypatch RpcServer and the instance yield from the pytest fixture without any success yet by debugging it seems that the class method correctly points to the dummy function but still the original one is invoked.
EDIT: the issue arises because of python imports work. As far as I understood when importing like from ... import ... I'm creating a new reference so basically I'm patching the reference created from test_rpc.py and not the one in rpc_server.py (correct me if I'm wrong).
So I tried
# test_rpc.py
#pytest.mark.asyncio
async def test_rpc_server_exception(
rpc_server: RpcServer,
...
monkeypatch: MonkeyPatch,
):
async def raise_runtime_error():
raise RuntimeError()
import network # the package containing rpc_server.py
monkeypatch.setattr(
network.rpc_server.RpcServer, "method_to_be_patched", raise_runtime_error, raising=True
)
... # making an rpc request to launch method_to_be_patched
assert ...
but still not getting the intended behaviour.
The project tree is like this
/src
|rpc_server.py
/test
|conftest.py
/e2e
|test_rpc.py

The solution is to monkeypatch where the method is invoked so since I'm using jsonrpcserver here I had to monkeypatch the call method defined inside the async_dispatch module and now it is working as I expected

Related

Overriding FastAPI dependencies that have parameters

I'm trying to test my FastAPI endpoints by overriding the injected database using the officially recommended method in the FastAPI documentation.
The function I'm injecting the db with is a closure that allows me to build any desired database from a MongoClient by giving it the database name whilst (I assume) still working with FastAPI depends as it returns a closure function's signature. No error is thrown so I think this method is correct:
# app
def build_db(name: str):
def close():
return build_singleton_whatever(MongoClient, args....)
return close
Adding it to the endpoint:
# endpoint
#app.post("/notification/feed")
async def route_receive_notifications(db: Database = Depends(build_db("someDB"))):
...
And finally, attempting to override it in the tests:
# pytest
# test_endpoint.py
fastapi_app.dependency_overrides[app.build_db] = lambda x: lambda: x
However, the dependency doesn't seem to override at all and the test ends up creating a MongoClient with the IP of the production database as in normal execution.
So, any ideas on overriding FastAPI dependencies that are given parameters in their endpoints?
I have tried creating a mock closure function with no success:
def mock_closure(*args):
def close():
return args
return close
app.dependency_overrides[app.build_db] = mock_closure('otherDB')
And I have also tried providing the same signature, including the parameter, with still no success:
app.dependency_overrides[app.build_db('someDB')] = mock_closure('otherDB')
Edit note I'm also aware I can create a separate function that creates my desired database and use that as the dependency, but I would much prefer to use this dynamic version as it's more scalable to using more databases in my apps and avoids me writing essentially repeated functions just so they can be cleanly injected.
I use next fixtures for main db overriding to db for testing:
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from settings import get_settings
#pytest.fixture()
async def get_engine():
engine = create_async_engine(get_settings().test_db_url)
yield engine
await engine.dispose()
#pytest.fixture()
async def db_session(get_engine) -> AsyncSession:
async with get_engine.begin() as connection:
async with async_session(bind=connection) as session:
yield session
await session.close()
#pytest.fixture()
def override_get_async_session(db_session: AsyncSession) -> Callable:
async def _override_get_async_session():
yield db_session
return _override_get_async_session
There are two issues with your implementation getting in your way:
As you are calling build_db right in the route_receive_notifications function definition, the latter receives nested close function as a dependency. And it's impossible to override it. To fix this you would need to avoid calling your dependency right away and still provide it with db name. For that you can either define a new dependency to inject name into build_db:
# app
def get_db_name():
return "someDB"
def build_db(name: str = Depends(get_db_name)):
...
# endpoint
#app.post("/notification/feed")
async def route_receive_notifications(db: Database = Depends(build_db)):
...
or use functools.partial (shorter but less elegant):
# endpoint
from functools import partial
#app.post("/notification/feed")
async def route_receive_notifications(db: Database = Depends(partial(build_db, "someDB"))):
...
FastAPI requires dependency overriding function to have the same signature as the original dependency. Simply switching from *args to a single parameter is enough, although using the same argument name and type makes it easier to support in future. Of course you need to provide the function itself as a value for dependency_overrides without calling it:
def mock_closure(name: str):
def close():
return name
return close
app.dependency_overrides[app.build_db] = mock_closure

Pytest Unit Testing Sending Email SMTP

I want to test the following function in pytest without actually creating an SMTP server and sending email to the specified address.
def send_email_investor_owner_occupier(self,user_type,smtp_server):
message=MIMEMultipart()
message['From']=self.email_address
message['To']=self.email_address
suburb_set={suburb for location in user_type.locations for suburb in location.keys()}
suburb_string=','.join(suburb_set)
message['Subject']=f'Investment Properties in {suburb_string} last updated {user_type.date_posted}'
body= f'{user_type.properties} with {user_type.bedrooms} bedrooms, {user_type.bathrooms} bathrooms,{user_type.car_spaces} car spaces, priced between ${user_type.min_price} and ${user_type.max_price} in {suburb_string} last updated {user_type.date_posted}. Key metrics calculated for {user_type.loan_type} {user_type.variable_loan_type if user_type.variable_loan_type!=None else ""} loan with {user_type.lvr/100} lvr for {user_type.loan_term} years with {user_type.mortgage_interest}% interest.'
message.attach(MIMEText(body,"plain"))
message.attach('property_data.csv',self.property_data_io.getvalue(),'text/csv')
with smtplib.SMTP(smtp_server,self.port) as server:
server.starttls()
server.login(self.email_address,self.password)
server.sendmail(self.email_address,self.email_address,message.as_string())
server.quit()
I am aware that SMTP server can be mocked using the python unittest module as per https://jingwen-z.github.io/how-to-send-emails-with-python/ but I cannot figure it out using pytest. I would like to be able to do this in pytest if possible given all my other tests use the module.
I'm not entirely sure what you are looking for, as your function doesn't return anything. But i guess you can assert which functions are called inside send_email_investor_owner_occupier.
To use mocker in pytest, what i do is install the pytest-mock module.
You can then define you test as:
def test_something(mocker):
mocker.patch(foo.bar, return_value=True)
assert foo.bar()
If you know how to mock in unittest, then mocking in pytest is pretty easy.
Say that your python module that contains the send_email_investor_owner_occupier method is called my_email_sender.py, then you patch my_email_sender.smtplib.SMTP.
One way of doing it would be using the patch decorator:
from unittest.mock import patch
#patch("my_email_sender.smtplib.SMTP")
def test_send_email_investor_owner_occupier_patch_decorator(smtp_mock):
# call send_email_investor_owner_occupier
# assert like: smtp_mock.assert_called_once_with("server", port)
pass
Another way would be to use the pytest-mock plugin to pytest:
pip install pytest-mock
def test_send_email_investor_owner_occupier_pytest_mock_plugin(mocker):
smtp_mock = mocker.MagicMock(name='smtp_mock')
mocker.patch('my_email_sender.smtplib.SMTP', new=smtp_mock)
# call send_email_investor_owner_occupier
# assert like: smtp_mock.assert_called_once_with("server", port)
pass
This might seem a bit tedious, but if you have several tests which check the mail sending and you want to mock all of them easily without copying the same code you can mix this approach with a pytest fixture:
import pytest
#pytest.fixture
def smtp_mock(mocker):
smtp_mock = mocker.MagicMock(name='smtp_mock')
mocker.patch('my_email_sender.smtplib.SMTP', new=smtp_mock)
yield smtp_mock
def test_send_email_investor_owner_occupier_pytest_fixture(smtp_mock):
# call send_email_investor_owner_occupier
# assert like: smtp_mock.assert_called_once_with("server", port)
pass
def test_send_email_investor_owner_occupier_pytest_fixture_2(smtp_mock):
# call send_email_investor_owner_occupier
# assert like: smtp_mock.assert_called_once_with("server", port)
pass
def test_send_email_investor_owner_occupier_pytest_fixture_3(smtp_mock):
# call send_email_investor_owner_occupier
# assert like: smtp_mock.assert_called_once_with("server", port)
pass

How to mocks logging pytest in a fastapi call

i am working in a project with fastAPI
As the title says, i have an endpoint which calls a logging event when a HTTPException occurs, and when the request is called and finished
Something like this:
#router.get(
"/chat/{chat_id}/messages/",
status_code=status.HTTP_200_OK,
)
async def get_messages(chat_message: GetMessageValidator = Depends(), request: Request = None):
logging.info(request.url.path+" request started") ##LOGGING
if chat_message.chat_id_validator(chat_message.chat_id):
logging.error(request.url.path+settings.GET_MESSAGES_CHAT_ID_ERROR) ##LOGGING
raise HTTPException(
status_code=404, detail=settings.GET_MESSAGES_CHAT_ID_ERROR
)
logging.info(request.url.path+" request OK") ##LOGGING
return chat_message
And i have build a test with pytest which call that endpoint, something like this:
#dataclass
class ChatMessage:
from_user: str
to_user: str
chat_id: str
body: str
#pytest.mark.asyncio()
async def test_pagination_get_messages(client: AsyncSession,
user_token_header):
conversation = [
ChatMessage(frank_id, pepe_id, chat_record.chat_id, 'Hello Pepe!')
]
page_1 = await client.get( ##ENDPOINT CALL
f"/api/v1/chat/messages/",
json={
"chat_id": str(chat_record.chat_id),
"quantity": 3
},
headers=user_token_header
)
assert page_1.status_code == 200
The pytest response is okay, but i don't want that the logging event works when i call the endpoint from a pytest, and i don't have an idea to avoid the logging event call when the pytest of that endpoint is running..
Can you give an idea or a solution of how to mock logging events in the endpoint when is called from the pytest?
Thanks!
Looking at the source code, logging.info() is a wrapper around logging.root.info(), which in turn calls logging.root._log(), as do the other logging functions. Therefore, you should be able to decorate your test with unittest.mock.patch:
from unittest.mock import patch
#pytest.mark.asyncio()
#patch("logging.root._log")
async def test_pagination_get_messages(...):
...
Another option would be to remove all logging handlers in a test setup function.
But before doing any of this, why though? Pytest catches printouts (unless you explicitly enable it), and logging to stdout/stderr isn't exactly resource heavy.

Mock entire client class with pytest

I have a class that inherits from another class in which we build a client:
class Client(ClientLibrary):
def __init__(self, hosts=[{'host':<HOST_ADDRESS>, 'port':<PORT>}], **kwargs):
''' alternative constructor, where i'd pass in some defaults to simplify connection'''
super().__init__(hosts, *args, **kwargs)
def some_method(self):
...
I want to test this class, and already have a test server set up that I want to connect to for testing. My initial approach was to create a MockClient that inherits from the original Client but swaps out the hosts parameter for the test host like so:
# I create a mock client that inherits from the original `Client` class, but passes in the host and port of the test server.
class MockClient(Client):
def __init__(self, hosts=[{'host':MOCK_HOST, 'port':MOCK_PORT}]):
super().__init__(hosts=hosts)
The idea was then that i'd use this mock client in the tests, however I have faced a lot of issues where I am testing functions that encapsulate the original Client class. I have tried patching it but keep on running into issues.
Is there a better way to approach this? And can this be done using pytest fixtures?
I want to be able to perform the following sorts of tests:
class TestFunctionThatUtilisesClient:
def test_in_which_class_is_constructed_explicitly(self):
client = Client()
r = client.some_method()
assert r == 'something'
def test_in_which_class_is_constructed_implicitly(self):
r = another_method() # Client() is called somewhere in here
assert r == 'something else'

How to call asynchronous function in Django?

The following doesn't execute foo and gives
RuntimeWarning: coroutine 'foo' was never awaited
# urls.py
async def foo(data):
# process data ...
#api_view(['POST'])
def endpoint(request):
data = request.data.get('data')
# How to call foo here?
foo(data)
return Response({})
Django is an synchronous language but it supports Async behavior.
Sharing the code snippet which may help.
import asyncio
from channels.db import database_sync_to_async
def get_details(tag):
response = another_sync_function()
# Creating another thread to execute function
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
async_result = loop.run_until_complete(remove_tags(response, tag))
loop.close()
# Async function
async def remove_tags(response, tag_id):
// do something here
# calling another function only for executing database queries
await tag_query(response, tag_id)
#database_sync_to_async
def tag_query(response, tag_id):
Mymodel.objects.get(all_tag_id=tag_id).delete()
This way i called async function in synchronous function.
Reference for database sync to async decorator
Found a way to do it.
Create another file bar.py in the same directory as urls.py.
# bar.py
def foo(data):
// process data
# urls.py
from multiprocessing import Process
from .bar import foo
#api_view(['POST'])
def endpoint(request):
data = request.data.get('data')
p = Process(target=foo, args=(data,))
p.start()
return Response({})
You can't await foo in this context. Seeing that Django is mainly a synchronous library, it doesn't interact well with asynchronous code. The best advice I can give it to try avoid using an asynchronous function here, or perhaps use another method of concurrency (ie threading or multiprocessing).
Note: there is a great answer given about Django's synchronous nature that can be found here: Django is synchronous or asynchronous?.

Categories

Resources