I've encountered this problem, and I can't see any solution, though it must be a common one. So, maybe I'm missing something here.
I'm working on FastAPI app with asynchronous endpoints and asynchronous connection with database. Database connection is passed as a dependency. I want to write some asynchronous tests for said app.
engine = create_async_engine(connection_string, echo=True)
def get_session():
return sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
#router.post("/register")
async def register(
user_data: UserRequest,
authorize: AuthJWT = Depends(),
async_session: sessionmaker = Depends(get_session),
):
"""Register new user."""
if authorize.get_jwt_subject():
raise LogicException("already authorized")
session: AsyncSession
async with async_session() as session:
query = await session.execute(
select(UserModel).where(UserModel.name == user_data.name)
)
...
I'm using AsyncSession to work with database. So in my test, db connection also has to be asynchronous.
engine = create_async_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
app.dependency_overrides[get_session] = lambda: sessionmaker(
engine, class_=AsyncSession, expire_on_commit=False
)
#pytest.mark.asyncio
async def test_create_user():
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
async with AsyncClient(app=app, base_url="http://test") as ac:
response = await ac.post(
"/register",
json={"name": "TestGuy", "password": "TestPass"},
)
assert response.status_code == 200, response.text
When running the test, I get the following error:
...
coin_venv\lib\site-packages\fastapi\routing.py:217: in app
solved_result = await solve_dependencies(
coin_venv\lib\site-packages\fastapi\dependencies\utils.py:529: in solve_dependencies
solved = await run_in_threadpool(call, **sub_values)
AttributeError: module 'anyio' has no attribute 'to_thread'
I concluded that error appears only when there is a dependency in an endpoint. Weird part is that I don't even have anyio in my environment.
So, is there a way to test asynchronous FastAPI endpoints with dependencies and asynchronous db connection? Surely, there must be something, it's not like this situation is something unique...
UPD: I tried using decorator #pytest.mark.anyio and also have installed trio and anyio. Now pytest seem to discover two distinct tests in this one:
login_test.py::test_create_user[asyncio]
login_test.py::test_create_user[trio]
Both fails, first one with what seems to be a valid error in my code, and second one with:
RuntimeError: There is no current event loop in thread 'MainThread'.
I guess it is true, though I don't really know if pytest creates eventloop to test async code. Anyway, I don't need the second test, why it is here, and how can I get rid of it?
It turned out, I can specify backend to run tests like this:
#pytest.fixture
def anyio_backend():
return 'asyncio'
So, now I have only the right tests running)
pytest runs on different eventloop (not get_running_loop), and so, when you try to run it in the same context, it raises exception. I suggest you to consider using nest_asyncio (https://pypi.org/project/nest-asyncio/), so that pytest can run in the same eventloop.
import nest_asyncio
nest_asyncio.apply()
Related
I am using Prefect. And I tried to download a file from S3.
When I hard coded the AWS credentials, the file can be downloaded successfully:
import asyncio
from prefect_aws.s3 import s3_download
from prefect_aws.credentials import AwsCredentials
from prefect import flow, get_run_logger
#flow
async def fetch_taxi_data():
logger = get_run_logger()
credentials = AwsCredentials(
aws_access_key_id="xxx",
aws_secret_access_key="xxx",
)
data = await s3_download(
bucket="hongbomiao-bucket",
key="hm-airflow/taxi.csv",
aws_credentials=credentials,
)
logger.info(data)
if __name__ == "__main__":
asyncio.run(fetch_taxi_data())
Now I tried to load the credentials from Prefect Blocks.
I created a AWS Credentials Block:
However,
aws_credentials_block = AwsCredentials.load("aws-credentials-block")
data = await s3_download(
bucket="hongbomiao-bucket",
key="hm-airflow/taxi.csv",
aws_credentials=aws_credentials_block,
)
throws the error:
AttributeError: 'coroutine' object has no attribute 'get_boto3_session'
And
aws_credentials_block = AwsCredentials.load("aws-credentials-block")
credentials = AwsCredentials(
aws_access_key_id=aws_credentials_block.aws_access_key_id,
aws_secret_access_key=aws_credentials_block.aws_secret_access_key,
)
data = await s3_download(
bucket="hongbomiao-bucket",
key="hm-airflow/taxi.csv",
aws_credentials=credentials,
)
throws the error:
AttributeError: 'coroutine' object has no attribute 'aws_access_key_id'
I didn't find any useful document about how to use it.
Am I supposed to use Blocks to load credentials? If it is, what is the correct way to use Blocks correctly in Prefect? Thanks!
I just found the snippet in the screenshot in the question misses an await.
After adding await, it works now!
aws_credentials_block = await AwsCredentials.load("aws-credentials-block")
data = await s3_download(
bucket="hongbomiao-bucket",
key="hm-airflow/taxi.csv",
aws_credentials=aws_credentials_block,
)
UPDATE:
Got an answer from Michael Adkins on GitHub, and thanks!
await is only needed if you're writing an async flow or task. For users writing synchronous code, an await is not needed (and not possible). Most of our users are writing synchronous code and the example in the UI is in a synchronous context so it does not include the await.
I saw the source code at
https://github.com/PrefectHQ/prefect/blob/1dcd45637914896c60b7d49254a34e95a9ce56ea/src/prefect/blocks/core.py#L601-L604
#classmethod
#sync_compatible
#inject_client
async def load(cls, name: str, client: "OrionClient" = None):
# ...
So I think as long as the function has the decorator #sync_compatible, it means it can be used as both async and sync functions.
Consider the following fastapi setup:
application.add_event_handler(
"startup",
create_start_app_handler(application, settings),
)
def create_start_app_handler(
app: FastAPI,
settings: AppSettings,
) -> Callable:
async def start_app() -> None:
await connect_to_db(app, settings)
return start_app
async def connect_to_db(app: FastAPI, settings: AppSettings) -> None:
db_url = settings.DATABASE_URL
engine = create_engine(db_url, pool_size=settings.POOL_SIZE, max_overflow=settings.MAX_OVERFLOW)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
db = SessionLocal()
def close_db():
db.close()
engine.dispose()
app.state.db = db
app.state.close_db = close_db
close_db is used to close the database connection on app shutdown
I have the following dependencies defined:
def _get_db(request: Request) -> Generator:
yield request.app.state.db
def get_repository(
repo_type: Type[BaseRepository],
) -> Callable[[Session], BaseRepository]:
def _get_repo(
sess: Session = Depends(_get_db),
) -> BaseRepository:
return repo_type(sess)
return _get_repo
Would this still allow me to take advantage of connection pooling?
Also, this feels a little hacky and I could use some feedback if there's anything in particular that I should not be doing.
To be blunt; it seems overly complicated for something that is pretty well documented in the docs.
In your case, you create only 1 instance of SessionLocal() and will share that across all your requests (because you store it in the app.state). In other words: no this will not be using connection pooling, it will use only 1 connection.
A better approach is to yield an instance per request, either via middleware or via a dependency. That way, the connection is actually closed when the incoming request has been fully handled. For example, like this:
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
#app.get("/")
def root(db: SessionLocal = Depends(get_db)):
return "hello world"
I am not sure how you ended up where you ended up, but I would recommend to refactor a bunch.
So I'm using django_channels to handle some WebSocket stuff and since Django 3.1 you can create unittest-like tests for testing async and decided to go with that.
It happens to be that for some reason when accessing the Consumer can't reach the data.
I'm using model_bakery (but also tried with plain Django ORM) and have a very simple test.
class TestChatConsumer(TestCase):
url = '/ws/chat/'
def setUp(self):
self.user = baker.make_recipe('registration.user')
self.chat = baker.make_recipe('chat.chat')
async def test_setup_channel_layer_ok(self):
consummer = WebsocketCommunicator(
application=AuthMiddlewareStack(ChatConsumer.as_asgi()),
path=self.url,
)
consummer.scope['user'] = self.user
await consummer.connect()
await consummer.send_json_to({
'type': 'setup_channel_layer',
'chat': self.chat.pk,
})
response = await consummer.receive_json_from()
self.assertEqual(response['type'], 'info')
self.assertEqual(response['content']['message'], 'Chat connected!')
The problem is that on the test the entry is created but when accessing consumers the entry seems to be off.
Do you know if there's any desync between test database or something?
Edit
I added a dummy function to check for tests on consumer and got this.
Test:
Consumer code:
Result on that breakpoint:
So at the end the problem was django.test.TestCase.
It was fixed by changing class TestChatConsumer(TestCase): for class TestChatConsumer(TransactionTestCase):
Seems like TransactionTestCase comes with a lot of features that TestCase doesn't have.
See more...
I trying to make async database request using sqlalachemy like described in example :
https://docs.sqlalchemy.org/en/14/orm/extensions/asyncio.html#synopsis-orm under title : Preventing Implicit IO when Using AsyncSession.
As i understood from code exist two ways to create session :
using async_session = AsyncSession(engine, expire_on_commit=False)
or using sessionmaker with class_=AsyncSession parameter
my code looks like following :
async def setup_connection(self):
self.logger.info("Pause to be sure that i am async")
await asyncio.sleep(1)
self.logger.info("Async Pause finished")
self.logger.info("Database engine uri: %s", self.database_engine_uri)
self.database_engine = create_async_engine(self.database_engine_uri, pool_size=self.pool_size, echo=True)
async_session = AsyncSession(self.database_engine, expire_on_commit=False)
self.logger.info("Before hang")
async with async_session() as session:
self.logger.info("Inside with")
....
After i execute my code i get following :
2021-06-17 16:07:52,942 File:database.py Function:setup_connection Line:46 Pause to be sure that i am async
2021-06-17 16:07:53,943 File:database.py Function:setup_connection Line:48 Async Pause finished
2021-06-17 16:07:53,943 File:database.py Function:setup_connection Line:49 Database engine uri: mysql+aiomysql://user:password#10.111.117.9/db
2021-06-17 16:07:53,954 File:database.py Function:setup_connection Line:53 Before hang
It is feels like code just hang in moment of execution "async with async_session() as session:" because next log message never appear. Can you please help me with proper and simplest way to use asyncio with sqlalachemy.
In this unit test I would like to check if the device has been created and that the expiry date is 7 days in future.
database.py
import databases
database = databases.Database(settings.sqlalchemy_database_uri)
Unit Test:
from database.database import database
def test_successful_register_expiry_set_to_seven_days():
response = client.post(
"/register/",
headers={},
json={"device_id": "u1"},
)
assert response.status_code == 201
query = device.select(whereclause=device.c.id == "u1")
d = database.fetch_one(query)
assert d.expires_at == datetime.utcnow().replace(microsecond=0) + timedelta(days=7)
Because d is a coroutine object it fails with the message:
AttributeError: 'coroutine' object has no attribute 'expires_at'
And I can't use await inside a unit test.
d = await database.fetch_one(query)
What am I missing, please?
Well it is never awaited, your code returns the coroutine before that semaphore going in to the scheduler.
If you are using an asynchronous driver, you need to await it. Are there any workarounds for this? Yes.
You can use asyncio.run(awaitable) to run the coroutine inside an event loop.
import asyncio
d = asyncio.run(database.fetch_one(query))
If you have an currently running event loop you may want to use that event loop instead. You can achieve that by asyncio.get_event_loop(), which will run the function inside the running loop.
import asyncio
asyncio.get_event_loop().run_until_complete(database.fetch_one(query))
You can also use #pytest.mark.asyncio decorator (see documentation).
#pytest.mark.asyncio
async def dummy():
await some_awaitable()