FastAPI DB Context - python

I think I am misunderstanding how dependency injection is used in FastAPI, specifically in the context of DB sessions.
My current set up is FastAPI, SqlAlchhemy & Alembic, although i am writing the raw SQL myself, pydantic etc, pretty straight forward.
I have basic CRUD routes which communicate directly to my repository layer and all is working. In these methods I am able to successfully use the DB dependency injection. See example code below:
Dependencies
def get_database(request: Request) -> Database:
return request.app.state._db
def get_repository(Repo_type: Type[BaseRepository]) -> Callable:
def get_repo(db: Database = Depends(get_database)) -> Type[BaseRepository]:
return Repo_type(db)
return get_repo
Example GET by ID Route
#router.get("/{id}/", response_model=TablePub, name="Get Table by id")
async def get_table_by_id(
id: UUID, table_repo: TableRepository = Depends(get_repository(TableRepository))
) -> TableInDB:
table = await table_repo.get_table_by_id(id=id)
if not table:
raise HTTPException(status_code=HTTP_404_NOT_FOUND, detail="No Table found with that id.")
return table
Corresponding Repository
from databases import Database
class BaseRepository:
def __init__(self, db: Database) -> None:
self.db = db
class TableRepository(BaseRepository):
async def get_table_by_id(self, *, id: UUID) -> TableInDB:
table = await self.db.fetch_one(
query=GET_TABLE_BY_ID_QUERY,
values={"id": id},
)
if not table:
return None
return TableInDB(**table)
Now I want to start doing some more complex operations and want to add a service layer to house all of the business logic.
What is the correct way to structure this so that i can reuse the repositories that i have already written? For example, i want to return all Sales for a Table, but i need to get the table number from the DB first before i can query the Sales Table. The route requires table_id to be passed in as a param -> service layer, where i fetch the table by ID (Using existing repo) -> from that object, get the table number, then do a request to an external API that requires the table number as a param.
What I have so far:
Route
#router.get("/{table_id}", response_model=SalesPub, name="Get Sale Entries by table id")
async def get_sales_by_table_id(
table_id: UUID = Path(..., title="ID of the Table to get Sales Entries for")):
response = await SalesService.get_sales_from_external_API(table_id=table_id)
return response
Service Layer 'SalesService'
async def get_sales_from_external_API(
table_id: UUID,
table_repo: TableRepository = Depends(get_repository(TableRepository))
) -> TableInDB:
table_data = await table_repo.get_table_by_id(id=table_id)
if table_data is None:
logger.info(f"No table with id:{table_id} could not be found")
table_number = table_data.number
client_id = table_data.client_id
sales = await salesGateway.call_external_API(table_number, client_id)
return sales
The code brakes here table_data = await table_repo.get_table_by_id(id=table_id)
With an error AttributeError: 'Depends' object has no attribute 'get_table_by_id'
What i don't understand is that the code is almost identical to the route method that can get the table by ID? The depends object TableRepository does have a get_table_by_id method. What is it that i'm doing incorrectly, and is this the best way to split up business logic from database actions?
Thanks in advance

I seem to have found a solution to this, although i'm not sure if it is the best way.
The Depends Module only works on FastAPI routes and Dependencies. I was trying to use it on a regular function.
I needed to make the parameter table_repo an instance of Depends. and pass it in as a parameter to the external API call function.
#router.get("/table/{table_id}/", response_model=SalePub, name="Get Sales by table id")
async def get_sales_by_table_id(
table_id: UUID = Path(..., title="ID of the Table to get Sales Entries for"),
table_repo: TableRepository = Depends(get_repository(TableRepository))):
response = await get_sales_entries_from_pos(table_id=table_id, table_repo=table_repo)
return response
The issue i am foreseeing is that if i have a large service that may need access to manny repos, i have to give that access on the router through Depends, which just seems a bit strange to me.

Related

How do I destructure an API with python Django and django-rest-framework?

I have a successfully compiled and run a django rest consuming cocktaildb api. On local server when I run http://127.0.0.1:8000/api/ I get
{
"ingredients": "http://127.0.0.1:8000/api/ingredients/",
"drinks": "http://127.0.0.1:8000/api/drinks/",
"feeling-lucky": "http://127.0.0.1:8000/api/feeling-lucky/"
}
But when I go to one of the links mentioned in the json result above, for example:
http://127.0.0.1:8000/api/ingredients/
I get an empty [] with a status 200OK!
I need an endpoint to GET drinks and ingredients before I can destructure to specific details using angular.
I implemented helper folder in the app with the the API function as below:
class TheCoctailDBAPI:
THECOCTAILDB_URL = 'https://www.thecocktaildb.com/api/json/v1/1/'
async def __load_coctails_for_drink(self, drink, session):
for i in range(1, 16):
ingredientKey = 'strIngredient' + str(i)
ingredientName = drink[ingredientKey]
if not ingredientName:
break
if ingredientName not in self.ingredients:
async with session.get(f'{TheCoctailDBAPI.THECOCTAILDB_URL}search.php?i={ingredientName}') \
as response:
result = json.loads(await response.text())
self.ingredients[ingredientName] = result['ingredients'][0]
What was your expected responce?
Add the function that is called by this API as well as the DB settings in the question, so that we can properly help you.
Are you sure that you are connecting and pulling data from a remote location? It looks to me like your local DB is empty, so the API has no data to return.

How can I get more than one document from mongodb with fastapi?

my db model looks like this...
from pydantic import BaseModel
class Store(BaseModel):
name: str
store_code : str
and there can be same store names in db with different store_code.
what I want is filtering all informations of stores with same names.
for example, if my db is like this...
{
name:lg
store_code: 123
name:lg
store_code:456
}
I'd like to see all those two documents
my python fast api code is like this..
from fastapi import FastAPI, HTTPException
from database import *
app = FastAPI()
#app.get("/api/store{store_name}", response_model=Store)
async def get_store_by_name(store_name):
response = await fetch_store_by_name(store_name)
if response:
return response
raise HTTPException
and this is my mongo query code...
from pymongo import MongoClient
from model import Store
client = MongoClient(host='localhost', port=27017)
database = client.store
async def fetch_store_by_name(store_name:str):
document = collection.find({"name":store_name})
return document
i thought in the document, there would be two documents eventually.
but there's always an error like this
pydantic.error_wrappers.ValidationError: 1 validation error for Store
response
value is not a valid dict (type=type_error.dict)
is there anyone to help me please?
++++
I just changed my query like this
async def fetch_store_by_name(store_name:str):
stores = []
cursor = collection.find({"name":store_name})
for document in cursor:
stores.append(document)
return stores
this should returns two documents like I expected but it still has
ValueError: [TypeError("'ObjectId' object is not iterable"), TypeError('vars() argument must have __dict__ attribute')]
this error.
I think my fast-api code has a problem which I really have no idea...
async def fetch_store_by_name(store_name:str):
stores = [] ---Fault in this line---
cursor = collection.find({"name":store_name})
for document in cursor:
stores.append(document)
return stores
stores should be a string value, not a list as Mongodb will try to find it as the default value that you provided. In this case - str

Why open the same one database with sqlalchemy, but get different, how can I update it?

I write some tests with pytest, I want to test create user and email with post method.
With some debug, I know the issue is I open two databases in memory, but they are same database SessionLocal().
So how can I fix this, I try db.flush(), but it doesn't work.
this is the post method code
#router.post("/", response_model=schemas.User)
def create_user(
*,
db: Session = Depends(deps.get_db), #the get_db is SessionLocal()
user_in: schemas.UserCreate,
current_user: models.User = Depends(deps.get_current_active_superuser),
) -> Any:
"""
Create new user.
"""
user = crud.user.get_by_email(db, email=user_in.email)
if user:
raise HTTPException(
status_code=400,
detail="The user with this username already exists in the system.",
)
user = crud.user.create(db, obj_in=user_in)
print("====post====")
print(db.query(models.User).count())
print(db)
if settings.EMAILS_ENABLED and user_in.email:
send_new_account_email(
email_to=user_in.email, username=user_in.email, password=user_in.password
)
return user
and the test code is:
def test_create_user_new_email(
client: TestClient, superuser_token_headers: dict, db: Session # db is SessionLocal()
) -> None:
username = random_email()
password = random_lower_string()
data = {"email": username, "password": password}
r = client.post(
f"{settings.API_V1_STR}/users/", headers=superuser_token_headers, json=data,
)
assert 200 <= r.status_code < 300
created_user = r.json()
print("====test====")
print(db.query(User).count())
print(db)
user = crud.user.get_by_email(db, email=username)
assert user
assert user.email == created_user["email"]
and the test result is
> assert user
E assert None
====post====
320
<sqlalchemy.orm.session.Session object at 0x7f0a9f660910>
====test====
319
<sqlalchemy.orm.session.Session object at 0x7f0aa09c4d60>
Your code does not provide enough information to help you, the key issues are probably in what is hidden and explained by your comments.
And it seems like you are confusing sqlalchemy session and databases. If you are not familiar with these concepts, I highly recommend you to have a look at SQLAlchemy documentation.
But, looking at your code structure, it seems like you are using FastAPI.
Then, if you want to test SQLAlchemy with pytest, I recommend you to use pytest fixture with SQL transactions.
Here is my suggestion on how to implement such a test. I'll suppose that you want to run the test on your actual database and not create a new database especially for the tests. This implementation is heavily based on this github gist (the author made a "feel free to use statement", so I suppose he is ok with me copying his code here):
# test.py
import pytest
from sqlalchemy import create_engine
from sqlalchemy.orm import Session
from fastapi.testclient import TestClient
from myapp.models import BaseModel
from myapp.main import app # import your fastapi app
from myapp.database import get_db # import the dependency
client = TestClient(app)
# scope="session" mean that the engine will last for the whole test session
#pytest.fixture(scope="session")
def engine():
return create_engine("postgresql://localhost/test_database")
# at the end of the test session drops the created metadata using fixture with yield
#pytest.fixture(scope="session")
def tables(engine):
BaseModel.metadata.create_all(engine)
yield
BaseModel.metadata.drop_all(engine)
# here scope="function" (by default) so each time a test finished, the database is cleaned
#pytest.fixture
def dbsession(engine, tables):
"""Returns an sqlalchemy session, and after the test tears down everything properly."""
connection = engine.connect()
# begin the nested transaction
transaction = connection.begin()
# use the connection with the already started transaction
session = Session(bind=connection)
yield session
session.close()
# roll back the broader transaction
transaction.rollback()
# put back the connection to the connection pool
connection.close()
## end of the gist.github code
#pytest.fixture
def db_fastapi(dbsession):
def override_get_db():
db = dbsession
try:
yield db
finally:
db.close()
client.app.dependency_overrides[get_db] = override_get_db
yield db
# Now you can run your test
def test_create_user_new_email(db_fastapi):
username = random_email()
# ...

Google Admin SDK Latency

I am experiencing latency when doing queries against the Google Admin API.
def get_user(self, email: str) -> dict:
res = (
self.service.users()
.list(
domain="gmail.com",
projection="full",
query="email={0}".format(email),
)
.execute()
)
if "users" not in res or len(res["users"]) != 1:
msg = "Could not find user %s" % email
logging.error(msg)
raise GoogleAdminNonExistentUser(msg)
return res["users"][0]
def create_user(
self,
email: str,
first_name: str,
last_name: str,
org_unit_path: str,
manager_email: str,
) -> dict:
user_info = {...}
try:
res = self.service.users().insert(body=user_info).execute()
return res
except HttpError as error:
exc = self._generate_error(error)
logger.exception(exc.message)
raise exc
Take for example these two calls. In my test suite, I do a test for creating a user and immediately deleting them. In the next test I create the same user and update custom attributes. I then validate that those attributes were set.
test_create_delete()
create_user(EMAIL)
delete_user(EMAIL)
test_create_update()
create_user(EMAIL) # This will variably error out if the delete_user from the last request hasn't replicated throughout Google
update_user(EMAIL, UPDATE_INFO)
user = get_user()
# This assertion will variably fail if get_user() fetches old data
assert the update info is in user
I could liter the tests with sleeps, but build time is important. Is there a way to force the Google Admin API to return the freshest data possible?
Not possible because it has many dependent services. We have on-going projects for improvement on the performance though. (I'm one of the developers behind these services, and just ran across this question.)

Python flassger: Get query with extended conditions ? (more, less, between...)

I develop a python application based on flask that connects to a postgresql database and exposes the API using flassger (swagger UI).
I already defined a basic API (handle entries by ID, etc) as well a a query api to match different parameters (name=='John Doe'for example).
I would like to expand this query api to integrate more complex queries such as lower than, higher than, between, contains, etc.
I search on internet but couldn't find a proper way to do it. Any suggestion ?
I found this article which was useful but does not say anything about the implementation of the query: https://hackernoon.com/restful-api-designing-guidelines-the-best-practices-60e1d954e7c9
Here is briefly how it looks like so far (some extracted code):
GET_query.xml:
Return an account information
---
tags:
- accounts
parameters:
- name: name
in: query
type: string
example: John Doe
- name: number
in: query
type: string
example: X
- name: opened
in: query
type: boolean
example: False
- name: highlighted
in: query
type: boolean
example: False
- name: date_opened
in: query
type: Date
example: 2018-01-01
Blueprint definition:
ACCOUNTS_BLUEPRINT = Blueprint('accounts', __name__)
Api(ACCOUNTS_BLUEPRINT).add_resource(
AccountQueryResource,
'/accounts/<query>',
endpoint='accountq'
)
Api(ACCOUNTS_BLUEPRINT).add_resource(
AccountResource,
'/accounts/<int:id>',
endpoint='account'
)
Api(ACCOUNTS_BLUEPRINT).add_resource(
AccountListResource,
'/accounts',
endpoint='accounts'
)
Resource:
from flasgger import swag_from
from urllib import parse
from flask_restful import Resource
from flask_restful.reqparse import Argument
from flask import request as req
...
class AccountQueryResource(Resource):
""" Verbs relative to the accounts """
#staticmethod
#swag_from('../swagger/accounts/GET_query.yml')
def get(query):
""" Handle complex queries """
logger.debug('Recv %s:%s from %s', req.url, req.data, req.remote_addr)
query = dict(parse.parse_qsl(parse.urlsplit(req.url).query))
logger.debug('Get query: {}'.format(query))
try:
account = AccountRepository.filter(**query)
except Exception as e:
logger.error(e)
return {'error': '{}'.format(e)}, 409
if account:
result = AccountSchema(many=True).dump(account)
logger.debug('Get query returns: {}({})'.format(account, result))
return {'account': result}, 200
logger.debug('Get query returns: {}'.format(account))
return {'message': 'No account corresponds to {}'.format(query)}, 404
And finally the epository:
class AccountRepository:
""" The repository for the account model """
#staticmethod
def get(id):
""" Query an account by ID """
account = AccountModel.query.filter_by(id=id).first()
logger.debug('Get ID %d: got:%s', id, account)
return account
#staticmethod
def filter(**kwargs):
""" Query an account """
account = AccountModel.query.filter_by(**kwargs).all()
logger.debug('Filter %s: found:%s', kwargs, account)
return account
...
I don't know about your exact problem, but I had a problem similar to yours, and I fixed it with:
query = []
if location:
query.append(obj.location==location)
I will query this list of queries with
obj.query.filter(*query).all()
Where in above examples, obj is the name of a model you have created.
How is this help? this will allow you to fill in the variables you have dynamically and each query has its own conditions. you can use ==, !=, <=, etc.
note you should use filter and not filter_by then you can as many operators as you like.
you can read link1 and link2 for documents on how to query sqlalchemy.
edit:
name = request.args.get("name")
address = request.args.get("address")
age = request.args.get("address")
query = []
if name:
query.append(Myobject.name==name)
if address:
query.append(Myobject.address==name)
if age:
query.append(Myobject.age >= age) # look how we select people with age over the provided number!
query_result = Myobject.query.filter(*query).all()
if's will help you when there is no value provided by the user. this way you are not including those queries in your main query. with use of get, if these values are not provided by the user, they will be None and respected query won't be added to the query list.

Categories

Resources