Fastapi to read from an existing database table in postgreSQL - python

I am trying to create a FAST Api that is reading from an already existing table in PostgreSQL database but it is giving me an internal server error. Would appreciate your direction on what might be wrong with the code
The existing table looks like this:
schema : testSchema
table : test_api
id
email
1
test#***.com
2
test2#***.com
engine = sqlalchemy.create_engine("my_database_connection")
Base = declarative_base()
database = databases.Database("my_database_connection")
metadata = sqlalchemy.MetaData()
metadata.reflect(bind=engine, schema='testSchema')
test_api_tb = metadata.tables['testSchema.test_api']
class testAPI(Base):
__tablename__ = test_api_tb
id = Column(Integer, primary_key=True)
email = Column(String(256))
app = FastAPI()
#app.get("/testing_api/")
def read_users():
query = test_api_tb.select()
return database.execute(query)
The error I am getting from the logs
RecursionError: maximum recursion depth exceeded in comparison

The best thing you can do is to read the official documentation at fastapi.tiangolo.com, it is amazing and explains all the basics in a very detailed way.
SQL Relational Databases are used very often with FastAPI and are also mentioned in the documentation here, you can find step by step tutorial about how to use postgresql with sqlalchemy and FastAPI.
There are a few parts to make this work. The first part is to connect to the database:
engine = create_engine(my_database_connection)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
we create the engine with the connection string as you did, then we need to create a session in order to connect to the database. At the end we are creating a Base class which will help us to create the models and schemas.
Now we need to create the model using the base class just as you did above.
we need to make sure that the __tablename__ is the same as the name of the table in the database
class testAPIModel(Base):
__tablename__ = "test_api"
id = Column(Integer, primary_key=True)
email = Column(String(256))
Now comes the main part. We need to make sure we bind the engine of the database to the base class using
Base.metadata.create_all(bind=engine)
Now we will create a function that will help us and create a db session instance and will close the connection when we done with the query.
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
Now we can create the FastAPI app instance and get the data from the database.
#app.get("/testing_api/")
def read_users(db:Session = Depends(get_db)):
users = db.query(testAPIModel).all()
return users
We are using the Depends(get_db) to inject the db session from the function we wrote above.
The full code:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, Session
from fastapi import Depends, FastAPI
from sqlalchemy import Column, Integer, String
my_database_connection = "postgresql://user:password#server_ip/db_name"
engine = create_engine(my_database_connection)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
class testAPIModel(Base):
__tablename__ = "test_api"
id = Column(Integer, primary_key=True)
email = Column(String(256))
Base.metadata.create_all(bind=engine)
app = FastAPI()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
#app.get("/testing_api/")
def read_users(db:Session = Depends(get_db)):
users = db.query(testAPIModel).all()
return users
Good Luck!

Related

SQLAlchemy Not Creating Tables in Postgres Database

I am having trouble writing tables to a postgres database using SQLAlchemy ORM and Python scripts.
I know the problem has something to do with incorrect Session imports because when I place all the code below into a single file, the script creates the table without trouble.
However, when I break the script up into multiple files (necessary for this project), I receive the error "psycopg2.errors.UndefinedTable: relation "user" does not exist".
I have read many posts here on SO, tried reorganising my files, the function call order, changing from non-scoped to scoped sessions, eliminating and adding Base.metadata.create_all(bind=engine) in various spots, changed how the sessions are organised and created in base.py, as well as other things, but the script still errors and I'm not sure which code sequence is out of order.
The code currently looks like:
base.py:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import scoped_session, sessionmaker
# SQLAlchemy requires creating an Engine to interact with our database.
engine = create_engine('postgresql://user:pass#localhost:5432/testdb', echo=True)
# Create a configured ORM 'Session' factory to get a new Session bound to this engine
#_SessionFactory = sessionmaker(bind=engine)
# Use scoped session
db_session = scoped_session(
sessionmaker(
bind=engine,
autocommit=False,
autoflush=False
)
)
# Create a Base class for our classes definitions
Base = declarative_base()
models.py
from sqlalchemy import Column, DateTime, Integer, Text
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
email = Column(Text, nullable=False, unique=True)
name = Column(Text)
date_last_seen = Column(DateTime(timezone=True))
def __init__(self, email, name, date_last_seen):
self.email = email
self.name = name
self.date_last_seen = date_last_seen
inserts.py
from datetime import date
from base import db_session, engine, Base
from models import User
def init_db():
# Generate database schema based on our definitions in model.py
Base.metadata.create_all(bind=engine)
# Extract a new session from the session factory
#session = _SessionFactory()
# Create instance of the User class
alice = User('alice#throughthelooking.glass', 'Alice', date(1865, 11, 26))
# Use the current session to persist data
db_session.add(alice)
# Commit current session to database and close session
db_session.commit()
db_session.close()
print('Initialized the db')
return
if __name__ == '__main__':
init_db()
Thank you for any insight you're able to offer!

Why is this column not found in this SQL table, when the table was defined with it included?

I'm trying to create two SQLite databases through python 3.8 to store user information. I have two scripts defining the databases and one to fill it with test data. The first database works fine. It is defined by the script create_logindb.py:
from sqlalchemy import *
from sqlalchemy import create_engine, ForeignKey
from sqlalchemy import Column, Date, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import relationship, backref
import bcrypt
engine = create_engine('sqlite:///upa.db', echo=True)
Base = declarative_base()
def create_hashed_password(plain_text_password):
# Hash a password for the first time
# (Using bcrypt, the salt is saved into the hash itself)
return bcrypt.hashpw(plain_text_password.encode('utf-8'), bcrypt.gensalt()).decode('utf8')
def check_hashed_password(plain_text_password, hashed_password):
# Check hashed password. Using bcrypt, the salt is saved into the hash itself
return bcrypt.checkpw(plain_text_password.encode('utf-8'), hashed_password.encode('utf-8'))
########################################################################
class User(Base):
""""""
def __init__(self, username, password, app_location):
self.username = username
self.password = password
self.app_location = app_location
__tablename__ = "users"
id = Column(Integer, primary_key=True)
username = Column(String)
password = Column(String)
app_location = Column(String)
#----------------------------------------------------------------------
Base.metadata.create_all(engine)
I have a similar database defined with create_redirectauthdb:
engine = create_engine('sqlite:////Users/rpfhome/Documents/POS II/log in/ra.db', echo=True)
Base = declarative_base()
########################################################################
class Redirected_User(Base):
""""""
def __init__(self, username, hash_time, hash_value,user_ip,user_store):
self.username = username
self.hash_time = hash_time
self.hash_value = hash_value
self.user_ip = user_ip#ip of user
self.user_store = user_store#location of user data
__tablename__ = "users"
id = Column(Integer, primary_key=True)
username = Column(String)
hash_time = Column(String)
hash_value = Column(String)
user_ip = Column(String)
user_store = Column(String)
#----------------------------------------------------------------------
# create tables
Base.metadata.create_all(engine)
and I am creating the test accounts with:
import datetime
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from create_logindb import *
from create_redirectauthdb import *
import bcrypt
import random
engine = create_engine('sqlite:///upa.db', echo=True)
# def create_hashed_password(plain_text_password):
# # Hash a password for the first time
# # (Using bcrypt, the salt is saved into the hash itself)
# return bcrypt.hashpw(plain_text_password.encode('utf-8'), bcrypt.gensalt())
# create a Session
Session = sessionmaker(bind=engine)
session = Session()
user = User("admin", create_hashed_password("password"), 'localhost:8000')
session.add(user)
user = User("python", create_hashed_password("python"), 'localhost:8001')
session.add(user)
user = User("jumpiness", create_hashed_password("python"),'localhost:8002')
session.add(user)
# commit the record the database
session.commit()
engine2 = create_engine('sqlite:////Users/rpfhome/Documents/POS II/log in/ra.db', echo=True)
Session2 = sessionmaker(bind=engine2)
session2 = Session()
user = Redirected_User("admin", 'uninitialized_datetime', str('%032x' % random.getrandbits(128)), '0.0.0.0','$Home/Documents/POS II/log in/')
session2.add(user)
user = Redirected_User("python", 'uninitialized_datetime', str('%032x' % random.getrandbits(128)), '0.0.0.0','$Home/Documents/POS II/log in/')
session2.add(user)
user = Redirected_User("jumpiness", 'uninitialized_datetime', str('%032x' % random.getrandbits(128)), '0.0.0.0','$Home/Documents/POS II/log in/')
session2.add(user)
# commit the record the database
session2.commit()
This is just a toy example I'm using to get started. The first database, upa.db, is created and populated fine. Trying to create the second database creates the error:
...
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) table users has no column named hash_time
[SQL: INSERT INTO users (username, hash_time, hash_value, user_ip, user_store) VALUES (?, ?, ?, ?, ?)]
[parameters: ('admin', 'uninitialized_datetime', '287cac9fcc2ab4760d8318190f182630', '0.0.0.0', '$Home/Documents/POS II/log in/')]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
The background page for the error doesn't seem very helpful in this case. I was wondering why It's saying there's no column named hash_time when the database was defined with that column included. I thought this might have to do with the fact that both databases include a table named "users" so I renamed the table "users2" in the script that defines the database but then got an error that there is no tables called users2 when running the script that fills it with example data. This also doesn't make much sense to me. I've also tried changing all instances of Base to Base2 in the second script but this seems to have no effect. Why does the database in the script to fill it with example data not seem to match the structure of the database defined in the script that creates it?
As #rfkortekaas pointed out, it was a simple bug. session2 was created from Session() for the first database and wouldn't work on the second.

Can I bind a Session to a specific schema in SQLAlchemy?

I work with a postgres database which has multiple (similar) schemas. In my codebase, I reflect the different schemas in seperate schema_xy.py files. I also have a base.py file which contains a base class with abstract tables definitions, for tables that are present in multiple schemas.
My base.py file:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
engine = create_engine('postgresql+psycopg2://dbuser#dbhost:5432/dbname')
Base = declarative_base(bind=engine)
class User(Base):
__abstract__ = True
id = ..
And an example Schema1.py file, which inherits the User table from base but also has a schema specific table S1Table:
from sqlalchemy import declarative_base
from .base import User
engine = create_engine('postgresql+psycopg2://dbuser#dbhost:5432/dbname')
Schema1Base = declarative_base(bind=engine, metadata=MetaData(schema='Schema1'))
class User(Schema1Base, User):
__tablename__ = "user"
class S1Table(Schema1Base):
__tablename__ = "s1table"
foo = ...
My question is, how do I best instantiate sessions for querying and uploading data, which are 'bound' to a specific schema, i.e. how do I make sure that I query/manipulate the User table from the correct schema?
I have found this blog post
http://www.blog.pythonlibrary.org/2010/09/10/sqlalchemy-connecting-to-pre-existing-databases/
which defines a loadSession function for Base like so:
def loadSession():
metadata = Base.metadata
Session = sessionmaker(bind=engine)
session = Session()
return session
But I don't understand what the unused metadata is supposed to do here exactly.

Connecting to google-sql with Python3 flexible engine through Flask/SQLAlchemy

I've upgraded my app engine to flexible and am now refactoring code. I haven't worked with Flask besides in standard and haven't used SQLAlchemy. I've set up my databases and have had valid, functioning connections before in standard environment. I'm now trying to execute a simple SQL in Python3 flexible environment:
SELECT id, latitude, longitude FROM weatherData
I now have a valid connection to the database through the following:
app = Flask(__name__)
app.config['WEATHER_DATABASE_URI'] = os.environ['WEATHER_DATABASE_URI']
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
db = SQLAlchemy(app)
The respective environment variables are in my app.yaml file.
I understand that SQLAlchemy uses ORMs but in all the examples I've seen they've created a class as 'buffer' between the client and database to firstly create the table, and then perform CRUD operations. Eg.
engine = create_engine('sqlite:///student.db', echo=True)
Base = declarative_base()
class Student(Base):
""""""
__tablename__ = "student"
id = Column(Integer, primary_key=True)
username = Column(String)
firstname = Column(String)
lastname = Column(String)
university = Column(String)
#----------------------------------------------------------------------
def __init__(self, username, firstname, lastname, university):
""""""
self.username = username
self.firstname = firstname
self.lastname = lastname
self.university = university
# create tables
Base.metadata.create_all(engine)
I notice that in this case they're using engine which doesn't seem relevant to me. In short, how can I perform the aforementioned SQL query?
Thanks :)
SQLAlchemy uses it's engine class to control interactions with the database. First, you create an engine specifying how you want to connect:
db = sqlalchemy.create_engine(
# Equivalent URL:
# mysql+pymysql://<db_user>:<db_pass>#/<db_name>?unix_socket=/cloudsql/<cloud_sql_instance_name>
sqlalchemy.engine.url.URL(
drivername='mysql+pymysql',
username=db_user,
password=db_pass,
database=db_name,
query={
'unix_socket': '/cloudsql/{}'.format(cloud_sql_instance_name)
}
)
}
Second, you use the engine to retrieve a connection to the instance and perform your actions:
with db.connect() as conn:
recent_votes = conn.execute(
"SELECT candidate, time_cast FROM votes "
"ORDER BY time_cast DESC LIMIT 5"
).fetchall()
This allows SQLAlchemy to manage your connections in a more efficient way. If you want to see these snippets in the context of an application, take a look at this example application.

Using Flask-SQLAlchemy without Flask

I had a small web service built using Flask and Flask-SQLAlchemy that only held one model. I now want to use the same database, but with a command line app, so I'd like to drop the Flask dependency.
My model looks like this:
class IPEntry(db.Model):
id = db.Column(db.Integer, primary_key=True)
ip_address = db.Column(db.String(16), unique=True)
first_seen = db.Column(db.DateTime(),
default = datetime.datetime.utcnow
)
last_seen = db.Column(db.DateTime(),
default = datetime.datetime.utcnow
)
#validates('ip')
def validate_ip(self, key, ip):
assert is_ip_addr(ip)
return ip
Since db will no longer be a reference to flask.ext.sqlalchemy.SQLAlchemy(app), how can I convert my model to use just SQLAlchemy. Is there a way for the two applications (one with Flask-SQLAlchemy the other with SQLAlchemy) to use the same database?
you can do this to replace db.Model:
from sqlalchemy import orm
from sqlalchemy.ext.declarative import declarative_base
import sqlalchemy as sa
base = declarative_base()
engine = sa.create_engine(YOUR_DB_URI)
base.metadata.bind = engine
session = orm.scoped_session(orm.sessionmaker())(bind=engine)
# after this:
# base == db.Model
# session == db.session
# other db.* values are in sa.*
# ie: old: db.Column(db.Integer,db.ForeignKey('s.id'))
# new: sa.Column(sa.Integer,sa.ForeignKey('s.id'))
# except relationship, and backref, those are in orm
# ie: orm.relationship, orm.backref
# so to define a simple model
class UserModel(base):
__tablename__ = 'users' #<- must declare name for db table
id = sa.Column(sa.Integer,primary_key=True)
name = sa.Column(sa.String(255),nullable=False)
then to create the tables:
base.metadata.create_all()
That is how to use SQLAlchemy without Flask (for example to write a bulk of objects to PostgreSQL database):
from sqlalchemy import Column, Integer, String
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
# Define variables DB_USERNAME, DB_PASSWORD, DB_HOST, DB_PORT, DB_NAME
SQLALCHEMY_DATABASE_URI = f'postgresql://{DB_USERNAME}:{DB_PASSWORD}#{DB_HOST}:
{DB_PORT}/{DB_NAME}'
# ----- This is related code -----
engine = create_engine(SQLALCHEMY_DATABASE_URI, echo=True)
Base = declarative_base()
Base.metadata.create_all(engine)
Session = sessionmaker(bind=engine)
Session.configure(bind=engine)
session = Session()
# ----- This is related code -----
class MyModel(Base):
__tablename__ = 'my_table_name'
id = Column(Integer, primary_key=True)
value = Column(String)
objects = [MyModel(id=0, value='a'), MyModel(id=1, value='b')]
session.bulk_save_objects(objects)
session.commit()
Check this one github.com/mardix/active-alchemy
Active-Alchemy is a framework agnostic wrapper for SQLAlchemy that makes it really easy to use by implementing a simple active record like api, while it still uses the db.session underneath. Inspired by Flask-SQLAlchemy
There is a great article about Flask-SQLAlchemy: how it works, and how to modify models to use them outside of Flask:
http://derrickgilland.com/posts/demystifying-flask-sqlalchemy/
The sqlalchemy docs has a good tutorial with examples that sound like what you want to do.
Shows how to connect to a db, mapping, schema creation, and querying/saving to the db.
This does not completely answer your question, because it does not remove Flask dependency, but you can use SqlAlchemy in scripts and tests by just not running the Flask app.
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from sqlalchemy import MetaData
test_app = Flask('test_app')
test_app.config['SQLALCHEMY_DATABASE_URI'] = 'database_uri'
test_app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
metadata = MetaData(schema='myschema')
db = SQLAlchemy(test_app, metadata=metadata)
class IPEntry(db.Model):
pass
One difficulty you may encounter is the requirement of using db.Model as a base class for your models if you want to target the web app and independent scripts using same codebase. Possible way to tackle it is using dynamic polymorphism and wrap the class definition in a function.
def get_ipentry(db):
class IPEntry(db.Model):
pass
return IPEntry
As you construct the class run-time in the function, you can pass in different SqlAlchemy instances. Only downside is that you need to call the function to construct the class before using it.
db = SqlAlchemy(...)
IpEntry = get_ipentry(db)
IpEntry.query.filter_by(id=123).one()
Flask (> 1.0) attempt to provide helpers for sharing code between an web application and a command line interface; i personally think it might be cleaner, lighter and easier to build libraries unbound to flask, but you might want to check:
https://flask.palletsprojects.com/en/2.1.x/cli/
https://flask.palletsprojects.com/en/2.1.x/api/#flask.Flask.cli
Create database and table
import os
from sqlalchemy import create_engine
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
if os.path.exists('test.db'):
os.remove('test.db')
Base = declarative_base()
class Person(Base):
__tablename__ = 'person'
id = Column(Integer(), primary_key=True)
name = Column(String())
engine = create_engine('sqlite:///test.db')
Base.metadata.create_all(engine)
Using Flask_SQLAlchemy directly
from flask import Flask
from sqlalchemy import MetaData
from flask_sqlalchemy import SQLAlchemy
from sqlalchemy import Column, Integer, String
app = Flask(__name__)
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = True
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///test.db'
db = SQLAlchemy(app, metadata=MetaData())
class Person(db.Model):
__tablename__ = 'person'
id = Column(Integer(), primary_key=True)
name = Column(String())
person = Person(name='Bob')
db.session.add(person)
db.session.commit()
print(person.id)

Categories

Resources