I've upgraded my app engine to flexible and am now refactoring code. I haven't worked with Flask besides in standard and haven't used SQLAlchemy. I've set up my databases and have had valid, functioning connections before in standard environment. I'm now trying to execute a simple SQL in Python3 flexible environment:
SELECT id, latitude, longitude FROM weatherData
I now have a valid connection to the database through the following:
app = Flask(__name__)
app.config['WEATHER_DATABASE_URI'] = os.environ['WEATHER_DATABASE_URI']
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
db = SQLAlchemy(app)
The respective environment variables are in my app.yaml file.
I understand that SQLAlchemy uses ORMs but in all the examples I've seen they've created a class as 'buffer' between the client and database to firstly create the table, and then perform CRUD operations. Eg.
engine = create_engine('sqlite:///student.db', echo=True)
Base = declarative_base()
class Student(Base):
""""""
__tablename__ = "student"
id = Column(Integer, primary_key=True)
username = Column(String)
firstname = Column(String)
lastname = Column(String)
university = Column(String)
#----------------------------------------------------------------------
def __init__(self, username, firstname, lastname, university):
""""""
self.username = username
self.firstname = firstname
self.lastname = lastname
self.university = university
# create tables
Base.metadata.create_all(engine)
I notice that in this case they're using engine which doesn't seem relevant to me. In short, how can I perform the aforementioned SQL query?
Thanks :)
SQLAlchemy uses it's engine class to control interactions with the database. First, you create an engine specifying how you want to connect:
db = sqlalchemy.create_engine(
# Equivalent URL:
# mysql+pymysql://<db_user>:<db_pass>#/<db_name>?unix_socket=/cloudsql/<cloud_sql_instance_name>
sqlalchemy.engine.url.URL(
drivername='mysql+pymysql',
username=db_user,
password=db_pass,
database=db_name,
query={
'unix_socket': '/cloudsql/{}'.format(cloud_sql_instance_name)
}
)
}
Second, you use the engine to retrieve a connection to the instance and perform your actions:
with db.connect() as conn:
recent_votes = conn.execute(
"SELECT candidate, time_cast FROM votes "
"ORDER BY time_cast DESC LIMIT 5"
).fetchall()
This allows SQLAlchemy to manage your connections in a more efficient way. If you want to see these snippets in the context of an application, take a look at this example application.
Related
My app.py file
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'postgres:////tmp/test.db'
db = SQLAlchemy(app) # refer https://flask-sqlalchemy.palletsprojects.com/en/2.x/api/#flask_sqlalchemy.SQLAlchemy
One of my model classes, where I imported db
from app import db
Base = declarative_base()
# User class
class User(db.Model, Base):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(80), unique=True, nullable=False)
email = db.Column(db.String(120), unique=True, nullable=False)
def __repr__(self):
return '<User %r>' % self.username
def get_user_by_id(self, id):
return self.query.get(id)
My database has the same set of tables in different schema (multi-tenancy) and there
I need to select the schema as per the request initiated by a particular tenant on the fly by using before_request (grabbing tenant_id from subdomain URL).
I found Postgres provides selecting the schema name on fly by using
schema_translate_map ref. https://docs.sqlalchemy.org/en/14/core/connections.html#translation-of-schema-names and that is under execution_options https://docs.sqlalchemy.org/en/14/core/connections.html#sqlalchemy.engine.Connection.execution_options
In my above code snippet where you see db = SQLAlchemy(app), as per official documentation, two parameters can be set in SQLAlchemy objct creation and they are - session_options and engine_options, but no execution_options ref. https://flask-sqlalchemy.palletsprojects.com/en/2.x/api/#flask_sqlalchemy.SQLAlchemy
But how do I set schema_translate_map setting when I am creating an object of SQLAlchemy
I tried this -
db = SQLAlchemy(app,
session_options={
"autocommit": True,
"autoflush": False,
"schema_translate_map": {
None: "public"
}
}
)
But obviously, it did not work, because schema_translate_map is under execution_options as mentioned here https://docs.sqlalchemy.org/en/14/core/connections.html#translation-of-schema-names
Anyone has an idea, how to set schema_translate_map at the time of creating SQLAlchemy object.
My goal is to set it dynamically for each request. I want to control it from this
centralized place, rather than going in each model file and specifying it when I execute
queries.
I am aware of doing this differently as suggested here https://stackoverflow.com/a/56490246/1560470
but my need is to set somewhere around db = SQLAlchemy(app) in app.py file only. Then after I import db in all my model classes (as shown above) and in those model classes, all queries execute under the selected schema.
I found a way to accomplish it. This is what needed
db = SQLAlchemy(app,
session_options={
"autocommit": True,
"autoflush": False
},
engine_options={
"execution_options":
{
"schema_translate_map": {
None: "public",
"abc": "xyz"
}
}
}
)
I am trying to create a FAST Api that is reading from an already existing table in PostgreSQL database but it is giving me an internal server error. Would appreciate your direction on what might be wrong with the code
The existing table looks like this:
schema : testSchema
table : test_api
id
email
1
test#***.com
2
test2#***.com
engine = sqlalchemy.create_engine("my_database_connection")
Base = declarative_base()
database = databases.Database("my_database_connection")
metadata = sqlalchemy.MetaData()
metadata.reflect(bind=engine, schema='testSchema')
test_api_tb = metadata.tables['testSchema.test_api']
class testAPI(Base):
__tablename__ = test_api_tb
id = Column(Integer, primary_key=True)
email = Column(String(256))
app = FastAPI()
#app.get("/testing_api/")
def read_users():
query = test_api_tb.select()
return database.execute(query)
The error I am getting from the logs
RecursionError: maximum recursion depth exceeded in comparison
The best thing you can do is to read the official documentation at fastapi.tiangolo.com, it is amazing and explains all the basics in a very detailed way.
SQL Relational Databases are used very often with FastAPI and are also mentioned in the documentation here, you can find step by step tutorial about how to use postgresql with sqlalchemy and FastAPI.
There are a few parts to make this work. The first part is to connect to the database:
engine = create_engine(my_database_connection)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
we create the engine with the connection string as you did, then we need to create a session in order to connect to the database. At the end we are creating a Base class which will help us to create the models and schemas.
Now we need to create the model using the base class just as you did above.
we need to make sure that the __tablename__ is the same as the name of the table in the database
class testAPIModel(Base):
__tablename__ = "test_api"
id = Column(Integer, primary_key=True)
email = Column(String(256))
Now comes the main part. We need to make sure we bind the engine of the database to the base class using
Base.metadata.create_all(bind=engine)
Now we will create a function that will help us and create a db session instance and will close the connection when we done with the query.
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
Now we can create the FastAPI app instance and get the data from the database.
#app.get("/testing_api/")
def read_users(db:Session = Depends(get_db)):
users = db.query(testAPIModel).all()
return users
We are using the Depends(get_db) to inject the db session from the function we wrote above.
The full code:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, Session
from fastapi import Depends, FastAPI
from sqlalchemy import Column, Integer, String
my_database_connection = "postgresql://user:password#server_ip/db_name"
engine = create_engine(my_database_connection)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
class testAPIModel(Base):
__tablename__ = "test_api"
id = Column(Integer, primary_key=True)
email = Column(String(256))
Base.metadata.create_all(bind=engine)
app = FastAPI()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
#app.get("/testing_api/")
def read_users(db:Session = Depends(get_db)):
users = db.query(testAPIModel).all()
return users
Good Luck!
I'm trying to create two SQLite databases through python 3.8 to store user information. I have two scripts defining the databases and one to fill it with test data. The first database works fine. It is defined by the script create_logindb.py:
from sqlalchemy import *
from sqlalchemy import create_engine, ForeignKey
from sqlalchemy import Column, Date, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import relationship, backref
import bcrypt
engine = create_engine('sqlite:///upa.db', echo=True)
Base = declarative_base()
def create_hashed_password(plain_text_password):
# Hash a password for the first time
# (Using bcrypt, the salt is saved into the hash itself)
return bcrypt.hashpw(plain_text_password.encode('utf-8'), bcrypt.gensalt()).decode('utf8')
def check_hashed_password(plain_text_password, hashed_password):
# Check hashed password. Using bcrypt, the salt is saved into the hash itself
return bcrypt.checkpw(plain_text_password.encode('utf-8'), hashed_password.encode('utf-8'))
########################################################################
class User(Base):
""""""
def __init__(self, username, password, app_location):
self.username = username
self.password = password
self.app_location = app_location
__tablename__ = "users"
id = Column(Integer, primary_key=True)
username = Column(String)
password = Column(String)
app_location = Column(String)
#----------------------------------------------------------------------
Base.metadata.create_all(engine)
I have a similar database defined with create_redirectauthdb:
engine = create_engine('sqlite:////Users/rpfhome/Documents/POS II/log in/ra.db', echo=True)
Base = declarative_base()
########################################################################
class Redirected_User(Base):
""""""
def __init__(self, username, hash_time, hash_value,user_ip,user_store):
self.username = username
self.hash_time = hash_time
self.hash_value = hash_value
self.user_ip = user_ip#ip of user
self.user_store = user_store#location of user data
__tablename__ = "users"
id = Column(Integer, primary_key=True)
username = Column(String)
hash_time = Column(String)
hash_value = Column(String)
user_ip = Column(String)
user_store = Column(String)
#----------------------------------------------------------------------
# create tables
Base.metadata.create_all(engine)
and I am creating the test accounts with:
import datetime
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from create_logindb import *
from create_redirectauthdb import *
import bcrypt
import random
engine = create_engine('sqlite:///upa.db', echo=True)
# def create_hashed_password(plain_text_password):
# # Hash a password for the first time
# # (Using bcrypt, the salt is saved into the hash itself)
# return bcrypt.hashpw(plain_text_password.encode('utf-8'), bcrypt.gensalt())
# create a Session
Session = sessionmaker(bind=engine)
session = Session()
user = User("admin", create_hashed_password("password"), 'localhost:8000')
session.add(user)
user = User("python", create_hashed_password("python"), 'localhost:8001')
session.add(user)
user = User("jumpiness", create_hashed_password("python"),'localhost:8002')
session.add(user)
# commit the record the database
session.commit()
engine2 = create_engine('sqlite:////Users/rpfhome/Documents/POS II/log in/ra.db', echo=True)
Session2 = sessionmaker(bind=engine2)
session2 = Session()
user = Redirected_User("admin", 'uninitialized_datetime', str('%032x' % random.getrandbits(128)), '0.0.0.0','$Home/Documents/POS II/log in/')
session2.add(user)
user = Redirected_User("python", 'uninitialized_datetime', str('%032x' % random.getrandbits(128)), '0.0.0.0','$Home/Documents/POS II/log in/')
session2.add(user)
user = Redirected_User("jumpiness", 'uninitialized_datetime', str('%032x' % random.getrandbits(128)), '0.0.0.0','$Home/Documents/POS II/log in/')
session2.add(user)
# commit the record the database
session2.commit()
This is just a toy example I'm using to get started. The first database, upa.db, is created and populated fine. Trying to create the second database creates the error:
...
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) table users has no column named hash_time
[SQL: INSERT INTO users (username, hash_time, hash_value, user_ip, user_store) VALUES (?, ?, ?, ?, ?)]
[parameters: ('admin', 'uninitialized_datetime', '287cac9fcc2ab4760d8318190f182630', '0.0.0.0', '$Home/Documents/POS II/log in/')]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
The background page for the error doesn't seem very helpful in this case. I was wondering why It's saying there's no column named hash_time when the database was defined with that column included. I thought this might have to do with the fact that both databases include a table named "users" so I renamed the table "users2" in the script that defines the database but then got an error that there is no tables called users2 when running the script that fills it with example data. This also doesn't make much sense to me. I've also tried changing all instances of Base to Base2 in the second script but this seems to have no effect. Why does the database in the script to fill it with example data not seem to match the structure of the database defined in the script that creates it?
As #rfkortekaas pointed out, it was a simple bug. session2 was created from Session() for the first database and wouldn't work on the second.
I have push a python-django project to heroku and it works well. In my view.py file of django model, I added function that could connect to the local mysql database to retrieve data from the mysql. The function is the view.py is as followed:
#login_required
def results(request):
data=[]
data1 = []
owner = request.user
owner = str(owner)
db = MySQLdb.connect(user='root', db='aaa', passwd='xxxxx', host='localhost')
cursor = db.cursor()
cursor.execute("SELECT search_content, id, title, author, institute, FROM result_split where username = '%s'" % (owner))
data = cursor.fetchall()
db.close()
return render(request, "webdevelop/results.html", {"datas": data})
But when I try to open the page that show the data from mysql database in the deployed heroku website, it show the error:"OperationalError at /results/
(2003, "Can't connect to MySQL server on 'localhost' ([Errno 111] Connection refused)")". How could I have this heroku project to connect to my local mysql database? Or I should choose alternative?
Firstly, you need to ensure that the user and password you're using to connect to MySQL is correct and that the user has the correct privileges to work with the selected database.
Then you can check that mysql is accepting connections on localhost.
As for directly addressing the Connection Refused exception, check things like the mysql socket used to communicate with localhost applications like your Django project. The socket must exist and be configured in MySQL.
I also recommend taking a look at something like SQLAlchemy for Python which will help you interact directly with the database using Python objects. For example,
Connecting to the database:
from sqlalchemy import *
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, relationship, scoped_session, mapper
from config import DB_URL
"""Database Declaration"""
metadata = MetaData()
Base = declarative_base(name='Base', mapper=mapper, metadata=metadata)
engine = create_engine(DB_URL, pool_recycle=1800)
Session = sessionmaker(bind=engine, autocommit=False, autoflush=True)
session = scoped_session(Session)
You can now use session variable to perform queries and updates using its inherited functions from the SQLAlchemy Session class.
SQLAlchemy also includes a declarative model for telling Python what your tables look like. For example,
class Clinic(Base):
__tablename__ = 'clinic'
clinic_id = Column(Integer, primary_key=True)
clinic_name = Column(VARCHAR)
address = Column(VARCHAR)
city = Column(VARCHAR)
zip = Column(VARCHAR)
phone = Column(VARCHAR)
user_id = Column(VARCHAR)
These examples work well for my projects in Flask and should work well enough in Django.
I have a Flask app that uses Flask-SQLAlchemy and I'm trying to configure it to use multiple databases with the Flask-Restless package.
According to the docs, configuring your models to use multiple databases with __bind_key__ seems pretty straightforward.
However it doesn't seem to be working for me.
I create my app and initialise my database like this:
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
SQLALCHEMY_DATABASE_URI = 'postgres://db_user:db_pw#localhost:5432/db_name'
SQLALCHEMY_BINDS = {
'db1': SQLALCHEMY_DATABASE_URI,
'db2': 'mysql://db_user:db_pw#localhost:3306/db_name'
}
app = Flask(__name__)
db = SQLALchemy(app)
Then define my models including __bind_key__, which should tell SQLAlchemy which DB it needs to use:
class PostgresModel(db.Model):
__tablename__ = 'postgres_model_table'
__bind_key__ = 'db1'
id = db.Column(db.Integer, primary_key=True)
...
class MySQLModel(db.Model):
__tablename__ = 'mysql_model_table'
__bind_key__ = 'db2'
id = db.Column(db.Integer, primary_key=True)
...
Then I fire up Flask-Restless like this:
manager = restless.APIManager(app, flask_sqlalchemy_db=db)
manager.init_app(app, db)
auth_func = lambda: is_authenticated(app)
manager.create_api(PostgresModel,
methods=['GET'],
collection_name='postgres_model',
authentication_required_for=['GET'],
authentication_function=auth_func)
manager.create_api(MySQLModel,
methods=['GET'],
collection_name='mysql_model',
authentication_required_for=['GET'],
authentication_function=auth_func)
The app runs fine and when I hit http://localhost:5000/api/postgres_model/[id] I get the expected JSON response of the object from the Postgres DB (I'm guessing this is because I have it's credentials in SQLALCHEMY_DATABASE_URI).
Although when I hit http://localhost:5000/api/mysql_model/[id], I get a mysql_model_table does not exist error, indicating that it's looking in the Postgres DB, not the MySQL one.
What am I doing wrong here?
This was not working because of a simple typo:
__bind_key = 'db1'
Should have been
__bind_key__ = 'db1'
I've updated the original question and fixed the typo as an example of how this can work for others.