I'm trying to SQLITE/spatialite with geoalchemy2. It seems to be possible according that link.
My problem comes I think from the custom engine.
What I have so far:
from flask_sqlalchemy import SQLAlchemy
from geoalchemy2 import Geometry
#and other imports...
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:////Users/cricket/Documents/peas project/open-peas/localapp/test.db'
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
app.config['SQLALCHEMY_ECHO'] = True
db = SQLAlchemy(app)
class Polygon(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(64), unique=True)
point = db.Column(Geometry("POLYGON"))
#app.before_first_request
def init_request():
db.create_all()
When I start the script, I get the message below:
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) near "POLYGON": syntax error [SQL: '\nCREATE TABLE polygon (\n\tid INTEGER NOT NULL, \n\tname VARCHAR(64), \n\tpoint geometry(POLYGON,-1), \n\tPRIMARY KEY (id), \n\tUNIQUE (name)\n)\n\n'] (Background on this error at: http://sqlalche.me/e/e3q8)
Any idea how I could fix that ?
I had the same problem and it took a while to work it out. There are a bunch of layers (SQLAlchemy, Flask, SQLite, spatialite, Flask's SQLAlchemy extension, ....) working together. Hope this helps:
from sqlalchemy import event
db = SQLAlchemy(app)
#event.listens_for(db.engine, "connect")
def load_spatialite(dbapi_conn, connection_record):
# From https://geoalchemy-2.readthedocs.io/en/latest/spatialite_tutorial.html
dbapi_conn.enable_load_extension(True)
dbapi_conn.load_extension('/usr/lib/x86_64-linux-gnu/mod_spatialite.so')
Struggling with the same question, even with the accepted answer, I eventually realized that the management argument of the Geometry() constructor was missing, as explained here.
Following content should send the right sql command to DB API :
class Polygon(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(64), unique=True)
point = db.Column(Geometry("POLYGON", management = True))
Related
My app.py file
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'postgres:////tmp/test.db'
db = SQLAlchemy(app) # refer https://flask-sqlalchemy.palletsprojects.com/en/2.x/api/#flask_sqlalchemy.SQLAlchemy
One of my model classes, where I imported db
from app import db
Base = declarative_base()
# User class
class User(db.Model, Base):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(80), unique=True, nullable=False)
email = db.Column(db.String(120), unique=True, nullable=False)
def __repr__(self):
return '<User %r>' % self.username
def get_user_by_id(self, id):
return self.query.get(id)
My database has the same set of tables in different schema (multi-tenancy) and there
I need to select the schema as per the request initiated by a particular tenant on the fly by using before_request (grabbing tenant_id from subdomain URL).
I found Postgres provides selecting the schema name on fly by using
schema_translate_map ref. https://docs.sqlalchemy.org/en/14/core/connections.html#translation-of-schema-names and that is under execution_options https://docs.sqlalchemy.org/en/14/core/connections.html#sqlalchemy.engine.Connection.execution_options
In my above code snippet where you see db = SQLAlchemy(app), as per official documentation, two parameters can be set in SQLAlchemy objct creation and they are - session_options and engine_options, but no execution_options ref. https://flask-sqlalchemy.palletsprojects.com/en/2.x/api/#flask_sqlalchemy.SQLAlchemy
But how do I set schema_translate_map setting when I am creating an object of SQLAlchemy
I tried this -
db = SQLAlchemy(app,
session_options={
"autocommit": True,
"autoflush": False,
"schema_translate_map": {
None: "public"
}
}
)
But obviously, it did not work, because schema_translate_map is under execution_options as mentioned here https://docs.sqlalchemy.org/en/14/core/connections.html#translation-of-schema-names
Anyone has an idea, how to set schema_translate_map at the time of creating SQLAlchemy object.
My goal is to set it dynamically for each request. I want to control it from this
centralized place, rather than going in each model file and specifying it when I execute
queries.
I am aware of doing this differently as suggested here https://stackoverflow.com/a/56490246/1560470
but my need is to set somewhere around db = SQLAlchemy(app) in app.py file only. Then after I import db in all my model classes (as shown above) and in those model classes, all queries execute under the selected schema.
I found a way to accomplish it. This is what needed
db = SQLAlchemy(app,
session_options={
"autocommit": True,
"autoflush": False
},
engine_options={
"execution_options":
{
"schema_translate_map": {
None: "public",
"abc": "xyz"
}
}
}
)
I am trying to create a FAST Api that is reading from an already existing table in PostgreSQL database but it is giving me an internal server error. Would appreciate your direction on what might be wrong with the code
The existing table looks like this:
schema : testSchema
table : test_api
id
email
1
test#***.com
2
test2#***.com
engine = sqlalchemy.create_engine("my_database_connection")
Base = declarative_base()
database = databases.Database("my_database_connection")
metadata = sqlalchemy.MetaData()
metadata.reflect(bind=engine, schema='testSchema')
test_api_tb = metadata.tables['testSchema.test_api']
class testAPI(Base):
__tablename__ = test_api_tb
id = Column(Integer, primary_key=True)
email = Column(String(256))
app = FastAPI()
#app.get("/testing_api/")
def read_users():
query = test_api_tb.select()
return database.execute(query)
The error I am getting from the logs
RecursionError: maximum recursion depth exceeded in comparison
The best thing you can do is to read the official documentation at fastapi.tiangolo.com, it is amazing and explains all the basics in a very detailed way.
SQL Relational Databases are used very often with FastAPI and are also mentioned in the documentation here, you can find step by step tutorial about how to use postgresql with sqlalchemy and FastAPI.
There are a few parts to make this work. The first part is to connect to the database:
engine = create_engine(my_database_connection)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
we create the engine with the connection string as you did, then we need to create a session in order to connect to the database. At the end we are creating a Base class which will help us to create the models and schemas.
Now we need to create the model using the base class just as you did above.
we need to make sure that the __tablename__ is the same as the name of the table in the database
class testAPIModel(Base):
__tablename__ = "test_api"
id = Column(Integer, primary_key=True)
email = Column(String(256))
Now comes the main part. We need to make sure we bind the engine of the database to the base class using
Base.metadata.create_all(bind=engine)
Now we will create a function that will help us and create a db session instance and will close the connection when we done with the query.
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
Now we can create the FastAPI app instance and get the data from the database.
#app.get("/testing_api/")
def read_users(db:Session = Depends(get_db)):
users = db.query(testAPIModel).all()
return users
We are using the Depends(get_db) to inject the db session from the function we wrote above.
The full code:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, Session
from fastapi import Depends, FastAPI
from sqlalchemy import Column, Integer, String
my_database_connection = "postgresql://user:password#server_ip/db_name"
engine = create_engine(my_database_connection)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
class testAPIModel(Base):
__tablename__ = "test_api"
id = Column(Integer, primary_key=True)
email = Column(String(256))
Base.metadata.create_all(bind=engine)
app = FastAPI()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
#app.get("/testing_api/")
def read_users(db:Session = Depends(get_db)):
users = db.query(testAPIModel).all()
return users
Good Luck!
I am trying to populate a database with two tables in SQLAlchemy. I have already created the database test_database and now I am trying to create 2 tables inside this. I have already checked that this database is successfully created using \l. Following is the code for the file create.py which creates a database with two tables:
import os
from flask import Flask, render_template, request
from models import *
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config["SQLALCHEMY_DATABASE_URI"] = 'postgresql://shammun:my_password#localhost:5432/test_database.db'
app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
# db = SQLAlchemy()
db.init_app(app)
def main():
db.create_all()
if __name__ == "__main__":
with app.app_context():
main()
This file create.py imports model.py which generates two tables, the code of which is given below:
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
class Flight(db.Model):
__tablename__ = "flights"
id = db.Column(db.Integer, primary_key=True)
origin = db.Column(db.String, nullable=False)
destination = db.Column(db.String, nullable=False)
duration = db.Column(db.Integer, nullable=False)
class Passenger(db.Model):
__tablename__ = "passengers"
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String, nullable=False)
flight_id = db.Column(db.Integer, db.ForeignKey("flights.id"), nullable=False)
Now, in the terminal when I run the file create.py, I get the following error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) FATAL: database "test_database.db" does not exist
Actually, this is almost the same question that I asked in this post db.create_all() doesn't create a database a month ago. The only difference is that I asked the wrong question that database was not created. Actually the question would be why the database wasn't found and why would it throw an error. As this question was closed and as I have tried so many times for a long time to resolve it and couldn't find any solution, I am asking almost the same question again. I will much appreciate if someone can help me to lift me from this bottleneck where I am stuck for a long time.
Check on what port is your postgres running using this command \conninfo
cause I doubt your PostgreSQL database is running on some different port.
Default port of PostgreSQL is 5432, but if it is already occupied by some other application then it tries next empty port and starts running on 5433
So in your app config variable of SQLALCHEMY_DATABASE_URI, try changing 5432 to 5433 and see if it works.
Edit 1:
Try removing .db from your database name test_database.db, and just put test_database
Change this:
app.config["SQLALCHEMY_DATABASE_URI"] = 'postgresql://shammun:my_password#localhost:5432/test_database.db'
To this:
app.config["SQLALCHEMY_DATABASE_URI"] = 'postgresql://shammun:my_password#localhost:5432/test_database'
I'm trying to enable full text search on a model's column using SQLAlchemy-searchable. I followed the instructions on their quickstart guide and applied the fix specified in this github issue given that I'm using Flask. Also, I already created and applied the migrations as specified in Alembic Migrations docs section. However, the following exception is being raised:
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) function
tsq_parse(unknown, unknown) does not exist
LINE 3: WHERE quote.qt_search_vector ## tsq_parse('pg_catalog.englis...
^
HINT: No function matches the given name and argument types. You might need
to add explicit type casts. [SQL: 'SELECT quote.id AS quote_id, quote.song_id AS
quote_song_id, quote.stanza_id AS quote_stanza_id, quote.popularity_count AS
quote_popularity_count, quote.quote_text AS quote_quote_text,
quote.qt_search_vector AS quote_qt_search_vector \nFROM quote \nWHERE
quote.qt_search_vector ## tsq_parse(%(tsq_parse_1)s, %(tsq_parse_2)s) \n
LIMIT %(param_1)s'] [parameters: {'tsq_parse_1': 'pg_catalog.english',
'tsq_parse_2': '"ipsum"', 'param_1': 10}]
(Background on this error at: http://sqlalche.me/e/f405)
Am I missing something?
__init__.py
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy_searchable import make_searchable
Base = declarative_base()
make_searchable(Base.metadata)
... more code ...
Model
class QuoteQuery(BaseQuery, SearchQueryMixin):
pass
class Quote(db.Model):
query_class = QuoteQuery
__table_args__ = (
db.UniqueConstraint('song_id', 'stanza_id', 'quote_text'),)
id = db.Column(db.Integer, primary_key=True)
song_id = db.Column(
db.Integer, db.ForeignKey('song.id'), nullable=False)
stanza_id = db.Column(
db.Integer, db.ForeignKey('stanza.id'), nullable=True)
popularity_count = db.Column(
db.BigInteger, unique=False, nullable=False, server_default='1')
quote_text = db.Column(db.Text, unique=False, nullable=False)
qt_search_vector = db.Column(TSVectorType('quote_text'))
Query
term = 'lorem'
Quote.query.search('"' + term + '"').all()
This was a problem with the metadata I was providing the make_searchable function. Fixed make_searchable call:
from flask_sqlalchemy import SQLAlchemy
from sqlalchemy_searchable import make_searchable
db = SQLAlchemy()
make_searchable(db.metadata)
After this, I called db.create_all() and the full text search started working as expected.
For those not using Flask, this was my solution:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import configure_mappers
db_engine = create_engine(...)
configure_mappers()
Base = declarative_base()
make_searchable(Base.metadata)
Base.metadata.create_all(db_engine)
The missing piece was the Base.metadata.create_all(db_engine)
I ended up here troubleshooting for a day on this. Maybe this will help someone else. I was also sometimes missing function parse_websearch instead of tsq_parse but I wasn't able to narrow down why it gave one or the other before I fixed it.
My issue was I upgraded sqlalchemy_searchable (from 0.10.2 to 1.4) and my database version (Postgres 9.6) was too old now. I needed to manually create some functions now from sqlalchemy_searchable but I was missing a Postgres function (specifically websearch_to_tsquery that was added in 11).
I was using Flask-Migrate/Alembic for migrations. I got help from here and ended up making a blank migration (flask db revision to do that). I pasted in the SQL from here. My new migration looked like this.
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'AAA'
down_revision = 'BBB'
branch_labels = None
depends_on = None
def upgrade():
command = """
CREATE OR REPLACE FUNCTION parse_websearch(config regconfig, search_query text)
RETURNS tsquery AS $$
SELECT
string_agg(
(
CASE
WHEN position('''' IN words.word) > 0 THEN CONCAT(words.word, ':*')
ELSE words.word
END
),
' '
)::tsquery
FROM (
SELECT trim(
regexp_split_to_table(
websearch_to_tsquery(config, lower(search_query))::text,
' '
)
) AS word
) AS words
$$ LANGUAGE SQL IMMUTABLE;
CREATE OR REPLACE FUNCTION parse_websearch(search_query text)
RETURNS tsquery AS $$
SELECT parse_websearch('pg_catalog.simple', search_query);
$$ LANGUAGE SQL IMMUTABLE;"""
op.execute(command)
def downgrade():
op.execute('DROP FUNCTION public.parse_websearch(regconfig, text);')
op.execute('DROP FUNCTION public.parse_websearch(text);')
I have a Flask app that uses Flask-SQLAlchemy and I'm trying to configure it to use multiple databases with the Flask-Restless package.
According to the docs, configuring your models to use multiple databases with __bind_key__ seems pretty straightforward.
However it doesn't seem to be working for me.
I create my app and initialise my database like this:
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
SQLALCHEMY_DATABASE_URI = 'postgres://db_user:db_pw#localhost:5432/db_name'
SQLALCHEMY_BINDS = {
'db1': SQLALCHEMY_DATABASE_URI,
'db2': 'mysql://db_user:db_pw#localhost:3306/db_name'
}
app = Flask(__name__)
db = SQLALchemy(app)
Then define my models including __bind_key__, which should tell SQLAlchemy which DB it needs to use:
class PostgresModel(db.Model):
__tablename__ = 'postgres_model_table'
__bind_key__ = 'db1'
id = db.Column(db.Integer, primary_key=True)
...
class MySQLModel(db.Model):
__tablename__ = 'mysql_model_table'
__bind_key__ = 'db2'
id = db.Column(db.Integer, primary_key=True)
...
Then I fire up Flask-Restless like this:
manager = restless.APIManager(app, flask_sqlalchemy_db=db)
manager.init_app(app, db)
auth_func = lambda: is_authenticated(app)
manager.create_api(PostgresModel,
methods=['GET'],
collection_name='postgres_model',
authentication_required_for=['GET'],
authentication_function=auth_func)
manager.create_api(MySQLModel,
methods=['GET'],
collection_name='mysql_model',
authentication_required_for=['GET'],
authentication_function=auth_func)
The app runs fine and when I hit http://localhost:5000/api/postgres_model/[id] I get the expected JSON response of the object from the Postgres DB (I'm guessing this is because I have it's credentials in SQLALCHEMY_DATABASE_URI).
Although when I hit http://localhost:5000/api/mysql_model/[id], I get a mysql_model_table does not exist error, indicating that it's looking in the Postgres DB, not the MySQL one.
What am I doing wrong here?
This was not working because of a simple typo:
__bind_key = 'db1'
Should have been
__bind_key__ = 'db1'
I've updated the original question and fixed the typo as an example of how this can work for others.