Join query in PostgreSQL using Python Flask sqlalchemy orm - python

I have two tables X and Y.
X has two columns id and name,
Y has three columns id, city, country
I need to make a join query and take city and country values corresponding to its name using id. id is a foreign key.
I tried directly in PostgreSQL with this query,
select * from x inner join y on x.name = 'xyz'
It's giving exact result. But while trying the same query using sqlalchemy orm in Python flask app it's not. I'm new to Python and Flask framework. I don't know how to achieve it.
I declared table definition and serialize function in a separate python file inside my models directory.
In my main.py file, I'm importing that model and tried this kind of querying, But it's not working.
from models import x, y
response = x.query.join(y).filter_by(name=name).all()
I'm getting error message like this,
sqlalchemy.exc.InvalidRequestError
InvalidRequestError: Could not find a FROM clause to join from. Tried joining to <class 'models.y'>, but got: Can't find any foreign key relationships between 'x' and 'y'.

If you prefer to write out regular SQL statements, then why not use SQLAlchemy to do just that. It's not a requirement that you use chain methods like join, filter_by, etc to query your database.
In order to answer this question, I need to make some assumptions about what's in your models file. I'll assume it's something like this:
from flask_sqlalchemy import SQLAlchemy
import datetime
db = SQLAlchemy()
class BaseModel(db.Model):
"""Base data model for all objects"""
# more code here
class x(BaseModel, db.Model):
# more table setup code here
class y(BaseModel, db.Model):
# more table setup code here
If that's the case, then here's what you can do to execute plain old parameterized SQL statements::
from flask import Flask
from models import db
import json
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'your_database_connection_string'
db.init_app(app)
result = db.session.execute("select * from x inner join y on x.name = :name", {"name":"xyz"})
# If no rows were returned in the result, return an empty list
if result.returns_rows == False:
response = []
# Convert the response to a plain list of dicts
else:
response = [dict(row.items()) for row in result]
# Output the query result as JSON
print(json.dumps(response))
I find this method of running SQL queries in Flash with SQLAlchemy far easier to follow and understand than trying to use all of the different method chaining that you were attempting in your original post.

Although it is possible to avoid the ORM alltogether like in Elliot's example, the ORM does have its advantages if the database is small. The most convenient way to solve your problem is by adding a relationship. You can then query the results by, for example:
tree = db.session.query(Tree).first()
for leaf in tree.leaves:
print(leaf.id)
If you cannot create the relationship, the proper syntax for a join is as follows:
db.session.query(Leaf).join(Tree,Leaf.tree_id==Tree.id)

Related

Flask-sqlalchemy: Can I create multiple bind keys for the same db.model?

I have 2 empty databases. I have two tasks at hand.
TASK 1:
I want to create these two tables m1 & m2 in my primary database (sqlite) and add data to the tables.
TASK 2:
Then I want to copy those two tables(with data) from sqlite to my secondary database (mysql).
from flask_sqlalchemy import SQLAlchemy
from flask import Flask
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///db'
app.config['SQLALCHEMY_BINDS'] = {
'mysql': "mysql+pymysql://root#localhost/db",
}
db=SQLAlchemy(app)
class m1(db.Model):
__tablename__= 'm1'
id = db.Column(db.Integer,primary_key=True)
name = db.Column(db.String(255))
def __init__(self,name):
self.name=name
class m2(db.Model):
__tablename__= 'm2'
id = db.Column(db.Integer,primary_key=True)
name = db.Column(db.String(255))
def __init__(self,name):
self.name=name
Task1 is easy. I use create_all() to create tables and db.session.add() to add data to those tables.
Now, Task2 is the tricky one. I can't create tables in mysql because my session is bound to sqlite engine. I also can't switch sessions to query data from my primary database(sqlite) and add to my secondary database(mysql).
Possible ways to solve this issue (according to me).
Creating multiple binding keys for the same table/Model and switching engines whenever i want. For ex: I grab data from table m1 from sqlite, I switch to mysql engine and create table m1 and fill data (I don't know how).
Somehow creating multiple sessions ( you can't create, but if you have a way)
Anything else that you suggest.
Rules :
I'm talking about flask_sqlalchemy here (so don't write about plain sqlalchemy).
I don't want to use raw sql queries. Abstraction is important.
Before you jump in, to duplicate this question, understand the scenario of the question. If you find anything relevant, post them in the comments. If it helps me, you can go on duplicate it .

Get Primary Key column name from table in sqlalchemy (Core)

I am using the core of Sqlalchemy so I am not using a declarative base class like in other similar questions.
How to get the primary key of a table using the engine?
So, I just ran into the same problem. You need to create a Table object that reflects the table for which you are looking to find the primary key.
from sqlalchemy import create_engine, Table, MetaData
dbUrl = 'postgres://localhost:5432/databaseName' #will be different for different db's
engine = create_engine(dbUrl)
meta = MetaData()
table = Table('tableName', meta, autoload=True, autoload_with=engine)
primaryKeyColName = table.primary_key.columns.values()[0].name
The Table construct above is useful for a number of different functions. I use it quite a bit for managing geospatial tables since I do not really like any of the current packages out there.
In your comment you mention that you are not defining a table...I think that means that you aren't creating a sqlalchemy model of the the table. With the approach above, you don't need to do that and can gather all sorts of information from a table in a more dynamic fashion. Especially useful if you are be handed someone else's messy database!
Hope that helps.
I'd like to comment, but I do not have enough reputation for that.
Building on greenbergé answer:
from sqlalchemy import create_engine, Table, MetaData
dbUrl = 'postgres://localhost:5432/databaseName' #will be different for different db's
engine = create_engine(dbUrl)
meta = MetaData()
table = Table('tableName', meta, autoload=True, autoload_with=engine)
[0] is not always the PK, only if the PK has only one column.
table.primary_key.columns.values() is a list.
In order to get all columns of a multi-column pk you could use:
primaryKeyColNames = [pk_column.name for pk_column in table.primary_key.columns.values()]
The two answers were given for retrieving the primary key from a metadata object.
Even if it works well, sometimes one can look for retrieving the primary key from an instance of a SQL Alchemy model, without even knowing what actually the model class is (for example if you search for having a helper function called, let's say, get_primary_key, that would accept an instance of a DB Model and output the primary keys).
For this we can use the inspect function from the inspection module :
from sqlalchemy import inspect
def get_primary_key(model_instance):
inspector = inspect(model_instance)
model_columns = inspector.mapper.columns
return [c.description for c in model_columns if c.primary_key]
You could also directly use the __mapper__ object
def get_primary_key(model_instance):
model_columns = model_instance.__mapper__.columns
return [c.description for c in model_columns if c.primary_key]
for a reflected table this works:
insp=inspect(self.db.engine)
pk_temp=insp.get_pk_constraint(self.__tablename__)['constrained_columns']

Python and sqlite3 data structure to store table name and columns for multiple reuse

I'm using python sqlite3 api to create a database.
In all examples I saw on the documentation table names and colum names are hardcoded inside queries..but this could be a potential problem if I re-use the same table multiple times (ie, creating table, inserting records into table, reading data from table, alter table and so on...) because In case of table modification I need to change the hardcoded names in multiple places and this is not a good programming practice..
How can I solve this problem?
I thought creating a class with just constructor method in order to store all this string names..and use it inside the class that will operation on database..but as I'm not an expert python programmer I would like to share my thoughts...
class TableA(object):
def __init__(self):
self.table_name = 'tableA'
self.name_col1 = 'first_column'
self.type_col1='INTEGER'
self.name_col2 = 'second_column'
self.type.col2 = 'TEXT'
self.name_col3 = 'third_column'
self.type_col3 = 'BLOB'
and then inside the DB classe
table_A = TableA()
def insert_table(self):
conn = sqlite3.connect(self._db_name)
query = 'INSERT INTO ' + table_A.table_name + ..... <SNIP>
conn.execute(query)
Is this a proper way to proceed?
I don't know what's proper but I can tell you that it's not conventional.
If you really want to structure tables as classes, you could consider an object relational mapper like SQLAlchemy. Otherwise, the way you're going about it, how do you know how many column variables you have? What about storing a list of 2-item lists? Or a list of dictionaries?
self.column_list = []
self.column_list.append({'name':'first','type':'integer'})
The way you're doing it sounds pretty novel. Check out their code and see how they do it.
If you are going to start using classes to provide an abstraction layer for your database tables, you might as well start using an ORM. Some examples are SQLAlchemy and SQLObject, both of which are extremely popular.
Here's a taste of SQLAlchemy:
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import create_engine
Base = declarative_base()
class TableA(Base):
__tablename__ = 'tableA'
id = Column(Integer, primary_key=True)
first_column = Column(Integer)
second_column = Column(String)
# etc...
engine = create_engine('sqlite:///test.db')
Base.metadata.bind = engine
session = sessionmaker(bind=engine)()
ta = TableA(first_column=123, second_column='Hi there')
session.add(ta)
session.commit()
Of course you would choose semantic names for the table and columns, but you can see that declaring a table is something along the lines of what you were proposing in your question, i.e. using a class. Inserting records is simplified by creating instances of that class.
I personally don't like to use libraries and frameworks without proper reason. So, if I'd such reason, so will write a thinking wrapper around sqlite.
class Column(object):
def __init__(self, col_name="FOO", col_type="INTEGER"):
# standard initialization
And then table class that encapsulates operations with database
class Table(object):
def __init__(self, list_of_columns, cursor):
#initialization
#create-update-delete commands
In table class you can encapsulate all operations with the database you want.

Return query with columns from multiple tables in SQLAlchemy

I haven't been able to find an answer to this, but I'm sure it must be somewhere.
My question is similar to this question: sqlalchemy: how to join several tables by one query?
But I need a query result, not a tuple. I don't have access to the models, so I can't change it, and I can't modify the functions to use a tuple.
I have two tables, UserInformation and MemberInformation, both with a foreign key and relationship to Principal, but not to each other.
How can I get all the records and columns from both tables in one query?
I've tried:
query = DBSession.query(MemberInformation).join(UserInformation, MemberInformation.pId == UserInformation.pId)
but it only returns the columns of MemberInformation
and:
query = DBSession.query(MemberInformation, UserInformation).join(UserInformation, MemberInformation.pId == UserInformation.pId)
but that returns a tuple.
What am I missing here?
Old question, but worth answering because i see it's got a lot of view activity.
You need to create a relationship and then tell SQLAlchemy how to load the related data. Not sure what your tables / relationship looks like, but it might look something like this:
# Create relationship
MemberInformation.user = relationship(
"UserInformation",
foreign_keys=[MemberInformation.pId],
lazy="joined",
)
# Execute query
query = DBSession.query(MemberInformation) \
.options(joinedload(MemberInformation.user)) \
.all()
# All objects are in memory. Evaluating the following will NOT result in additional
# database interaction
for member in query:
print(f'Member: {member} User: {member.user}')
# member is a MemberInformation object, member.user is a UserInformation object
Ideally, the relationship would be defined in your models. If can, however, be defined at run time list the example above.
Only way I found to do this is to use statement instead of query:
stmt = select([table1, table2.col.label('table2_col')]).select_from(join(table1, table2, table1.t1_id == table2.t2_id))
obj = session.execute(stmt).fetchall()

Python, sqlalchemy and a fast way to get tables with schemas

I've been using sqlalchemy to do my rest implementation and now I want to get a list of tables from the database. I have tried these:
# first attempt
engine = create_engine(datasource.connection_string)
insp = reflection.Inspector.from_engine(engine)
tables = insp.get_table_names()
view = insp.get_view_names()
# second attempt
meta = MetaData()
meta.reflect(bind=engine, views=True)
While they both work perfectly fine they both have their downsides.
The first does not give me the schema but only the object name
The second gives me the world but runs dog slow...
For both there is no filtering
Using a sql-statement is not an option since I need it to be somewhat cross-db.
Is there a way to quickly load the objects including schema? Filtering is of less importance since I can do that on the list of objects fast enough.
get_table_names() and meta.reflect() only load tables from one schema at a time. If an explicit schema name is not given, the "default" schema is assumed, which is the schema that your database connection automatically selects when you connect; this is configured on the database side in conjunction with the user account used to connect.
if you want to see the name of this schema, use the "default_schema_name" property:
>>> from sqlalchemy import create_engine, inspect
>>> e = create_engine("postgresql://scott:tiger#localhost/test")
>>> insp = inspect(e)
>>> insp.default_schema_name
u'public'

Categories

Resources