How to find nested value in postgres JSONB using sqlalchemy? - python

I have a postgres table with JSONB column name entity. The example of json structure is:
{
"some_key": "some value",
"properties": {
"name": ["name_value"],
}
}
I need to find records by name_value. I can get it by using query:
SELECT entity FROM my_table where entity->'properties' #> {"name": ["name_value"]};
The problem is: I cannot find the way to create that same query using sqlalchemy ORM.
Update:
My model is made dynamic, because multiple tables are using same structure just the different table name. Here is the code:
...
Base = declarative_base(engine)
class ForeignBase(AbstractConcreteBase, Base):
pass
def make_model(name):
class Entity(ForeignBase):
__tablename__ = name
__mapper_args__ = {"polymorphic_identity": name, "concrete": True}
__table_args__ = {"extend_existing": True}
id = Column(String, primary_key=True)
entity = Column(JSONB, nullable=True)
...
configure_mappers()
return Entity
And then I use 3 functions to initiate my Models and get one I currently need:
def get_model_list():
tables_names_list = get_table_name_list()
model_list = [make_model(PREFIX + table_name) for table_name in tables_names_list]
return model_list
def get_tables_dict():
return {table.__tablename__: table for table in ForeignBase.__subclasses__()}
def get_table_object(table_name):
return get_tables_dict().get(table_name)

You can use the following SQLAlchemy expression. The Operator #> is contains() in SQLAlchemy for JSONB columns
Session().query(MyTable).filter(MyTable.entity["properties"].contains({"name": ["name_value"]})).with_entities(MyTable.entity).all()

Related

In SQL Alchemy how to use 'on_conflict_do_update' when one use a dictionary for the values

In SqlAlchemy I want to insert a new row in my_table, but if the value for the column "name" already exists then I want to update this column:
dico = {
"name": "somename",
"other_column": "some_value"
}
result = conn.execute(
insert(my_table.on_conflict_do_update(
constraint="name",
set_=dico
),
[ dico ]
)
But I get this error:
'Insert' object has no attribute 'on_conflict_do_update'
Very important: I need to specify the values for the insert/update as a dictionary.
Thank you
Suppose you have a table like that:
from sqlalchemy import Column, String, Integer, UniqueConstraint
from sqlalchemy.orm import declarative_base
from sqlalchemy.dialects.postgresql import insert
Base = declarative_base()
class Table(Base):
__tablename__ = "example"
__table_args__ = (UniqueConstraint("name"),)
row_id = Column(Integer, primary_key=True)
name = Column(String, unique=True)
other_column = Column(String)
You can use on_conflict_do_update like that:
dico = {
"name": "somename",
"other_column": "some_value"
}
stmt = insert(Table).values(dico)
stmt = stmt.on_conflict_do_update(
constraint="example_name_key",
set_={col: getattr(stmt.excluded, col) for col in dico}
)
Important to know is that you have to import it from sqlalchemy.dialects.postgresql
and that stmt.excluded holds the "new" value for your update.
EDIT:
To get your constraints dynamically, you can use the inspector.
inspector = sqlalchemy.inspect(engine)
inspector.get_unique_constraints(table.__tablename__) # or provide the table name as a string

SQLAlchemy - Return filtered table AND corresponding foreign table values

I have the following SQLAlchemy mapped classes:
class ShowModel(db.Model):
__tablename__ = 'shows'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(100))
episodes = db.relationship('EpisodeModel', backref='episode', lazy='dynamic')
class EpisodeModel(db.Model):
__tablename__ = 'episodes'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(200))
show_id = db.Column(db.Integer, db.ForeignKey('shows.id'))
info = db.relationship('InfoModel', backref='episode', lazy='dynamic')
class InfoModel(db.Model):
__tablename__ = 'info'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(100))
episode_id = db.Column(db.Integer, db.ForeignKey('episodes.id'))
I'm trying and failing to figure out how to perform a query that searches the info table for a specific name column value AND then return the shows and episodes table rows that are associated with it.
Using the following query allows me to return the specific info row that matches the filter_by(name=name) query
InfoModel.query.filter_by(name=name).all()))
But I am really struggling to figure out how to also get the values of the corresponding foreign key rows that have a relationship with the specific info row. Is there a proper way to do this with the join statement or something similar? Thank you very much for any help on this, as I'm still trying to get the hang of working with SQL databases, and databases in general.
Edit -
If, for example, I use the query InfoModel.query.filter_by(name="ShowName1").all())), my json() representation returns
{
"name": "InfoName1",
"id": 1,
"episode_id": 1
}
But I'm also wanting to return the associated foreign table values so that my json() representation returns -
{
"name": "ShowName1",
"id": 1,
"episodes":
{
"name": "EpisodeName1",
"id": 1,
"show_id": 1
"info":
{
"name": "InfoName1",
"id": 1,
"episode_id": 1
}
},
}
And I apologize for fumbling over my use of jargon here, making my question appear more complicated than it is.
Because you have lazy loading enabled, the joined tables will only be set when they are accessed. What you can do is force a join. Something like the following should work for you:
shows = session.query(ShowModel)
.join(EpisodeModel)
.join(InfoModel)
.filter(ShowModel.name == "foo")
.all()
You can also change your load configuration to be "eager", or any number of other options. I don't like to do this by default though, as it makes for accidentally expensive queries: https://docs.sqlalchemy.org/en/latest/orm/loading_relationships.html

sqlalchemy: Select from table where column in QUERY

I have a situation where I am trying to count up the number of rows in a table when the column value is in a subquery. For example, lets say that I have some sql like so:
select count(*) from table1
where column1 in (select column2 from table2);
I have my tables defined like so:
class table1(Base):
__tablename__ = "table1"
__table_args__ = {'schema': 'myschema'}
acct_id = Column(DECIMAL(precision=15), primary_key=True)
class table2(Base):
__tablename__ = "table2"
__table_args__ = {'schema': 'myschema'}
ban = Column(String(length=128), primary_key=True)
The tables are reflected from the database so there are other attributes present that aren't explicitly specified in the class definition.
I can try to write my query but here is where I am getting stuck...
qry=self.session.query(func.?(...)) # what to put here?
res = qry.one()
I tried looking through the documentation here but I don't see any comparable implementation to the 'in' keyword which is a feature of many SQL dialects.
I am using Teradata as my backend if that matters.
sub_stmt = session.query(table2.some_id)
stmt = session.query(table1).filter(table1.id.in_(sub_stmt))
data = stmt.all()

How to properly use SQLAlchemy's '#aggregated' class attribute decorator

I'm trying to use SQLAlchemy's #aggregated decorator to define an attribute ('gross_amount)' for a class, Receipt. This gross_amount attribute is the sum of the Item.gross_amount for all Item instances associated with the Receipt instance by a foreign id.
I.E., a receipt is made up of items, and I want to define a receipt 'gross_amount' value which is just the total $ of all of the items on the receipt.
I've modeled my code after this document http://sqlalchemy-utils.readthedocs.io/en/latest/aggregates.html
So it looks like this...
from sqlalchemy import Column, Integer, ForeignKey
from sqlalchemy.sql import func
from sqlalchemy import orm
class Receipt(Base):
__tablename__ = "receipts"
__table_args__ = {'extend_existing': True}
id = Column(Integer, index = True, primary_key = True, nullable = False)
#aggregated('itemz', Column(Integer))
def gross_amount(self):
return func.sum(Item.gross_amount)
itemz = orm.relationship(
'Item',
backref='receipts'
)
class Item(Base):
__tablename__ = "items"
id = Column(Integer, index = True, primary_key = True, nullable = False)
'''
FE relevant
'''
gross_amount = Column(Integer)
receipt_id = Column(Integer, ForeignKey("receipts.id"), nullable=False)
In my migration, am I supposed to have a column in the receipts table for gross_amount?
1) When I DO define this column in the receipts table, any Receipt.gross_amount for any instance just points to the gross_amount values defined in the receipts table.
2) When I DO NOT define this column in the receipts table, I get a SQLAlchemy error whenever I execute a SELECT against the database:
ProgrammingError: (psycopg2.ProgrammingError) column receipts.gross_amount does not exist
FWIW, my SQLAlchemy package is the latest distributed thru PIP...
SQLAlchemy==1.1.11
SQLAlchemy-Utils==0.32.14
And my local db on which I'm running this for now is PostgreSQL 9.6.2
What am I doing wrong here? Any patient help would be greatly appreciated!
Yes, you do need to add the column to table:
CREATE TABLE receipts (
id INTEGER NOT NULL,
gross_amount INTEGER, -- <<< See, it's here :)
PRIMARY KEY (id)
);
INSERT INTO receipts VALUES(1,7);
INSERT INTO receipts VALUES(2,7);
CREATE TABLE items (
id INTEGER NOT NULL,
gross_amount INTEGER,
receipt_id INTEGER NOT NULL,
PRIMARY KEY (id),
FOREIGN KEY(receipt_id) REFERENCES receipts (id)
);
Tested with this self-contained snippet:
from sqlalchemy import Column, Integer, ForeignKey, create_engine, orm
from sqlalchemy.orm import sessionmaker
from sqlalchemy.sql import func
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy_utils import aggregated
Base = declarative_base()
class Receipt(Base):
__tablename__ = "receipts"
__table_args__ = {'extend_existing': True}
id = Column(Integer, index = True, primary_key = True, nullable = False)
#aggregated('itemz', Column(Integer))
def gross_amount(self):
return func.sum(Item.gross_amount)
itemz = orm.relationship('Item', backref='receipts')
class Item(Base):
__tablename__ = "items"
id = Column(Integer, index = True, primary_key = True, nullable = False)
gross_amount = Column(Integer)
receipt_id = Column(Integer, ForeignKey("receipts.id"), nullable=False)
def __init__(self, amount):
self.gross_amount=amount
engine = create_engine('sqlite:///xxx.db', echo=True)
Base.metadata.create_all(engine)
session = sessionmaker(bind=engine)()
receipt = Receipt()
receipt.itemz.append(Item(5))
receipt.itemz.append(Item(2))
session.add(receipt)
session.commit()
print (receipt.gross_amount)
Of course, there's also another approach called hybrid_property, which basically allows you to do both orm- and database level queries without adding extra column do your database:
#hybrid_property
def gross_sum(self):
return sum(i.gross_amount for i in self.itemz)
#gross_sum.expression
def gross_sum(cls):
return select([func.sum(Item.gross_amount)]).\
where(Item.receipt_id==cls.id).\
label('gross_sum')
The reason you're getting this error is because the new column you're adding (gross_amount) has not been created in the receipts table in the database.
Meaning, your current database table only has one created column (id). For the aggregated column to work, it needs to contain an additional column called gross_amount.
This additional column has to allow null values.
One way to go about doing that is through SQL directly in PostgreSQL:
ALTER TABLE receipts ADD gross_amount int;
Alternatively, if there's no data yet, you can drop and recreate the table via SQLAlchemy. It should create this extra column automatically.
I'm not sure what you mean by the last part:
When I DO define this column in the receipts table, any
Receipt.gross_amount for any instance just points to the gross_amount
values defined in the receipts table.
That's where it's supposed to point. I'm not sure what you mean by that. Do you mean that it doesn't contain any values, even though there are values for this receipt's items in Item? If so, I would double check that this is the case (and per their examples here, refresh the database session before seeing the results).

How to query with joins using sql alchemy?

I am trying to use SqlAlchemy with mysql as backend.The following are my table schema (defined for ORM using SQLAlchemy):
class ListItem(Base):
""""""
__tablename__ = "listitem"
ListItemID = Column(Integer, primary_key=True)
ListItemTypeID = Column(Integer, ForeignKey("ListItemType.ListItemTypeID"))
ModelID = Column(Integer, ForeignKey("Model.ModelID"))
RefCode = Column(String(25))
def __init__(self, ListItemTypeID, ModelID, RefCode):
self.ListItemTypeID= ListItemTypeID
self.ModelID= ModelID
self.RefCode= RefCode
class Model(Base):
""""""
__tablename__ = "model"
ModelID= Column(Integer, primary_key=True)
Name = Column(String(255))
def __init__(self, Name):
self.Name= Name
I am not including the class mappers for the other reference tables like (ListItemType).
I would like to know how to query joining the "ListItem" table to the "Model" table and "ListItemType" table.
An SQL equivalent of the same should be this way:
select listitem.ListItemID, model.Name, listitemtype.Name from listitemrequest
join listitemtype on listitemrequest.ListItemTypeID = listitemtype.ListItemID
join model on listitemrequest.ModelID = Model.ModelID
I am fairly new with using sqlalchemy. Thanks for any help in advance.
If the columns already have a foreign key relationship the following should work.
Read the docs on joins.
result = session.query(listitemrequest).
join(listitemtype).
join(model).
with_entities([listitem.c.ListItemID, mode.c.name,listitemtype.c.Name]).
all()

Categories

Resources