I'm trying to insert a JSON string into a snowflake table defined below:
from snowflake.sqlalchemy import VARIANT
class Identify(Base):
__tablename__ = 'identify'
event_id = Column(Integer, Sequence('event_id_seq'), unique=True, primary_key=True)
user_id = Column(String, nullable=False)
traits = Column(VARIANT, nullable=False)
The json_str that I'm trying to insert is:
json_str = '''
{
"address": "xyz",
"age": 32,
"avatar": "xyz",
"birthday": {
"seconds": 20,
"nanos": 10
}
}
'''
I'm using the following code to insert the json_str into the table
session = Session()
obj = Identify(user_id=request.user_id,
traits=json_str)
session.add(obj)
session.commit()
session.close()
Which yields the following error: snowflake.connector.errors.ProgrammingError) 002023 (22000): SQL compilation error: Expression type does not match column data type, expecting VARIANT but got VARCHAR(3038) for column TRAITS
Is there a way to insert the json_str or a dict without writing an SQL insert statement that uses the TO_VARIANT conversion function?
Is that valid JSON? Looks to me like you have unmatched brackets...
-Paul-
The API doc says VARIANT is supported for fetching data by converting to a string.
I've found no other way to store data than an INSERT converting STRING to VARIANT.
Related
In SqlAlchemy I want to insert a new row in my_table, but if the value for the column "name" already exists then I want to update this column:
dico = {
"name": "somename",
"other_column": "some_value"
}
result = conn.execute(
insert(my_table.on_conflict_do_update(
constraint="name",
set_=dico
),
[ dico ]
)
But I get this error:
'Insert' object has no attribute 'on_conflict_do_update'
Very important: I need to specify the values for the insert/update as a dictionary.
Thank you
Suppose you have a table like that:
from sqlalchemy import Column, String, Integer, UniqueConstraint
from sqlalchemy.orm import declarative_base
from sqlalchemy.dialects.postgresql import insert
Base = declarative_base()
class Table(Base):
__tablename__ = "example"
__table_args__ = (UniqueConstraint("name"),)
row_id = Column(Integer, primary_key=True)
name = Column(String, unique=True)
other_column = Column(String)
You can use on_conflict_do_update like that:
dico = {
"name": "somename",
"other_column": "some_value"
}
stmt = insert(Table).values(dico)
stmt = stmt.on_conflict_do_update(
constraint="example_name_key",
set_={col: getattr(stmt.excluded, col) for col in dico}
)
Important to know is that you have to import it from sqlalchemy.dialects.postgresql
and that stmt.excluded holds the "new" value for your update.
EDIT:
To get your constraints dynamically, you can use the inspector.
inspector = sqlalchemy.inspect(engine)
inspector.get_unique_constraints(table.__tablename__) # or provide the table name as a string
I have a postgres table with JSONB column name entity. The example of json structure is:
{
"some_key": "some value",
"properties": {
"name": ["name_value"],
}
}
I need to find records by name_value. I can get it by using query:
SELECT entity FROM my_table where entity->'properties' #> {"name": ["name_value"]};
The problem is: I cannot find the way to create that same query using sqlalchemy ORM.
Update:
My model is made dynamic, because multiple tables are using same structure just the different table name. Here is the code:
...
Base = declarative_base(engine)
class ForeignBase(AbstractConcreteBase, Base):
pass
def make_model(name):
class Entity(ForeignBase):
__tablename__ = name
__mapper_args__ = {"polymorphic_identity": name, "concrete": True}
__table_args__ = {"extend_existing": True}
id = Column(String, primary_key=True)
entity = Column(JSONB, nullable=True)
...
configure_mappers()
return Entity
And then I use 3 functions to initiate my Models and get one I currently need:
def get_model_list():
tables_names_list = get_table_name_list()
model_list = [make_model(PREFIX + table_name) for table_name in tables_names_list]
return model_list
def get_tables_dict():
return {table.__tablename__: table for table in ForeignBase.__subclasses__()}
def get_table_object(table_name):
return get_tables_dict().get(table_name)
You can use the following SQLAlchemy expression. The Operator #> is contains() in SQLAlchemy for JSONB columns
Session().query(MyTable).filter(MyTable.entity["properties"].contains({"name": ["name_value"]})).with_entities(MyTable.entity).all()
I have the following SQLAlchemy mapped classes:
class ShowModel(db.Model):
__tablename__ = 'shows'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(100))
episodes = db.relationship('EpisodeModel', backref='episode', lazy='dynamic')
class EpisodeModel(db.Model):
__tablename__ = 'episodes'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(200))
show_id = db.Column(db.Integer, db.ForeignKey('shows.id'))
info = db.relationship('InfoModel', backref='episode', lazy='dynamic')
class InfoModel(db.Model):
__tablename__ = 'info'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(100))
episode_id = db.Column(db.Integer, db.ForeignKey('episodes.id'))
I'm trying and failing to figure out how to perform a query that searches the info table for a specific name column value AND then return the shows and episodes table rows that are associated with it.
Using the following query allows me to return the specific info row that matches the filter_by(name=name) query
InfoModel.query.filter_by(name=name).all()))
But I am really struggling to figure out how to also get the values of the corresponding foreign key rows that have a relationship with the specific info row. Is there a proper way to do this with the join statement or something similar? Thank you very much for any help on this, as I'm still trying to get the hang of working with SQL databases, and databases in general.
Edit -
If, for example, I use the query InfoModel.query.filter_by(name="ShowName1").all())), my json() representation returns
{
"name": "InfoName1",
"id": 1,
"episode_id": 1
}
But I'm also wanting to return the associated foreign table values so that my json() representation returns -
{
"name": "ShowName1",
"id": 1,
"episodes":
{
"name": "EpisodeName1",
"id": 1,
"show_id": 1
"info":
{
"name": "InfoName1",
"id": 1,
"episode_id": 1
}
},
}
And I apologize for fumbling over my use of jargon here, making my question appear more complicated than it is.
Because you have lazy loading enabled, the joined tables will only be set when they are accessed. What you can do is force a join. Something like the following should work for you:
shows = session.query(ShowModel)
.join(EpisodeModel)
.join(InfoModel)
.filter(ShowModel.name == "foo")
.all()
You can also change your load configuration to be "eager", or any number of other options. I don't like to do this by default though, as it makes for accidentally expensive queries: https://docs.sqlalchemy.org/en/latest/orm/loading_relationships.html
This has been a tricky one, hope someone out there can help us all out by posting a/their method of creating an index on a nested key of a JSON (or JSONB) column in PostgreSQL using SQLAlchemy (i'm specifically using Flask-SQLAlchemy, but I do not think that will matter much for the answer).
I've tried all sorts of permutations of the index creations below and get everything from key errors, to 'c' is not an attribute, to that the operator 'getitem' is not supported on this expression.
Any help would be greatly appreciated.
# Example JSON, the nested property is "level2_A"
{
'level1': {
'level2_A': 'test value',
}
}
class TestThing(db.Model):
__tablename__ = 'test_thing'
id = db.Column(db.BigInteger(), primary_key=True)
data = db.Column(JSONB)
__table_args__ = (db.Index('ix_1', TestThing.data['level1']['level2_A']),
db.Index('ix_2', data['level1']['level2_A'].astext),
db.Index('ix_3', "TestThing.c.data['level1']['level2_A'].astext"),
db.Index('ix_4', TestThing.c.data['level1']['level2_A'].astext),
db.Index('ix_5', "test_thing.c.data['level1']['level2_A']"),
)
# db.Index('ix_1', TestThing.data['level1']['level2_A'])
# db.Index('ix_2_t', "test_thing.data['level1']['level2_A']")
# db.Index('ix_3', "TestThing.c.data['level1']['level2_A'].astext")
# db.Index('ix_4', TestThing.c.data['level1']['level2_A'].astext)
# db.Index('ix_5', "test_thing.c.data['level1']['level2_A']")
The solution I've found is using text to create a functional index.
Two example indexes here, depending on whether you want to cast the result to text or not:
from sqlalchemy.sql.expression import text
from sqlalchemy.schema import Index
class TestThing(db.Model):
__tablename__ = 'test_thing'
id = db.Column(db.BigInteger(), primary_key=True)
data = db.Column(JSONB)
__table_args__ = (
Index("ix_6", text("(data->'level1'->'level2_A')")),
Index("ix_7", text("(data->'level1'->>'level2_A')")),
)
Which results in the following SQL to create the indexes:
CREATE INDEX ix_6 ON test_thing(((data -> 'level1'::text) -> 'level2_A'::text) jsonb_ops);
CREATE INDEX ix_7 ON test_thing(((data -> 'level1'::text) ->> 'level2_A'::text) text_ops);
class TestThing(db.Model):
__tablename__ = 'test_thing'
id = db.Column(db.BigInteger(), primary_key=True)
data = db.Column(JSONB)
__table_args__ = (
Index("ix_7", "(data->'level1'->>'level2_A')"),
)
This should ideally work without the text()
because -> returns json(b) and ->> returns text:
The query which will be generated will be
CREATE INDEX ix_7 ON test_thing(((data->'level1')->>'level2_A') text_ops);
I have a MySQL table as follows:
create table USER
(
USERID int not null auto_increment,
USERAVATAR varchar(1024) default NULL,
primary key (USERID)
);
I have created an entry in the table where USERID = 1 and USERAVATAR = NULL.
In Main.py
user_list = session.query(USER).all()
return jsonify(users=[r.serialize() for r in user_list])
sqltables.py
class USER(Base):
__tablename__ = 'USER'
USERID = Column(Integer, primary_key=True)
USERAVATAR = Column(String(1024))
def serialize(self):
return unicode({
'id': self.USERID,
'userAvatar': self.USERAVATAR
})
The issue is that even though USERAVATAR has been set to NULL in the database, I'm getting None as my `JSON output.
{
"users": [
"{'userAvatar': None, 'id': 1}"
]
}
Would anyone know what might be the problem here?
Your serialize function is casting into a string. Is that what you want in your JSON output, an array of strings instead of an array of objects?
If not, change your serialize function to not use unicode()