Problem with changing from mysqlclient to pymysql: save dict as string - python

I need to change a project that currently uses the library mysqlclient to use pymysql because of license issues.
The project uses sqlalchemy and doesn't use mysqlclient directly so I thought I will only need to change the connection string but I seem to encounter an edge case.
I have places in the code where some columns are defined in the sqlalchemy model as String, but for some reason (old code) the code tries to put a dict there. This works by casting the dict to str (this is the expected behaviour for all types - if I put int it will be cast to str).
When I change from the library mysqlclient to pymysql this behaviour seem to break only for dicts.
Here is a sample code that replicate this issue:
import sqlalchemy
from sqlalchemy import Column, Integer, String, DateTime, func, text, MetaData
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
SCHEMA = 'testing'
con = "mysql+pymysql://{USERNAME}:{PASSWORD}#{HOST}/{SCHEMA}?charset=utf8mb4".format(USERNAME='reducted',
PASSWORD="reducted",
HOST='127.0.0.1:3306',
SCHEMA=SCHEMA)
engine = sqlalchemy.create_engine(con, pool_recycle=3600, pool_size=20, pool_pre_ping=True, max_overflow=100)
metadata = MetaData(bind=engine)
base = declarative_base(metadata=metadata)
class TestModel(base):
__tablename__ = 'test_table'
__table_args__ = {'autoload': False,
'schema': SCHEMA
}
id = Column(Integer(), primary_key=True, nullable=False, index=True)
test_value = Column(String(50), nullable=False)
date_created = Column(DateTime, server_default=func.now(), index=True)
date_modified = Column(DateTime, server_default=text('CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP'), index=True)
metadata.create_all()
session_maker = sessionmaker(bind=engine)
session = session_maker()
row = TestModel()
row.test_value = {}
session.add(row)
session.commit()
This causes this error You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '})' at line 1
If you change pymysql in the connection string to mysqldb the code will work.
My question is this:
Is there a workaround or is there a sqlalchemy hook i can use cast the dicts myself?
Also if anymore knows about more issues in moving from mysqlclient to pymysql i would appreciate any tip, I cant seem to find any documentation of the differences (except the license part)

is there a sqlalchemy hook i can use cast the dicts myself?
You could add a validator to your TestModel class:
#validates("test_value")
def validate_test_value(self, key, thing):
if isinstance(thing, dict):
return str(thing)
else:
return thing

Related

SQLAlchemy: create_engine() syntax error with PostGIS

currently I am trying to create a postGIS database with sqlalchemy. I plan on normalizing my database by making several tables for shapefile data to be uploaded. my code is as follows:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, String, BigInteger, DateTime, MetaData
from geoalchemy2 import Geometry
Base = declarative_base()
class meta_link(Base):
__tablename__ = 'META_LINK'
ID = Column(BigInteger, primary_key = True)
FARM = Column(String)
FIELD = Column(String)
YEAR = Column(Integer)
CROP = Column(String)
TYPE = Column(String)
TIMESTAMP = Column(DateTime, default=datetime.datetime.utcnow)
I also inserted some other tables, but I made them exactly like the one listed above. Currently I am trying to create the tables by doing the following:
engine = create_engine('postgresql://myusername:mypassword#localhost:5432/my databasename')
metadata = MetaData()
metadata.create_all(bind=engine)
When I try to run the python script I get the following error:
File "./app.py", line 83
engine = create_engine('postgresql://postgres:postgresql#localhost:5432/ammar_DIFM_database')
^
SyntaxError: invalid syntax
I am currently trying to discover my bug, yet I cannot seem to figure it out. I also tried adding "echo=true" at the end of the create_engine statement, yet it did not work. How do I fix this syntax error?
My bad, I missed a parenthesis on the previous line. make sure your parenthesis are closed like () before you post haha.

SQLAlchemy searchable: "Function tsq_parse does not exist" on full text search

I'm trying to enable full text search on a model's column using SQLAlchemy-searchable. I followed the instructions on their quickstart guide and applied the fix specified in this github issue given that I'm using Flask. Also, I already created and applied the migrations as specified in Alembic Migrations docs section. However, the following exception is being raised:
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) function
tsq_parse(unknown, unknown) does not exist
LINE 3: WHERE quote.qt_search_vector ## tsq_parse('pg_catalog.englis...
^
HINT: No function matches the given name and argument types. You might need
to add explicit type casts. [SQL: 'SELECT quote.id AS quote_id, quote.song_id AS
quote_song_id, quote.stanza_id AS quote_stanza_id, quote.popularity_count AS
quote_popularity_count, quote.quote_text AS quote_quote_text,
quote.qt_search_vector AS quote_qt_search_vector \nFROM quote \nWHERE
quote.qt_search_vector ## tsq_parse(%(tsq_parse_1)s, %(tsq_parse_2)s) \n
LIMIT %(param_1)s'] [parameters: {'tsq_parse_1': 'pg_catalog.english',
'tsq_parse_2': '"ipsum"', 'param_1': 10}]
(Background on this error at: http://sqlalche.me/e/f405)
Am I missing something?
__init__.py
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy_searchable import make_searchable
Base = declarative_base()
make_searchable(Base.metadata)
... more code ...
Model
class QuoteQuery(BaseQuery, SearchQueryMixin):
pass
class Quote(db.Model):
query_class = QuoteQuery
__table_args__ = (
db.UniqueConstraint('song_id', 'stanza_id', 'quote_text'),)
id = db.Column(db.Integer, primary_key=True)
song_id = db.Column(
db.Integer, db.ForeignKey('song.id'), nullable=False)
stanza_id = db.Column(
db.Integer, db.ForeignKey('stanza.id'), nullable=True)
popularity_count = db.Column(
db.BigInteger, unique=False, nullable=False, server_default='1')
quote_text = db.Column(db.Text, unique=False, nullable=False)
qt_search_vector = db.Column(TSVectorType('quote_text'))
Query
term = 'lorem'
Quote.query.search('"' + term + '"').all()
This was a problem with the metadata I was providing the make_searchable function. Fixed make_searchable call:
from flask_sqlalchemy import SQLAlchemy
from sqlalchemy_searchable import make_searchable
db = SQLAlchemy()
make_searchable(db.metadata)
After this, I called db.create_all() and the full text search started working as expected.
For those not using Flask, this was my solution:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import configure_mappers
db_engine = create_engine(...)
configure_mappers()
Base = declarative_base()
make_searchable(Base.metadata)
Base.metadata.create_all(db_engine)
The missing piece was the Base.metadata.create_all(db_engine)
I ended up here troubleshooting for a day on this. Maybe this will help someone else. I was also sometimes missing function parse_websearch instead of tsq_parse but I wasn't able to narrow down why it gave one or the other before I fixed it.
My issue was I upgraded sqlalchemy_searchable (from 0.10.2 to 1.4) and my database version (Postgres 9.6) was too old now. I needed to manually create some functions now from sqlalchemy_searchable but I was missing a Postgres function (specifically websearch_to_tsquery that was added in 11).
I was using Flask-Migrate/Alembic for migrations. I got help from here and ended up making a blank migration (flask db revision to do that). I pasted in the SQL from here. My new migration looked like this.
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'AAA'
down_revision = 'BBB'
branch_labels = None
depends_on = None
def upgrade():
command = """
CREATE OR REPLACE FUNCTION parse_websearch(config regconfig, search_query text)
RETURNS tsquery AS $$
SELECT
string_agg(
(
CASE
WHEN position('''' IN words.word) > 0 THEN CONCAT(words.word, ':*')
ELSE words.word
END
),
' '
)::tsquery
FROM (
SELECT trim(
regexp_split_to_table(
websearch_to_tsquery(config, lower(search_query))::text,
' '
)
) AS word
) AS words
$$ LANGUAGE SQL IMMUTABLE;
CREATE OR REPLACE FUNCTION parse_websearch(search_query text)
RETURNS tsquery AS $$
SELECT parse_websearch('pg_catalog.simple', search_query);
$$ LANGUAGE SQL IMMUTABLE;"""
op.execute(command)
def downgrade():
op.execute('DROP FUNCTION public.parse_websearch(regconfig, text);')
op.execute('DROP FUNCTION public.parse_websearch(text);')

Sqlalchemy if table does not exist

I wrote a module which is to create an empty database file
def create_database():
engine = create_engine("sqlite:///myexample.db", echo=True)
metadata = MetaData(engine)
metadata.create_all()
But in another function, I want to open myexample.db database, and create tables to it if it doesn't already have that table.
EG of the first, subsequent table I would create would be:
Table(Variable_TableName, metadata,
Column('Id', Integer, primary_key=True, nullable=False),
Column('Date', Date),
Column('Volume', Float))
(Since it is initially an empty database, it will have no tables in it, but subsequently, I can add more tables to it. Thats what i'm trying to say.)
Any suggestions?
I've managed to figure out what I intended to do. I used engine.dialect.has_table(engine, Variable_tableName) to check if the database has the table inside. IF it doesn't, then it will proceed to create a table in the database.
Sample code:
engine = create_engine("sqlite:///myexample.db") # Access the DB Engine
if not engine.dialect.has_table(engine, Variable_tableName): # If table don't exist, Create.
metadata = MetaData(engine)
# Create a table with the appropriate Columns
Table(Variable_tableName, metadata,
Column('Id', Integer, primary_key=True, nullable=False),
Column('Date', Date), Column('Country', String),
Column('Brand', String), Column('Price', Float),
# Implement the creation
metadata.create_all()
This seems to be giving me what i'm looking for.
Note that in 'Base.metadata' documentation it states about create_all:
Conditional by default, will not attempt to recreate tables already
present in the target database.
And if you can see that create_all takes these arguments: create_all(self, bind=None, tables=None, checkfirst=True), and according to documentation:
Defaults to True, don't issue CREATEs for tables already present in
the target database.
So if I understand your question correctly, you can just skip the condition.
The accepted answer prints a warning that engine.dialect.has_table() is only for internal use and not part of the public API. The message suggests this as an alternative, which works for me:
import os
import sqlalchemy
# Set up a connection to a SQLite3 DB
test_db = os.getcwd() + "/test.sqlite"
db_connection_string = "sqlite:///" + test_db
engine = create_engine(db_connection_string)
# The recommended way to check for existence
sqlalchemy.inspect(engine).has_table("BOOKS")
See also the SQL Alchemy docs.
For those who define the table first in some models.table file, among other tables.
This is a code snippet for finding the class that represents the table we want to create ( so later we can use the same code to just query it )
But together with the if written above, I still run the code with checkfirst=True
ORMTable.__table__.create(bind=engine, checkfirst=True)
models.table
class TableA(Base):
class TableB(Base):
class NewTableC(Base):
id = Column('id', Text)
name = Column('name', Text)
form
Then in the form action file:
engine = create_engine("sqlite:///myexample.db")
if not engine.dialect.has_table(engine, table_name):
# Added to models.tables the new table I needed ( format Table as written above )
table_models = importlib.import_module('models.tables')
# Grab the class that represents the new table
# table_name = 'NewTableC'
ORMTable = getattr(table_models, table_name)
# checkfirst=True to make sure it doesn't exists
ORMTable.__table__.create(bind=engine, checkfirst=True)
engine.dialect.has_table does not work for me on cx_oracle.
I am getting AttributeError: 'OracleDialect_cx_oracle' object has no attribute 'default_schema_name'
I wrote a workaround function:
from sqlalchemy.engine.base import Engine
def orcl_tab_or_view_exists(in_engine: Engine, in_object: str, in_object_name: str,)-> bool:
"""Checks if Oracle table exists in current in_engine connection
in_object: 'table' | 'view'
in_object_name: table_name | view_name
"""
obj_query = """SELECT {o}_name FROM all_{o}s WHERE owner = SYS_CONTEXT ('userenv', 'current_schema') AND {o}_name = '{on}'
""".format(o=in_object, on=in_object_name.upper())
with in_engine.connect() as connection:
result = connection.execute(obj_query)
return len(list(result)) > 0
This is the code working for me to create all tables of all model classes defined with Base class
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
class YourTable(Base):
__tablename__ = 'your_table'
id = Column(Integer, primary_key = True)
DB_URL="mysql+mysqldb://<user>:<password>#<host>:<port>/<db_name>"
scoped_engine = create_engine(DB_URL)
Base = declarative_base()
Base.metadata.create_all(scoped_engine)

SQLAlchemy alembic AmbiguousForeignKeysError for declarative type but not for equivalent non-declarative type

I have the following alembic migration:
revision = '535f7a49839'
down_revision = '46c675c68f4'
from alembic import op
import sqlalchemy as sa
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from datetime import datetime
Session = sessionmaker()
Base = declarative_base()
metadata = sa.MetaData()
# This table definition works
organisations = sa.Table(
'organisations',
metadata,
sa.Column('id', sa.Integer, primary_key=True),
sa.Column('creator_id', sa.Integer),
sa.Column('creator_staff_member_id', sa.Integer),
)
"""
# This doesn't...
class organisations(Base):
__tablename__ = 'organisations'
id = sa.Column(sa.Integer, primary_key=True)
creator_id = sa.Column(sa.Integer)
creator_staff_member_id = sa.Column(sa.Integer)
"""
def upgrade():
bind = op.get_bind()
session = Session(bind=bind)
session._model_changes = {} # if you are using Flask-SQLAlchemy, this works around a bug
print(session.query(organisations).all())
raise Exception("don't succeed")
def downgrade():
pass
Now the query session.query(organisations).all() works when I use the imperatively-defined table (the one not commented out). But if I use the declarative version, which as far as I understand should be equivalent, I get an error:
sqlalchemy.exc.AmbiguousForeignKeysError: Could not determine join
condition between parent/child tables on relationship
StaffMember.organisation - there are multiple foreign key paths
linking the tables. Specify the 'foreign_keys' argument, providing a
list of those columns which should be counted as containing a foreign
key reference to the parent table.
Now I understand what this error means: I have two foreign keys from organisations to staff_members in my actual models. But why does alembic care about these, and how does it even know they exist? How does this migration know that something called StaffMember exists? As far as I understand, alembic should only know about the models you explicitly tell it about in the migration.
Turns out the problem was with my Flask-script setup I was using to call alembic. The command I was using to call alembic was importing the code to initialise my Flask app which was itself importing my actual models.

Get last inserted value from MySQL using SQLAlchemy

I've just run across a fairly vexing problem, and after testing I have found that NONE of the available answers are sufficient.
I have seen various suggestions but none seem to be able to return the last inserted value for an auto_increment field in MySQL.
I have seen examples that mention the use of session.flush() to add the record and then retrieve the id. However that always seems to return 0.
I have also seen examples that mention the use of session.refresh() but that raises the following error: InvalidRequestError: Could not refresh instance ''
What I'm trying to do seems insanely simple but I can't seem to figure out the secret.
I'm using the declarative approach.
So, my code looks something like this:
class Foo(Base):
__tablename__ = 'tblfoo'
__table_args__ = {'mysql_engine':'InnoDB'}
ModelID = Column(INTEGER(unsigned=True), default=0, primary_key=True, autoincrement=True)
ModelName = Column(Unicode(255), nullable=True, index=True)
ModelMemo = Column(Unicode(255), nullable=True)
f = Foo(ModelName='Bar', ModelMemo='Foo')
session.add(f)
session.flush()
At this point, the object f has been pushed to the DB, and has been automatically assigned a unique primary key id. However, I can't seem to find a way to obtain the value to use in some additional operations. I would like to do the following:
my_new_id = f.ModelID
I know I could simply execute another query to lookup the ModelID based on other parameters but I would prefer not to if at all possible.
I would much appreciate any insight into a solution to this problem.
Thanks for the help in advance.
The problem is you are setting defaul for the auto increment. So when it run the insert into query the log of server is
2011-12-21 13:44:26,561 INFO sqlalchemy.engine.base.Engine.0x...1150 INSERT INTO tblfoo (`ModelID`, `ModelName`, `ModelMemo`) VALUES (%s, %s, %s)
2011-12-21 13:44:26,561 INFO sqlalchemy.engine.base.Engine.0x...1150 (0, 'Bar', 'Foo')
ID : 0
So the output is 0 which is the default value and which is passed because you are setting default value for autoincrement column.
If I run same code without default then it give the correct output.
Please try this code
from sqlalchemy import create_engine
engine = create_engine('mysql://test:test#localhost/test1', echo=True)
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
from sqlalchemy.orm import sessionmaker
Session = sessionmaker(bind=engine)
session = Session()
from sqlalchemy import Column, Integer, Unicode
class Foo(Base):
__tablename__ = 'tblfoo'
__table_args__ = {'mysql_engine':'InnoDB'}
ModelID = Column(Integer, primary_key=True, autoincrement=True)
ModelName = Column(Unicode(255), nullable=True, index=True)
ModelMemo = Column(Unicode(255), nullable=True)
Base.metadata.create_all(engine)
f = Foo(ModelName='Bar', ModelMemo='Foo')
session.add(f)
session.flush()
print "ID :", f.ModelID
Try using session.commit() instead of session.flush(). You can then use f.ModelID.
Not sure why the flagged answer worked for you. But in my case, that does not actually insert the row into the table. I need to call commit() in the end.
So the last few lines of code are:
f = Foo(ModelName='Bar', ModelMemo='Foo')
session.add(f)
session.flush()
print "ID:", f.ModelID
session.commit()

Categories

Resources