How to set ON CONFLICT? - python

In SqlAlchemy ORM and SQLite how make a table definition, like raw UNIQUE(fld) ON CONFLICT REPLACE?
This should be analogy of raw query, like:
CREATE TABLE tbl (fld TEXT UNIQUE, UNIQUE(fld) ON CONFLICT IGNORE)
There is INSERT…ON DUPLICATE KEY UPDATE for MySQL, and it is not the definition.

ON CONFLICT in SQLite constraints has been supported since SQLAlchemy 1.3.
A constraint can be applied inline in a column description
tbl1 = sa.Table(
't1',
metadata,
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(
'name', sa.String, unique=True, sqlite_on_conflict_unique='REPLACE'
),
sa.Column('colour', sa.String),
)
generating this DDL
CREATE TABLE t1 (
id INTEGER NOT NULL,
name VARCHAR,
colour VARCHAR,
PRIMARY KEY (id),
UNIQUE (name) ON CONFLICT REPLACE
)
Or as a separate constraint:
tbl2 = sa.Table(
't2',
metadata,
sa.Column('id', sa.Integer, primary_key=True),
sa.Column('name', sa.String),
sa.Column('colour', sa.String),
sa.Column('size', sa.Integer),
sa.UniqueConstraint('name', 'colour', sqlite_on_conflict='IGNORE')
)
generating
CREATE TABLE t2 (
id INTEGER NOT NULL,
name VARCHAR,
colour VARCHAR,
size INTEGER,
PRIMARY KEY (id),
UNIQUE (name, colour) ON CONFLICT IGNORE
)

Related

how to remove a foreign key constraint using flask and alembic

I am using flask with alembic and i have the two tables below linked by a Foreign key constraint:
table_one = Table("table_one", meta.Base.metadata,
Column("id", BigInteger, primary_key=True),
Column("filename", VARCHAR(65535)),
Column("mission", VARCHAR(65535)),
)
table_two = Table("table_two", meta.Base.metadata,
Column("id", BigInteger, primary_key=True),
Column("file_id", BigInteger, ForeignKey("table_one.id")),
Column("username", ArrowType(timezone=True)),
I am trying to get rid of table_one with the alembic revision below
def upgrade():
op.drop_table('table_one')
op.drop_constraint('table_two_id_fkey', 'table_two', type_='foreignkey')
op.drop_column('table_two', 'file_id')
op.drop_column('table_two', 'id')
def downgrade():
op.add_column('table_two', sa.Column('id', sa.BIGINT(), autoincrement=True, nullable=False))
op.add_column('table_two', sa.Column('file_id', sa.BIGINT(), autoincrement=False, nullable=True))
op.create_foreign_key('table_two_file_id_fkey', 'table_two', 'table_one', ['file_id'], ['id'])
op.create_table('table_one',
sa.Column('id', sa.BIGINT(), autoincrement=True, nullable=False),
sa.Column('filename', sa.VARCHAR(length=65535), autoincrement=False, nullable=True),
sa.Column('mission', sa.VARCHAR(length=65535), autoincrement=False, nullable=True),
sa.PrimaryKeyConstraint('id', name='table_one_pkey')
)
but unfortunately there seem to be an issue with the cascade and i am facing the error below:
psycopg2.errors.DependentObjectsStillExist: cannot drop table table_one because other objects depend on it
DETAIL: constraint table_two_file_id_fkey on table table_tow depends on table table_one
HINT: Use DROP ... CASCADE to drop the dependent objects too.
Does anyone have an idea how on to solve this issue?
Incase anyone is trying to drop a foreign key as the question title says:
You can remove a foreign key using the drop_constraint() function in alembic
op.drop_constraint(constraint_name="FK_<target>_<source>", table_name="<source>")

SQLAlchemy / Alembic wanting to drop & re-create index for foreign key

total newbie to Alembic, SQLAlchemy, and Python. I've gotten to the point where Alembic is comparing existing objects in the database against the declarative classes I've made, and there's one pesky index (for a foreign key) that Alembic refuses to leave in-place in my initial migration.
I'm completely at a loss as to why the migration is continually trying to drop and re-create this index, which, if I leave in the migration I'll wager is going to fail anyway. Plus, if I don't reconcile the class to the database this will likely come up every time I auto-generate migrations.
Here's the pertinent part of what is in the upgrade method:
op.drop_index(
'vndr_prod_tp_cat_category_fk_idx',
table_name='vendor_product_types_magento_categories'
)
In the downgrade method:
op.create_index(
'vndr_prod_tp_cat_category_fk_idx',
'vendor_product_types_magento_categories',
['magento_category_id'],
unique=False
)
...here's the DDL for the table as it exists in MySQL:
CREATE TABLE `vendor_product_types_magento_categories` (
`id` bigint(20) unsigned NOT NULL AUTO_INCREMENT,
`vendor_product_type_id` bigint(20) unsigned NOT NULL,
`magento_category_id` bigint(20) unsigned NOT NULL,
`sequence` tinyint(3) unsigned NOT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`updated_at` timestamp NULL DEFAULT NULL ON UPDATE CURRENT_TIMESTAMP,
PRIMARY KEY (`id`),
UNIQUE KEY `vendor_product_types_magento_categories_uq` (`vendor_product_type_id`,`magento_category_id`,`sequence`),
KEY `vndr_prod_tp_cat_category_fk_idx` (`magento_category_id`),
CONSTRAINT `vndr_prod_tp_cat_magento_category_fk` FOREIGN KEY (`magento_category_id`) REFERENCES `magento_categories` (`id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `vndr_prod_tp_cat_product_type_fk` FOREIGN KEY (`vendor_product_type_id`) REFERENCES `vendor_product_types` (`id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=101 DEFAULT CHARSET=utf8
...and here's the class I wrote:
from sqlalchemy import Column, Integer, UniqueConstraint, ForeignKeyConstraint, Index
from sqlalchemy.dialects.mysql import TIMESTAMP
from sqlalchemy.sql import text
from .base import Base
class VendorProductTypesMagentoCategories(Base):
__tablename__ = 'vendor_product_types_magento_categories'
id = Column(Integer, primary_key=True)
vendor_product_type_id = Column(
Integer,
nullable=False
)
magento_category_id = Column(
Integer,
nullable=False
)
sequence = Column(Integer, nullable=False)
created_at = Column(TIMESTAMP, server_default=text('CURRENT_TIMESTAMP'), nullable=False)
updated_at = Column(
TIMESTAMP,
server_default=text('NULL ON UPDATE CURRENT_TIMESTAMP'),
nullable=True
)
__table_args__ = (
UniqueConstraint(
'vendor_product_type_id',
'magento_category_id',
'sequence',
name='vendor_product_types_magento_categories_uq'
),
ForeignKeyConstraint(
('vendor_product_type_id',),
('vendor_product_types.id',),
name='vndr_prod_tp_cat_product_type_fk'
),
ForeignKeyConstraint(
('magento_category_id',),
('magento_categories.id',),
name='vndr_prod_tp_cat_category_fk_idx'
),
)
def __repr__(self):
return '<VendorProductTypesMagentoCategories (id={}, vendor_name={}, product_type={})>'.format(
self.id,
self.vendor_name,
self.product_type
)
You define your product foreign key in your python code as
ForeignKeyConstraint(
('magento_category_id',),
('magento_categories.id',),
name='vndr_prod_tp_cat_category_fk_idx'
)
Here you use vndr_prod_tp_cat_category_fk_idx as the name of the foreign key constraint, not as the name of the underlying index, which explains why sqlalchemy wants to drop the index.
You should use vndr_prod_tp_cat_product_type_fk as the foreign key name and have a separate Index() construct with vndr_prod_tp_cat_category_fk_idx as name to create the index.

How can I print schema / table definitions in SQLAlchemy

How can I print out the schema of all tables using sqlalchemy?
This is how I do it using SQLite3: I run an SQL to print out the schema of all tables in the database:
import sqlite3
conn = sqlite3.connect("example.db")
cur = conn.cursor()
rs = cur.execute(
"""
SELECT name, sql FROM sqlite_master
WHERE type='table'
ORDER BY name;
""")
for name, schema, *args in rs:
print(name)
print(schema)
print()
With output that can look like this:
albums
CREATE TABLE "albums"
(
[AlbumId] INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
[Title] NVARCHAR(160) NOT NULL,
[ArtistId] INTEGER NOT NULL,
FOREIGN KEY ([ArtistId]) REFERENCES "artists" ([ArtistId])
ON DELETE NO ACTION ON UPDATE NO ACTION
)
artists
CREATE TABLE "artists"
(
[ArtistId] INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
[Name] NVARCHAR(120)
)
Is there a way to do it with pure sqlalchemy api calls, something better than this?
metadata = sqlalchemy.MetaData()
metadata.reflect(engine)
insp = sqlalchemy.inspect(engine)
for table_name in metadata.tables:
print(table_name)
for column in insp.get_columns(table_name):
for name,value in column.items():
print(' ', end='')
if value:
field = name if value in [True, 'auto'] else value
print(field, end=' ')
print()
Output:
albums
AlbumId INTEGER autoincrement primary_key
Title NVARCHAR(160) autoincrement
ArtistId INTEGER autoincrement
artists
ArtistId INTEGER autoincrement primary_key
Name NVARCHAR(120) nullable autoincrement
This bit in the SQLAlchemy docs may help: they suggest doing this:
def dump(sql, *multiparams, **params):
print(sql.compile(dialect=engine.dialect))
engine = create_engine('postgresql://', strategy='mock', executor=dump)
metadata.create_all(engine, checkfirst=False)

Add autoincrement column that is not primary key Alembic

I want to add a column that is autoincrement that is not primary key to an existing MySQL database.
The command issued on the server required for this operation is the following:
ALTER TABLE `mytable` ADD `id` INT UNIQUE NOT NULL AUTO_INCREMENT FIRST
The issue I face is how to replicate this table change through an Alembic migration. I have tried:
from alembic import op
import sqlalchemy as sa
def upgrade():
op.add_column('mytable', sa.Colummn('id', sa.INTEGER(),
nullable=False, autoincrement=True)
but when I try to insert a row with the following command:
INSERT INTO `mytable` (`col1`, `col2`) VALUES (`bar`);
where col1, col2 are non nullable columns. When I insert this record I expect the table to generate automatically the id for me.
ERROR 1364 (HY000): Field 'id' doesn't have a default value
If I inspect the sql autogenerated by Alembic using the following command:
alembic upgrade 'hash-of-revision' --sql
it spits out, for the given revision:
ALTER TABLE mytable ADD COLUMN id INTEGER NOT NULL;
Which means that either Alembic or SQLAlchemy is ignoring the autoincrement field when generating the sql of the migration.
Is there a way I can solve this? Or can I establish a migration based on a custom sql command?
First we add the column and allow it to have null values
op.add_column('abc', Column('id', BIGINT(unsigned=True), comment='This column stores the type of phrase') )
Then we create the primary key
op.create_primary_key( 'abc_pk', table_name='abc', columns=['id'] )
Not sure how it allows me to add a primary key on an empty column but i guess it is because it is in a transaction block.
Then we alter the column to have the autoincrement column
op.alter_column( existing_type=BIGINT(unsigned=True), table_name='abc', column_name='id', autoincrement=True, existing_autoincrement=True, nullable=False)
As of SQLAlchemy 1.4+ and Alembic 1.9 you can use the Identity type, which according to docs, supersedes the Serial type.
This Declarative ORM:
class ProductOption(Base):
__tablename__:str = 'product_options'
id:Mapped[int] = mapped_column(Integer, server_default=Identity(start=1, cycle=True), primary_key=True)
uuid:Mapped[UUID] = mapped_column(UUID, nullable=False, unique=True)
name:Mapped[str] = mapped_column(String(50), nullable=False)
price:Mapped[Decimal] = mapped_column(Numeric(16, 4), nullable=False)
cost:Mapped[Decimal] = mapped_column(Numeric(16, 4), nullable=False)
unit:Mapped[str] = mapped_column(String(8), nullable=False)
duration:Mapped[int] = mapped_column(Integer)
Results in the following Alebic --autogenerate migration:
op.create_table(
"product_options",
sa.Column(
"id",
sa.Integer(),
sa.Identity(always=False, start=1, cycle=True),
nullable=False,
),
sa.Column("uuid", sa.UUID(), nullable=False),
sa.Column("name", sa.String(length=50), nullable=False),
sa.Column("price", sa.Numeric(precision=16, scale=4), nullable=False),
sa.Column("cost", sa.Numeric(precision=16, scale=4), nullable=False),
sa.Column("unit", sa.String(length=8), nullable=False),
sa.Column("duration", sa.Integer(), nullable=False),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("uuid"),
)

Cannot map ForeignKey due to dual Primary Keys

I'm trying to create tables on-the-fly from existing data...however, the table I need has dual Primary Keys. I can't find how to satisfy the restrictions.
I.e. I start with the following two tables...
self.DDB_PAT_BASE = Table('DDB_PAT_BASE', METADATA,
Column('PATID', INTEGER(), primary_key=True),
Column('PATDB', INTEGER(), primary_key=True),
Column('FAMILYID', INTEGER()),
)
self.DDB_ERX_MEDICATION_BASE = Table('DDB_ERX_MEDICATION_BASE', METADATA,
Column('ErxID', INTEGER(), primary_key=True),
Column('ErxGuid', VARCHAR(length=36)),
Column('LastDownload', DATETIME()),
Column('LastUpload', DATETIME()),
Column('Source', INTEGER()),
)
When I try the following, it works...
t = Table('testtable', METADATA,
Column('ErxID', INTEGER(), ForeignKey('DDB_ERX_MEDICATION_BASE.ErxID')),
)
t.create()
However, both the following give me the error...
t = Table('testtable', METADATA,
Column('PATID', INTEGER(), ForeignKey('DDB_PAT_BASE.PATID')),
)
t.create()
t = Table('testtable', METADATA,
Column('PATID', INTEGER(), ForeignKey('DDB_PAT_BASE.PATID')),
Column('PATDB', INTEGER(), ForeignKey('DDB_PAT_BASE.PATDB')),
)
t.create()
sqlalchemy.exc.OperationalError: (pymssql.OperationalError) (1776, "There are no primary or candidate keys in the referenced table 'DDB_PAT_BASE' that match the referencing column list in the foreign key 'FK__testtabl__PATID__3FD3A585'.DB-Lib error message 20018, severity 16:\nGeneral SQL Server error: Check messages from the SQL Server\nDB-Lib error message 20018, severity 16:\nGeneral SQL Server error: Check messages from the SQL Server\n") [SQL: '\nCREATE TABLE [testtable] (\n\t[PATID] INTEGER NULL, \n\tFOREIGN KEY([PATID]) REFERENCES [DDB_PAT_BASE] ([PATID])\n)\n\n']
The table you are pointing to has a composite primary key, not multiple primary keys. Hence. you need to create a composite foreign key, not two foreign keys pointing to each half of the composite primary key:
t = Table('testtable', METADATA,
Column('PATID', INTEGER()),
Column('PATDB', INTEGER()),
ForeignKeyConstraint(['PATID', 'PATDB'], ['DDB_PAT_BASE.PATID', 'DDB_PAT_BASE.PATDB']),
)
t.create()

Categories

Resources