I initially defined one of my SQLAlchemy models as:
class StreetSegment(db.Model):
id = db.Column(db.Integer, autoincrement=True) # surrogate ID
seg_id = db.Column(db.Integer, primary_key=True) # assigned in another system; don't increment
not realizing that seg_id would become a SERIAL field in my Postgres database. What I really wanted was an INTEGER field with a PK constraint (no autoincrement). I turned off autoincrement like so:
class StreetSegment(db.Model):
id = db.Column(db.Integer, autoincrement=True)
seg_id = db.Column(db.Integer, primary_key=True, autoincrement=False) # <--- see here
but the change isn't reflected when I run migrate in Alembic. I've also tried writing a custom migration with the following operations:
def upgrade():
op.alter_column('street_segment', 'seg_id', autoincrement=False)
def downgrade():
op.alter_column('street_segment', 'seg_id', autoincrement=True)
but that gives the warning autoincrement and existing_autoincrement only make sense for MySQL. So my question is: is there any way of using Alembic to convert a SERIAL to an INTEGER in Postgres?
Just set the type explicitly to the one you want. This should work:
op.alter_column('street_segment', 'seg_id', _type=Integer)
Related
I am inserting data to a remote postgres db and recently added a new column to one of the tables and now I'm getting this error inserting new data into the table with new column
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedColumn) column "build_name" of relation "DryRun" does not exist
LINE 1: ...tance_id, profile_name, host_box, default_config, build_name...
I tried what I found online such as setting nullable=True or giving it a default value default='Unknown', but still hitting the same error.
Here's my DryRun model.
class DryRun(Base):
__tablename__ = 'DryRun'
id = Column(Integer, primary_key=True, autoincrement=True)
instance_id = Column(Integer, nullable=True)
profile_name = Column(String, nullable=False)
host_box = Column(String, nullable=False)
default_config = Column(JSON, nullable=False)
build_name = Column(String, nullable=True)
This was originally posted as a comment.
The issue is that now the object differs from what's in the database. You can run an ALTER TABLE command to add the column within the command-line tool psql, or using the pgadmin GUI. And if you're anticipating more such changes, consider reading about migrations which are supported by some other modules; for instance, alembic: https://alembic.sqlalchemy.org/en/latest/
For just one column addition, migrations most likely don't make sense though. Here's an example of an ALTER TABLE command for your use-case:
# run this command to get to a text-based interface to your Postgres database
$ psql
# alter table command for your use-case:
ALTER TABLE "DryRun" ADD COLUMN build_name VARCHAR;
I'm using Flask sqlalchemy to create some tables for a database using mySQL. I have a problem when setting some foreign keys,
the tables are:
class Specific(db.Model):
code = db.Column(db.String(30), primary_key=True, index=True)
serial_number = db.Column(db.String(30), primary_key=True, index=True)
equipment_id = db.Column(db.String(30), db.ForeignKey('general.id'), index=True)
#setting relationship fields
tickets_equip_code = db.relationship('Ticket', backref='specific_code', lazy='dynamic')
tickets_serial_number = db.relationship('Ticket', backref='specific_serial, lazy='dynamic')
class Ticket(db.Model):
id = db.Column(db.String(30), primary_key=True)
solicitant = db.Column(db.String(30), db.ForeignKey('user.id'), index=True, unique=False)
#setting the foreign keys
equipment_code = db.Column(db.String(30), db.ForeignKey('specific.code'), nullable=False)
equipment_serial_number = db.Column(db.String(30), db.ForeignKey('specific.serial_number'), nullable=False)
the problem comes when I try to create all the tables using db.create_all() in the console, I get the following error:
pymysql.err.OperationalError 1822 "Failed to add the foreign key constraint. Missing index for constraint 'fk_ticket_serial_number' in the referenced table 'specific'"
I have other db Models with FKs but they seem to work well these two models are the ones causing the trouble but I can't get it. I've even tried to run the script using only one of the two FKs in the mentioned models and it runs fine; I have also tried setting both the primary keys in the Specific model using __table_args__, also I tried setting manually the FK constraints in Tickets using __table_args__ (that's why the name 'fk_ticket_serial_number' appears) but it had no effect.
Do you guys have any ideas what might be going wrong?
I'm trying to create objects in Postgres db.
I'm using this approach https://websauna.org/docs/narrative/modelling/models.html#uuid-primary-keys
class Role(Base):
__tablename__ = 'role'
# Pass `binary=False` to fallback to CHAR instead of BINARY
id = sa.Column(UUIDType(binary=False), primary_key=True)
But when I create object
user_role = Role(name='User')
db.session.add(user_role)
db.session.commit()
I have the following error:
sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) null value in column "id" violates not-null constraint
Looks like I didn't provide any ID. So, how I can make the database auto-generate it or generate on my own?
You appear to be using this code. It's missing a default for the column. You're emulating this SQL:
id UUID PRIMARY KEY DEFAULT uuid_generate_v4()
But you've already linked to the correct code.
id = Column(UUID(as_uuid=True),
primary_key=True,
server_default=sqlalchemy.text("uuid_generate_v4()"),)
Alternatively if you don't want to load a Postgres UUID extension, you can create the UUIDs in Python.
from uuid import uuid4
id = Column(UUID(as_uuid=True),
primary_key=True,
default=uuid4,)
You could use the uuid module and just set a column default. For example:
from uuid import uuid4
from sqlalchemy import Column, String
class Role(Base):
__tablename__ = 'role'
id = Column(String, primary_key=True, default=uuid4)
What I actually came to is:
import uuid
class SomeClass(db.Model):
__tablename__ = 'someclass'
id = db.Column(UUID(as_uuid=True),
primary_key=True, default=lambda: uuid.uuid4().hex)
import uuid
myId = uuid.uuid4()
print(myId)
When im trying to delete category instance identified by 'id' with its category_image and files instances the way like this:
c = Category.query.get(id)
for ci in c.images:
db.session.delete(ci)
db.session.flush()
for ci in c.images:
db.session.delete(ci.file)
db.session.flush() # if i type here db.session.commit() all is fine
db.session.delete(c)
db.session.commit()
i'm getting a AssertionError: Dependency rule tried to blank-out primary key column 'category_image.id_category' on instance ''. But when i replace flush which is after deleting category_image.files with commit, then it works. I've notice it after i changed CategoryImage table to intermediary. Before changes it has it's own pk that wasn't combined and all was working properly. Here're my current models definitions.
class File(db.Model):
__tablename__ = 'file'
id_file = Column(Integer, Sequence('seq_id_file'), primary_key=True, nullable=False)
name = Column(Text, nullable=False)
path = Column(Text, nullable=False, unique=True)
protected = Column(Boolean, nullable=False, default=False)
class Category(db.Model):
__tablename__ = 'category'
id_category = Column(Integer, Sequence('seq_id_category'), primary_key=True, nullable=False)
name = Column(UnicodeText, nullable=False, unique=True)
images = relationship('CategoryImage', backref='images')
class CategoryImage(db.Model):
__tablename__ = 'category_image'
__table_args__ = (
PrimaryKeyConstraint('id_category', 'id_file', name='seq_id_category_image'),
)
id_category = Column(Integer, ForeignKey(Category.id_category), nullable=False)
id_file = Column(Integer, ForeignKey(File.id_file), nullable=False)
id_size_type = Column(Integer, nullable=)
file = relationship(File)
Now i'm trying to figure out what just happened. Correct me if i'm using things wrong.
I just noticed that i have to delete objects beeing in relation with intermediate model in the same order as it was declared in table_args, PrimaryKeyConstraint('id_category', 'id_file'). So when i perform it this way: session.delete(category_image), session.delete(category), session.delete(file) and commit it or flush everywhere before commit, then all works fine. If anyone spot something about it in alch docs let me know.
Here is what is happening. Once you call session.delete() on some object it is like having marked the object for deletion but not yet deleted from db. when you call the flush() after deleting (note: db still has the object as it is yet not committed) but session has marked the object as deleted. So the objects become inconsistent. In order to make the delete smooth you can always wrap your delete operations within a transaction and once they are deleted from the session you need to call the db.commit() once to make db session consistent with the db.
Hope it helps.
I'm using SQLAlchemy + alembic to manage my database. I had a string field which was 10 characters long and later on found out that it has to be 20. So I updated the model definition.
class Foo(db.Model):
__tablename__ = 'foos'
id = db.Column(db.Integer, primary_key=True)
foo_id = db.Column(db.Integer, db.ForeignKey('users.id'))
name = db.Column(db.String(80))
When I run alembic revision --autogenerate, this was not detected. Now I did read the documentation and suspected that this might not be supported. How do I managed such changes in DB gracefully?
You need to enable optional column type checking.
See this for notes on what is checked by default
context.configure(
# ...
compare_type = True
)