MySQL DB migration: change of string length - python

I'm using SQLAlchemy + alembic to manage my database. I had a string field which was 10 characters long and later on found out that it has to be 20. So I updated the model definition.
class Foo(db.Model):
__tablename__ = 'foos'
id = db.Column(db.Integer, primary_key=True)
foo_id = db.Column(db.Integer, db.ForeignKey('users.id'))
name = db.Column(db.String(80))
When I run alembic revision --autogenerate, this was not detected. Now I did read the documentation and suspected that this might not be supported. How do I managed such changes in DB gracefully?

You need to enable optional column type checking.
See this for notes on what is checked by default
context.configure(
# ...
compare_type = True
)

Related

Nameerror in flask db migrate - Table not defined

I have run flask db upgrade for creating Todos table. Now I added a new table and established a relationship with the existing table and also added a new field in the existing table.
I would expect flask db migrate to record the differences (adding new table Todoslist and adding a new field in Todo) however it says name error - table not defined.
class TodoList(db.Model):
__tablename__ = 'todolists'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(), nullable=False)
todos.db.relationship('Todo', backref='list', lazy = True)
class Todo(db.Model):
__tablename__ = 'todos'
id = db.Column(db.Integer, primary_key=True)
description = db.Column(db.String(), nullable=False)
completed = db.Column(db.Boolean, nullable=False, default=False)
list_id = db.Column(db.Integer, db.ForeignKey(todolists.id), nullable=True)
- Error details:
File "C:\Users\rg\anaconda3\lib\site-packages\flask\_compat.py", line 39, in reraise
raise value
File "C:\Users\rg\class_demos\Todoapp\app.py", line 26, in <module>
class Todo(db.Model):
File "C:\Users\rg\class_demos\Todoapp\app.py", line 32, in Todo
list_id = db.Column(db.Integer, db.ForeignKey(todolists.id), nullable=True)
NameError: name 'todolists' is not defined
Solutions tried
Flask app is set to the current python module
Tried swapping the create table (one before the other)
Searched extensively and this error seems to be common but not able to find a matching case like the one mentioned above.
Any help is much appreciated.
The error stems from how you are creating a relationship between Todo and TodoList models.
list_id needs to be properly initialized as a foreign key to TodoList.id, which means that it references an id value from the TodoList table. You have incorrectly used todolists.id, which is a table that does not exist. Rather, you should define the list_id as:
list_id = db.Column(db.Integer, db.ForeignKey("todoList.id"))
To create the relationship in TodoList table, you will need to initialize the field todos with db.relationship. This is not an actual database field, but a high-level view of the relationship between TodoList and Todo, and for that reason it isn't in the database diagram. So, you can properly create the field as:
todos = db.relationship('Todo', backref='list', lazy='True')
My bad. Missed the quotes for todolists.id
list_id = db.Column(db.Integer, db.ForeignKey("todolists.id"), nullable=True)
This fixed the issue :)

sqlalchemy added new column and `UndefinedColumn` Error

I am inserting data to a remote postgres db and recently added a new column to one of the tables and now I'm getting this error inserting new data into the table with new column
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedColumn) column "build_name" of relation "DryRun" does not exist
LINE 1: ...tance_id, profile_name, host_box, default_config, build_name...
I tried what I found online such as setting nullable=True or giving it a default value default='Unknown', but still hitting the same error.
Here's my DryRun model.
class DryRun(Base):
__tablename__ = 'DryRun'
id = Column(Integer, primary_key=True, autoincrement=True)
instance_id = Column(Integer, nullable=True)
profile_name = Column(String, nullable=False)
host_box = Column(String, nullable=False)
default_config = Column(JSON, nullable=False)
build_name = Column(String, nullable=True)
This was originally posted as a comment.
The issue is that now the object differs from what's in the database. You can run an ALTER TABLE command to add the column within the command-line tool psql, or using the pgadmin GUI. And if you're anticipating more such changes, consider reading about migrations which are supported by some other modules; for instance, alembic: https://alembic.sqlalchemy.org/en/latest/
For just one column addition, migrations most likely don't make sense though. Here's an example of an ALTER TABLE command for your use-case:
# run this command to get to a text-based interface to your Postgres database
$ psql
# alter table command for your use-case:
ALTER TABLE "DryRun" ADD COLUMN build_name VARCHAR;

What could be the cause of "Dependency rule tried to blank-out primary key column"

When im trying to delete category instance identified by 'id' with its category_image and files instances the way like this:
c = Category.query.get(id)
for ci in c.images:
db.session.delete(ci)
db.session.flush()
for ci in c.images:
db.session.delete(ci.file)
db.session.flush() # if i type here db.session.commit() all is fine
db.session.delete(c)
db.session.commit()
i'm getting a AssertionError: Dependency rule tried to blank-out primary key column 'category_image.id_category' on instance ''. But when i replace flush which is after deleting category_image.files with commit, then it works. I've notice it after i changed CategoryImage table to intermediary. Before changes it has it's own pk that wasn't combined and all was working properly. Here're my current models definitions.
class File(db.Model):
__tablename__ = 'file'
id_file = Column(Integer, Sequence('seq_id_file'), primary_key=True, nullable=False)
name = Column(Text, nullable=False)
path = Column(Text, nullable=False, unique=True)
protected = Column(Boolean, nullable=False, default=False)
class Category(db.Model):
__tablename__ = 'category'
id_category = Column(Integer, Sequence('seq_id_category'), primary_key=True, nullable=False)
name = Column(UnicodeText, nullable=False, unique=True)
images = relationship('CategoryImage', backref='images')
class CategoryImage(db.Model):
__tablename__ = 'category_image'
__table_args__ = (
PrimaryKeyConstraint('id_category', 'id_file', name='seq_id_category_image'),
)
id_category = Column(Integer, ForeignKey(Category.id_category), nullable=False)
id_file = Column(Integer, ForeignKey(File.id_file), nullable=False)
id_size_type = Column(Integer, nullable=)
file = relationship(File)
Now i'm trying to figure out what just happened. Correct me if i'm using things wrong.
I just noticed that i have to delete objects beeing in relation with intermediate model in the same order as it was declared in table_args, PrimaryKeyConstraint('id_category', 'id_file'). So when i perform it this way: session.delete(category_image), session.delete(category), session.delete(file) and commit it or flush everywhere before commit, then all works fine. If anyone spot something about it in alch docs let me know.
Here is what is happening. Once you call session.delete() on some object it is like having marked the object for deletion but not yet deleted from db. when you call the flush() after deleting (note: db still has the object as it is yet not committed) but session has marked the object as deleted. So the objects become inconsistent. In order to make the delete smooth you can always wrap your delete operations within a transaction and once they are deleted from the session you need to call the db.commit() once to make db session consistent with the db.
Hope it helps.

How to ignore some models to migrate? [duplicate]

I am using Flask-SQLAlchemy to define my models, and then using Flask-Migrate to auto-generate migration scripts for deployment onto a PostgreSQL database. I have defined a number of SQL Views on the database that I use in my application like below.
However, Flask-Migrate now generates a migration file for the view as it thinks it's a table. How do I correctly get Flask-Migrate / Alembic to ignore the view during autogenerate?
SQL View name: vw_SampleView with two columns: id and rowcount.
class ViewSampleView(db.Model):
__tablename__ = 'vw_report_high_level_count'
info = dict(is_view=True)
id = db.Column(db.String(), primary_key=True)
rowcount = db.Column(db.Integer(), nullable=False)
Which means I can now do queries like so:
ViewSampleView.query.all()
I tried following instructions on http://alembic.zzzcomputing.com/en/latest/cookbook.html and added the info = dict(is_view=True) portion to my model and the following bits to my env.py file, but don't know where to go from here.
def include_object(object, name, type_, reflected, compare_to):
"""
Exclude views from Alembic's consideration.
"""
return not object.info.get('is_view', False)
...
context.configure(url=url,include_object = include_object)
I think (though haven't tested) that you can mark your Table as a view with the __table_args__ attribute:
class ViewSampleView(db.Model):
__tablename__ = 'vw_report_high_level_count'
__table_args__ = {'info': dict(is_view=True)}
id = db.Column(db.String(), primary_key=True)
rowcount = db.Column(db.Integer(), nullable=False)

Alembic migration to convert serial field to integer

I initially defined one of my SQLAlchemy models as:
class StreetSegment(db.Model):
id = db.Column(db.Integer, autoincrement=True) # surrogate ID
seg_id = db.Column(db.Integer, primary_key=True) # assigned in another system; don't increment
not realizing that seg_id would become a SERIAL field in my Postgres database. What I really wanted was an INTEGER field with a PK constraint (no autoincrement). I turned off autoincrement like so:
class StreetSegment(db.Model):
id = db.Column(db.Integer, autoincrement=True)
seg_id = db.Column(db.Integer, primary_key=True, autoincrement=False) # <--- see here
but the change isn't reflected when I run migrate in Alembic. I've also tried writing a custom migration with the following operations:
def upgrade():
op.alter_column('street_segment', 'seg_id', autoincrement=False)
def downgrade():
op.alter_column('street_segment', 'seg_id', autoincrement=True)
but that gives the warning autoincrement and existing_autoincrement only make sense for MySQL. So my question is: is there any way of using Alembic to convert a SERIAL to an INTEGER in Postgres?
Just set the type explicitly to the one you want. This should work:
op.alter_column('street_segment', 'seg_id', _type=Integer)

Categories

Resources