I have run flask db upgrade for creating Todos table. Now I added a new table and established a relationship with the existing table and also added a new field in the existing table.
I would expect flask db migrate to record the differences (adding new table Todoslist and adding a new field in Todo) however it says name error - table not defined.
class TodoList(db.Model):
__tablename__ = 'todolists'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(), nullable=False)
todos.db.relationship('Todo', backref='list', lazy = True)
class Todo(db.Model):
__tablename__ = 'todos'
id = db.Column(db.Integer, primary_key=True)
description = db.Column(db.String(), nullable=False)
completed = db.Column(db.Boolean, nullable=False, default=False)
list_id = db.Column(db.Integer, db.ForeignKey(todolists.id), nullable=True)
- Error details:
File "C:\Users\rg\anaconda3\lib\site-packages\flask\_compat.py", line 39, in reraise
raise value
File "C:\Users\rg\class_demos\Todoapp\app.py", line 26, in <module>
class Todo(db.Model):
File "C:\Users\rg\class_demos\Todoapp\app.py", line 32, in Todo
list_id = db.Column(db.Integer, db.ForeignKey(todolists.id), nullable=True)
NameError: name 'todolists' is not defined
Solutions tried
Flask app is set to the current python module
Tried swapping the create table (one before the other)
Searched extensively and this error seems to be common but not able to find a matching case like the one mentioned above.
Any help is much appreciated.
The error stems from how you are creating a relationship between Todo and TodoList models.
list_id needs to be properly initialized as a foreign key to TodoList.id, which means that it references an id value from the TodoList table. You have incorrectly used todolists.id, which is a table that does not exist. Rather, you should define the list_id as:
list_id = db.Column(db.Integer, db.ForeignKey("todoList.id"))
To create the relationship in TodoList table, you will need to initialize the field todos with db.relationship. This is not an actual database field, but a high-level view of the relationship between TodoList and Todo, and for that reason it isn't in the database diagram. So, you can properly create the field as:
todos = db.relationship('Todo', backref='list', lazy='True')
My bad. Missed the quotes for todolists.id
list_id = db.Column(db.Integer, db.ForeignKey("todolists.id"), nullable=True)
This fixed the issue :)
Related
Using SQL Alchemy and declarative base to define two related tables
JiraBase = declarative_base(cls=_Shared)
class JiraSource(JiraBase):
id = Column(BigInteger, primary_key=True, autoincrement=True)
name = Column(String)
__tablename__ = 'jira_source'
class JiraProject(JiraBase):
id = Column(BigInteger, primary_key=True, autoincrement=True)
source = Column(BigInteger, ForeignKey('jira_source.id'), nullable=False)
project_id = Column(String, nullable=False)
project_key = Column(String, nullable=False)
name = Column(String)
__table_args__ = (Index('source', 'project_id', 'project_key', unique=True),)
__tablename__ = 'jira_project'
The idea behind the Index is that projects are uniquely identified by an id and key for a given JiraSource.
The tables are being dropped and created via
JiraBase.metadata.drop_all(bind=self.engine)
JiraBase.metadata.create_all(bind=self.engine)
This actually drops and creates tables fine; however, also throws an exception
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.DuplicateTable) relation "source" already exists
[SQL: CREATE UNIQUE INDEX source ON jira_issue (issue_id)]
(Background on this error at: http://sqlalche.me/e/13/f405)
Thinking somehow because using both a ForeignKey and Index that both relate to source, is this a bad practice or something? When I comment out the Index entirely or drop the source field from it, the error goes away. Thinking that because I defined source as a foreign key that using it again in an index is bad or something?
Any idea why I'm seeing this error? For now am simply try/ignoring in code.
When im trying to delete category instance identified by 'id' with its category_image and files instances the way like this:
c = Category.query.get(id)
for ci in c.images:
db.session.delete(ci)
db.session.flush()
for ci in c.images:
db.session.delete(ci.file)
db.session.flush() # if i type here db.session.commit() all is fine
db.session.delete(c)
db.session.commit()
i'm getting a AssertionError: Dependency rule tried to blank-out primary key column 'category_image.id_category' on instance ''. But when i replace flush which is after deleting category_image.files with commit, then it works. I've notice it after i changed CategoryImage table to intermediary. Before changes it has it's own pk that wasn't combined and all was working properly. Here're my current models definitions.
class File(db.Model):
__tablename__ = 'file'
id_file = Column(Integer, Sequence('seq_id_file'), primary_key=True, nullable=False)
name = Column(Text, nullable=False)
path = Column(Text, nullable=False, unique=True)
protected = Column(Boolean, nullable=False, default=False)
class Category(db.Model):
__tablename__ = 'category'
id_category = Column(Integer, Sequence('seq_id_category'), primary_key=True, nullable=False)
name = Column(UnicodeText, nullable=False, unique=True)
images = relationship('CategoryImage', backref='images')
class CategoryImage(db.Model):
__tablename__ = 'category_image'
__table_args__ = (
PrimaryKeyConstraint('id_category', 'id_file', name='seq_id_category_image'),
)
id_category = Column(Integer, ForeignKey(Category.id_category), nullable=False)
id_file = Column(Integer, ForeignKey(File.id_file), nullable=False)
id_size_type = Column(Integer, nullable=)
file = relationship(File)
Now i'm trying to figure out what just happened. Correct me if i'm using things wrong.
I just noticed that i have to delete objects beeing in relation with intermediate model in the same order as it was declared in table_args, PrimaryKeyConstraint('id_category', 'id_file'). So when i perform it this way: session.delete(category_image), session.delete(category), session.delete(file) and commit it or flush everywhere before commit, then all works fine. If anyone spot something about it in alch docs let me know.
Here is what is happening. Once you call session.delete() on some object it is like having marked the object for deletion but not yet deleted from db. when you call the flush() after deleting (note: db still has the object as it is yet not committed) but session has marked the object as deleted. So the objects become inconsistent. In order to make the delete smooth you can always wrap your delete operations within a transaction and once they are deleted from the session you need to call the db.commit() once to make db session consistent with the db.
Hope it helps.
I am using Flask-SQLAlchemy to define my models, and then using Flask-Migrate to auto-generate migration scripts for deployment onto a PostgreSQL database. I have defined a number of SQL Views on the database that I use in my application like below.
However, Flask-Migrate now generates a migration file for the view as it thinks it's a table. How do I correctly get Flask-Migrate / Alembic to ignore the view during autogenerate?
SQL View name: vw_SampleView with two columns: id and rowcount.
class ViewSampleView(db.Model):
__tablename__ = 'vw_report_high_level_count'
info = dict(is_view=True)
id = db.Column(db.String(), primary_key=True)
rowcount = db.Column(db.Integer(), nullable=False)
Which means I can now do queries like so:
ViewSampleView.query.all()
I tried following instructions on http://alembic.zzzcomputing.com/en/latest/cookbook.html and added the info = dict(is_view=True) portion to my model and the following bits to my env.py file, but don't know where to go from here.
def include_object(object, name, type_, reflected, compare_to):
"""
Exclude views from Alembic's consideration.
"""
return not object.info.get('is_view', False)
...
context.configure(url=url,include_object = include_object)
I think (though haven't tested) that you can mark your Table as a view with the __table_args__ attribute:
class ViewSampleView(db.Model):
__tablename__ = 'vw_report_high_level_count'
__table_args__ = {'info': dict(is_view=True)}
id = db.Column(db.String(), primary_key=True)
rowcount = db.Column(db.Integer(), nullable=False)
This error happened when I tried to get access to the page. I didn't get errors when I created the tables, but seems like there are problems still.
The models are like this:
class User(UserMixin, db.Model):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(64), index=True, unique=True)
sell_items = db.relationship('Item', backref='user')
class Item(db.Model):
id = db.Column(db.Integer, primary_key=True)
item_name = db.Column(db.String(64), index=True)
item_image = db.Column(db.String(200), index=True)
price = db.Column(db.Float(10), index=True)
user_id = db.Column(db.Integer, db.ForeignKey('user.id'))
user = db.relationship('User', backref='sell_items')
The whole error message is this
Triggering mapper: 'Mapper|User|user'. Original exception was: Error creating backref 'user' on relationship 'User.sell_items': property of that name exists on mapper 'Mapper|Item|item'
How can I fix this? What I want to do is to refer to username who sells the item, but I cannot. There is a problem with the relationships between the models.
When you use backref the backwards relationship is automatically created, so it should only be used in one side of the relationship. In your case, you can remove the sell_items in the User model and the User model will automatically get a relationship from Item.
To declare the relationshiop on both sides (in case you want to customize its name, for example, use back_populates='name_of_relationship_on_other_model'.
in your Item class, replace this line
user = db.relationship('User', backref='sell_items')
with this line
user_id = db.Column(db.Integer, db.ForeignKey('user.id'), nullable=False)
it should work that way, from there you can query like this item = Item.query.first(), then item.sell_items... to get the user who posted the item.
i hope it helps.
I'm trying to insert the role data in this association table:
class Association(db.Model):
__tablename__ = 'associations'
user_id = db.Column(db.Integer, db.ForeignKey('users.id'), primary_key=True)
team_id = db.Column(db.Integer, db.ForeignKey('teams.id'), primary_key=True)
role = db.Column(db.String)
user = db.relationship("User", back_populates="teams")
team = db.relationship("Team", back_populates="users")
It's linked to 2 other tables:
class User(db.Model, UserMixin):
__tablename__ = 'users'
id = db.Column(db.Integer, primary_key=True)
email = db.Column(db.String)
username = db.Column(db.String, nullable=False, unique=True, index=True)
password_hash = db.Column(db.String)
teams = db.relationship("Association", back_populates="user")
class Team(db.Model):
__tablename__ = 'teams'
id = db.Column(db.Integer, primary_key = True)
name = db.Column(db.String)
player = db.Column(db.String)
users = db.relationship("Association", back_populates="team")
However, when I try to insert the role in the association table:
t = Team.query.filter_by(id=team_name1_id).first()
a = Association(role=form.role.data)
a.user = User.query.filter_by(id=current_user.id).first()
t.users.append(a)
db.session.add(t)
db.session.commit()
I get this error:
IntegrityError: (raised as a result of Query-invoked autoflush;
consider using a session.no_autoflush block if this flush is occurring
prematurely) (sqlite3.IntegrityError) NOT NULL constraint failed:
association.team_id [SQL: u'INSERT INTO association (user_id, role)
VALUES (?, ?)'] [parameters: (1, u'point guard')]
Do you know how can I fix it?
The line that causes the autoflush is
t.users.append(a)
You've not defined your relationship loading strategies, so they default to "select". Since you're accessing the Team.users relationship attribute for the first time, it will emit a SELECT in order to fetch the related objects.
You've also made changes to the session indirectly in
a.user = User.query.filter_by(id=current_user.id).first()
which due to save-update cascade, on by default, places the new Association instance to the session as well.
In order to keep the DB's and session's state consistent SQLAlchemy has to flush your changes to the DB before the query is emitted, and so the half initialized Association is sent to the DB before being associated with the Team instance.
Possible solutions:
Temporarily disable the autoflush feature for that particular append because you know you're adding a new Association:
with db.session.no_autoflush:
t.users.append(a)
The SELECT will still be emitted, though.
Reverse the operation. Instead of appending the Association instance to the Team.users collection, assign the Team to the Association.team:
# Instead of t.users.append(a):
a.team = t # Mr.
A less obvious solution is to eager load the contents of the relationship when you fetch the Team instance, so that no SELECT is necessary:
t = db.session.query(Team).options(db.joinedload(Team.users)).first()
or not load them at all using the db.noload(Team.users) option.
Note that
db.session.add(t)
is redundant. The fetched Team instance is already in the session – otherwise the lazy load of Team.users could not have proceeded in the first place.