I Am trying to add some Rows to My Table with SQLAlchemy to my local SQLLite File. However the Rows dont get saved. I dont get any Primary Key Errors and i am also commiting my Inserts/Session.
What's really confusing me is that my Code works fine(My Inserts are Saved), when i am changing my Engine to Connect to a local postgresql-Server. I Am Calling the add_values Function from an other File.
def add_values(offers):
engine = create_engine("sqlite:///mydb.db",echo=True)
Base.metadata.create_all(bind=engine)
Session = sessionmaker(bind=engine)
with Session.begin() as session:
list_of_offers =[]
for ele in offers:
offer = Offer(ele.id,ele.asin,ele.price,ele.currency,ele.condition,ele.seller,ele.datetime,ele.buyboxWinner,ele.numberOfSellers)
list_of_offers.append(offer)
session.add_all(list_of_offers)
session.commit()
I Tryed to Change my Database to postgresql-Server.
Related
I'm trying to build a flask web app that interfaces with a remote, pre-existing sql server database.
I have a successful connection, according to the output of echo='debug.'
I'm using the .reflect() "only" parameter because the database I'm connecting to has hundreds of tables. Without it, the entire database gets reflected and it runs way too slow.
engine = create_engine('mssql+pymssql://user:pass#server/db', echo='debug')
conn = engine.connect()
meta = MetaData(engine).reflect(schema='dbo', only=['Bookings'])
table = meta.tables['Bookings']
select_st = select([table]).where(
table.c.ID == 'id-1234')
res = conn.execute(select_st)
for _row in res:
print(_row)
The problem is that I'm getting the error:
table = meta.tables['Bookings']
AttributeError: 'NoneType' object has no attribute 'tables'
My guess is that .tables doesn't work with the subset 'Bookings' that I've passed it because .tables calls the meta object, which it believes should be a database, not a a table.
How can I get sqlalchemy features to work with that 'only' parameter? Again, I'm building a flask web app so the solution needs to be able to interface with Base.automap.
.reflect() does not return a value / returns None. So meta is None.
Try it like this please:
meta = MetaData(engine)
meta.reflect(schema='dbo', only=['Bookings'])
table = meta.tables['Bookings']
I have a small flask app that I am working on. It consists of a search box, which is using a select statement outside of Sqlalchemy to do a "full text search" on a virtual fts5 table
sqlstr = "SELECT * FROM items INNER JOIN Restaurants on Restaurants.rest_id = items.restaurant_id WHERE items MATCH '{}' ORDER BY item_rank DESC".format(q)
crsr = conn.execute(sqlstr)
search_data = crsr.fetchall()
crsr.close()
The user gets the results and then can THumbs up a menu item from the results,when that happens I take them to a "vote" flask route to record the thumbs up AND make a log in my analytics table. and then redirect to last page. This is where it gets wonky.
db.session.query(items).filter(items.id==item_id).update({items.item_rank:items.item_rank +1})
db.session.commit()
## RECORDING IT IN ANALYTICS
voted_item = Analytics(user_id=str(current_user),action=str('vote'),item_id=item_id)
db.session.add(voted_item)
db.session.commit()
I don't get any errors in my log files. When I hit vote I get the flask message that says It was successful. I am successfully logging the vote in a separate Analytics table, but for some reason its not recording it in the items table.
Ive tried a few things. I went back to sql statement to update the items virtual table, something like this:
sqlstr = "UPDATE items SET item_rank = item_rank +1 where id = {}".format(item_id)
crsr = conn.execute(sqlstr)
crsr.close()
But I was getting data base is locked errors, I assume because I was access db outside of Sqlalchemy orm...Any ideas? Because of a fts5, virtual table? going back and forth form sql to orm?
I'm trying to reflect system table in MS SQL Server database:
from sqlalchemy import engine, create_engine, MetaData, Table
meta = MetaData()
url = engine.url.URL(
"mssql+pyodbc",
username=credentials["username"],
password=credentials["pswd"],
host=credentials["host"],
database=credentials["db"],
query=dict(driver="ODBC Driver 13 for SQL Server")
)
e = create_engine(url)
conn = e.connect()
tt = Table("objects", meta, autoload=True, autoload_with=e, schema="sys")
for c in tt.columns:
print(c.name)
At the end I get NoSuchTable error. I tried to reflect other system tables (sys.triggers, sys.sql_modules) - same results. With ordinary tables this code works normally, I can list columns and make other queries. Login which I use in my application has "db_owner" role, so it has enough permissions, and if I write something like this
for item in conn.execute("select * from sys.triggers"):
print(item)
it works fine.
What am I doing wrong? Is there any other way to work with data from system tables, besides executing raw sql and wrapping results in dataclasses etc.?
I was trying to reflect system views in MS SQL Server database. After adding echo='debug' to engine, I realized that, SQL Alchemy searches table and view metadata from INFORMATION_SCHEMA in MSSQL.
The system tables and views are not listed in INFORMATION_SCHEMA.TABLES OR INFORMATION_SCHEMA.VIEWS.
(I'm using SQLAlchemy version 1.3.5.)
When you add echo='debug' to your engine you can see the steps it's going through when talking to the database. In my case it sends out a query consisting of a left join of all_col_comments on all_tab_cols. In this query you'll see it uses owner = <schema value>.
I found that the system tables are owned by 'SYS' so by setting the schema to 'SYS' it will be able to find the system table just fine. A small code example to clarify where to set the schema:
table = db.Table('USER_SOURCE', metadata, schema='SYS', autoload=True, autoload_with=engine)
I have a flask app that uses sqlalchemy to read, write a postgres schema. When i use the .delete() function, it only flushes, but actual changes to the database do not occur.
Session = (sessionmaker(autocommit=False, autoflush=False,bind=conn))
sess = Session()
sess.query(Table).filter(Column.id==1).delete()
sess.commit()
I tried without scoped_session, but still the same issue.
You're overwriting sess with the number of deleted rows, and then trying to commit that number to the database. The .delete() method returns the number of rows to be deleted (http://docs.sqlalchemy.org/en/rel_1_0/orm/query.html#sqlalchemy.orm.query.Query.delete).
Additionally, you set autoflush=False when you created your session. That makes it so you have to explicitly flush to the db after a commit. I suggest this:
Session = sessionmaker(autocommit=False, bind=conn)
sess = Session()
rows_deleted = sess.query(Table).filter(Column.id==1).delete()
sess.commit()
print str(rows_deleted) + " rows were deleted"
db.session.query(Model).filter(Model.id==123).delete()
db.session.commit()
I'm trying to do a two-phase commit using SQLalchemy 0.6.8 with Postgresql 8.3.4, but I think I'm missing something...
The workflow goes like this:
session = sessionmaker(engine)(autocommit=True)
tx = session.connection().begin_twophase(xid) # Doesn't issue any SQL
session.begin()
session.add(obj1)
session.flush()
tx.prepare()
then from another session
session = sessionmaker(engine)(autocommit=True)
session.connection().commit_prepared(xid, recover=True) # recover=True because otherwise it complains that you can't issue a COMMIT PREPARED from inside a transaction
This doesn't raise any error, but doesn't write anything to the table either... O_o
What am I missing?
I tried even blocking the application after the prepare() and issuing a COMMIT PREPARED 'xid' from pgadmin, but still nothing gets written.
I managed to get it working, here's how:
session = sessionmaker(engine)(twophase=True)
session.add(obj1)
session.prepare()
# Find transaction id
for k, v in s.transaction._connections.iteritems():
if isinstance(k, Connection):
return v[1].xid
then from another session
session = sessionmaker(engine)(twophase=True)
session.connection().commit_prepared(xid, recover=True)