I'm trying to do a two-phase commit using SQLalchemy 0.6.8 with Postgresql 8.3.4, but I think I'm missing something...
The workflow goes like this:
session = sessionmaker(engine)(autocommit=True)
tx = session.connection().begin_twophase(xid) # Doesn't issue any SQL
session.begin()
session.add(obj1)
session.flush()
tx.prepare()
then from another session
session = sessionmaker(engine)(autocommit=True)
session.connection().commit_prepared(xid, recover=True) # recover=True because otherwise it complains that you can't issue a COMMIT PREPARED from inside a transaction
This doesn't raise any error, but doesn't write anything to the table either... O_o
What am I missing?
I tried even blocking the application after the prepare() and issuing a COMMIT PREPARED 'xid' from pgadmin, but still nothing gets written.
I managed to get it working, here's how:
session = sessionmaker(engine)(twophase=True)
session.add(obj1)
session.prepare()
# Find transaction id
for k, v in s.transaction._connections.iteritems():
if isinstance(k, Connection):
return v[1].xid
then from another session
session = sessionmaker(engine)(twophase=True)
session.connection().commit_prepared(xid, recover=True)
Related
I Am trying to add some Rows to My Table with SQLAlchemy to my local SQLLite File. However the Rows dont get saved. I dont get any Primary Key Errors and i am also commiting my Inserts/Session.
What's really confusing me is that my Code works fine(My Inserts are Saved), when i am changing my Engine to Connect to a local postgresql-Server. I Am Calling the add_values Function from an other File.
def add_values(offers):
engine = create_engine("sqlite:///mydb.db",echo=True)
Base.metadata.create_all(bind=engine)
Session = sessionmaker(bind=engine)
with Session.begin() as session:
list_of_offers =[]
for ele in offers:
offer = Offer(ele.id,ele.asin,ele.price,ele.currency,ele.condition,ele.seller,ele.datetime,ele.buyboxWinner,ele.numberOfSellers)
list_of_offers.append(offer)
session.add_all(list_of_offers)
session.commit()
I Tryed to Change my Database to postgresql-Server.
Per the docs, we should use the following pattern with a sessionmaker object:
Session = sessionmaker(engine)
with Session.begin() as session:
session.add(some_object)
In a multithreaded environment, we are also supposed to use a single scoped_session and share it. So in my __init__.py I create create one and import it everywhere else in my program:
engine = create_engine(config.SQLALCHEMY_DATABASE_URI)
Session = scoped_session(sessionmaker(bind=engine))
The question is, how am I supposed to combine these two approaches? This seems to be the suggested way, but it errors out:
from myapp import Session
with Session.begin() as session:
query_result = session.query(MyModel).all()
----
Exception has occurred: AttributeError
'SessionTransaction' object has no attribute 'query'
I tried the following and it works, but it seems like it doesn't follow the docs, and I'm afraid it breaks something not obvious. Can anyone confirm if this is correct?
from myapp import Session
with Session() as session, session.begin():
query_result = session.query(MyModel).all()
I've been looking around at other replies and seeing very little that addresses the specific question.
From the Session.begin() docs:
The Session object features autobegin behavior, so that normally it is not necessary to call the Session.begin() method explicitly. However, it may be used in order to control the scope of when the transactional state is begun.
You can use Session.begin() (new in 1.4) to obtain a SessionTransaction instance usable as a context manager which will autocommit on successful exit.
Calling the scoped_session returns a SessionTransaction right away, as per your error, so you do not need to begin it again.
All in all, you can definitely do the stacked context manager, but its unnecessary, so you might as well stick to using the original flow:
Session = scoped_session(...)
with Session() as session: # NB. session is a thread local SessionTransaction
...
session.commit()
or the proxied Session
Session = scoped_session(...)
#on_request_end
def remove_session(req):
Session.remove()
#route("/xyz", ...)
def handle_xyz():
instance = Class(...)
Session.add(instance)
Session.commit()
I have a flask app that uses sqlalchemy to read, write a postgres schema. When i use the .delete() function, it only flushes, but actual changes to the database do not occur.
Session = (sessionmaker(autocommit=False, autoflush=False,bind=conn))
sess = Session()
sess.query(Table).filter(Column.id==1).delete()
sess.commit()
I tried without scoped_session, but still the same issue.
You're overwriting sess with the number of deleted rows, and then trying to commit that number to the database. The .delete() method returns the number of rows to be deleted (http://docs.sqlalchemy.org/en/rel_1_0/orm/query.html#sqlalchemy.orm.query.Query.delete).
Additionally, you set autoflush=False when you created your session. That makes it so you have to explicitly flush to the db after a commit. I suggest this:
Session = sessionmaker(autocommit=False, bind=conn)
sess = Session()
rows_deleted = sess.query(Table).filter(Column.id==1).delete()
sess.commit()
print str(rows_deleted) + " rows were deleted"
db.session.query(Model).filter(Model.id==123).delete()
db.session.commit()
I am using SQLAlchemy's provided contextmanager to handle sessions for me. What I don't understand is how to get the automatically generated ID because (1) the ID is not created until after commit() is called yet (2) the newly created instance is only available in the context manager's scope:
def save_soft_file(name, is_geo=False):
with session_scope() as session:
soft_file = models.SoftFile(name=name, is_geo=is_geo)
session.add(soft_file)
# id is not available here, because the session has not been committed
# soft_file is not available here, because the session is out of context
return soft_file.id
What am I missing?
Use session.flush() to execute pending commands within the current transaction.
def save_soft_file(name, is_geo=False):
with session_scope() as session:
soft_file = models.SoftFile(name=name, is_geo=is_geo)
session.add(soft_file)
session.flush()
return soft_file.id
If an exception occurs after a flush but before the session goes out of scope, the changes will be rolled back to the beginning of the transaction. In that case your soft_file would not actually be written to the database, even though it had been given an ID.
I am trying to use session to pass some data from one page to another page.
Here is the code i wrote in ajax.py.
def save_cookie(request, query):
request.session['query'] = query
But when i call this dajaxice function.An error will occurred. As we all know that when we try to use dajaxice in html page, the error msg always is "sth goes wrong".
I tried to debug save_cookie, but the mock request object i created has no session attr. However, if i do request.session="blah", it worked. If I directly use save_cookie(request,query). It will pop up the error msg that request object has no attr seesion...
The code is right straight forward. I didn't see any mistake in it. Does anyone know the cause?
Never used dajaxice / dajax so I can't really help here. Just a few points:
did you enable (and properly configue) the session support ? https://docs.djangoproject.com/en/1.3/topics/http/sessions/
you can use the logging module (or a plain "print" statement but then you won't have the whole traceback) to trace the exception, ie :
def save_cookie(request, query):
try:
request.session['query'] = query
except Exception, e:
print e
raise
The output of the print statement should now appear in the shell you started the dev server from (assuming you're working with the dev server... you ARE workin with the dev server, aren't you ?)
still using the dev server, you can use pdb to switch to interactive debugging:
def save_cookie(request, query):
import pdb; pdb.set_trace()
request.session['query'] = query
then try to access the url in your browser, switch back to your shell and you're in a pdb session where you can inspect the request and (if there's one) request.session object etc.
NB : don't do this if running behind Apache or any other web server - only with the builtin dev server.
"request.session='blah'" will create the "session" attribute on the "request" object if it doesn't exist (and possibly replace the real "session" object if it already existed), so it's neither a valid test nor something sensible to do
My 2 cents...
Disclaimer: I don't know anything about dajaxice.
The following will work on a mock request object:
def save_cookie(request, query):
if not hasattr(request, 'session'):
request.session = dict()
request.session['query'] = query