I'm new to Python and Python's MySQL adapter. I'm not sure if I'm missing something obvious here:
db = MySQLdb.connect(# db details omitted)
cursor = self.db.cursor()
# WORKS
cursor.execute("SELECT site_id FROM users WHERE username=%s", (username))
record = cursor.fetchone()
# DOES NOT SEEM TO WORK
cursor.execute("DELETE FROM users WHERE username=%s", (username))
Any ideas?
I'd guess that you are using a storage engine that supports transactions (e.g. InnoDB) but you don't call db.commit() after the DELETE. The effect of the DELETE is discarded if you don't commit.
See http://mysql-python.sourceforge.net/FAQ.html#my-data-disappeared-or-won-t-go-away:
Starting with 1.2.0, MySQLdb disables
autocommit by default, as required by
the DB-API standard (PEP-249). If you
are using InnoDB tables or some other
type of transactional table type,
you'll need to do connection.commit()
before closing the connection, or else
none of your changes will be written
to the database.
See also this similar SO question: Python MySQLdb update query fails
Perhaps you are violating a foreign key constraint.
To your code above,
just add a call to self.db.commit().
The feature is far from an annoyance:
It saves you from data corruption issues
when there are errors in your queries.
The problem might be that you are not committing the changes. it can be done by
conn.commit()
read more on this here
Related
With default settings in Django (version 3.1) is safe to do:
with connection.cursor() as cursor:
cursor.execute("BEGIN")
# Some SQL operations
commit_or_rollback = "COMMIT" if success else "ROLLBACK"
with connection.cursor() as cursor:
cursor.execute(commit_or_rollback)
Or must I set autocommit to False with set_autocommit method before, as Django's autocommit closes transactions? Or autocommit is isolated and there will be no problem with my code?
In case you're asking why I'm using raw SQL for transactions: I've tried using transactions manually as docs indicates but it had some issues with multi-process enviroment, so I had to implement with raw queries
Ok, I've been reading more, testing in my project and seing the queries executed by Django in the database logs. It seems to be safe to use transactions with raw SQL as autocommit begins a new one with any operation and doesn't interferes with transactions opened by others connections
Currently using cx_Oracle module in Python to connect to my Oracle database. I would like to only allow the user of the program to do read only executions, like Select, and NOT INSERT/DELETE queries.
Is there something I can do to the connection/cursor variables once I establish the connection to prevent writable queries?
I am using the Python Language.
Appreciate any help.
Thanks.
One possibility is to issue the statement "set transaction read only" as in the following code:
import cx_Oracle
conn = cx_Oracle.connect("cx_Oracle/welcome")
cursor = conn.cursor()
cursor.execute("set transaction read only")
cursor.execute("insert into c values (1, 'test')")
That will result in the following error:
ORA-01456: may not perform insert/delete/update operation inside a READ ONLY transaction
Of course you'll have to make sure that you create a Connection class that calls this statement when it is first created and after each and every commit() and rollback() call. And it can still be circumvented by calling a PL/SQL block that performs a commit or rollback.
The only other possibility that I can think of right now is to create a restricted user or role which simply doesn't have the ability to insert, update, delete, etc. and make sure the application uses that user or role. This one at least is fool proof, but a lot more effort up front!
I want to come up minimal set of queries/loc that extracts the table metadata within a database, on as many versions of database as possible. I'm using PostgreSQl. I'm trying to get this using python. But I've no clue on how to do this, as I'm a python newbie.
I appreciate your ideas/suggestions on this issue.
You can ask your database driver, in this case psycopg2, to return some metadata about a database connection you've established. You can also ask the database directly about some of it's capabilities, or schemas, but this is highly dependent on the version of the database you're connecting to, as well as the type of database.
Here's an example taken from http://bytes.com/topic/python/answers/438133-find-out-schema-psycopg for PostgreSQL:
>>> import psycopg2 as db
>>> conn = db.connect('dbname=billings user=steve password=xxxxx port=5432')
>>> curs = conn.cursor()
>>> curs.execute("""select table_name from information_schema.tables WHERE table_schema='public' AND table_type='BASETABLE'""")
>>> curs.fetchall()
[('contacts',), ('invoicing',), ('lines',), ('task',), ('products',),('project',)]
However, you probably would be better served using an ORM like SQLAlchemy. This will create an engine which you can query about the database you're connected to, as well as normalize how you connect to varying database types.
If you need help with SQLAlchemy, post another question here! There's TONS of information already available by searching the site.
I'm trying to implement a server side cursor in order to "bypass" Django ORM weakness when it comes to fetch an huge amount of data from the database.
But I don't understand how named cursor are supposed to be defined, since my current code seems not working properly. I define the cursor in this way:
id = 'cursor%s' % uuid4().hex
connection = psycopg2.connect('my connection string here')
cursor = connection.cursor(id, cursor_factory=psycopg2.extras.RealDictCursor)
The cursor seems to work in that it can be iterated and returns expected records as python dictionary, but when I try to close it (cursor.close()) I get the exception:
psycopg2 OperationalError: cursor *the generated cursor id* does not exist
WTF?! So what is the object I'm using to retrieve stuff from the database?
Is psycopg2 using a fallback default (unnamed) cursor since the one I defined is not found in my database (and if so... my big question: it's mandatory to define a cursor at the db level before using psycopg2?) I'm a lot confused, can you help me?
I made a really simple and silly mistake of forgetting to run ./manage.py makemigrations and ./manage.py migrate before running ./manage.py test which caused this error.
(I'm aware this doesn't answer the original question, but since this is the first result from Google I thought I would contribute. Hopefully that's okay)
I've had this problems when playing around with my models and launching the test with Pytest.
What resolved the problem for me was to reset the database of my test unit. I used --create-db like so:
pytest backend/test_projects/partners/test_actions.py --create-db
I had similar problem and found the solution. Just disable server side cursors like described here: https://docs.djangoproject.com/en/2.2/ref/settings/#disable-server-side-cursors
'default': {
...
'USER': DB_USER,
'PASSWORD': DB_PASSWORD,
'NAME': DB_NAME,
'DISABLE_SERVER_SIDE_CURSORS': True,
...
},
From the psycopg2 documentation:
"Named cursors are usually created WITHOUT HOLD, meaning they live only as long as the current transaction. Trying to fetch from a named cursor after a commit() or to create a named cursor when the connection transaction isolation level is set to AUTOCOMMIT will result in an exception."
Which is to say that these cursors do not need to be explicitly closed.
http://initd.org/psycopg/docs/usage.html#server-side-cursors
I have a python script that connects to a local MySQL db. I know it is connecting correctly because I can do this and get the proper results:
cursor.execute("SELECT * FROM reel")
But when I try to do any insert statements it just does nothing. No error messages, no exceptions. Nothing shows up in the database when I check it from sqlyog. This is what my code looks like:
self.cursor.executemany("INSERT INTO reel (etime,etext) VALUES (%s,%s)", tups)
where tups is a list of tuples looking like this ('0000-00-00 00:00:00','text'). No errors show up and if I copy paste the generated SQL query into sqlyog it works. I've tried generating the query and doing cursor.execute() on it and no errors and no result either. Anyone know what I'm doing wrong?
You need to do a self.cursor.commit() after self.cursor.executemany("INSERT INTO reel (etime,etext) VALUES (%s,%s)", tups)
Starting with 1.2.0, MySQLdb disables autocommit by default, as required by the DB-API standard (PEP-249). If you are using InnoDB tables or some other type of transactional table type, you'll need to do connection.commit() before closing the connection, or else none of your changes will be written to the database.
Conversely, you can also use connection.rollback() to throw away any changes you've made since the last commit.
Important note: Some SQL statements -- specifically DDL statements like CREATE TABLE -- are non-transactional, so they can't be rolled back, and they cause pending transactions to commit.
Is a FAQ