Im using this website: https://remotemysql.com/ for a sql database,
when I try to update value using sql console on phpmyadmin it works:
UPDATE users SET id='someid' WHERE username='myusername';
but with python mysql connector it doesnt:
import mysql.connector
mydb = mysql.connector.connect(
host="remotemysql.com",
user="blahblah",
password="blahblah",
database="blahblah",
)
mycursor = mydb.cursor()
mycursor.execute("UPDATE users SET id='someid' WHERE username='myusername';")
mydb.close()
the command is executed and doesnt throw an error, but in phpmyadmin theres no visible change, other commands like reading the data work.
Im asking how to make it work?
There are two possible reasons.
You should use mycursor.commit() after your statement, since it's a DML statement.
No rows match your where clause.
Related
I need to insert the logs from my test case into a table in postgresql data base.
I was able to connect to the db but I can't figure out how to insert this line result in the tabble, I have tried the below but it doesnt work
import logging
import psycopg2
from io import StringIO
from config import config
params = config()
conn = psycopg2.connect(**params)
print(conn)
curr = conn.cursor()
try:
if not hw.has_connection():
logging.error('Failure: Unable to reach websb! ==> '+ str(driver.find_element_by_xpath('//span[#jsselect="heading" and #jsvalues=".innerHTML:msg"]').text))
return
elif hw.is_websiteReachable():
logging.info("Success: website is reachable!")
curr.execute("""INSERT INTO db_name(logs) VALUES (%s)""", ("Success: website is reachable!"))
conn.commit()
except:
logging.error("Failure: Unable to reach website!")
return
Iam a total beginner in this. I have searched but I couldnt find a clear example or guide about it. the above code throws the exception eventhough the website is reachable. sorry if I sound dumb.
It looks like you're incorrectly constructing your SQL statement. Instead of INSERT INTO db_name(table_name) ... it should be INSERT INTO table_name(column_name) .... If you've correctly connected to the appropriate database in your connection settings, you usually don't have to specify the database name each time you write your SQL.
Therefore I would recommend, the following modification (assuming your table is called logs and it has a column named message):
# ...
sql = 'INSERT INTO logs(message) VALUES (%s);'
msg = 'Success: website is reachable!'
curr.execute(sql, (msg))
conn.commit()
You can read the pyscopg2 docs here for more information as well if that would help with passing named parameters to your SQL queries in Python.
You can check a good solution that I personally use in my in-server projects. You just need to give a connection-string to the CRUD object and all the things will be done. For Postgres you can use:
'postgresql+psycopg2://username:password#host:port/database'
or
'postgresql+pg8000://username:password#host:port/database'
for more details check SQLAlchemy Engine Configuration.
I'm attempting to bulk insert a csv into a table in SQL server. The catch is, the data doesn't match the columns of the destination table. The destination table has several audit columns that are not found in the source file. The solution I found for this is to insert into a view instead. The code is pretty simple:
from sqlalchemy import create_engine
engine = create_engine('mssql+pyodbc://[DNS]')
conn = engine.connect()
sql = "BULK INSERT [table view] FROM '[source file path]' WITH (FIELDTERMINATOR = ',',ROWTERMINATOR = '\n')"
conn.execute(sql)
conn.close()
When I run the SQL statement inside of SSMS it works perfectly. When I try to execute it from inside a Python script, the script runs but no data winds up in the table. What am I missing?
Update: It turns out bulk inserting into a normal table doesn't work either.
Before closing the connection, you need to call commit() or the SQL actions will be rolled back on connection close.
conn.commit()
conn.close()
It turns out that instead of using SQL Alchemy, I had to use pypyodbc. Not sure why this worked and the other way didn't. Example code found here:How to Speed up with Bulk Insert to MS Server from Python with Pyodbc from CSV
This works for me after checking sqlalchemy transactions refeference. I don't explicitly set conn.commit() as
The block managed by each .begin() method has the behavior such that the transaction is committed when the block completes.
with engine.begin() as conn:
conn.execute(sql_bulk_insert)
I need to make SQL Query in Openerp with another user than postgres from python code that has only SELECT privileges. Is there a way that cursor(cr) receives connection string?
Ok, I found an easy solution for this. As OpenERP uses psycopg as postgres database cursor I have explicitly made an psycopg object with parameters that I need:
conn = psycopg1.connect(database=cr.dbname, user=dbuser, password=dbpass)
cur = conn.cursor()
Be careful, if you want to use dictfetchall you need to import psycopg1:
from psycopg2 import psycopg1
cur.execute(sql)
res = cur.dictfetchall()
I am trying to update datasets in my MySQL database with a Python 2.7 script. I am updating a field, that has the unique option enabled. When I try to add a duplicate entry, Python does not give me an error message.
connection = MySQLdb.connect(
host=DB_HOST,
db=DB_DB,
user='root', passwd='',
charset="utf8"
)
cur = connection.cursor()
sql = "UPDATE type SET article_code='Duplicate_Code' where id=9"
try:
cur.execute(sql)
connection.commit()
print "No ERROR"
except:
print "ERROR"
connection.close()
OUTPUT: No ERROR
The dataset, however, is not updated. If I enter the same SQL Code in the phpMyAdmin interface, I get the following message:
#1062 - Duplicate entry 'Duplicate_Code' for key 'article_code'
I would like my Python script to go into the except option.
What am I doing wrong here?
if you read through examples here: http://mysql-python.sourceforge.net/MySQLdb.html#mysqldb
it states that you should use something more akin to this:
cur.execute("""UPDATE type SET article_code=%s where id=%s""", ('Duplicate_Code', 9))
as the 9 in your query will get converted to '9' which probably does not exist as a key hence your query does nothing, meaning always success.
Also as a side note it's advisable to always use this pattern as this way MySQLdb will escape your values whereas otherwise you/others may be tempted to do something that will open you up to SQL injections.
Edited out 'Duplicate_Code' from query body
I'm connecting mysql on my Kivy application.
import mysql.connector
con = mysql.connector.Connect(host='XXX', port=XXX, user='XXX', password='XXX', database='XXX')
cur = con.cursor()
db = cur.execute("""select SELECT SQL_NO_CACHE * from abc""")
data = cur.fetchall()
print (data)
After inserting or deleting on table abc from another connection; i call the same query on python; but data is not updating.
I add the query "SET SESSION query_cache_type = OFF;" before select query, but it didn't work. Someone said "select NOW() ..." query is not cachable but it didn't work again. What should I do?
I solved this by adding the code after fetchall()
con.commit()
Calling the same select query without doing a commit, won't update the results.
The solution is to use:
Once:
con.autocommit(True)
Or, after each select query:
con.commit()
With this option, there will be a commit after each select query.
Otherwise, subsequent selects will render the same result.
This error seems to be Bug #42197 related to Query cache and auto-commit in MySQL. The status is won't fix!
In a few months, this should be irrelevant because MySQL 8.0 is dropping Query Cache.
I encounterd the same problem that has been solved and used the above method.
conn.commit()
and I found that different DBMS has different behavior,not all DBMS exist in the connection cache
try this,
conn.autocommit(True);
this will auto commit after each of you select query.
The MySQL query cache is flushed when tables are modified, so it wouldn't have that effect. It's impossible to say without seeing the rest of your code, but it's most likely that your INSERT / DELETE query is failing to run.