I am writing my first python script, and I am trying to connect it to a mysql db to insert rows.
import MySQLdb
db = MySQLdb.connect("localhost","root","xxx","pytest" )
cursor = db.cursor()
cursor.execute("INSERT INTO `first_table` (`name`) VALUES ('boop') ")
When I check the mysql db via phpmyadmin, it contains no rows, however if the auto incrementing ID was 5 and then I run the script 2 times, when I insert a new row it inserts it as id= 8 so the script has been incrementing the primary key but not inserting the rows?
The script reports no mysql errors, so I'm a bit lost here.
In yuor case please use
import MySQLdb
db = MySQLdb.connect("localhost","root","jimmypq79","pytest" )
cursor = db.cursor()
cursor.execute("INSERT INTO `first_table` (`name`) VALUES ('boop') ")
db.commit()
Please put this in top of the code like this--
db = MySQLdb.connect("localhost","root","jimmypq79","pytest" )
db.autocommit(True)
check here
You can use
cursor.autocommit(True)
in the beginning of the code for automatically committing the changes .
you use,
db.commit()
after insert query
Related
I'm working in a Jupyter Notebook and using pymysql. I can read off that database, so the connection must be established, but I can't send any INSERT statements.
connection = pymysql.connect(endpoint, user, passwd, db)
insert = [('Popowice',363000),('Wroclaw',389991),('Biskupin',359000)]
sql = "INSERT INTO housing_wroclaw (`District`, `Price`) VALUES (%s, %s)"
cursor = connection.cursor()
cursor.executemany(sql,insert)
This piece of code with my credentials returns 3 - the number of insert tuples and no errors. But the database just doesn't have those records. I tried also looping through values using execute() rather than executemany(), but neither worked and the latter is apparently better.
Below is my working SELECT statement:
cursor = connection.cursor()
cursor.execute('SELECT * from housing_wroclaw')
rows = cursor.fetchall()
How can I INSERT? Why it doesn't work?
You must call connection.commit() after inserting data to make it persistent.
I have a MySQL database of some measurements taken by a device and I'm looking for a way to retrieve specific columns from it, where the user chooses what columns he needs from a python interface/front end. All the solutions I've seen till now either retrieves all columns or had the columns specified in the code itself.
Is there a possible way I could do this?
Thanks!
Your query can look something like this :
select
table_name, table_schema, column_name
from information_schema.columns
where table_schema in ('schema1', 'schema2')
and column_name like '%column_name%'
order by table_name;
you can definitely pass the column_name as a parameter(fetch it from python code) run it dynamically.
import MySQLdb
#### #GET COLUMN NAME FROM USER PRESENT WITH IN TABLE
column = input()
#### #Open database connection
db = MySQLdb.connect("host","username","password","DB_name" )
#### #prepare a cursor object using cursor() method
cursor = db.cursor()
#### #execute SQL query using execute() method.
cursor.execute("SELECT * FROM TABLE")
# Fetch a all rows using fetchall() method.
result_set = cursor.fetchall()
for row in result_set:
print(row[column])
# disconnect from server
db.close()
OR you can use .execute() to run a specific query with column name.
I am trying to execute a very long statement from python which has around 1.3 million characters using following code:
import pyodbc
conn_str = 'Driver={SQL Server};
SERVER=MYSERVER;DATABASE=MyDatabase;TrustedConnection=True'
conn = pyodbc.connect(conn_str)
cursor = conn.cursor()
try:
cursor.execute("A SQL statement with 1.3m characters")
cursor.commit()
except Exception as e:
print(e)
finally:
conn.close()
It's basically a long list of insert statements.
I am watching the SQL Profiler as this is running against my SQL server and it executes every time a different number of INSERT statements. It inserts the data up to around 40k characters. Then it suddenly stops. I was thinking of a max number of characters that a sql statement can hold, but since it's executing a different number of statements that doesn't sound like the issue here?
Any one any ideas what's happening here and how I could get around this?
Thanks,
Joe
Edit:
here is the query:
SET XACT_ABORT ON;
SET QUOTED_IDENTIFIER ON;
IF (select max(id)
from Table1) = 87648
BEGIN
BEGIN TRY
BEGIN TRANSACTION
INSERT INTO Table1 VALUES (87649, 'G4KG72HF6','87649');
INSERT INTO Table1 VALUES (87650, 'G4KG72HF6','87650');
INSERT INTO Table1 VALUES (87651, 'GDGVFKVW6','87651');
INSERT INTO Table1 VALUES (87652, 'GYAPWLNU1','87652');
INSERT INTO Table1 VALUES (87653, 'GYAPWLNU1','87653');
INSERT INTO Table1 VALUES (87654, 'H884542A2','87654');
INSERT INTO Table1 VALUES (87655, 'HT2XM4U83','87655');
INSERT INTO Table1 VALUES (87656, 'GPD9P39C7','87656');
INSERT INTO Table1 VALUES (87657, 'J2ZBUN7Q7','87657');
INSERT INTO Table1 VALUES (87658, 'JBWS35M69','87658');
INSERT INTO Table1 VALUES (87659, 'JMU6ANZN7','87659');
INSERT INTO Table1 VALUES (87660, 'JWRLK6D48','87660');
INSERT INTO Table1 VALUES (87661, 'K6NZSPSL2','87661');
--- a lot more inserts happening here
COMMIT
END TRY
BEGIN CATCH
PRINT N'ERROR: ' + ERROR_MESSAGE()
IF ##TRANCOUNT > 0
BEGIN
ROLLBACK
PRINT N'Transaction rolled back'
END
END CATCH
END
ELSE
PRINT 'Max id in Table1 != 87648, has this script been ran already?'
It's basically a long list of insert statements
Since your SQL text does not begin with SET NOCOUNT ON;, each INSERT statement is generating an update count that gets queued so it can be returned to your Python app, and there is a limit as to how long that queue can be.
So, just prepend SET NOCOUNT ON; to your SQL text to avoid the problem.
(See this GitHub issue for a more thorough discussion.)
What I'm trying to achieve is placing data generated with a python script into a MySQL database. So far I have been able to generate the data with Python, and print it, but I'm not too sure how I can place that into the mysql table. I've read that you can use MySQLdb as a connector between the two, and currently I have been able to place data into the table, but only data that I manually type.
Hopefully the code makes sense, but I am trying to place the values from BidSize, AskSize, BidPrice, and AskPrice into the table.
from wrapper_v3 import IBWrapper, IBclient
from swigibpy import Contract as IBcontract
if __name__=="__main__":
"""
This simple example returns streaming price data
"""
callback = IBWrapper()
client=IBclient(callback)
ibcontract = IBcontract()
ibcontract.secType = "FUT"
ibcontract.expiry="201612"
ibcontract.symbol="GE"
ibcontract.exchange="GLOBEX"
ans=client.get_IB_market_data(ibcontract)
print "Bid size, Ask size; Bid price; Ask price"
print ans
import MySQLdb as mdb
con = mdb.connect('localhost', 'testuser', 'test623', 'testdb');
with con:
cur = con.cursor()
cur.execute("DROP TABLE IF EXISTS Histdata")
cur.execute("CREATE TABLE Histdata(Id INT PRIMARY KEY AUTO_INCREMENT, \
Name VARCHAR(25), BidSize VARCHAR(20), AskSize VARCHAR(20), BidPrice VARCHAR(20), AskPrice VARCHAR(20))")
cur.execute("INSERT INTO Histdata(Name,BidSize,AskSize,BidPrice,AskPrice) VALUES('test','test','test','test','test'))")
Parameterize your query like this:
# assuming ans is a list
bid_size, ask_size, bid_price, ask_price = ans
cur.execute('''
INSERT INTO Histdata
(Name,BidSize,AskSize,BidPrice,AskPrice)
VALUES(%s,%s,%s,%s,%s)''', ('test', bid_size, ask_size, bid_price, ask_price))
As Fabricator correctly pointed out, please parametrize your queries to prevent SQL injections.
Yet, the reason why you are not seeing the data written to your table is because you did not commit the transaction. Your cursor is adding queries to the transaction, but it is only after you instruct a commit that the data is actually written to the database.
Add con.commit() after adding your cursor queries to finalize the transaction.
In older versions of the MySQLdb adapter for Python data would be committed automatically. But this is considered "bad practice" as the programmer's control over transactions should be honored. This flow has also been laid out in Python's DB API, which all database adapters are asked to honor.
As a side note: Don't forget to close both your cursor and your connection when you are done:
cursor.close()
conn.close()
Otherwise your connection will stay open in an idle state and you could potentially deplete your pool of connection slots.
I have a strange problem that Im having trouble both duplicating and solving.
Im using the pyodbc library in Python to access a MS Access 2007 database. The script is basically just importing a csv file into Access plus a few other tricks.
I am trying to first save a 'Gift Header' - then get the auto-incrmented id (GiftRef) that it is saved with - and use this value to save 1 or more associated 'Gift Details'.
Everything works exactly as it should - 90% of the time. The other 10% of the time Access seems to get stuck and repeatedly returns the same value for cur.execute("select last(GiftRef) from tblGiftHeader").
Once it gets stuck it returns this value for the duration of the script. It does not happen while processing a specific entry or at any specific time in the execution - it seems to happen completely
at random.
Also I know that it is returning the wrong value - in other words the Gift Headers are being saved - and are being given new, unique ID's - but for whatever reason that value is not being returned correctly when called.
SQL = "insert into tblGiftHeader (PersonID, GiftDate, Initials, Total) VALUES "+ str(header_vals) + ""
cur.execute(SQL)
gift_ref = [s[0] for s in cur.execute("select last(GiftRef) from tblGiftHeader")][0]
cur.commit()
Any thoughts or insights would be appreciated.
In Access SQL the LAST() function does not necessarily return the most recently created AutoNumber value. (See here for details.)
What you want is to do a SELECT ##IDENTITY immediately after you commit your INSERT, like this:
import pyodbc
cnxn = pyodbc.connect('DRIVER={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=C:\\Users\\Public\\Database1.accdb;')
cursor = cnxn.cursor()
cursor.execute("INSERT INTO Clients (FirstName, LastName) VALUES (?, ?)", ['Mister', 'Gumby'])
cursor.commit()
cursor.execute("SELECT ##IDENTITY AS ID")
row = cursor.fetchone()
print row.ID
cnxn.close()
Yep! That seems to be a much more reliable way of getting the last id. I believe my initial code was based on the example here http://www.w3schools.com/sql/sql_func_last.asp which I suppose I took out of context.
Thanks for the assist! Here is the updated version of my original code (with connection string):
MDB = 'C:\\Users\\Public\\database.mdb'
DRV = '{Microsoft Access Driver (*.mdb)}'
conn = pyodbc.connect('DRIVER={};DBQ={}'.format(DRV,MDB))
curs = conn.cursor()
SQL = "insert into tblGiftHeader (PersonID, GiftDate, Initials, Total) VALUES "+ str(header_vals) + ""
curs.execute(SQL)
curs.commit()
curs.execute("SELECT ##IDENTITY AS ID")
row = curs.fetchone()
gift_ref = row.ID