Trying to figure out what is happening during Mysql Insert statement.
I have a utility I am writing and am attempting to open my database (Django database in Mysql) and add a record from another database into it. I think I am including all the fields it needs, and If I actually cut and paste the Insert statement that gets generated - it works.
It however does not work programmatically. It doesn't seem to generate an error. The last insert row seems to indicate success, but the actual record never gets there (I think it is rolling it back for some reason). Just don't see what the problem is. I successfully use the same cursor to check something else in the same database right before this, so the cursor should be good.
Below is the insert code in python.
create_string = """Insert INTO trackx_site_program
SET air_date = '%s',
air_time = '%s',
service = '%s',
block_time = '%s',
block_time_delta = %d,
running_time = '%s',
running_time_delta = %d,
remaining_time = '%s',
remaining_time_delta = %d,
title = '%s',
locked_flag = %d,
deleted_flag = %d,
library = '%s',
mc = '%s',
producer = '%s',
editor = '%s',
remarks = '%s',
audit_time = '%s',
audit_user = 'todd' """ % (
air_date, air_time, service_name, block_time, block_time_delta,
running_time, running_time_delta, remaining_time, remaining_time_delta,
title, locked_flag, deleted_flag, library, mc, producer, editor,
remarks, audit_time)
print(" Create String = \n %s" % create_string)
num_rows = new_trackx_cursor.execute(create_string)
print ("Num_rows from execute = %s" % num_rows)
new_program_id = new_trackx_cursor.lastrowid
print("Last Row ID Inserted was %s " % new_program_id)
new_trackx_cursor.close()
sys.exit("Exiting after insert")
An example of the string is below:
Create String =
Insert INTO trackx_site_program
SET air_date = '2001-06-13',
air_time = '18:00:00',
service = 'TheService',
block_time = '0:57:00',
block_time_delta = 3420000000,
running_time = '00:00:00',
running_time_delta = 0,
remaining_time = '0:57:00',
remaining_time_delta = 3420000000,
title = 'My061301',
locked_flag = 1,
deleted_flag = 0,
library = 'K061301-PM',
mc = 'ToddS',
producer = 'TheProducer',
editor = 'theEditor',
remarks = 'REGULAR PROGRAM',
audit_time = '2001-06-13 10:55:16',
audit_user = 'toadyb'
Num_rows from execute = 1
Last Row ID Inserted was 22
Exiting after insert
The Actual Database Table looks like this:
desc tracks_site_program;
+----------------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------------------+--------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| air_date | date | NO | | NULL | |
| air_time | time(6) | NO | | NULL | |
| service | varchar(10) | NO | | NULL | |
| block_time | time(6) | NO | | NULL | |
| block_time_delta | bigint(20) | NO | | NULL | |
| running_time | time(6) | NO | | NULL | |
| running_time_delta | bigint(20) | NO | | NULL | |
| remaining_time | time(6) | NO | | NULL | |
| remaining_time_delta | bigint(20) | NO | | NULL | |
| title | varchar(190) | NO | | NULL | |
| locked_flag | tinyint(1) | NO | | NULL | |
| locked_expiration | datetime(6) | YES | | NULL | |
| deleted_flag | tinyint(1) | NO | | NULL | |
| library | varchar(190) | YES | | NULL | |
| mc | varchar(64) | NO | | NULL | |
| producer | varchar(64) | NO | | NULL | |
| editor | varchar(64) | NO | | NULL | |
| remarks | longtext | YES | | NULL | |
| audit_time | datetime(6) | NO | | NULL | |
| audit_user | varchar(32) | YES | | NULL | |
+----------------------+--------------+------+-----+---------+----------------+
21 rows in set (0.00 sec)
As you can see I print out the actual string, and if I cut and paste into a mysql session, it works just fine.
Anybody know what gives here? Is there a step I am missing?
Thanks
Not quite sure why, but my connections in Python were all preceded with autocommit=0 (False), so my transaction were automatically rolling back. I did a commit on the database connection handle after the insert, and the transaction completed successfully.
Related
I have created a small excel file to try this process
+----------+-------------+------+-----+---------+-------+ | Field | Type | Null | Key | Default | Extra |
+----------+-------------+------+-----+---------+-------+ | S no | int(11) | NO | PI | NULL | | | Order | int(11)
| YES | | NULL | | | Price | int(11) | YES |
| NULL | | | comments | var char(45) | YES | | NULL
| |
+----------+-------------+------+-----+---------+-------+ 4 rows in set (1.21 sec)
This is the code I am using in python for the same
import xlrd import pymysql book =
xlrd.open_workbook(r'C:\Users\hp\Desktop\excel-python-mysql.xlsx')
sheet = book.sheet_by_index(0)
# Connect to the database
connection = pymysql.connect(host='localhost',
user='root',
password='*****',
db='trial',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
cursor = connection.cursor()
query = """INSERT INTO demo(Sno,Order,Price,Comments)VALUES(%d,%d,%d,%s)"""
for r in range(1,sheet.nrows):
Sno = sheet.cell(r,0).value
Order = sheet.cell(r,1).value
Price = sheet.cell(r,2).value
Comments = sheet.cell(r,3).value
values = (Sno,Order,Price,Comments)
cursor.execute(query,values)
cursor.close()
connection.commit()
connection.close()
but I am facing error stating "%d format: a number is required, not str" I want to move the data to MySQL for further use it at Metabase.
So I'm trying to get prices of crypto currencies via an API every N seconds and store them in a MySQL DB in Python 3. So far I have written this code.
import requests
import pymysql
import json
import threading
def addEthPrice():
threading.Timer(11, addEthPrice).start()
url = 'https://min-api.cryptocompare.com/'
urlPrice = 'https://min-api.cryptocompare.com/data/pricemulti?fsyms=BTC,ETH&tsyms=USD,EUR,GBP'
login_data = dict(login='*', password='*')
session = requests.session()
r = session.post(urlPrice, data=login_data)
print("Response Code: ", r, "\n")
data = json.loads(r.content.decode())
conn = pymysql.connect(host='localhost', port=3306, user='root', passwd='*', db='eth')
sql = "INSERT INTO price (date, ethGBP, ethUSD, ethEUR) VALUES(UTC_TIMESTAMP, %s, %s, %s)"
params = (str(data['ETH']['GBP']),
str(data['ETH']['USD']),
str(data['ETH']['EUR']))
try:
with conn.cursor() as c:
res = c.execute(sql, params)
print(params)
print("mysql response: ", res)
finally:
conn.close()
if __name__ == "__main__":
addEthPrice()
Now this code runs fine on Windows and stores it in the DB here is the description of the table for Windows:
+--------+---------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+--------+---------------+------+-----+---------+-------+
| date | timestamp | YES | | NULL | |
| ethGBP | decimal(12,8) | YES | | NULL | |
| ethUSD | decimal(12,8) | YES | | NULL | |
| ethEUR | decimal(12,8) | YES | | NULL | |
+--------+---------------+------+-----+---------+-------+
4 rows in set (0.00 sec)
And here is the description for the Linux table:
+--------+---------------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+--------+---------------+------+-----+-------------------+-----------------------------+
| date | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| ethGBP | decimal(12,8) | YES | | NULL | |
| ethUSD | decimal(12,8) | YES | | NULL | |
| ethEUR | decimal(12,8) | YES | | NULL | |
+--------+---------------+------+-----+-------------------+-----------------------------+
4 rows in set (0.04 sec)
I have used the same syntax to create both the tables but the Linux one has added parameters.
Here's my question, When I run these script the data gets added fine to the WAMP MySQL but not to the Linux MySQL version, Why is this and how can I make it so it works on both.
Linux version 5.7.18-0ubuntu0.16.04.1
Windows version MySQL5.7.14
Thanks.
I have a database with tables: person, player, coach, and team. All the tables have an auto-increment id field as the primary key. Person has id, firstname, lastname. Player and coach both have the id field, as well as person_id and team_id as foreign keys to tie them to a team.id or person.id field in the other tables.
I have one master csv file, from that I want import all the values in MySql different tables with ids.
And I want to check the value also in the data base. If the value is in database then do not import that value.
I have used CSV parsing and indexing function. But I am not able to do that. Can any one help me in that
My sql table below
mysql> describe person;
+-----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| firstname | varchar(30) | NO | | NULL | |
| lastname | varchar(30) | NO | | NULL | |
+-----------+-------------+------+-----+---------+----------------+
mysql> describe player;
+-----------+---------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+---------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| person_id | int(11) | NO | MUL | NULL | |
| team_id | int(11) | NO | MUL | NULL | |
+-----------+---------+------+-----+---------+----------------+
mysql> describe team;
+-----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| teamname | varchar(25) | NO | | NULL | |
| location | varchar(40) | NO | | NULL | |
| city | varchar(25) | NO | | NULL | |
| state | varchar(2) | NO | | NULL | |
| venue | varchar(35) | NO | | NULL | |
| league_id | int(11) | NO | MUL | NULL | |
+-----------+-------------+------+-----+---------+----------------+
My Csv file is
First Name Last Name teamname Location city state |venue
abc cdf india csv bng kar abc
After importing
I have a database with tables: person, player, coach, and team. All the tables have an auto-increment id field as the primary key. Person has id, firstname, lastname. Player and coach both have the id field, as well as person_id and team_id as foreign keys to tie them to a team.id or person.id field in the other tables.
I have one master csv file, from that I want import all the values in MySql different tables with ids.
And I want to check the value also in the data base. If the value is in database then do not import that value.
I have used CSV parsing and indexing function. But I am not able to do that. Can any one help me in that
My sql table below
mysql> describe person;
+-----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| firstname | varchar(30) | NO | | NULL | |
| lastname | varchar(30) | NO | | NULL | |
+-----------+-------------+------+-----+---------+----------------+
mysql> describe player;
+-----------+---------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+---------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| person_id | int(11) | NO | MUL | NULL | |
| team_id | int(11) | NO | MUL | NULL | |
+-----------+---------+------+-----+---------+----------------+
mysql> describe team;
+-----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| teamname | varchar(25) | NO | | NULL | |
| location | varchar(40) | NO | | NULL | |
| city | varchar(25) | NO | | NULL | |
| state | varchar(2) | NO | | NULL | |
| venue | varchar(35) | NO | | NULL | |
| league_id | int(11) | NO | MUL | NULL | |
+-----------+-------------+------+-----+---------+----------------+
My Csv file is
First Name Last Name teamname Location city state |venue
abc cdf india csv bng kar abc
After importing
id First Name Last Name teamname Location city state |venue coment
1 1 1 1 1 1 1 abc abc
I am trying with some small code
# initialize with empty ints and dicts
name,cities,countries,states=[],[],[],[]
with open('ind.csv','rb') as csvfile:
reader = csv.reader(csvfile, delimiter=',')
reader.next() #skip header
for row in reader:
name.append(row[0])
cities.append(row[2])
states.append(row[3])
countries.append(row[4])
cl = list(set(countries))
sl = list(set(states))
citl = list(set(cities))
inf1 = list(set(name))
with open('countries.csv','w') as cfile:
writer = csv.writer(cfile, delimiter=',')
writer.writerow(['country_id','name'])
for i,x in enumerate(cl):
writer.writerow([i,x])
with open('state.csv','w') as cfile:
writer = csv.writer(cfile, delimiter=',')
writer.writerow(['state_id','country_id','state'])
for i,x in enumerate(sl):
writer.writerow([i,x,cl.index(countries[states.index(x)])])
with open('cities.csv','w') as cfile:
writer = csv.writer(cfile,delimiter=',')
writer.writerow(['city_id','city','st_id','country_id'])
for i,x in enumerate(citl):
writer.writerow([i,x,sl.index(states[cities.index(x)]),
cl.index(countries[cities.index(x)])
])
with open('inf123.csv','w') as cfile:
writer = csv.writer(cfile,delimiter=',')
writer.writerow(['Name_id', 'Name','city_id','st_id','country_id'])
for i,x in enumerate(inf1):
writer.writerow([i,x,
citl.index(cities[name.index(x)]),
sl.index(states[name.index(x)]),
cl.index(countries[name.index(x)])
])
import MySQLdb
import csv
mydb = MySQLdb.connect(host="localhost", # The Host
user="root", # username
passwd="root", # password
db="abcm") # name of the data base
cursor = mydb.cursor()
csv_data = csv.reader(file('countries.csv'))
for row in csv_data:
cursor.execute('INSERT INTO country(id, \
name )' \
'VALUES("%s", "%s")',
row)
#close the connection to the database.
mydb.commit()
cursor.close()
print "Done"
cursor = mydb.cursor()
csv_data = csv.reader(file('state.csv'))
for row in csv_data:
cursor.execute('INSERT INTO state(id, \
country, name )' \
'VALUES("%s", "%s", "%s")',
row)
#close the connection to the database.
mydb.commit()
cursor.close()
print "Done"
I have one master csv file, from that I want import all the values in
MySql different tables with ids.
This is not possible because the import routine doesn't know where you want to put the data.
If your master csv file contained a column containing the table name you could then
import your csv file into a temporary
use different sql statements to move the data into the appropriate tables
I have two tables:user and post
and the structures of them are:
post:
+---------+----------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+---------+----------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| name | char(30) | YES | | NULL | |
| user_id | int(11) | YES | | NULL | |
+---------+----------+------+-----+---------+----------------+
user:
+---------+----------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+---------+----------+------+-----+---------+----------------+
| user_id | int(11) | NO | PRI | NULL | auto_increment |
| name | char(30) | YES | | NULL | |
| email | char(30) | YES | | NULL | |
+---------+----------+------+-----+---------+----------------+
i get this:(keys of data dict)
['post.user_id', 'user_id', 'name', 'email', 'post.name', 'id']
my python code is:
import MySQLdb
import MySQLdb.cursors
con = MySQLdb.connect(user = "root", passwd = "123456", db = "mydb", cursorclass=MySQLdb.cursors.DictCursor)
cur = con.cursor()
cur.execute("select * from user, post where user.user_id = post.user_id")
print cur.fetchone().keys()
but,why the keys of data dict is that? thanks. My English is not so well,excuse me
When you select *, you ask for all columns in both user and post. Since user and post have columns with overlapping names, the tablename is added before a few of them, to create unique keys.
I'm not sure what you were expecting, but you can explicitly control the keys you get by giving the columns aliases:
"select user.user_id as user_id, post.name as post_name, user.name as user_name ..."
I have a table that has 4 attributes.
+--------------+--------------+------+-----+-------------------+----------------+
| Field | Type | Null | Key | Default | Extra |
+--------------+--------------+------+-----+-------------------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| url | varchar(255) | YES | | NULL | |
| mail | varchar(255) | YES | | NULL | |
| date_entered | timestamp | NO | | CURRENT_TIMESTAMP | |
| active | tinyint(1) | NO | | 1 | |
+--------------+--------------+------+-----+-------------------+----------------+
Now i want to insert only the data_entered and other attributes to get default values.
I'm doing it for the id field which I need to be exact to another id I insereted to different table.
this is the code:
tx.execute(\
"insert into company_career (date_entered) "
"values (%s)",
(time.time())
)
This is the error:
query = query % db.literal(args)
exceptions.TypeError: not all arguments converted during string formatting
How to fix it?
Try this:
tx.execute(\
"insert into company_career (date_entered) "
"values (FROM_UNIXTIME('%s'))",
(time.time())
)
when you are providing values into a insert query dont forget to enclode it in commas
"insert into company_career (date_entered) values ('%s')"
I did manage to insert with: insert into company(url,mail,date_entered) values("","","now");