How to get columns name in mysqldb with a Python 2.7? - python

If I am using select * from query it is working well, but when I am trying to query the columns name too, it isnt working (maybe because I have got a column called "FROM" but that's why i used 'FROM!?)
Here my code:
connection = MySQLdb.connect(host='localhost',
user='admin',
passwd='',
db='database1',
use_unicode=True,
charset="utf8")
cursor = connection.cursor()
query = """ select ACTUAL_TIME, 'FROM, ID
union all
select ACTUAL_TIME, FROM , ID
from TEST
into outfile '/tmp/test.csv'
fields terminated by ';'
enclosed by '"'
lines terminated by '\n';
"""
cursor.execute(query)
connection.commit()
cursor.close()
I get this error message:
raise errorvalue
_mysql_exceptions.OperationalError: (1054, "Unknown column 'ACTUAL_TIME' in 'field list'")
EDIT: SHOW CREATE TABLE TEST;
| TEST | CREATE TABLE `TEST` (
`ACTUAL_TIME` varchar(100) DEFAULT NULL,
`FROM` varchar(100) DEFAULT NULL,
`STATUS` varchar(100) DEFAULT NULL,
`ID` int(10) NOT NULL AUTO_INCREMENT,
PRIMARY KEY (`ID`)
) ENGINE=InnoDB AUTO_INCREMENT=76287 DEFAULT CHARSET=utf8 |

try this :
connection = MySQLdb.connect(host='localhost',
user='admin',
passwd='',
db='database1',
use_unicode=True,
charset="utf8")
cursor = connection.cursor()
query = """ select 'ACTUAL_TIME', 'FROM', 'ID' -- add single quotes
union all
select `ACTUAL_TIME`, `FROM`, `ID` -- add here backtick in column names
from TEST
into outfile '/tmp/test.csv'
fields terminated by ';'
enclosed by '"'
lines terminated by '\n';
"""
cursor.execute(query)
connection.commit()
cursor.close()
or else you can use this to get column names "SHOW columns"
or :
cursor.execute("SELECT * FROM table_name LIMIT 0")
print cursor.description

columns = cursor.description
result = [{columns[index][0]:column for index, column in enumerate(value)} for value in cursor.fetchall()]
print(result)

Related

A script doesn't execute the command in cur.execute()

I'm new to PostgreSQL and psycopg2, and I face a problem.
import psycopg2
def create_tables(whichone):
in_str_station = "CREATE TABLE station (id SMALLINT NOT NULL PRIMARY KEY, name VARCHAR(40) NOT NULL," \
" country VARCHAR(3) NOT NULL, latitude VARCHAR(10), " \
"longitude VARCHAR(10),height SMALLINT);"
in_str_dailyData = "CREATE TABLE daily_data (id BIGSERIAL NOT NULL PRIMARY KEY," \
"s_id SMALLINT NOT NULL REFERENCES station(id)," \
"d_date DATE NOT NULL, d_mean NUMERIC(6, 1), quality SMALLINT);"
int_str_monthlyMean = "CREATE TABLE monthly_mean (id BIGSERIAL NOT NULL PRIMARY KEY,"\
"s_id SMALLINT NOT NULL REFERENCES station(id),"\
"m_date DATE NOT NULL, m_mean NUMERIC(9, 3),"\
"var NUMERIC(9, 3), std NUMERIC(9, 3));"
in_str_yearlymean = "CREATE TABLE yearly_mean (id BIGSERIAL NOT NULL PRIMARY KEY, " \
"s_id SMALLINT NOT NULL REFERENCES station(id)," \
"y_date DATE NOT NULL, y_mean NUMERIC(9, 3),var NUMERIC(9, 3)," \
"std NUMERIC(9, 3), var_m NUMERIC(9, 3), std_m NUMERIC(9, 3));"
database_list = {'station': in_str_station, 'monthly_mean': int_str_monthlyMean,
'daily_data': in_str_dailyData, 'yearly_mean': in_str_yearlymean}
try:
conn = psycopg2.connect(
host="localhost",
database="climate",
user="postgres",
password="1")
cur = conn.cursor()
in_str = database_list.get(whichone)
cur.execute(in_str)
output_ = cur.fetchall()
print(output_)
cur.close()
except (Exception, psycopg2.DatabaseError) as error:
print(error)
finally:
if conn is not None:
conn.close()
After I run the script, no matter which one of the in_str_ I choose, the table is not created. I have checked and when I copy the content of in_str that I executed in cur.execute and use it in the PostgreSQL shell, everything works.
Where did I make the mistake?
Call conn.commit() after cur.execute(), but before conn.close(). Transactions are not implicitly committed with psycopg2:
If the connection is closed (using the close() method) or destroyed (using del or by letting it fall out of scope) while a transaction is in progress, the server will discard the transaction.
I don't know if you just did that here but it's indented wrong. You need to indent code after the function.
def foo(a):
pass

Error Inserting records into mariadb table using Python : 1366 (22007): Incorrect integer value: '%s' for column

I am inserting records in maria db table from a file using python. Population column in the file is empty. I want it to go as empty value in table as well. Population column in table is set as integer and can accept null value. I am trying the below code -
Table Definition -
CREATE TABLE local_db.table_x (
Unique_code varchar(50) NOT NULL,
city varchar(200) DEFAULT NULL,
state varchar(50) DEFAULT NULL,
population bigint(20) DEFAULT NULL,
Govt varchar(50) DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
input_file = "input_file"
csv_data = csv.reader(open(input_file))
try:
connection = mysql.connector.connect(host='localhost',
database='local_db',
user='root',
password='root',
port = '3306')
cursor = connection.cursor()
for row in csv_data:
cursor.execute("""
INSERT INTO table_x(Unique_code,city,state,population,Govt)
VALUES("%s", "%s", "%s","%s", "%s")
ON DUPLICATE KEY UPDATE city = VALUES(city),state = VALUES(state), \
population = VALUES(population),Govt = VALUES(Govt)""")
connection.commit()
print(cursor.rowcount, "Record inserted successfully into table_x")
cursor.close()
except mysql.connector.Error as error:
print("Failed to insert record into table_x table {}".format(error))
finally:
if (connection.is_connected()):
connection.close()
print("MySQL connection is closed")
But I am getting below error -
Failed to insert record into table_x table 1366 (22007): Incorrect integer value: '%s' for column local_db.table_x.population at row 1
MySQL connection is closed
In other thread it was suggested to change
SET sql_mode = ""
But its not an option for me since I would be running it on organization server which I can not change for this only. Please suggest what code changes I can do here to handle this situation.
The population column is a bigint(20) DEFAULT NULL & hence you cannot provide the value "%s" format it should be a integer which is being passed.
Also on values front.
It can have NULL
Any integer value.
So incase you want it to be null & not empty string("") you can skip to insert value to population column all together by removing it from the insert command
INSERT INTO table_x(Unique_code,city,state,Govt)
VALUES("%s", "%s", "%s","%s")
ON DUPLICATE KEY UPDATE city = VALUES(city),state = VALUES(state), Govt = VALUES(Govt)""")
Assuming that each row of your CSV has 5 values, corresponding to the code, city, state, population, and government, in that order, you should be using this syntax for the insert query:
for row in csv_data:
params = row.split(",")
sql = """
INSERT INTO table_x (Unique_code, city, state, population, Govt)
VALUES(?, ?, ?, ?, ?)
ON DUPLICATE KEY UPDATE city = VALUES(city), state = VALUES(state),
population = VALUES(population),Govt = VALUES(Govt)"""
cursor.execute(sql, params)
The first parameter to execute() should be the SQL prepared statement, and the second parameter should be a tuple. In this case, it should be a tuple with 5 values, corresponding to the five ? placeholders in the VALUES clause of your query. An example of a valid tuple here might be:
params = ('ABC123', 'New York', 'New York', 15000000, 'USA',)
Then call via:
cursor.execute(sql, params)

executemany() not inserting all records to database

I'm trying to use execute many to insert four (4) records at a time to a MySQL Database table, using MySQL Connector driver.
The problem is executemany() is consistently "skipping" the first 3 records without throwing an error and I can't understand what's wrong.
I'm trying to insert the values in a list:
my_records = [(334, 20533, 387.5, 'Label1'), (335, 20534, 387.5, 'Label2'), (336, 108659, 387.5, 'Label3'), (337, 108660, 387.5, 'Label4')]
And then here's my code:
try:
mydb = mysql.connector.connect(
host=os.getenv('DBM_HOST', 'x.x.x.x'),
user=os.getenv('DBM_USER', 'username'),
passwd=os.getenv('DBM_PASSWORD', 'password'),
database=os.getenv('DBM_NAME', 'my_database')
)
mycursor = mydb.cursor()
sql = """
INSERT INTO my_table (
batch_id, user_id, assessment, label
) VALUES (
%s, %s, %s, %s
)
"""
mycursor.executemany(sql, my_records)
mydb.commit()
mycursor.close()
mydb.close()
return True
except Exception as exception:
print(exception)
return None
Does anyone know what's happening? Or why the first 3 records are not inserting?
Here's the table structure:
Field Type Null Key Default Extra
id int(10) unsigned NO PRI auto_increment
batch_id int(10) unsigned NO MUL
user_id int(10) unsigned NO
assessment decimal(10,2) unsigned NO
label varchar(250) YES

pymysql.err.ProgrammingError: You have an error in your SQL syntax

I am trying to insert data from a python dictionary to mySql DB. but I don't understand what is wrong with my sql query.
I am Getting this error:
pymysql.err.ProgrammingError: (1064, u"You have an error in your SQL
syntax; check the manual that corresponds to your MySQL server version
for the right syntax to use near ''DiedIn' ('name', 'city') VALUES
('\'Ethel_Merman\'', '\'New_York_City\\n\'')' at line 1")
this is my code:
import pymysql.cursors
wasBornIn = {}
with open("wasBornIn.txt") as f:
for line in f:
(key, val) = line.split(':')
wasBornIn[key] = val
diedIn = {}
with open("diedIn.txt") as f:
for line in f:
(key, val) = line.split(':')
diedIn[key] = val
isLocatedIn = {}
with open("isLocatedIn.txt") as f:
for line in f:
(key, val) = line.split(':')
isLocatedIn[key] = val
connection = pymysql.connect(host='********', user='******', password='******', db='*******',
charset='utf8mb4', cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
# Create a new record
sql = "DROP TABLE DiedIn"
cursor.execute(sql)
with connection.cursor() as cursor:
# Create a new record
sql = "DROP TABLE isLocatedIn"
cursor.execute(sql)
with connection.cursor() as cursor:
# Create a new record
sql = "DROP TABLE BornIn"
cursor.execute(sql)
with connection.cursor() as cursor:
sql = "CREATE TABLE `DiedIn`(`name` varchar(100) COLLATE utf8_bin NOT NULL, `city` varchar(50) COLLATE utf8_bin NOT NULL, " \
"PRIMARY KEY(`name`)) ENGINE = InnoDB DEFAULT CHARSET = utf8" \
" COLLATE = utf8_bin;"
cursor.execute(sql)
with connection.cursor() as cursor:
sql = "CREATE TABLE `isLocatedIn`(`name` varchar(150) COLLATE utf8_bin NOT NULL, `location` varchar(50) COLLATE utf8_bin NOT NULL, " \
"PRIMARY KEY(`name`)) ENGINE = InnoDB DEFAULT CHARSET = utf8" \
" COLLATE = utf8_bin;"
cursor.execute(sql)
with connection.cursor() as cursor:
sql = "CREATE TABLE `BornIn`(`name` varchar(100) COLLATE utf8_bin NOT NULL, `city` varchar(50) COLLATE utf8_bin NOT NULL, " \
"PRIMARY KEY(`name`)) ENGINE = InnoDB DEFAULT CHARSET = utf8" \
" COLLATE = utf8_bin;"
cursor.execute(sql)
with connection.cursor() as cursor:
for key, value in diedIn.iteritems():
strKey = repr(key)
strValue = repr(value)
sql = "INSERT INTO 'DiedIn' ('name', 'city') VALUES (%s, %s);"
cursor.execute(sql, (strKey, strValue))
# connection is not autocommit by default. So you must commit to save
# your changes.
connection.commit()
finally:
connection.close()
thanks for the help.
try:
sql = "INSERT INTO 'DiedIn' (name, city) VALUES ('%s', '%s');"

pymysql not inserting data; but "autoincrement" increases

this is a follow-up from https://stackoverflow.com/questions/33336963/use-a-python-dictionary-to-insert-into-mysql/33337128#33337128.
import pymysql
conn = pymysql.connect(server, user , password, "db")
cur = conn.cursor()
ORFs={'E7': '562', 'E6': '83', 'E1': '865', 'E2': '2756 '}
table="genome"
cols = ORFs.keys()
vals = ORFs.values()
sql = "INSERT INTO %s (%s) VALUES(%s)" % (
table, ",".join(cols), ",".join(vals))
print sql
print ORFs.values()
cur.execute(sql)
cur.close()
conn.close()
Thanks to Xiaohen, my program works (i.e. it does not throw any errors), but when I go and check the mysql database, the data is not inserted. I noticed that the autoincrement ID column does increase with every failed attempt. So this suggests that I am at least making contact with the database?
As always, any help is much appreciated
EDIT: I included the output from mysql> show create table genome;
| genome | CREATE TABLE `genome` (
`ID` int(11) NOT NULL AUTO_INCREMENT,
`state` char(255) DEFAULT NULL,
`CG` text,
`E1` char(25) DEFAULT NULL,
`E2` char(25) DEFAULT NULL,
`E6` char(25) DEFAULT NULL,
`E7` char(25) DEFAULT NULL,
PRIMARY KEY (`ID`)
) ENGINE=InnoDB AUTO_INCREMENT=15 DEFAULT CHARSET=latin1 |
+--------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
Think I figured it out.
I will add the info here in case someone else comes across this question:
I need to add conn.commit() to the script
You can use
try:
cur.execute(sql)
except Exception, e:
print e
If your code is wrong, the exception can tell you.
And it has another question.
the cols and vals are not match.
The values should be
vals = [dict[col] for col in cols]

Categories

Resources