I have a SQL query which opens up a csv file and dumps it in a table of a database. I am trying to dump multiple files at once using a python script to iterate among the files. I tried embedding the same SQL query inside the script, but it throws out an error.
This is the script I have.
import csv
import MySQLdb
connection = MySQLdb.connect(host='localhost',
user='root',
passwd='password',
db='some_db')
cursor = connection.cursor()
query = """ LOAD DATA INFILE 'c:\\example.csv' INTO TABLE new_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' Lines terminated by '\n' IGNORE 1 LINES """
cursor.execute(query)
conenction.commit()
cursor.close()
And for some reason the python script looks up example.csv at a different location
This is the error that is thrown out :
raise errorclass, errorvalue
InternalError: (29, "File 'C:\\Documents and Settings\\All Users\\Application Data\\MySQL\\MySQL Server 5.5\\data\\example.csv' not found (Errcode: 2)")
Any help would be greatly appreciated. I am also searching on stackoverflow for help to dump the scv files into differnt tables of a database. Any ideas on that?
You probably need the load data local syntax to make sure the data is being read relative to the client and not the server. Change
query = """ LOAD DATA INFILE 'c:\\example.csv' INTO TABLE new_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' Lines terminated by '\n' IGNORE 1 LINES """
to
query = """ LOAD DATA LOCAL INFILE 'c:\\example.csv' INTO TABLE new_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' Lines terminated by '\n' IGNORE 1 LINES """
Watch your spelling! conenction.commit() should be connection.commit()
Related
I am trying to update my mysql database field with a concatenation. I have to read my file line by line, and i need to append the existing string with the loaded line. I have to do it like this because my goal is to insert a 3gb long whitespace separated text file into one longtext field, and mysql only capable of handling 1gb text to insert.
The problem with my code is that if i add the field name to the concat function like seq=concat(seq, %s) I get a SQL syntax error, but when I add the field name as a variable, python acts like it's a string.
So short story long with this input file:
aaa
bbb
ccc
I want to have an updated mysql field like this:
aaabbbccc
But I get this: seqccc
Any idea how should i work with the fieldname to get this work?
import mysql.connector
connection = mysql.connector.connect(host='localhost',
database='sys',
user='Pannka',
password='???')
cursor = connection.cursor()
with open('test.txt', 'r') as f:
for line in f:
sql = "update linedna set seq=concat(%s, %s) where id=1"
val=('seq', line.rstrip())
print(line.rstrip())
cursor.execute(sql, val)
connection.commit()
cursor.close()
connection.close()
f.close()
print(0)
I think that you want:
sql = "update linedna set seq = concat(seq, %s) where id=1"
val=(line.rstrip())
cursor.execute(sql, val)
connection.commit()
This will append each new line at the end of the already existing database value in column seq.
I have a program that is exporting MSSQL data and importing it into MySQL. I have a function that is importing as follows:
def importMySql (mycursor,exportedfilename,table,delimiter):
file_loc = str(sys.path[0] +"\\" +exportedfilename.lower()+".out").replace("\\", "\\\\")
mycursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE %s FIELDS TERMINATED BY '%s' LINES TERMINATED BY '\r\n'" %(str(file_loc), table, delimiter))
the cursor (MySQLdb) is raising the following warnings:
C:\Users\tfy\Documents\PyProj\UTL (Export, Import, RDF)\eic.py:98: Warning: Data truncated for column 'DateofCharges' at row 1194
mycursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE %s FIELDS TERMINATED BY '%s' LINES TERMINATED BY '\r\n'" %(str(file_loc), table, delimiter))
C:\Users\tfy\Documents\PyProj\UTL (Export, Import, RDF)\eic.py:98: Warning: Data truncated for column 'DateofCharges' at row 2009
mycursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE %s FIELDS TERMINATED BY '%s' LINES TERMINATED BY '\r\n'" %(str(file_loc), table, delimiter))
C:\Users\tfy\Documents\PyProj\UTL (Export, Import, RDF)\eic.py:98: Warning: Data truncated for column 'DateofCharges' at row 4793
mycursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE %s FIELDS TERMINATED BY '%s' LINES TERMINATED BY '\r\n'" %(str(file_loc), table, delimiter))
but I need to control the warning to only output:
Warning: Data truncated for column 'DateofCharges' at row 1194
Warning: Data truncated for column 'DateofCharges' at row 2009
Warning: Data truncated for column 'DateofCharges' at row 4739
I have looked around and found plenty of information that illustrates hows to create custom warnings. However, not sure how I would achieve the above. I do not want to turn off the warnings, I just want to "format" them. I thought about editing the actual MySQLdb file but it is in .egg format and unable to do that. I also played around warning.format() but was unsuccessful.
Thanks!
So this is the easiest way I have found... Not sure why I did not think of this originally... but I simply suppressed the warnings issued by the cursor:
import warnings
warnings.filterwarnings("ignore", category = MySQLdb.Warning)
I then added this code to my importMySql function:
mycursor.execute("SHOW WARNINGS")
warnings = mycursor.fetchall()
for i in range(len(warnings)):
print "Warning - " +warnings[i][2]
figure this out to use pprint. As the OPs solution the default warnings need be suppressed, and then add the show_warnings function, and then use the new print format.
from warnings import filterwarnings
import MySQLdb as mdb
from pprint import pprint
filterwarnings('ignore', category = mdb.Warning)
con = mdb.connect(...)
cur = con.cursor()
query = "Update table ..."
cur.execute(query)
con.commit()
warnings = con.show_warnings() # return in tuple type
pprint(warnings, width=100, depth=2) # width is the num of characters in each line, and depth is the level of the warnings in the tuple
Using MySQLdb
You could monkey patch MySQLdb to achieve this:
import types
def warning_check(self):
if not self._warnings:
self.messages = ()
return
self.messages = self._get_db().show_warnings()
Then patch the Cursor object in your function like this:
cur._warning_check = types.MethodType(warning_check, cur)
Then, when you are done executing LOAD DATA.., you can print the messages:
cur.execute("LOAD DATA..")
for msg in cur.messages:
print "Warning: {msg}".format(msg=msg[2])
Using MySQL Connector/Python
Using MySQL Connector/Python, you would do something like this:
cnx.get_warnings = True
cur.execute("LOAD DATA..")
for msg in cur.fetchwarnings():
print "Warning: {msg}".format(msg=msg[2])
(Note that you need the client flag set with the connection argument client_flags=[mysql.connector.ClientFlag.LOCAL_FILES])
It is possible to run your mysql code in a subprocess? If so, you can use Python's subprocess to run the mysql code, read output from stdout and format it accordingly. For example, use process.stdout.readline().
You can refer to this question: Starting and Controlling an External Process via STDIN/STDOUT with Python
I am using following command to load multiple .csv files into Mysql database but i am getting no errors on (the IDLE window) and the data does not load
Here is the erroneous script
#!C:\Python27\python.exe
import MySQLdb
import os
import string
# Open database connection
db = MySQLdb.connect (host="localhost",port=3307,user="root",\
passwd="gamma123",db="test")
cursor=db.cursor()
l = os.listdir(".")
for file_name in l:
print file_name
cursor=db.cursor()
if (file_name.find("DIV.csv")>-1):
#Query under testing
sql = """LOAD DATA LOCAL INFILE file_name \
INTO TABLE system_work \
FIELDS TERMINATED BY ',' \
OPTIONALLY ENCLOSED BY '"' \
LINES TERMINATED BY '\r\n' \
IGNORE 1 LINES;;"""
try:
# Execute the SQL command
cursor.execute(sql)
# Commit your changes in the database
db.commit()
except:
# Rollback in case there is any error
db.rollback()
# disconnect from server
db.close()
But when i try to load a single file using the following python script then its works fine.
please help....
#!C:\Python27\python.exe
import MySQLdb
import os
import string
# Open database connection
db = MySQLdb.connect (host="localhost",port=3307,user="root",\
passwd="gamma123",db="test")
cursor=db.cursor()
#Query under testing
sql = """LOAD DATA LOCAL INFILE 'Axle.csv' \
INTO TABLE system_work \
FIELDS TERMINATED BY ',' \
OPTIONALLY ENCLOSED BY '"' \
LINES TERMINATED BY '\r\n' \
IGNORE 1 LINES;;"""
try:
# Execute the SQL command
cursor.execute(sql)
# Commit your changes in the database
db.commit()
except:
# Rollback in case there is any error
db.rollback()
# disconnect from server
db.close()
You need to interpolate the filename into the SQL string; you are just sending the literal text file_name to the server. You could use the str.format() method for that, any {} placeholder can then be replaced by a variable of your choosing.
You also must indent the try and except blocks to be within the for loop:
sql = """LOAD DATA LOCAL INFILE '{}'
INTO TABLE system_work
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\\r\\n'
IGNORE 1 LINES;;"""
for file_name in l:
print file_name
if file_name.endswith('DIV.csv'):
try:
cursor = db.cursor()
cursor.execute(sql.format(file_name))
db.commit()
except Exception:
# Rollback in case there is any error
db.rollback()
The cursor.execute() method is passed the sql string with the file_name variable interpolated. The {} part on the first line (LOAD DATA LOCAL INFILE '{}') will be replaced by the value in file_name before passing the SQL statement to MySQL.
I also simplified the filename test; presumably it is enough if the filename ends with DIV.csv.
Note that it might just be easier to use the mysqlimport utility; you can achieve the exact same results with:
mysqlimport --fields-terminated-by=, --fields-optionally-enclosed-by=\" \
--local --lines-terminated-by=\r\n --user=root --password=gamma123 \
test *DIV.csv
if (file_name.find("DIV.csv")>-1): unless all of your files are actually called DIV.csv should that be if (file_name.find(".csv")>-1): (that would probably be more efficient testing the last four letters of the file name by the way)
I have a txt file with many mysql inserts (1.5 million).
I need to read this file with python and divide this file at each ';' for each query and run the query with python. How can I divide this file at each ';'? And run the query with python?
Until now my code is:
import MySQLdb
db = MySQLdb.connect(host = "localhost",
user="root",
passwd="da66ro",
db="test")
f = open('E:/estudos/projetos/tricae/tests_python.txt')
First open the file:
with open('youfilename.sql', 'r') as f:
fileAsString = f.read().replace("\n", "")
sqlStatements = fileAsString.split(";")
Then to run the query:
cursor = db.cursor()
for statement in sqlStatements:
try:
cursor.execute(statement)
db.commit()
except:
db.rollback()
But of course you must realize this is a terrible idea. What happens when you have a quoted ";" character in a string you are inserting? You'll have to be a bit more clever than what you posed as a question - in general it's a terrible idea to assume anything about any data you are inserting into a database.
Or even worse than a broken query: what about malicious code? SQL injection? Never trust input you haven't sanitized.
OK, so, first you need read the file and split it by ";", which is done with split() function. Then you can loop or select which queries to execute (or just execute the whole file without splitting). You can find numerous examples on each of these and I'm sure it will be easy enough to combine them in what you need.
I need to read this file with python and divide this file at
each ';' for each query and run the query with python.
I am new to python.. Here is the work around
import io
myfile = open('69_ptc_group_mappingfile_mysql.sql')
data = (myfile.read().decode("utf-8-sig").encode("utf-8")).lower()
query_list=[]
if 'delimiter' not in data:
query_list = (data.strip()).split(";")
else:
tempv = (data.rstrip()).split('delimiter')
for i in tempV:
if (i.strip()).startswith("//"):
i = i.rstrip().split("//")
for a in i:
if len(a)!=0:
query_list.append(a.strip())
else:
corr = ((i.rstrip()).split(";"))
for i in corr:
if len(i.rstrip())!=0:
query_list.append(i.rstrip())
print query_list
for j in query_list: cursor.execute(j)
I wanted a script that iterates through csv files in a folder and dump them into a MySQL database. I was able to dump one csv file into it.. But have troubles passing the file name in to the SQL script.
This is the code I use
file_path="C:\csv-files"
files=os.listdir(file_path)
files.sort()
for n in files:
cursor.execute(" LOAD DATA LOCAL INFILE '%s' INTO TABLE new_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' Lines terminated by '\n' IGNORE 1 LINES ",(n))
And I get the following error
raise errorclass, errorvalue
ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'file1.csv'' INTO TABLE new_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY' at line 1")
If I use the file name directly instead of passing it, it works fine.
If you can see in the error thrown out, there seems to be an error in the SQL Script.
This would be the whole code
import csv
import MySQLdb
import sys
import os
connection = MySQLdb.connect(host='localhost',
user='root',
passwd='password',
db='some_db')
cursor = connection.cursor()
file_path="C:\csv-files"
files=os.listdir(file_path)
files.sort()
for n in files:
print n
cursor.execute(" LOAD DATA LOCAL INFILE %s INTO TABLE new_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' Lines terminated by '\n' IGNORE 1 LINES " %n)
connection.commit()
cursor.close()
First, replace '%s' with %s in the query. MySQLdb handles any quoting automatically.
Here's the code with some corrections and changes:
import MySQLdb
import os
CSV_DIR = "C:\csv-files"
connection = MySQLdb.connect(host='localhost',
user='root',
passwd='password',
db='some_db',
local_infile=1)
cursor = connection.cursor()
try:
for filename in sorted(os.listdir(CSV_DIR)):
cursor.execute("""LOAD DATA LOCAL INFILE %s
INTO TABLE new_table
FIELDS
TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES""",
(os.path.join(CSV_DIR, filename),))
connection.commit()
finally:
cursor.close()
NOTE: I set local_infile parameter to 1 in MySQLdb.connect and pass filename in tuple to execute.
Works for me.