Python sqlite3 - Operational Error - python

This is my code. It will read a bunch of files with SQL commands in identical formats (i.e. comments prefaced with -, along with some blank lines, hence the if condition in my loop)
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import csv,sqlite3,os
conn = sqlite3.connect('db_all.db')
c = conn.cursor()
files = os.listdir('C:\\Users\\ghb\\Desktop\\database_schema')
for file in files:
string = ''
with open(file, 'rb') as read:
for line in read.readlines():
if line[0]!='-' and len(line)!=0: string = string + line.rstrip() #error also occurs if I skip .rstrip
print string #for debugging purposes
c.executescript(string)
string=''
conn.close()
Error:
Traceback (most recent call last):
File "C:/Users/ghb/Desktop/database_schema/database_schema.py", line 16, in <module>
c.executescript(string)
sqlite3.OperationalError: near "SET": syntax error
For fear of clutter, here is the output of the string variable (without the INSERT INTO data, that is confidential):
SET SQL_MODE="NO_AUTO_VALUE_ON_ZERO";
CREATE TABLE IF NOT EXISTS `career` (
`carKey` int(11) NOT NULL AUTO_INCREMENT,
`persID` bigint(10) unsigned zerofill NOT NULL,
`persKey` int(6) unsigned NOT NULL,
`wcKey` int(2) unsigned NOT NULL,
`wtKey` int(2) unsigned DEFAULT NULL,
`pwtKey` int(2) unsigned DEFAULT NULL,
`dptId` bigint(10) unsigned NOT NULL,
`dptNr` int(4) unsigned NOT NULL,
`dptalias` varchar(10) COLLATE utf8_icelandic_ci NOT NULL,
`class` enum('A','B') COLLATE utf8_icelandic_ci NOT NULL,
`getfilm` enum('yes','no') COLLATE utf8_icelandic_ci NOT NULL DEFAULT 'yes',
`finished` enum('true','false') COLLATE utf8_icelandic_ci NOT NULL DEFAULT 'false',
`startDate` date NOT NULL,
`endDate` date DEFAULT NULL,
`regDate` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`user` tinyint(4) NOT NULL,
`status` set('LÉST','BRFL','BRFD','BRNN') COLLATE utf8_icelandic_ci DEFAULT NULL,
`descr` text COLLATE utf8_icelandic_ci,
PRIMARY KEY (`carKey`),
KEY `pwtKey` (`pwtKey`),
KEY `wtKey` (`wtKey`),
KEY `dptId` (`dptId`),
KEY `user` (`user`),
KEY `persID` (`persID`),
KEY `persKey` (`persKey`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_icelandic_ci AUTO_INCREMENT=2686 ;
Input sample:
INSERT INTO `career` (`carKey`, `persID`, `persKey`, `wcKey`, `wtKey`, `pwtKey`, `dptId`, `dptNr`, `dptalias`, `class`, `getfilm`, `finished`, `startDate`, `endDate`, `regDate`, `user`,
(5, 34536, 346, 22, 44, 34, 3454356, 33, 'asdasd', 'ASDASD', 'ASDSD', 'true', '1991-02-04', '2010-05-02', '2009-05-02 00:01:02', 1, NULL, 'HH:
'),

Related

SQLite3: ""sqlite3.OperationalError: no such column: dateandtime"" when making Primary Key?

I'm currently trying to create a database using SQLlite3 with Python, however I'm having trouble setting up a Primary Key. I'm aware of what one is, and how it uniquely identifies the table, but I want to change it from the standard "rowid" it comes with to the current date. When I try to add things into the table however it comes up with this:
File "Economic_Analyser.py", line 258, in <module>
startup()
File "Economic_Analyser.py", line 243, in startup
c.execute("INSERT INTO economicdata VALUES (dateandtime, up_GDPgrowthRate, up_GDP, up_GNP, up_GDPperCapita, up_GDPagriculture, up_GDPconstruction, up_GDPmanufacturing, up_GDPmining, up_GDPpublicadmin, up_GDPservices, up_GDPtransport, up_GDPtourism, up_UnemploymentRate, up_EmploymentRate, up_InflationRate, up_CPI, up_InterestRate, up_BalanceOfTrade, up_CurrentAccount, up_Imports, up_Exports, up_FDI, up_GovernmentSpending, up_GovernmentDebt, up_BusinessConfidence, up_Bankruptcies, up_CompetitiveRank, up_CorruptionRank, up_ConsumerConfidence, up_CorporateTaxRate, up_IncomeTaxRate)")
sqlite3.OperationalError: no such column: dateandtime
As you can see from my actual code below, I've declared that the date is the Primary Key but cannot add the data to it. I've changed the code multiple times based on what I've seen other people do but it hasn't worked. Just to clarify this isn't all of my code - just the parts that I think matter. Any help would be appreciated!!
try:
data_attempt = open("Economic_Analyser.db")
except:
print("- Database not found. Creating 'Economic_Analyser.db' .")
databasevariables()
c.execute("""CREATE TABLE economicdata (
dateandtime text NOT NULL PRIMARY KEY,
GDPgrowthRate decimal NOT NULL,
GDP decimal NOT NULL,
GNP decimal NOT NULL,
GDPperCapita decimal NOT NULL,
GDPagriculture decimal NOT NULL,
GDPconstruction decimal NOT NULL,
GDPmanufacturing decimal NOT NULL,
GDPmining decimal NOT NULL,
GDPpublicadmin decimal NOT NULL,
GDPservices decimal NOT NULL,
GDPtransport decimal NOT NULL,
GDPtourism decimal NOT NULL,
UnemploymentRate decimal NOT NULL,
EmploymentRate decimal NOT NULL,
InflationRate decimal NOT NULL,
CPI decimal NOT NULL,
InterestRate decimal NOT NULL,
BalanceOfTrade decimal NOT NULL,
CurrentAccount decimal NOT NULL,
Imports decimal NOT NULL,
Exports decimal NOT NULL,
FDI decimal NOT NULL,
GovernmentSpending decimal NOT NULL,
GovernmentDebt decimal NOT NULL,
BusinessConfidence decimal NOT NULL,
Bankruptcies decimal NOT NULL,
CompetitiveRank decimal NOT NULL,
CorruptionRank decimal NOT NULL,
ConsumerConfidence decimal NOT NULL,
CorporateTaxRate decimal NOT NULL,
IncomeTaxRate decimal NOT NULL
)""")
conn.commit()
c.execute("""CREATE TABLE users (
username text,
password text
)""")
conn.commit()
conn.close()
if internet_access == True:
databasevariables()
c.execute("INSERT INTO economicdata VALUES (dateandtime, up_GDPgrowthRate, up_GDP, up_GNP, up_GDPperCapita, up_GDPagriculture, up_GDPconstruction, up_GDPmanufacturing, up_GDPmining, up_GDPpublicadmin, up_GDPservices, up_GDPtransport, up_GDPtourism, up_UnemploymentRate, up_EmploymentRate, up_InflationRate, up_CPI, up_InterestRate, up_BalanceOfTrade, up_CurrentAccount, up_Imports, up_Exports, up_FDI, up_GovernmentSpending, up_GovernmentDebt, up_BusinessConfidence, up_Bankruptcies, up_CompetitiveRank, up_CorruptionRank, up_ConsumerConfidence, up_CorporateTaxRate, up_IncomeTaxRate)")
conn.commit()
conn.close()
print("- Most recent data has been saved.")
else:
print("- Failed.")
def databasevariables():
global conn
conn = sqlite3.connect("Economic_Analyser.db")
global c
c = conn.cursor()
You're putting the column names in the wrong place in your INSERT statement. They should go immediately after the tablename enclosed in parens/brackets, then you need a placeholder for each value to be inserted.
Instead you may do something like this:
columns = ['dateandtime',
'up_GDPgrowthRate',
'up_GDP',
'up_GNP',
'up_GDPperCapita',
'up_GDPagriculture',
'up_GDPconstruction',
'up_GDPmanufacturing',
'up_GDPmining',
'up_GDPpublicadmin',
'up_GDPservices',
'up_GDPtransport',
'up_GDPtourism',
'up_UnemploymentRate',
'up_EmploymentRate',
'up_InflationRate',
'up_CPI',
'up_InterestRate',
'up_BalanceOfTrade',
'up_CurrentAccount',
'up_Imports',
'up_Exports',
'up_FDI',
'up_GovernmentSpending',
'up_GovernmentDebt',
'up_BusinessConfidence',
'up_Bankruptcies',
'up_CompetitiveRank',
'up_CorruptionRank',
'up_ConsumerConfidence',
'up_CorporateTaxRate',
'up_IncomeTaxRate']
placeholders = ",".join('?'*len(columns))
insert_stmt = f"""INSERT INTO economicdata ({columns})
VALUES ({placeholders});"""
c.execute(insert_stmt)

Parameterized query with pyodbc and mysql8 returns 0 for columns with int data types

Python: 2.7.12
pyodbc: 4.0.24
OS: Ubuntu 16.4
DB: MySQL 8
driver: MySQL 8
Expected behaviour: resultset should have numbers in columns with datatype int
Actual Behaviour: All of the columns with int data type have 0's (If parameterised query is used)
Here's the queries -
1.
cursor.execute("SELECT * FROM TABLE where id =7")
Result set:
[(7, 1, None, 1, u'An', u'Zed', None, u'Ms', datetime.datetime(2016, 12, 20, 0, 0), u'F', u'Not To Be Disclosed', None, None, u'SPRING', None, u'4000', datetime.datetime(2009, 5, 20, 18, 55), datetime.datetime(2019, 1, 4, 14, 25, 58, 763000), 0, None, None, None, bytearray(b'\x00\x00\x00\x00\x01(n\xba'))]
2.
cursor.execute("SELECT * FROM patients where patient_id=?", [7])`
or
cursor.execute("SELECT * FROM patients where patient_id=?", ['7'])
or
cursor.execute("SELECT * FROM patients where patient_id IN ", [7])
Result set:
[(0, 0, None, 0, u'An', u'Zed', None, u'Ms', datetime.datetime(2016, 12, 20, 0, 0), u'F', u'Not To Be Disclosed', None, None, u'SPRING', None, u'4000', datetime.datetime(2009, 5, 20, 18, 55), datetime.datetime(2019, 1, 4, 14, 25, 58, 763000), 0, None, None, None, bytearray(b'\x00\x00\x00\x00\x01(n\xba'))]
Rest of the result set is okay except for the columns with int data type that all have 0's if paramterized query is used.
It seems like it should have worked without issues. Can I get some help here.
Edit : Here's the schema of the table:
CREATE TABLE `patient
`lastname` varchar(30) DEFAULT NULL,
`known_as` varchar(30) DEFAULT NULL,
`title` varchar(50) DEFAULT NULL,
`dob` datetime DEFAULT NULL,
`sex` char(1) DEFAULT NULL,
`address1` varchar(30) DEFAULT NULL,
`address2` varchar(30) DEFAULT NULL,
`address3` varchar(30) DEFAULT NULL,
`city` varchar(30) DEFAULT NULL,
`state` varchar(16) DEFAULT NULL,
`postcode` char(4) DEFAULT NULL,
`datecreated` datetime NOT NULL,
`dateupdated` datetime(6) DEFAULT NULL,
`isrep` tinyint(1) DEFAULT NULL,
`photo` longblob,
`foreign_images_imported` tinyint(1) DEFAULT NULL,
`ismerged` tinyint(1) DEFAULT NULL,
`rowversion` varbinary(8) DEFAULT NULL,
PRIMARY KEY (`patient_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;
You have encountered this bug in MySQL Connector/ODBC.
EDIT: The bug has now been fixed.
The following (Python 3) test code verifies that MySQL Connector/ODBC returns zero (incorrect), while mysqlclient returns the correct value:
import MySQLdb # from mysqlclient
import pyodbc
host = 'localhost'
user = 'root'
passwd = 'whatever'
db = 'mydb'
port = 3307
charset = 'utf8mb4'
use_odbc = False # or True
print(f'{"" if use_odbc else "not "}using ODBC ...')
if use_odbc:
connection_string = (
f'DRIVER=MySQL ODBC 8.0 ANSI Driver;'
f'SERVER={host};UID={user};PWD={passwd};DATABASE={db};PORT={port};'
f'charset={charset};'
)
cnxn = pyodbc.connect(connection_string)
print(f'{cnxn.getinfo(pyodbc.SQL_DRIVER_NAME)}, version {cnxn.getinfo(pyodbc.SQL_DRIVER_VER)}')
else:
cnxn = MySQLdb.connect(
host=host, user=user, passwd=passwd, db=db, port=port, charset=charset
)
int_value = 123
crsr = cnxn.cursor()
crsr.execute("CREATE TEMPORARY TABLE foo (id varchar(10) PRIMARY KEY, intcol int, othercol longblob)")
crsr.execute(f"INSERT INTO foo (id, intcol) VALUES ('Alfa', {int_value})")
sql = f"SELECT intcol, othercol FROM foo WHERE id = {'?' if use_odbc else '%s'}"
crsr.execute(sql, ('Alfa',))
result = crsr.fetchone()[0]
print(f'{"pass" if result == int_value else "FAIL"} -- expected: {repr(int_value)} ; actual: {repr(result)}')
Console output with use_odbc = True:
using ODBC ...
myodbc8a.dll, version 08.00.0018
FAIL -- expected: 123 ; actual: 0
Console output with use_odbc = False:
not using ODBC ...
pass -- expected: 123 ; actual: 123
FWIW i just posted a question where i was seeing this in version 3.1.14 of the ODBC connector but NOT in version 3.1.10.

load CSV into MySQL table with ODO python package - date error 1292

I'm trying to import a simple CSV file that I downloaded from Quandl into a MySQL table with the odo python package
t = odo('C:\ProgramData\MySQL\MySQL Server 5.6\Uploads\WIKI_20160725.partial.csv', 'mysql+pymysql://' + self._sql._user+':'
+ self._sql._password +'#localhost/testDB::QUANDL_DATA_WIKI')
The first row looks like this in the CSV:
A 7/25/2016 46.49 46.52 45.92 46.14 1719772 0 1 46.49 46.52 45.92 46.14 1719772
The MySQL table is defined as follows:
Ticker varchar(255) NOT NULL,
Date date NOT NULL,
Open numeric(15,2) NULL,
High numeric(15,2) NULL,
Low numeric(15,2) NULL,
Close numeric(15,2) NULL,
Volume bigint NULL,
ExDividend numeric(15,2),
SplitRatio int NULL,
OpenAdj numeric(15,2) NULL,
HighAdj numeric(15,2) NULL,
LowAdj numeric(15,2) NULL,
CloseAdj numeric(15,2) NULL,
VolumeAdj bigint NULL,
PRIMARY KEY(Ticker,Date)
It throws an exception 1292 with the following info:
sqlalchemy.exc.InternalError: (pymysql.err.InternalError) (1292, "Incorrect date value: '7/25/2016' for column 'Date' at row 1") [SQL: 'LOAD DATA INFILE %(path)s\n INTO TABLE QUANDL_DATA_WIKI\n CHARACTER SET %(encoding)s\n FIELDS\n TERMINATED BY %(delimiter)s\n ENCLOSED BY %(quotechar)s\n ESCAPED BY %(escapechar)s\n LINES TERMINATED BY %(lineterminator)s\n IGNORE %(skiprows)s LINES\n '] [parameters: {'path': 'C:\ProgramData\MySQL\MySQL Server 5.6\Uploads\WIKI_20160725.partial.csv', 'quotechar': '"', 'skiprows': 0, 'lineterminator': '\r\n', 'escapechar': '\', 'delimiter': ',', 'encoding': 'utf8'}]
Does anyone have an idea what is wrong with the date in the first row? It doesn't seem to match it to the MySql database
mysql has problems with date conversions. I noticed when I defined the date field as a varchar
Date varchar(255) NOT NULL
then the csv file was read properly
in my SQL the conversion of the string to date format then looks like this:
STR_TO_DATE(Date, "%m/%d/%Y")

UTF-8 not working when connecting to MySQL database in Python

I am struggling to make Python play nice with my UTF-8 encoded MySQL database containing, for example, the Norwegian characters, æøå. I have searched around for hours, but have not been able to find anything that works as expected. Here is an example table extracted from the database:
mysql> select * from my_table;
+----+-----------------+
| id | shop_group_name |
+----+-----------------+
| 1 | Frukt og grønt |
| 2 | Kjøtt og fisk |
| 3 | Meieriprodukter |
| 4 | Frysevarer |
| 5 | Bakevarer |
| 6 | Tørrvarer |
| 7 | Krydder |
| 8 | Hermetikk |
| 9 | Basisvarer |
| 10 | Diverse |
+----+-----------------+
10 rows in set (0.00 sec)
So the data is definitely UTF-8 encoded. When running the below Python code, however, it does not give the output int UTF-8. What could be wrong with it? It has nothing to do with the zipping; the tuples returned by cursor.execute(query) has already messed up the encoding.
#!/usr/bin/env python
import MySQLdb
db = MySQLdb.connect(host="localhost",
user="test",
passwd="passwd",
db="mydb",
charset='utf8',
use_unicode=True)
# Set desired conversion of data.
db.converter[MySQLdb.FIELD_TYPE.NEWDECIMAL] = float
db.converter[MySQLdb.FIELD_TYPE.DATETIME] = str
db.converter[MySQLdb.FIELD_TYPE.LONGLONG] = int
db.converter[MySQLdb.FIELD_TYPE.LONG] = int
db.converter[MySQLdb.FIELD_TYPE.DATETIME] = str
db.converter[MySQLdb.FIELD_TYPE.DATETIME] = str
db.converter[MySQLdb.FIELD_TYPE.DATETIME] = str
cursor = db.cursor()
query = 'SELECT * FROM my_table'
allResults = {}
cursor.execute(query)
columns = [desc[0] for desc in cursor.description]
rows = cursor.fetchall()
results = []
for row in rows:
row = dict(zip(columns, row))
results.append(row)
allResults['my_table'] = results
cursor.close()
db.close()
The allResults dictionary now contains:
{
'my_table': [
{
'id': 1,
'shop_group_name': 'Fruktoggr\xf8nt'
},
{
'id': 2,
'shop_group_name': 'Kj\xf8ttogfisk'
},
{
'id': 3,
'shop_group_name': 'Meieriprodukter'
},
{
'id': 4,
'shop_group_name': 'Frysevarer'
},
{
'id': 5,
'shop_group_name': 'Bakevarer'
},
{
'id': 6,
'shop_group_name': 'T\xf8rrvarer'
},
{
'id': 7,
'shop_group_name': 'Krydder'
},
{
'id': 8,
'shop_group_name': 'Hermetikk'
},
{
'id': 9,
'shop_group_name': 'Basisvarer'
},
{
'id': 10,
'shop_group_name': 'Diverse'
}
]
}
I cannot really see what I am doing wrong. I am running the tests in Python 2.7.6 in Ubuntu.
Update (changing tables to UTF-8)
I tried changing the tables to UTF-8 by dumping the database and changing the character set and collation in the dump file and then inserting it into a new database. For example, this part of the dump file corresponds to the example above. This is how it was:
DROP TABLE IF EXISTS `my_table`;
/*!40101 SET #saved_cs_client = ##character_set_client */;
/*!40101 SET character_set_client = utf8 */;
CREATE TABLE `my_table` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`shop_group_name` varchar(100) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=11 DEFAULT CHARSET=latin1;
/*!40101 SET character_set_client = #saved_cs_client */;
And this is what I changed this part to:
DROP TABLE IF EXISTS `my_table`;
/*!40101 SET #saved_cs_client = ##character_set_client */;
/*!40101 SET character_set_client = utf8 */;
CREATE TABLE `my_table` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`shop_group_name` varchar(100) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=11 DEFAULT CHARSET=utf8;
/*!40101 SET character_set_client = #saved_cs_client */;
However, this is still not working. The output is still the same as above. Running SELECT CHARACTER_SET_NAME FROM information_schema.columns WHERE TABLE_NAME = 'my_table'; now produces utf8.
When you create your table, create your columns in UTF-8:
CREATE TABLE my_table (
...
shop_group_name VARCHAR(100) CHARACTER SET utf8 COLLATE utf8_general_ci
);
If you don't specify the character set and collation, then MySQL uses defaults for character set and collation. Alternatively, you can set the defaults in mysql.cnf.

Insert dictionary keys values into a MySQL database

I have executed this code to insert a dictionary into my table in database,
d = {'err': '0', 'tst': '0', 'Type o': 'FTP', 'recip': 'ADMIN', 'id': '101', 'origin': 'REPORT', 'Type recip': 'SMTP', 'date': '2010-01-10 18:47:52'}
db = MySQLdb.connect("localhost","admin","password","database")
cursor = db.cursor()
cursor.execute("""INSERT INTO mytable(ID, ERR, TST, DATE, ORIGIN, TYPE_O, RECIP, TYPE_RECIP) VALUES (%(id)s, %(err)s, %(tst)s, %(date)s, %(origin)s, %(Type o)s, %(recip)s, %(Type recip)s)""", d)
db.commit()
db.close()
Create statement of my table:
CREATE TABLE mytable (
`ID` tinyint unsigned NOT NULL,
`ERR` tinyint NOT NULL,
`TST` tinyint unsigned NOT NULL,
`DATE` datetime NOT NULL,
`ORIGIN` varchar(30) NOT NULL,
`TYPE_O` varchar(10) NOT NULL,
`RECIP` varchar(30) NOT NULL,
`TYPE_RECIP` varchar(10) NOT NULL,
PRIMARY KEY (`ID`,`DATE`)
) ENGINE = InnoDB;
But i have an error, it says:
1064, "you have an error in your SQL syntax; check the manual that
corresponds to you MySQL server version... )
Be aware of SQL injections and use the second argument to execute for inserting your query parameters:
cursor.execute("""
INSERT INTO
table
(name, age, origin, date)
VALUES
(%(name)s, %(age)s, %(origin)s, %(date)s)
""", d)

Categories

Resources