This is my first post here and I found so many answers here I am really confident, that some can help me. Because I work with Python for only about half a year now, it maybe is a very stupid beginners question. Forgive me so far...
In my project I have several different functions. Two of them are listed below. mysqlLogbookIndex is a thread witch show refresh a list with the names of some table from a database. The connection to the DB is existing at the moment, the function is called and so far it works fine. At the end I can see a tuple containing all the table names.
But now the second function, named create_flight comes in. It is a callback for a Tkinter button and creates some new tables in my database. It uses the same preopened connection than mysqlLogbookIndex.
I expected to see the new tables is my tuple in the next cycle of mysqlLogbookIndex but what happened is that result turned into a None.
Do you know why?
Widget function:
def create_flight(self):
# *********************************************************************************************
# * Create new flight table *
# *********************************************************************************************
if self.sql_write.get():
# ********************************************
# * Try to connect message *
# ********************************************
self.printLog(self.lbl_sql_write, self.LANG['tryCreateTable'], 'normal')
logging.info('Creating table for flight tracking...')
# ********************************************
# * Create name tables *
# ********************************************
event = self.oprSett['mysql']['event']
now = str(int(time.time()))
mainName = "tbl_%s_%s_" % (event, now)
trackingTable = mainName + "flighttrack"
logging.debug('Name of tracking table: %s', trackingTable)
unitsTable = mainName + "units"
logging.debug('Name of units table: %s', unitsTable)
headerTable = mainName + "header"
logging.debug('Name of header table: %s', headerTable)
# ********************************************
# * Read SQL parameter *
# ********************************************
logging.debug('Reading CSV file for table structure...')
csvFile = "config/newFlight.csv"
try:
sqlCsv = csv.reader(open(csvFile, 'rb'),
delimiter = ',',
quotechar = '"',
quoting = csv.QUOTE_ALL
)
except:
msg = csvFile
msg += "\n\n"
msg += self.LANG['e13']
tkMessageBox.showerror("Error 13", msg)
self.printLog(self.lbl_sql_write, self.LANG['e13'], 'error')
logging.error('File not found!')
#print "[Error 13] " + self.LANG['e13']
return 0
# Transfer data from CSV file into own list
sqlVars = []
for row in sqlCsv:
if len(row) == 4 and row[0][0] != "#": # No comment
sqlVars.append(row)
# *************************************************
# * Create SQL statement to create tracking table *
# *************************************************
# Head for creating new table
sql = "CREATE TABLE IF NOT EXISTS `%s` (\n" % trackingTable
sql += " `ID` int(11) NOT NULL AUTO_INCREMENT,\n" # Becomes primary key
# Parse SQL variables from CSV file
for row in sqlVars:
if len(row[2]) > 0: # Data type requires length
sql += " `%s` %s(%s) NOT NULL COMMENT '%s',\n" % (row[0], row[1], row[2], row[3])
else: # Data type not requires length
sql += " `%s` %s NOT NULL COMMENT '%s',\n" % (row[0], row[1], row[3])
# Footer of SQL statement for creating new table
sql += " PRIMARY KEY (`ID`)\n"
sql += ") ENGINE=MyISAM DEFAULT CHARSET=latin1 COLLATE=latin1_german1_ci AUTO_INCREMENT=0;\n"
sql += "\n"
# In debug mode print SQL statement to console
#logging.debug('SQL statement to create tracking table:\n%s', sql)
# **********************************************
# * Create SQL statement to create units table *
# **********************************************
# Head for creating new table
sql += "CREATE TABLE IF NOT EXISTS `%s` (\n" % unitsTable
sql += " `ID` int(11) NOT NULL AUTO_INCREMENT,\n" # Becomes primary key
sql += " `Dataref` varchar(10) COLLATE latin1_german1_ci NOT NULL,\n"
sql += " `Unit` varchar(10) COLLATE latin1_german1_ci NOT NULL,\n"
sql += " PRIMARY KEY (`ID`)\n"
sql += ") ENGINE=MyISAM DEFAULT CHARSET=latin1 COLLATE=latin1_german1_ci AUTO_INCREMENT=1 ;\n"
sql += "\n"
# Parse SQL variables from CSV file
for row in sqlVars:
# Insert units in tbl_units
sql += "INSERT INTO %s VALUES ('', '%s', '%s');\n" % (unitsTable, row[0], row[3])
sql += "\n"
# In debug mode print SQL statement to console
#logging.debug('SQL statement to create units table:\n%s', sql)
# ***********************************************
# * Create SQL statement to create header table *
# ***********************************************
# Head for creating new table
sql += "CREATE TABLE IF NOT EXISTS `%s` (\n" % headerTable
#sql += " `ID` int(11) NOT NULL AUTO_INCREMENT,\n" # Becomes primary key
sql += " `Parameter` char(21) COLLATE latin1_german1_ci NOT NULL,\n"
sql += " `Value` varchar(100) COLLATE latin1_german1_ci NOT NULL,\n"
sql += " PRIMARY KEY (`Parameter`)\n"
sql += ") ENGINE=MyISAM DEFAULT CHARSET=latin1 COLLATE=latin1_german1_ci AUTO_INCREMENT=1 ;\n"
sql += "\n"
# IGC syntax from: http://carrier.csi.cam.ac.uk/forsterlewis/soaring/igc_file_format/igc_format_2008.html
# Adding header parameters, some values are coming later
sql += "INSERT INTO %s VALUES ('AXXX001', '');\n" % (headerTable) # Manufacturer code
sql += "INSERT INTO %s VALUES ('HFFXA', '035');\n" % (headerTable) # Fix accuracy
sql += "INSERT INTO %s VALUES ('HFDTE', '');\n" % (headerTable) # UTC date of flight
sql += "INSERT INTO %s VALUES ('HFPLTPILOT', '');\n" % (headerTable) # Pilots name
sql += "INSERT INTO %s VALUES ('HFGTYGLIDERTYPE', 'KA8B');\n" % (headerTable) # Glider type
sql += "INSERT INTO %s VALUES ('HFGIDGLIDERID', 'D1389');\n" % (headerTable) # Glider callsign
sql += "INSERT INTO %s VALUES ('HFDTM100DATUM', 'WGS-1984');\n" % (headerTable) # GPS datum
sql += "INSERT INTO %s VALUES ('HFGPSGPS', 'X-PLANE 10');\n" % (headerTable) # Manufacturer of GPS module
sql += "INSERT INTO %s VALUES ('HFFTYFRTYPE', 'FLORIANMEISSNER,HCM');\n" % (headerTable) # Logger type
sql += "INSERT INTO %s VALUES ('HFRFWFIRMWAREVERSION', '%s');\n" % (headerTable, self.VERSION) # Firmware version
sql += "INSERT INTO %s VALUES ('HFRHWHARDWAREVERSION', '%s');\n" % (headerTable, self.XPLANEVERSION) # Hardware version
sql += "INSERT INTO %s VALUES ('HFCCLCOMPETITIONCLASS', 'CLUB');\n" % (headerTable) # Competition class
# ********************************************
# * Handover SQL statement to create table *
# * to DB. *
# ********************************************
logging.debug('SQL statement to create all tables:\n%s', sql)
try:
cur = self.conn.cursor()
cur.execute(sql)
cur.close()
except pymysql.Error, e:
tkMessageBox.showerror("Error 9", self.LANG['e9'])
self.printLog(self.lbl_sql_write, self.LANG['e9'], 'error')
#print "[Error 9] " + self.LANG['e9']
logging.error('Could not create tracking table!')
logging.debug(e)
return 0
# ********************************************
# * Print success message *
# ********************************************
logging.info('Tracking table created.')
self.printLog(self.lbl_sql_write, self.LANG['doneCreateTable'], 'success')
text = self.LANG['flightId'] + " " + trackingTable
self.printLog(self.lbl_sql_write, text, 'normal', timestamp=False)
# Make trackingTable public
self.tableName = trackingTable
# Enable tambour register to write in database
self.tambourInMysql = True
Refresh Loop:
def mysqlLogbookIndex(self):
delay = float(self.oprSett['logbook']['refresh_delay']) / 1000
# ********************************************************************
# * Run only if MySQL connection from writeMysql.py is active *
# ********************************************************************
while self.sql_write.get():
# ********************************************
# * Query to get list with tables *
# ********************************************
logging.info('Querying list with tables from database')
dbName = self.oprSett['mysql']['db']
sql = "SELECT TABLE_NAME\n"
sql += "FROM `information_schema`.`TABLES`\n"
sql += "WHERE `TABLE_SCHEMA` LIKE '%s'" % dbName
try:
cur = self.conn.cursor()
cur.execute(sql)
result = cur.fetchall()
cur.close()
except pymysql.Error, e:
logging.warning('Could not take flights from database!')
logging.debug(e)
self.printLog(self.lbl_sql_write, self.LANG['e20'], 'error')
#continue
else:
logging.info('Logbook refreshed.')
print result
Thanks in advance for all helpful posts...
Found the error more or less. It seems to be a MySQL issue.
In my original idea I connected to DB "hcm" and tried to get some table informations from "information_schema" DB. If the "hcm" was changed the query returned None.
Solution:
I established a second independent MySQL connection in my function for the refresh loop and connected to "information_schema" directly.
Related
I need to set some user meta in my wordpress through local python script. Hence I can't use the WP update_user_meta for it - it has to be done manually.
import mysql.connector as mysql
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cursor = cnx.cursor()
get_meta = ("SELECT * FROM `ff_usermeta` WHERE `user_id`= 1 AND (`meta_key`='nickname' OR `meta_key`='info' OR `meta_key`='bg' OR `meta_key`='avatar' OR `meta_key`='profile_updated')")
cursor.execute(get_meta)
meta = cursor.fetchall()
#some processing of the result
cursor.execute(q, (...))
cnx.commit()
cursor.close()
cnx.close()
Now I need to check if the result has meta with each of the keys.
If the key already exists for this user, it needs to run UPDATE for this meta.
If this user still has no meta of this key, it has to INSERT new row.
if(there's no 'nickname' in meta_key on either of 5 or less rows):
q = ("INSERT INTO `ff_usermeta` ...")
else:
q = ("UPDATE `ff_usermeta` ...")
...and 4 more times like that?.. Seems like a good place for a cycle, but I don't really like the idea to make it 5x queues, especially since there might be more fields in the future.
I was thinking along the lines of searching the fetchall result for matches in meta_key, and if found, adding required data to one array, if not - to another. And then just running one update and one insert at the end, assuming both are not empty. If I were to write it in semi-php style, it would look roughly like this:
if(in_array("nickname", meta))
for_update .= "`nickname`='"+data[0]+"', "
else:
fields .= "`nickname`, "
vals .= "'"+data[0]+"', "
if(in_array("bg", meta)):
for_update .= "`bg`='"+data[1]+"', "
else:
fields .= "`bg`, "
vals .= "'"+data[1]+"', "
if(for_update):
update = ("UPDATE `ff_usermeta` SET "+for_update+" WHERE 1")
if(fields):
insert = ("INSERT INTO `ff_usermeta`("+fields+") VALUES ("+vals+")")
But absolutely no clue how to translate it correctly to python. I had to google it up to things like "why dot not working to add one string to another". Any advice? Or perhaps there is a better way? Thanks!
It is not complete, you can not update your rows in that way.
But with this you can start to make your query
The frist select gets exactly 1 row, if the user_id exists.
The user_id doesn't seem the right choice for this, but to get what you can do it is enough.
If the query doesn't have an entry, the it will insert some data you get from anywhere
The update as the insert are in that form wrong as you have to insert 5 new orws or update max 5 rows, but that is more for you to programm
import mysql.connector as mysql
HOST = "localhost"
DATABASE = ""
USER = "root"
PASSWORD = "mypassword"
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cursor = cnx.cursor()
user_id = 1
get_meta = ("""SELECT umeta_id, user_id , MAX(IF( `meta_key`='nickname', meta_value,'')) AS 'nickname' , MAX(IF( `meta_key`='info', meta_value,'')) AS 'info' , MAX(IF( `meta_key`='bg', meta_value,'')) AS 'bg' , MAX(IF( `meta_key`='avatar', meta_value,''NULL'')) AS 'avatar' , MAX(IF (`meta_key`='profile_updated', meta_value,'')) AS 'profile_updated' FROM `ff_usermeta` WHERE `user_id`= %s GROUP BY umeta_id, user_id:""")
result = cursor.execute(get_meta,(user_id,))
if result > 0:
data = cursor.fetchone()
for_update = "";
#some processing of the result
if not data["nickname"]:
for_update += "`nickname`='"+data["nickname"]+"', "
if not data["bg"]:
for_update += "`bg`='"+data["bg"]+"', "
query = ("UPDATE `ff_usermeta` SET "+for_update+" WHERE user_id = " + user_id)
else:
#here are no data to be gathered as there is no user_id present add new user
nickname = ""
bg= ""
info = ""
avatar = ""
profile_updated = ""
fields= ""
vals = ""
fields += "`nickname`,`info`, `bg`,`avatar`,`profile_updated`"
vals += "'"+nickname+"', "+"'"+info+"', "+"'"+bg+"', "+"'"+avatar+"', "+"'"+profile_updatedfo+"'"
query = ("INSERT INTO `ff_usermeta`("+fields+") VALUES ("+vals+")")
cursor.execute(query)
cnx.commit()
cursor.close()
cnx.close()
I tried my best to adapt the suggestion above, but couldn't figure out how to make it work. Eventually I went another way, and it seems to work somehow, so I'll post the full code in case anyone would find it useful.
What it does: checks the queue in table with validation request, then parses a page (separate function) and updates user profile accodringly.
import mysql.connector as mysql
import time
from datetime import datetime
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
while True: #endless loop as a temporary scheduler
cursor = cnx.cursor()
#getting first request in the queue - 0: id, 1: url, 2: parse, 3: status, 4: user, 5: user_page, 6: req_date, 7: action
cursor.execute("SELECT * FROM `ff_qq` WHERE status = 0 LIMIT 1")
row = cursor.fetchone()
if row:
status = 1 #processed
if row[7] == "verify":
get_user = ("SELECT * FROM `ff_users` WHERE ID = %s LIMIT 1")
cursor.execute(get_user, (row[4],))
user = cursor.fetchone() #0 - ID, 5 - user_url, 8 - user_status, 9 - display_name
#separate function that returns data to insert into mysql
udata = verify(row) #0 - nickname, 1 - fb_av, 2 - fb_bg, 3 - fb_info, 4 - owner
ustat = row[1].split("/authors/")
if udata['owned'] or user[8] == ustat[1]:
update_user = ("UPDATE `ff_users` SET user_status = %s, display_name = %s, user_url = %s WHERE ID = %s LIMIT 1")
cursor.execute(update_user, (ustat[1], udata['nickname'], row[1], user[0]))
status = 2 #success
get = ("SELECT `meta_value` FROM `ff_usermeta` WHERE `user_id`= %s AND `meta_key`='ff_capabilities' LIMIT 1")
cursor.execute(get, (row[4],))
rights = cursor.fetchone()
if rights == 'a:1:{s:10:"subscriber";b:1;}':
promote = ("UPDATE `ff_usermeta` SET `meta_value` = 'a:1:{s:6:\"author\";b:1;}' "
"WHERE `user_id` = %s AND `meta_key`='ff_capabilities' LIMIT 1")
cursor.execute(promote, (row[0],))
#list of meta_key values in same order as returned data
ff = ['nickname', 'fb_av', 'fb_bg', 'fb_info']
for x in range(0,3): #goes through each one of the above list
if udata[ff[x]]: #yes this actually works, who would've thought?..
#current meta_key added directly into the string
get = ("SELECT `meta_value` FROM `ff_usermeta` WHERE `user_id`= %s AND `meta_key`='" + ff[x] + "' LIMIT 1")
cursor.execute(get, (row[4],))
meta = cursor.fetchone()
if(meta): #update if it exists, otherwise insert new row
qq = ("UPDATE `ff_usermeta` SET `meta_value` = %s "
"WHERE `user_id` = %s AND `meta_key`='" + ff[x] + "' LIMIT 1")
else:
qq = ("INSERT INTO `ff_usermeta`(`meta_value`, `meta_key`, `user_id`) "
"VALUES ('%s','" + ff[x] + "','%s'")
cursor.execute(qq, (udata[ff[x]], row[0])) #same execute works for both
else:
status = 3 #verification failed
#update queue to reflect its status
update = ("UPDATE `ff_qq` SET status = %s WHERE id = %s LIMIT 1")
cursor.execute(update, (status, row[0]))
cnx.commit()
cursor.close()
now = datetime.now()
print(now.strftime("%d.%m.%Y %H:%M:%S"))
time.sleep(180) #sleep until it's time to re-check the queue
cnx.close()
i have two function (in python). The first function defines a new variable which i have to insert in a sql table (first column). The second one, does the same thing, but i want to insert its variable (the second one) near the first variable, so in the second column but in the same line. How can i do with sql?.
connloc = sqlite3.connect("request.db")
sqlloc = "create table requests (" \
" chat_id INTEGER NOT NULL PRIMARY KEY,"\
" locpar varchar(20)," \
" stoppar varchar(20)," \
" locdes varchar(20) ," \
" stopdes varchar(20) );"
connloc.execute(sqlloc)
def name_loc(chat, message):
for i in result:
if message.text == i:
item = [i]
cloc = connloc.cursor()
cloc.execute("INSERT INTO requests(locpar) VALUES (?);", item)
connloc.commit()
def name_stop(chat, message):
for i in result:
for t in result[i]:
if message.text == t:
item = [t]
cloc = connloc.cursor()
cloc.execute("INSERT INTO requests(stoppar) VALUES (?);", item)
connloc.commit()
I would break it up into a two step process by defining two methods, one for table generation and then another second method for populating the new table like this:
def create_table(ptbl):
""" Assemble DDL (Data Definition Language) Table Create statement and build
sqlite3 db table
Args:
string: new db table name.
Returns:
Status string, '' or 'SUCCESS'.
"""
retval = ''
sqlCmd = ''
try:
conn = sqlite3.connect(sqlite_file)
c = conn.cursor()
if ptbl == 'TBL_EXAMPLE':
sqlCmd = 'CREATE TABLE IF NOT EXISTS ' + ptbl + ' (FIELD1 TEXT, FIELD2 INTEGER, FIELD3 TEXT, ' \
'FIELD4 TEXT, FIELD5 TEXT)'
else:
pass
if sqlCmd != '':
c.execute(sqlCmd)
conn.commit()
conn.close()
retval = 'SUCCESS'
except Error as e:
retval = 'FAIL'
print(e)
return retval
and then populate it as you like with the values (inserting your new row with those two specific values you mentioned).
Now, I'm populating from a csv file here, but I thinkit'll give you a really good solid start on this task.
def populate_tbl_file_marker_linenums(p_fml_tbl, p_fml_datafile):
""" Read csv and load data into TBL_FILE_MARKER_LINENUMS table ...
Args:
p_fml_tbl (TEXT) target table name
p_fml_datafile (TEXT) name of csv file to load into table
Returns:
retval (TEXT) - Status of method, e.g., 'SUCCESS'
"""
retval = ''
mode = 'r'
try:
conn = sqlite3.connect(sqlite_file)
c = conn.cursor()
csv_dataset = open(p_fml_datafile, mode)
csv_reader = csv.reader(csv_dataset)
c.executemany('INSERT INTO ' + p_fml_tbl + ' (FIELD1, FIELD2, FIELD3, FIELD4, FIELD5) VALUES (?, ?, ?, ?, ?)', csv_reader)
conn.commit()
conn.close()
retval = 'SUCCESS'
except Error as e:
print(e)
return retval
I get this error when adding data to the database.
How do I solve this?
Error:
mysql.connector.errors.ProgrammingError: 1054 (42S22): Unknown column 'hn' in 'field list'
I know this column does not exist but I am not sending data to such a column anyway.
My Python code:
def addToTable(table_name,connection,column_name_list,*data_list):
if(len(column_name_list) != len(data_list)):
raise ValueError("'column_name_list' length has to be equal to 'data_list' length. Please check the parameters")
cursor = connection.cursor() # initializing a cursor
for column_data in range(len(data_list[0])):
addList = list()
for data in range(len(data_list)):
added = str(data_list[data][column_data])
addList.append(added)
cursor.execute("INSERT INTO " + table_name + " VALUES (" + ", ".join(str(k) for k in addList) + ")")
mydb.commit()
print("Added {} in {} ...".format(added, table_name))
Sample query sent from python code:
INSERT INTO deneme VALUES (hn, 1212, asdmailcom)
calling the function:
names = ["hn","ben","alex",]
numbers = [1212,1245,54541]
mails = ["asdmailcom","fghmailcom","xyzmailcom"]
columns = ["de","ne","me"]
mydb = mysql.connector.connect(host="127.0.0.1",
user="root",
passwd="1234",
database="deneme",
auth_plugin='mysql_native_password')
addToTable("deneme",mydb,columns,names,numbers,mails)
My table name is 'deneme', database name is 'deneme'. Columns : 'de' varchar(45), 'ne' varchar(45), 'me' varchar(45)
I solved the problem. I explained in the comment lines.
def addToTable(table_name,connection,column_name_list,*data_list):
if(len(column_name_list) != len(data_list)):
raise ValueError("'column_name_list' length has to be equal to 'data_list' length. Please check the parameters")
cursor = connection.cursor() # initializing a cursor
for column_data in range(len(data_list[0])):
addList = list()
for data in range(len(data_list)):
added = str(data_list[data][column_data])
added = "'"+added+"'" # the purpose of this line is to convert the data to string
# example: without this line
# query ---> INSERT INTO table_name (column1, column2, ...) VALUES (lorem,ipsum,sit)
# example: with this line
# query ---> INSERT INTO table_name (column1, column2, ...) VALUES ('lorem','ipsum','sit')
addList.append(added)
cursor.execute("INSERT INTO " + table_name + " VALUES (" + ", ".join(str(k) for k in addList) + ")")
mydb.commit()
print("Added {} in {} ...".format(added, table_name))
I have a large csv file of 30 million rows(1.6 gb) and I am using pymysql to load the data from csv to mysql tables.
I have removed all constraints in table schema to make load faster and have also set timeout values to large values.
def setTimeOutLimit(connection):
try:
with connection.cursor() as cursor:
query = "SET GLOBAL innodb_lock_wait_timeout = 28800"
cursor.execute(query)
query2 = "SET innodb_lock_wait_timeout = 28800"
cursor.execute(query2)
query3 = "SET GLOBAL connect_timeout = 28800"
cursor.execute(query3)
query4 = "SET GLOBAL wait_timeout = 28800"
cursor.execute(query4)
query5 = "SET GLOBAL interactive_timeout = 28800"
cursor.execute(query5)
query6 = "SET GLOBAL max_allowed_packet = 1073741824"
cursor.execute(query6)
except:
conn.close()
sys.exit(" Could not set timeout limit ")
The data gets inserted into the table but I need to make one of the column as Primary Key and so I am creating another table that makes that column primary index by ignoring duplicate values. (tableName_1 is old table tableName is new table)
def createNewTableFromOld(connection, tableName):
try:
pprint( " Creating new table from old table with constraints" )
with connection.cursor() as cursor:
query = (" CREATE TABLE " + tableName +
" Like " + tableName + "_1")
cursor.execute(query)
query2 = (" ALTER TABLE " + tableName +
" ADD PRIMARY KEY(TimeStamp) ")
cursor.execute(query2)
query3 = (" INSERT IGNORE INTO " + tableName +
" SELECT * FROM " + tableName + "_1")
cursor.execute(query3)
query4 = ("DROP TABLE " + tableName + "_1")
cursor.execute(query4)
connection.commit()
except:
conn.close()
sys.exit(" Could not create table with Primary Key ")
During this method execution, somewhere after 5-6 minutes I get this error,
pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query ([WinError 10054] An existing connection was forcibly closed by the remote host)')
And when I check services, MYSQL80 automatically crashed and stopped. I have also set max_allowed_packet_size to 1 gb in my.ini file and all timeouts are manually set to 8 hours. What could be the issue?
The original table schema is:
query = ("CREATE TABLE IF NOT EXISTS " + table + " ("
" TimeStamp DECIMAL(15, 3), " +
" Value DECIMAL(30, 11), " +
" Quality INT, " +
" TagName varchar(30) )"
)
I finally solved the issue by setting the innodb_buffer_pool_size in my.ini file to 2GB which was earlier only 4M.
I can't show the data from database sqlite in python.
connection = sqlite3.connect('db')
connection.cursor().execute('CREATE TABLE IF NOT EXISTS users ( \
id TEXT, \
name TEXT, \
avatar TEXT \
)')
# In cycle:
query = 'INSERT INTO users VALUES ("' + str(friend.id) + '", "' + friend.name + '", "' + friend.avatar +'" )'
print query
connection.cursor().execute(query)
connection.commit()
# After cycle
print connection.cursor().fetchall()
Sample output of query variable:
INSERT INTO users VALUES ("111", "Some Name", "http://avatar/path" )
In result, fetchall returns empty tuple. Why?
UPD
Forgotten code:
connection.cursor().execute('SELECT * FROM users')
connection.cursor().fetchall()
→
[]
INSERT does not return data. To get the data back out, you'll have to issue a SELECT statement.
import sqlite3
con = sqlite3.connect("db")
con.execute("create table users(id, name, avatar)")
con.execute("insert into users(id, name, avatar) values (?, ?, ?)", (friend.id, friend.name, friend.avatar))
con.commit()
for row in con.execute("select * from users")
print row
con.close()
Because the create table string as displayed is syntactically invalid Python, as is the insert into string.
Actually, the answer to your first question is: because you use different cursors.
connection.cursor() creates a new cursor in the connection you created before. fetchall() gives you the results of the query you executed before in that same cursor. I.e. what you did was this:
# After cycle
cursor1 = connection.cursor()
cursor1.execute('SELECT * FROM users')
cursor2 = connection.cursor()
cursor2.execute("")
cursor2.fetchall()
What you should have done was this:
# After cycle
cursor = connection.cursor()
cursor.execute('SELECT * FROM users')
print cursor.fetchall()