I am trying to populate an azure sql database using pyodbc.
I have around 14000item to push into the database per months, and running and executing each single upsert on it is really long.
I am so trying to run them in batch and push them all at once. Sadly, when doing this I am not getting anything in the database. Is there any way to make this works ?
i = 0
queryUpdate = ""
for values in valuesList:
if i == 100:
i = 0
queryUpdate = queryUpdate.replace('\'NULL\'', "NULL") #set null to real sql null
queryUpdate = queryUpdate.replace("True", "1") #replace true with 1 to match the bit type
queryUpdate = queryUpdate.replace("False", "0") #replace false with 0 to match the bit type
try:
cursor.execute(queryUpdate)
except pyodbc.Error as ex:
print(f"[x] Error when running the query:\n[x] {queryUpdate}\n[x] Exception: {ex}")
queryUpdate = ""
queryUpdate += f"""\n
if exists (select * from {tableName} with (updlock,serializable) where {colNames[keyIndex]} = {values[keyIndex]})
begin
update {tableName} set """
for i in range(len(colNames)):
queryUpdate += f"{colNames[i]} = '{values[i]}'"
if i != len(colNames) - 1:
queryUpdate += ","
queryUpdate += f"""
where {colNames[keyIndex]} = {values[keyIndex]}
end
else
begin
insert into {tableName} ({','.join(colNames)})
values {tuple(values)}
end;
"""
i+=1
try:
conn.commit()
except pyodbc.Error as ex:
print(f"[x] Error when commiting to the database.\n[x] Exception: {ex}")
else:
print("[+] Commit confirmed !")
cursor.close()
conn.close()
print("[+] Connection closed.")
Related
I am trying to get rows quantity from query using python to mysql. I've been using rowcount but returned value is always 0.
def getLastData(md5TwitterDate):
conectar = connection()
try:
cursor = conectar.cursor()
query = "SELECT fecha,numeroreporte,internoid FROM jsywe_ncnsismos_sismosigp WHERE internoid = '%s' and published=1"
cursor.execute(query, md5TwitterDate)
lastEvent = cursor.fetchall()
rowsa = cursor.rowcount
except Error as ex:
print("Error to get data: ", ex)
finally:
if conectar.is_connected():
conectar.close()
#return False if rowsa> 0 else True
return rowsa
Also tried this way setting a variable to cursor.execute but in this case always get none
def getLastData(md5TwitterDate):
conectar = connection()
try:
cursor = conectar.cursor()
query = "SELECT fecha,numeroreporte,internoid FROM jsywe_ncnsismos_sismosigp WHERE internoid = '%s' and published=1"
rowsa = cursor.execute(query, md5TwitterDate)
lastEvent = cursor.fetchall()
except Error as ex:
print("Error to get data: ", ex)
finally:
if conectar.is_connected():
conectar.close()
#return False if filas > 0 else True
return rowsa
Tested query on database and it works, it returns 1 row
I need to set some user meta in my wordpress through local python script. Hence I can't use the WP update_user_meta for it - it has to be done manually.
import mysql.connector as mysql
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cursor = cnx.cursor()
get_meta = ("SELECT * FROM `ff_usermeta` WHERE `user_id`= 1 AND (`meta_key`='nickname' OR `meta_key`='info' OR `meta_key`='bg' OR `meta_key`='avatar' OR `meta_key`='profile_updated')")
cursor.execute(get_meta)
meta = cursor.fetchall()
#some processing of the result
cursor.execute(q, (...))
cnx.commit()
cursor.close()
cnx.close()
Now I need to check if the result has meta with each of the keys.
If the key already exists for this user, it needs to run UPDATE for this meta.
If this user still has no meta of this key, it has to INSERT new row.
if(there's no 'nickname' in meta_key on either of 5 or less rows):
q = ("INSERT INTO `ff_usermeta` ...")
else:
q = ("UPDATE `ff_usermeta` ...")
...and 4 more times like that?.. Seems like a good place for a cycle, but I don't really like the idea to make it 5x queues, especially since there might be more fields in the future.
I was thinking along the lines of searching the fetchall result for matches in meta_key, and if found, adding required data to one array, if not - to another. And then just running one update and one insert at the end, assuming both are not empty. If I were to write it in semi-php style, it would look roughly like this:
if(in_array("nickname", meta))
for_update .= "`nickname`='"+data[0]+"', "
else:
fields .= "`nickname`, "
vals .= "'"+data[0]+"', "
if(in_array("bg", meta)):
for_update .= "`bg`='"+data[1]+"', "
else:
fields .= "`bg`, "
vals .= "'"+data[1]+"', "
if(for_update):
update = ("UPDATE `ff_usermeta` SET "+for_update+" WHERE 1")
if(fields):
insert = ("INSERT INTO `ff_usermeta`("+fields+") VALUES ("+vals+")")
But absolutely no clue how to translate it correctly to python. I had to google it up to things like "why dot not working to add one string to another". Any advice? Or perhaps there is a better way? Thanks!
It is not complete, you can not update your rows in that way.
But with this you can start to make your query
The frist select gets exactly 1 row, if the user_id exists.
The user_id doesn't seem the right choice for this, but to get what you can do it is enough.
If the query doesn't have an entry, the it will insert some data you get from anywhere
The update as the insert are in that form wrong as you have to insert 5 new orws or update max 5 rows, but that is more for you to programm
import mysql.connector as mysql
HOST = "localhost"
DATABASE = ""
USER = "root"
PASSWORD = "mypassword"
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cursor = cnx.cursor()
user_id = 1
get_meta = ("""SELECT umeta_id, user_id , MAX(IF( `meta_key`='nickname', meta_value,'')) AS 'nickname' , MAX(IF( `meta_key`='info', meta_value,'')) AS 'info' , MAX(IF( `meta_key`='bg', meta_value,'')) AS 'bg' , MAX(IF( `meta_key`='avatar', meta_value,''NULL'')) AS 'avatar' , MAX(IF (`meta_key`='profile_updated', meta_value,'')) AS 'profile_updated' FROM `ff_usermeta` WHERE `user_id`= %s GROUP BY umeta_id, user_id:""")
result = cursor.execute(get_meta,(user_id,))
if result > 0:
data = cursor.fetchone()
for_update = "";
#some processing of the result
if not data["nickname"]:
for_update += "`nickname`='"+data["nickname"]+"', "
if not data["bg"]:
for_update += "`bg`='"+data["bg"]+"', "
query = ("UPDATE `ff_usermeta` SET "+for_update+" WHERE user_id = " + user_id)
else:
#here are no data to be gathered as there is no user_id present add new user
nickname = ""
bg= ""
info = ""
avatar = ""
profile_updated = ""
fields= ""
vals = ""
fields += "`nickname`,`info`, `bg`,`avatar`,`profile_updated`"
vals += "'"+nickname+"', "+"'"+info+"', "+"'"+bg+"', "+"'"+avatar+"', "+"'"+profile_updatedfo+"'"
query = ("INSERT INTO `ff_usermeta`("+fields+") VALUES ("+vals+")")
cursor.execute(query)
cnx.commit()
cursor.close()
cnx.close()
I tried my best to adapt the suggestion above, but couldn't figure out how to make it work. Eventually I went another way, and it seems to work somehow, so I'll post the full code in case anyone would find it useful.
What it does: checks the queue in table with validation request, then parses a page (separate function) and updates user profile accodringly.
import mysql.connector as mysql
import time
from datetime import datetime
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
while True: #endless loop as a temporary scheduler
cursor = cnx.cursor()
#getting first request in the queue - 0: id, 1: url, 2: parse, 3: status, 4: user, 5: user_page, 6: req_date, 7: action
cursor.execute("SELECT * FROM `ff_qq` WHERE status = 0 LIMIT 1")
row = cursor.fetchone()
if row:
status = 1 #processed
if row[7] == "verify":
get_user = ("SELECT * FROM `ff_users` WHERE ID = %s LIMIT 1")
cursor.execute(get_user, (row[4],))
user = cursor.fetchone() #0 - ID, 5 - user_url, 8 - user_status, 9 - display_name
#separate function that returns data to insert into mysql
udata = verify(row) #0 - nickname, 1 - fb_av, 2 - fb_bg, 3 - fb_info, 4 - owner
ustat = row[1].split("/authors/")
if udata['owned'] or user[8] == ustat[1]:
update_user = ("UPDATE `ff_users` SET user_status = %s, display_name = %s, user_url = %s WHERE ID = %s LIMIT 1")
cursor.execute(update_user, (ustat[1], udata['nickname'], row[1], user[0]))
status = 2 #success
get = ("SELECT `meta_value` FROM `ff_usermeta` WHERE `user_id`= %s AND `meta_key`='ff_capabilities' LIMIT 1")
cursor.execute(get, (row[4],))
rights = cursor.fetchone()
if rights == 'a:1:{s:10:"subscriber";b:1;}':
promote = ("UPDATE `ff_usermeta` SET `meta_value` = 'a:1:{s:6:\"author\";b:1;}' "
"WHERE `user_id` = %s AND `meta_key`='ff_capabilities' LIMIT 1")
cursor.execute(promote, (row[0],))
#list of meta_key values in same order as returned data
ff = ['nickname', 'fb_av', 'fb_bg', 'fb_info']
for x in range(0,3): #goes through each one of the above list
if udata[ff[x]]: #yes this actually works, who would've thought?..
#current meta_key added directly into the string
get = ("SELECT `meta_value` FROM `ff_usermeta` WHERE `user_id`= %s AND `meta_key`='" + ff[x] + "' LIMIT 1")
cursor.execute(get, (row[4],))
meta = cursor.fetchone()
if(meta): #update if it exists, otherwise insert new row
qq = ("UPDATE `ff_usermeta` SET `meta_value` = %s "
"WHERE `user_id` = %s AND `meta_key`='" + ff[x] + "' LIMIT 1")
else:
qq = ("INSERT INTO `ff_usermeta`(`meta_value`, `meta_key`, `user_id`) "
"VALUES ('%s','" + ff[x] + "','%s'")
cursor.execute(qq, (udata[ff[x]], row[0])) #same execute works for both
else:
status = 3 #verification failed
#update queue to reflect its status
update = ("UPDATE `ff_qq` SET status = %s WHERE id = %s LIMIT 1")
cursor.execute(update, (status, row[0]))
cnx.commit()
cursor.close()
now = datetime.now()
print(now.strftime("%d.%m.%Y %H:%M:%S"))
time.sleep(180) #sleep until it's time to re-check the queue
cnx.close()
I'm in need of simple python script, that will fetch data from MSSQL database on a trigger and then send it to Telegram. My problem is that I cannot write appropriate while cycle to hold it until the SQL trigger works. Here's the code
import pymssql
conn = pymssql.connect(server='serv', user='user', password='pwd', database='DB')
cursor = conn.cursor()
print('connection success')
# Select Query
print('Reading data from table')
with conn.cursor() as cursor:
row = str(0)
while row == 0:
cursor.execute("""
CREATE TRIGGER Server_enter
ON pLogData
AFTER INSERT
AS SELECT [TimeVal],[Remark],[Name],[FirstName]
FROM [data1]
INNER JOIN [data2]
ON [ID]=[ID]
WHERE [Remark] LIKE '%server%' """)
row = cursor.fetchone()
print(str(row[0]) + " " + str(row[1]) + " " + str(row[2]) + " " + str(row[3]))
Now its writing me "reading data from table" and Process finished.
How to make appropriate cycle?
The problem is how you're setting up your loop. You have:
row = str(0)
while row == 0:
# code
however, the string "0" is not equal to the integer 0, so the loop will never execute and will be skipped.
You will hit a similar issue on subsequent iterations as fetchone doesn't return 0 when no data is available, it returns None. You should be basing the loop condition on that:
row = None
while row is None:
I have some python code that gets data from one database (SQL server) and inserts it into another database (MySQL). I am trying to add a WHERE NOT EXIST to the INSERT query so only new rows are inserted, but need to use one of the values in the tuple SageResults a second time for the primary key.
Code:
import mysql.connector
import pyodbc
def insert_VPS(SageResult):
query = """
INSERT INTO SOPOrderReturn(SOPOrderReturnID, DocumentTypeID, DocumentNo, DocumentDate, CustomerID, CustomerTypeID, CurrencyID, SubtotalGoodsValue, TotalNetValue, TotalTaxValue, TotalGrossValue, SourceTypeID, SourceDocumentNo)
VALUES(%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s)
WHERE NOT EXISTS (SELECT * FROM SOPOrderReturn WHERE SOPOrderReturnID = %1$s)"""
try:
mydbVPS = mysql.connector.connect(
host="address",
user="user",
passwd="password",
database="database"
)
VPScursor = mydbVPS.cursor()
#print(SageResult)
VPScursor.executemany(query, SageResult)
mydbVPS.commit()
except Exception as e:
print('InsertError:', e)
finally:
VPScursor.close()
mydbVPS.close()
def main():
selectQuery = """
SELECT TOP 51 [SOPOrderReturnID]
,[DocumentTypeID]
,[DocumentNo]
,[DocumentDate]
,[CustomerID]
,[CustomerTypeID]
,[CurrencyID]
,[SubtotalGoodsValue]
,[TotalNetValue]
,[TotalTaxValue]
,[TotalGrossValue]
,[SourceTypeID]
,[SourceDocumentNo]
FROM [Live].[dbo].[SOPOrderReturn]
"""
try:
mydbSage = pyodbc.connect('Driver={SQL Server};'
'Server=CRMTEST;'
'Database=Live;'
'UID=sa;'
'PWD=password;')
Sagecursor = mydbSage.cursor()
Sagecursor.execute(selectQuery)
#SageResult = tuple(Sagecursor.fetchall())
SageResult = []
while True:
row = Sagecursor.fetchone()
if row:
SageResult.append(tuple(row))
else:
break
#SageResult = Sagecursor.fetchall()
mydbSage.commit()
except Exception as e:
print('MainError:', e)
finally:
Sagecursor.close()
mydbSage.close()
insert_VPS(SageResult)
if __name__ == '__main__':
main()
Output:
D:\xampp\htdocs\stripe\group\beta>sql-sync.py
InsertError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use ne
ar 'WHERE NOT EXISTS (SELECT * FROM SOPOrderReturn WHERE SOPOrderReturnID = %1$s),(1' at line 3
The part in question is the query string variable. Everything else in here works fine. I basically need to use the SOPOrderReturnID value from the tuple a second time where I currently have %1$s
What is the issue with the query syntax? Is my use of %1$s correct?
class code:
def rowcount(self):
return self._psql_cur.rowcount
And the code in the main program:
def sql_query(self, sql, *data, **kwdata):
"""
NOTE: This function returns a generator. So if you use it to do any kind of update to the dbms that doesn't
return anything, it won't be executed!
"""
self.last_select_id += 1
n_retrials = kwdata.get("___n_retrials", 0)
if n_retrials > 10:
raise OperationalError
assert not (len(data) > 0 and len(set(kwdata) - {"___n_retrials"}) > 0), \
"Pass either keyword-based data or comma-separated data."
time_start = time.time()
n_records_retrieved = 0
status = None
toclose = False
print "*********************inside db.py******************"
if self.logfile is not None:
self.logfile.write(">>> {} {} {} START SELECT\n{}\ndata={}\nkwdata={}\n\n".format(
self.cursor_id, self.last_select_id, time_start, sql, data, kwdata))
print "\n************* QUERY:\n", sql
print "\n***************************"
try:
if len(data) > 0:
print "\n**************printing data***********",data
print "\n******************printing sql**************************",sql
print "\n*******************************************************"
# self._psql_cur.execute(sql, data)
cur, toclose = self._execute_query(sql, data)
elif len(kwdata) > 0:
# self._psql_cur.execute(sql, kwdata)
cur, toclose = self._execute_query(sql, kwdata)
else:
cur, toclose = self._execute_query(sql, None)
print "################check###################"
n_records_reported = cur.rowcount
print "\n*************printing rowcount**********",n_records_reported
# Yield records
for record in cur:
n_records_retrieved += 1
if n_records_retrieved == n_records_reported:
status = "Finished"
yield record
following code conatains _execute_query:
def _execute_query(self, sql, args):
# sql = sql.lower().strip()
# print sql
sql_strip = sql.lower().strip()
print "-------4",args
# print self.dbname, sql_strip
if sql_strip.startswith("select ") or \
(sql_strip.startswith("with ")
# and "update " not in sql_strip and "insert " not in sql_strip
):
# Try to close previous named cursor
# if self._psql_cur is not None and not self._psql_cur.closed:
# try:
# self._psql_cur.close()
# except ProgrammingError:
# pass
# self._psql_cur.scroll(self._psql_cur.rowcount, mode="absolute")
# self._psql_cur.fetchone()
# self._psql_cur.fetchone()
# Create a new named cursor
self._psql_cur = self.connection.get_cursor()
print self.dbname, "NAMED", self._psql_cur
# Execute query
self._psql_cur.execute(sql, args)
rows = self._psql_cur.fetchall()
print "FETCHED RESULT: ", rows
print sql
return rows, True
#self._psql_cur.execute("""select * from recipes""")
#rows=self._psql_cur.fetchall()
#print "---------------5 ",rows[0]
#self._psql_cur.fetchall()
return self._psql_cur, True
else:
# if "insert " in sql or "update " in sql or "delete " in sql or "create" in sql:
# print self.dbname, "UNNAMED"
# In this case, do not use the named (server side) cursor
# self._psql_unnamed_cur = self._connection.get_cursor(named=False)
self._psql_unnamed_cur.execute(sql, args)
return self._psql_unnamed_cur, False
I can't figure out why I'm getting this error.
I am trying to get data from database here actually .This is a part of code available in the Github.(PACKAGE QUERY). This is the output I am getting:
Exception occured while executing query:
File "src/dbms/db.py", line 378, in sql_query
n_records_reported = cur.rowcount
AttributeError: 'list' object has no attribute 'rowcount'
Exception during experiment
'list' object has no attribute 'rowcount'
Please tell if you need more information about this doubt. :-)
Your _execute_query method returns a list when your query starts with select or with:
if sql_strip.startswith("select ") or \
(sql_strip.startswith("with ")
# and "update " not in sql_strip and "insert " not in sql_strip
):
# ...
rows = self._psql_cur.fetchall()
print "FETCHED RESULT: ", rows
print sql
return rows, True
rows is a list, not a cursor, so won't have the attribute. Either return the cursor there, or use len() to get a count of rows.