Why encoded data is changed when retrieve from the database - python

I am having a problem with data corruption. I save the data after encripting in to a dabase. But when compare the same data after retriving, I found that value is not equal.
For me it seems, data is changed after saving data.
RPasswd = "pw2013"
Passwd = "2password"
TVkey = hashlib.sha256(Passwd).digest()
#encode password for before encryption to remove special characters
Rencoded = base64.b64encode(RPasswd)
#pad the length to a multiple of 16 with a space character for encryption
Rencoded += ((16 - len(Rencoded)%16) * " ")
#encrypt password to be stored as the user's credentials
encRPasswd = AES.new(TVkey, AES.MODE_ECB).encrypt(Rencoded)
# Open database connection
db = MySQLdb.connect("localhost","root","root","db" )
# prepare a cursor object using cursor() method
cursor = db.cursor()
try:
cursor.execute("select `group_id`, `group`, `password` from db.groups where group_id = 1 ")
#print "number of rows" + cursor.rowcount
results = cursor.fetchall()
for row in results:
print row[0]
print row[1]
pw = row[2]
print "test"
db.commit()
except:
db.rollback()
# disconnect from server
db.close()
if encRPasswd == pw:
print "samme"
else:
print "not same"
When I run the code I get not same. Ideally, both values should be same.
Any hint?

Related

unable to get use where execute in sql

the program is just to get the phone number based on the carplate number.
i keep an error of could not process parameter,it must be of type list,tuple or dict.
import mysql.connector as mysql
def get_number(carplatenum):
# Connect to the database
conn = mysql.connect(
host = "127.0.0.1",
user = "root",
passwd = "",
database = "py")
print(conn)
# Create a cursor
cursor = conn.cursor()
sql="SELECT number FROM gov_info WHERE carplate=col1=%s"
params=(carplatenum)
# Execute the SQL query
cursor.execute(sql,params)
# Fetch and save the result
result = cursor.fetchone()
# Close the cursor and connection
cursor.close()
# return the number if match is found, otherwise return None
if result:
return result[0]
else:
return None
# Example usage:
carplate = 'SJJ4649G'
number = get_number(carplate)
if number:
print(f"The number for carplate {carplate} is {number}")
else:
print(f"No match found for carplate {carplate}.")
if i run it with the normal sql query with such as"SELECT number FROM gov_info WHERE carplate='SJJ4649G'"it would give the correct output.
the changes that fixed it
sql = "SELECT number FROM gov_info WHERE carplate=%s"
params = (carplatenum,)
cursor.execute(sql,params)

MySQL update or insert based on fetchall results in Python

I need to set some user meta in my wordpress through local python script. Hence I can't use the WP update_user_meta for it - it has to be done manually.
import mysql.connector as mysql
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cursor = cnx.cursor()
get_meta = ("SELECT * FROM `ff_usermeta` WHERE `user_id`= 1 AND (`meta_key`='nickname' OR `meta_key`='info' OR `meta_key`='bg' OR `meta_key`='avatar' OR `meta_key`='profile_updated')")
cursor.execute(get_meta)
meta = cursor.fetchall()
#some processing of the result
cursor.execute(q, (...))
cnx.commit()
cursor.close()
cnx.close()
Now I need to check if the result has meta with each of the keys.
If the key already exists for this user, it needs to run UPDATE for this meta.
If this user still has no meta of this key, it has to INSERT new row.
if(there's no 'nickname' in meta_key on either of 5 or less rows):
q = ("INSERT INTO `ff_usermeta` ...")
else:
q = ("UPDATE `ff_usermeta` ...")
...and 4 more times like that?.. Seems like a good place for a cycle, but I don't really like the idea to make it 5x queues, especially since there might be more fields in the future.
I was thinking along the lines of searching the fetchall result for matches in meta_key, and if found, adding required data to one array, if not - to another. And then just running one update and one insert at the end, assuming both are not empty. If I were to write it in semi-php style, it would look roughly like this:
if(in_array("nickname", meta))
for_update .= "`nickname`='"+data[0]+"', "
else:
fields .= "`nickname`, "
vals .= "'"+data[0]+"', "
if(in_array("bg", meta)):
for_update .= "`bg`='"+data[1]+"', "
else:
fields .= "`bg`, "
vals .= "'"+data[1]+"', "
if(for_update):
update = ("UPDATE `ff_usermeta` SET "+for_update+" WHERE 1")
if(fields):
insert = ("INSERT INTO `ff_usermeta`("+fields+") VALUES ("+vals+")")
But absolutely no clue how to translate it correctly to python. I had to google it up to things like "why dot not working to add one string to another". Any advice? Or perhaps there is a better way? Thanks!
It is not complete, you can not update your rows in that way.
But with this you can start to make your query
The frist select gets exactly 1 row, if the user_id exists.
The user_id doesn't seem the right choice for this, but to get what you can do it is enough.
If the query doesn't have an entry, the it will insert some data you get from anywhere
The update as the insert are in that form wrong as you have to insert 5 new orws or update max 5 rows, but that is more for you to programm
import mysql.connector as mysql
HOST = "localhost"
DATABASE = ""
USER = "root"
PASSWORD = "mypassword"
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
cursor = cnx.cursor()
user_id = 1
get_meta = ("""SELECT umeta_id, user_id , MAX(IF( `meta_key`='nickname', meta_value,'')) AS 'nickname' , MAX(IF( `meta_key`='info', meta_value,'')) AS 'info' , MAX(IF( `meta_key`='bg', meta_value,'')) AS 'bg' , MAX(IF( `meta_key`='avatar', meta_value,''NULL'')) AS 'avatar' , MAX(IF (`meta_key`='profile_updated', meta_value,'')) AS 'profile_updated' FROM `ff_usermeta` WHERE `user_id`= %s GROUP BY umeta_id, user_id:""")
result = cursor.execute(get_meta,(user_id,))
if result > 0:
data = cursor.fetchone()
for_update = "";
#some processing of the result
if not data["nickname"]:
for_update += "`nickname`='"+data["nickname"]+"', "
if not data["bg"]:
for_update += "`bg`='"+data["bg"]+"', "
query = ("UPDATE `ff_usermeta` SET "+for_update+" WHERE user_id = " + user_id)
else:
#here are no data to be gathered as there is no user_id present add new user
nickname = ""
bg= ""
info = ""
avatar = ""
profile_updated = ""
fields= ""
vals = ""
fields += "`nickname`,`info`, `bg`,`avatar`,`profile_updated`"
vals += "'"+nickname+"', "+"'"+info+"', "+"'"+bg+"', "+"'"+avatar+"', "+"'"+profile_updatedfo+"'"
query = ("INSERT INTO `ff_usermeta`("+fields+") VALUES ("+vals+")")
cursor.execute(query)
cnx.commit()
cursor.close()
cnx.close()
I tried my best to adapt the suggestion above, but couldn't figure out how to make it work. Eventually I went another way, and it seems to work somehow, so I'll post the full code in case anyone would find it useful.
What it does: checks the queue in table with validation request, then parses a page (separate function) and updates user profile accodringly.
import mysql.connector as mysql
import time
from datetime import datetime
cnx = mysql.connect(host=HOST, database=DATABASE, user=USER, password=PASSWORD)
while True: #endless loop as a temporary scheduler
cursor = cnx.cursor()
#getting first request in the queue - 0: id, 1: url, 2: parse, 3: status, 4: user, 5: user_page, 6: req_date, 7: action
cursor.execute("SELECT * FROM `ff_qq` WHERE status = 0 LIMIT 1")
row = cursor.fetchone()
if row:
status = 1 #processed
if row[7] == "verify":
get_user = ("SELECT * FROM `ff_users` WHERE ID = %s LIMIT 1")
cursor.execute(get_user, (row[4],))
user = cursor.fetchone() #0 - ID, 5 - user_url, 8 - user_status, 9 - display_name
#separate function that returns data to insert into mysql
udata = verify(row) #0 - nickname, 1 - fb_av, 2 - fb_bg, 3 - fb_info, 4 - owner
ustat = row[1].split("/authors/")
if udata['owned'] or user[8] == ustat[1]:
update_user = ("UPDATE `ff_users` SET user_status = %s, display_name = %s, user_url = %s WHERE ID = %s LIMIT 1")
cursor.execute(update_user, (ustat[1], udata['nickname'], row[1], user[0]))
status = 2 #success
get = ("SELECT `meta_value` FROM `ff_usermeta` WHERE `user_id`= %s AND `meta_key`='ff_capabilities' LIMIT 1")
cursor.execute(get, (row[4],))
rights = cursor.fetchone()
if rights == 'a:1:{s:10:"subscriber";b:1;}':
promote = ("UPDATE `ff_usermeta` SET `meta_value` = 'a:1:{s:6:\"author\";b:1;}' "
"WHERE `user_id` = %s AND `meta_key`='ff_capabilities' LIMIT 1")
cursor.execute(promote, (row[0],))
#list of meta_key values in same order as returned data
ff = ['nickname', 'fb_av', 'fb_bg', 'fb_info']
for x in range(0,3): #goes through each one of the above list
if udata[ff[x]]: #yes this actually works, who would've thought?..
#current meta_key added directly into the string
get = ("SELECT `meta_value` FROM `ff_usermeta` WHERE `user_id`= %s AND `meta_key`='" + ff[x] + "' LIMIT 1")
cursor.execute(get, (row[4],))
meta = cursor.fetchone()
if(meta): #update if it exists, otherwise insert new row
qq = ("UPDATE `ff_usermeta` SET `meta_value` = %s "
"WHERE `user_id` = %s AND `meta_key`='" + ff[x] + "' LIMIT 1")
else:
qq = ("INSERT INTO `ff_usermeta`(`meta_value`, `meta_key`, `user_id`) "
"VALUES ('%s','" + ff[x] + "','%s'")
cursor.execute(qq, (udata[ff[x]], row[0])) #same execute works for both
else:
status = 3 #verification failed
#update queue to reflect its status
update = ("UPDATE `ff_qq` SET status = %s WHERE id = %s LIMIT 1")
cursor.execute(update, (status, row[0]))
cnx.commit()
cursor.close()
now = datetime.now()
print(now.strftime("%d.%m.%Y %H:%M:%S"))
time.sleep(180) #sleep until it's time to re-check the queue
cnx.close()

How to maintain "while" cycle in Python?

I'm in need of simple python script, that will fetch data from MSSQL database on a trigger and then send it to Telegram. My problem is that I cannot write appropriate while cycle to hold it until the SQL trigger works. Here's the code
import pymssql
conn = pymssql.connect(server='serv', user='user', password='pwd', database='DB')
cursor = conn.cursor()
print('connection success')
# Select Query
print('Reading data from table')
with conn.cursor() as cursor:
row = str(0)
while row == 0:
cursor.execute("""
CREATE TRIGGER Server_enter
ON pLogData
AFTER INSERT
AS SELECT [TimeVal],[Remark],[Name],[FirstName]
FROM [data1]
INNER JOIN [data2]
ON [ID]=[ID]
WHERE [Remark] LIKE '%server%' """)
row = cursor.fetchone()
print(str(row[0]) + " " + str(row[1]) + " " + str(row[2]) + " " + str(row[3]))
Now its writing me "reading data from table" and Process finished.
How to make appropriate cycle?
The problem is how you're setting up your loop. You have:
row = str(0)
while row == 0:
# code
however, the string "0" is not equal to the integer 0, so the loop will never execute and will be skipped.
You will hit a similar issue on subsequent iterations as fetchone doesn't return 0 when no data is available, it returns None. You should be basing the loop condition on that:
row = None
while row is None:

Return text from Oracle blob column from Oracle database using python

I have this code to take the text from a blob column of my database:
import cx_Oracle
ip = 'your host'
port = 1521
SID = 'ORCL'
USER = 'user'
PASSWORD = 'password'
dsn_tns = cx_Oracle.makedsn(ip, port, SID)
dsn = cx_Oracle.makedsn(ip, port, SID)
orcl = cx_Oracle.connect(USER + '/' + PASSWORD + '#' + dsn)
curs = orcl.cursor()
sql = """SELECT blob_column from table"""
curs.execute(sql)
rows = curs.fetchall()
for x in rows:
list_ = list(x)
print(x[0].read)
But when i print with the for, i got this result:
<built-in method read of cx_Oracle.LOB object at 0x0547EAE8>
<built-in method read of cx_Oracle.LOB object at 0x0547EAD0>
<built-in method read of cx_Oracle.LOB object at 0x0711D770>
How can i return the text from my blob column?
For LOBS that fit in memory, you'll probably find it a lot faster to read them as strings or bytes, see https://cx-oracle.readthedocs.io/en/latest/user_guide/lob_data.html
def OutputTypeHandler(cursor, name, defaultType, size, precision, scale):
if defaultType == cx_Oracle.CLOB:
return cursor.var(cx_Oracle.LONG_STRING, arraysize=cursor.arraysize)
if defaultType == cx_Oracle.BLOB:
return cursor.var(cx_Oracle.LONG_BINARY, arraysize=cursor.arraysize)
idVal = 1
textData = "The quick brown fox jumps over the lazy dog"
bytesData = b"Some binary data"
cursor.execute("insert into lob_tbl (id, c, b) values (:1, :2, :3)",
[idVal, textData, bytesData])
connection.outputtypehandler = OutputTypeHandler
cursor.execute("select c, b from lob_tbl where id = :1", [idVal])
clobData, blobData = cursor.fetchone()
print("CLOB length:", len(clobData))
print("CLOB data:", clobData)
print("BLOB length:", len(blobData))
print("BLOB data:", blobData)
Got it!
wkt = rows[0][0].read() # This works for me!
print(wkt.decode("utf-16")) #And my text back with utf-16 so i had to decode
orcl.close()

Python code not creating tables on the database but able to query the results postgres

My usecase is to write create a temp table in the postgres database and fetch records from it and insert into a different table.
The code i used is:
import psycopg2
import sys
import pprint
from __future__ import print_function
from os.path import join,dirname,abspath
import xlrd
import os.path
newlist = []
itemidlist = []
def main():
conn_string = "host='prod-dump.cvv9i14mrv4k.us-east-1.rds.amazonaws.com' dbname='ebdb' user='ebroot' password='*********'"
# print the connection string we will use to connect
# print "Connecting to database" % (conn_string)
# get a connection, if a connect cannot be made an exception will be raised here
conn = psycopg2.connect(conn_string)
# conn.cursor will return a cursor object, you can use this cursor to perform queries
cursor = conn.cursor()
dealer_id = input("Please enter dealer_id: ")
group_id = input("Please enter group_id: ")
scriptpath = os.path.dirname('__file__')
filename = os.path.join(scriptpath, 'Winco - Gusti.xlsx')
xl_workbook = xlrd.open_workbook(filename, "rb")
xl_sheet = xl_workbook.sheet_by_index(0)
print('Sheet Name: %s' % xl_sheet.name)
row=xl_sheet.row(0)
from xlrd.sheet import ctype_text
print('(Column #) type:value')
for idx, cell_obj in enumerate(row):
cell_type_str = ctype_text.get(cell_obj.ctype, 'unknown type')
#print('(%s) %s %s' % (idx, cell_type_str, cell_obj.value))
num_cols = xl_sheet.ncols
for row_idx in range(0, xl_sheet.nrows): # Iterate through rows
num_cols = xl_sheet.ncols
id_obj = xl_sheet.cell(row_idx, 1) # Get cell object by row, col
itemid = id_obj.value
#if itemid not in itemidlist:
itemidlist.append(itemid)
# execute our Query
'''
cursor.execute("""
if not exists(SELECT 1 FROM model_enable AS c WHERE c.name = %s);
BEGIN;
INSERT INTO model_enable (name) VALUES (%s)
END;
""" %(itemid,itemid))
'''
cursor.execute("drop table temp_mbp1")
try:
cursor.execute("SELECT p.model_no, pc.id as PCid, g.id AS GROUPid into public.temp_mbp1 FROM products p, \
model_enable me, products_clients pc, groups g WHERE p.model_no = me.name \
and p.id = pc.product_id and pc.client_id = %s and pc.client_id = g.client_id and g.id = %s"\
% (dealer_id,group_id)
except (Exception, psycopg2.DatabaseError) as error:
print(error)
cursor.execute("select count(*) from public.temp_mbp1")
# retrieve the records from the database
records = cursor.fetchall()
# print out the records using pretty print
# note that the NAMES of the columns are not shown, instead just indexes.
# for most people this isn't very useful so we'll show you how to return
# columns as a dictionary (hash) in the next example.
pprint.pprint(records)
if __name__ == "__main__":
main()
The try except block in between the program is not throwing any error but the table is not getting created in the postgres database as i see in the data admin.
The output shown is:
Please enter dealer_id: 90
Please enter group_id: 13
Sheet Name: Winco Full 8_15_17
(Column #) type:value
[(3263,)]
Thanks,
Santosh
You didn't commit the changes, so they aren't saved in the database. Add to the bottom, just below the pprint statement:
conn.commit()

Categories

Resources