I have the following code
engine = create_engine('database info',convert_unicode=True)
result = engine.execute(" select title from table")
line=(result.fetchall())
print(line)
Some of the results have "\xa0" inserted in them
Is there a way to return my query so that I don't have to try to fix it after the fact?
Related
Saving a string into a sqlite table, retrieving it again and comparing it to the original requires some filters to work and i dont know why exactly.
tl;dr
How can i retrieve string Data from the SQLITE DB without requiring Filter Nr 3 as its dangerous for more complex strings ?
import sqlite3
RAWSTRING = 'This is a DB Teststing'
# create database and table
currentdb = sqlite3.connect('test.db')
currentdb.execute('''CREATE TABLE tickertable (teststring text)''')
# enter RAWSTRING into databasse
currentdb.execute('''INSERT INTO tickertable VALUES(?);''', (RAWSTRING,))
# get RAWSTRING from database
cursorObj = currentdb.cursor()
cursorObj.execute('SELECT * FROM tickertable')
DB_RAWSTRING = cursorObj.fetchall()
currentdb.commit()
currentdb.close()
# Prints This is a DB Teststing
print('originalstring : ', RAWSTRING)
# Prints [('This is a DB Teststing',)]
print('retrieved from DB: ', DB_RAWSTRING)
# Get first entry from List because fetchall gives a list
FILTER1_DB_RAWSTRING = DB_RAWSTRING[0]
# Convert the Listelement to String because its still a listelement and comparing fails to string
FILTER2_DB_RAWSTRING = str(FILTER1_DB_RAWSTRING)
# Remove annoying db extra characters and i dont know why they exist anyway
FILTER3_DB_RAWSTRING = FILTER2_DB_RAWSTRING.replace("'", "").replace("(", "").replace(")", "").replace(",", "")
if RAWSTRING == FILTER3_DB_RAWSTRING:
print('Strings are the same as they should')
else:
print('String are not the same because of db weirdness')
So here's your problem: fetchall returns a list of tuples. This means that casting them to a string puts pesky parenthesis around each row and commas between each element of each row. If you'd like to retrieve the raw information from each column, that can be done by indexing the tuples:
entries = cursorObj.fetchall()
first_row = entries[0]
first_item = first_row[0]
print(first_item)
This ought to print just the content of the first row and column in the DB. If not, let me know!
David
I have the following dataframe in pandas
need to insert all value into a datawarehouse with chinese characters but chinese characters are instered as junk (?????) (百å¨è‹±åšï¼ˆèˆŸå±±ï¼‰å•¤é…’有é™å…¬å¸
) like above one
The insert query is prepared dynamically.
I need help on how to handle the following scenerio:
Read file as UTF-8 and writte into a datawarehouse using pyodbc connection using character set UTF-8.
df=pd.read_csv(filename,dtype='str',encoding='UTF-8')
cnxn = database_connect() ##Connect to database##
cnxn.setencoding(ctype=pyodbc.SQL_CHAR, encoding='UTF-8')
cnxn.autocommit = True
cursor = cnxn.cursor()
for y in range(len(df)):
inst='insert into '+tablename+' values ('
for x in range(len(clm)):
if str(df.iloc[y,x])=='nan':
df.iloc[y,x]=''
if x!=len(clm)-1:
inst_val=inst_val+"'"+str(df.iloc[y,x]).strip().replace("'",'')+"'"+","
else:
inst_val=inst_val+"'"+str(df.iloc[y,x]).strip().replace("'",'')+"'"+")"
inst=inst+inst_val #########prepare insert statment from values inside in-memory data###########
inst_val=''
print("Inserting value into table")
try:
cursor.execute(inst) ##########Execute insert statement##############
print("1 row inserted")
except Exception as e:
print (inst)
print (e)
same like value should inserted into sql datawarehouse
You are using dynamic SQL to construct string literals containing Chinese characters, but you are creating them as
insert into tablename values ('你好')
when SQL Server expects Unicode string literals to be of the form
insert into tablename values (N'你好')
You would be better off to use a proper parameterized query to avoid such issues:
sql = "insert into tablename values (?)"
params = ('你好',)
cursor.execute(sql, params)
I have to connect to an Oracle database and see if a table exists. While I can get a list of the tables, I'm having trouble seeing if the table I'm looking for is in the list. Some tables have associated table which I'll have to join on, some do not, thus I have to check.
What is in my list: ('NYSDOH_CI_EI_HOSPITAL',)
sql = "SELECT table_name FROM all_tables"
cur.execute(sql)
searchstr = 'NYSDOH_CI_EI_HOSPITAL'
p = re.compile(searchstr)
#create data array to load in SQL results in.
ciDataSet = []
cxRows = cur.fetchall()
for i in cxRows:
#print i # list of tables
if p.match(str(i)):
print i
It doesn't find it, even if I use a wildcard.
fetchall() returns a list of tuples.
So when you do
for i in cxRows:
'i' is of type tuple. In your case, this tuple will have only single value. You can access it using i[0] and match it with p.
Currently you are converting a tuple to string so regular expression will not match.
Corrected code:
sql = "SELECT table_name FROM all_tables"
cur.execute(sql)
searchstr = 'NYSDOH_CI_EI_HOSPITAL'
p = re.compile(searchstr)
#create data array to load in SQL results in.
ciDataSet = []
cxRows = cur.fetchall()
for i in cxRows:
#print i # list of tables
if p.match(str(i[0])):
print i
To improve on the syntax of #vaichidrewar, you could simplify the fetch loop to:
for tabname, in cur:
if p.match(str(tabname)):
print(tabname)
But it's going to be more efficient to do the reg exp matching in the query:
sql = "select table_name from all_tables where regexp_like(table_name, :tn, 'i')"
searchstr = 'EMP'
cur.execute(sql, (searchstr,))
for tabname, in cur:
print(tabname)
The 'i' option does a case-insensitive match. You can adjust the regexp as you like.
I'm looking to take an array list and attach it to a string.
Python 2.7.10, Windows 10
The list is loaded from a mySQL table and the output is this:
skuArray = [('000381001238',) ('000381001238',) ('000381001238',) ('FA200513652',) ('000614400967',)]
I'm wanting to take this list and attach it to a separate query
the problem:
query = "SELECT ItemLookupCode,Description, Quantity, Price, LastReceived "
query = query+"FROM Item "
query = query+"WHERE ItemLookupCode IN ("+skuArray+") "
query = query+"ORDER BY LastReceived ASC;"
I get the error:
TypeError: cannot concatenate 'str' and 'tuple' objects
My guess here is that I need to format the string as:
'000381001238', '000381001238', '000381001238', 'FA200513652','000614400967'
Ultimately the string needs to read:
query = query+"WHERE ItemLookupCode IN ('000381001238', '000381001238', '000381001238', 'FA200513652','000614400967') "
I have tried the following:
skuArray = ''.join(skuArray.split('(', 1))
skuArray = ''.join(skuArray.split(')', 1))
Second Try:
skus = [sku[0] for sku in skuArray]
stubs = ','.join(["'?'"]*len(skuArray))
msconn = pymssql.connect(host=r'*', user=r'*', password=r'*', database=r'*')
cur = msconn.cursor()
query ='''
SELECT ItemLookupCode,Description, Quantity, Price, LastReceived
FROM Item
WHERE ItemLookupCode IN { sku_params }
ORDER BY LastReceived ASC;'''.format(sku_params = stubs)
cur.execute(query, params=skus)
row = cur.fetchone()
print row[3]
cur.close()
msconn.close()
Thanks in advance for your help!
If you want to do the straight inline SQL you could use a list comprehension:
', '.join(["'{}'}.format(sku[0]) for sku in skuArray])
Note: You need to add commas between tuples (based on example)
That said, if you want to do some sql, I would encourage you to parameterize your request with ?
Here is an example of how you would do something like that:
skuArray = [('000381001238',), ('000381001238',), ('000381001238',), ('FA200513652',), ('000614400967',)]
skus = [sku[0] for sku in skuArray]
stubs = ','.join(["'?'"]*len(skuArray))
qry = '''
SELECT ItemLookupCode,Description, Quantity, Price, LastReceived
FROM Item
WHERE ItemLookupCode IN ({ sku_params })
ORDER BY LastReceived ASC;'''.format(sku_params = stubs)
#assuming pyodbc connection syntax may be off
conn.execute(qry, params=skus)
Why?
Non-parameterized queries are a bad idea because it leaves you vulnerable to sql injection and is easy to avoid.
Assuming that skuArray is a list, like this:
>>> skuArray = [('000381001238',), ('000381001238',), ('000381001238',), ('FA200513652',), ('000614400967',)]
You can format your string like this:
>>> ', '.join(["'{}'".format(x[0]) for x in skuArray])
"'000381001238', '000381001238', '000381001238', 'FA200513652', '000614400967'"
I am new to python. I am writing a script which queries the database for a URL string. Below is my snippet.
db.execute('select sitevideobaseurl,videositestring '
'from site, video '
'where siteID =1 and site.SiteID=video.VideoSiteID limit 1')
result = db.fetchall()
for row in result:
videosite= row[0:2]
print videosite
It gives me baseURL from a table and the video site string from another table.
output: ('http://www.youtube.com/watch?v={0}', 'uqcSJR_7fOc')
I wish to format the output by removing the braces, quotes and commas and replace the {0} from baseURL with sitestring: uqcSJR_7fOc.
something like: https://www.youtube.com/watch?v=uqcSJR_7fOc in the final output and wish to write this to a file.
Thanks for your time and help in advance.
Use str.format.
db.execute('select sitevideobaseurl,videositestring '
'from site, video '
'where siteID =1 and site.SiteID=video.VideoSiteID limit 1')
result = db.fetchall()
for row in result:
videosite= row[0:2]
print videosite[0].format(videosite[1])
You can use a "replace" command in either Sql or Python.
str_url = str_url.replace('[','')
str_url = str_url.replace(']','')
str_url = str_url.replace('?','')
Something like this in Python which is the easier of the two to do.
Repeat for as many characters as you want to chop out
My str_url is your "videosite".