I'm updating the output of Google reverse geocoding (which is in JSON format),
cnxn = pyodbc.connect('DRIVER={SQL Server};SERVER=localhost;DATABASE=mydb;UID=test;PWD=abc#123;autocommit=True')
cursor = cnxn.cursor()
wp = urllib.request.urlopen("http://maps.googleapis.com/maps/api/geocode/json?latlng=18.5504,73.9412&sensor=false")
pw = wp.read()
#print(pw)
cursor.execute("UPDATE GEOCODE_tbl SET JSON_str = ? WHERE GEOCODE_ID = ?", pw,749904)
print('Done')
cnxn.commit()
But it gives error
('22018', '[22018] [Microsoft][ODBC SQL Server Driver][SQL Server]Operand type clash: image is incompatible with nvarchar(max) (206) (SQLExecDirectW)')
What kind of error is that?
The JSON_str column has such JSON output, so I'm executing the task for those column whose JSON_str column is NULL.
Does anyone have any idea about it?
The value pw is not of type str. Try converting your query to this:
cursor.execute("UPDATE GEOCODE_tbl SET JSON_str = ? WHERE GEOCODE_ID = ?", (str(pw), 749904))
Good luck!
Related
For some reason, I am storing the below array completely in the SQL server using pyodbc in the form of text with single quotes.
['Sachin', 'Yuvraj']
I am inserting the above value using below code
tes_table= SQLCURSOR.execute('''INSERT INTO Test_Table(test_name) VALUES ('{}')
'''.format(arr))
I am getting the below error.
pyodbc.ProgrammingError: ('42000', "[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Incorrect syntax near 'Sachin'. (102) (SQLExecDirectW)")
[13/Oct/2020 23:54:53] "POST /api/save HTTP/1.1" 500 77431
This is another example of why using string formatting to embed data values into SQL command text is a bad idea. In this case the rendered string literal creates a syntax error because the single quotes are not properly escaped.
>>> arr = ['Sachin', 'Yuvraj']
>>> "... VALUES ('{}')".format(arr)
"... VALUES ('['Sachin', 'Yuvraj']')"
Instead, you should be using a proper parameterized query
sql = """\
INSERT INTO Test_Table (test_name) VALUES (?)
"""
tes_table = SQLCURSOR.execute(sql, str(arr))
I want to save an API response, on some table of my database, I'm using Postgres along with psycopg2.
This is my code:
import json
import requests
import psycopg2
def my_func():
response = requests.get("https://path/to/api/")
data = response.json()
while data['next'] is not None:
response = requests.get(data['next'])
data = response.json()
for item in data['results']:
try:
connection = psycopg2.connect(user="user",
password="user",
host="127.0.0.1",
port="5432",
database="mydb")
cursor = connection.cursor()
postgres_insert_query = """ INSERT INTO table_items (NAME VALUES (%s)"""
record_to_insert = print(item['name'])
cursor.execute(postgres_insert_query, record_to_insert)
connection.commit()
count = cursor.rowcount
print (count, "success")
except (Exception, psycopg2.Error) as error :
if(connection):
print("error", error)
finally:
if(connection):
cursor.close()
connection.close()
my_func()
I mean, I just wanted to sort of "print" all the resulting data from my request into the db, is there a way to accomplish this?
I'm a bit confused as You can see, I mean, what could be some "print" equivalent to achieve this?
I mean, I just want to save from the API response, the name field, into the database table. Or actually INSERT that, I guess psycopg2 has some sort of function for this circumstance?
Any example You could provide?
EDIT
Sorry, I forgot, if I run this code it will throw this:
PostgreSQL connection is closed
A particular name
Failed to insert record into table_items table syntax error at or near "VALUES"
LINE 1: INSERT INTO table_items (NAME VALUES (%s)
There are a few issues here. I'm not sure what the API is or what it is returning, but I will make some assumptions and suggestions based on those.
There is a syntax error in your query, it is missing a ) it should be:
postgres_insert_query = 'INSERT INTO table_items (NAME) VALUES (%s)'
(I'm also assuming thatNAME` is a real column in your database).
Even with this correction, you will have a problem since:
record_to_insert = print(item['name']) will set record_to_insert to None. The return value of the print function is always None. The line should instead be:
record_to_insert = item['name']
(assuming the key name in the dict item is actually the field you're looking for)
I believe calls to execute must pass replacements as a tuple so the line: cursor.execute(postgres_insert_query, record_to_insert) should be:
cursor.execute(postgres_insert_query, (record_to_insert,))
I am trying to pull data from a REST API and insert it into SQL Server. If we have the script do the PhotoBinary,Filetype together it works but as soon as I add the ID which is an integer we get the error below. Also if I just have it pull ID on its own from the API it works.
I am trying to pull 3 pieces of information
The EmployeeID which is an int.
The Binary String representation of the image
The file type of the original file e.g.: .jpg
The target table is setup as:
Create table Employee_Photo
(
EmployeeID int,
PhotoBinary varchar(max),
FileType varchar(10)
)
The Error I get is:
Traceback (most recent call last):
File "apiphotopullwithid.py", line 64, in <module>
cursor.execute("INSERT INTO dbo.Employee_Photo([EmployeeID],[PhotoBinary],[FileType]) values (?,?,?)", row['EMPID'],row['Photo'],row['PhotoType'])
pyodbc.ProgrammingError: ('42000', '[42000] [Microsoft][ODBC SQL Server Driver][SQL Server]The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 5 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision. (8023) (SQLExecDirectW)')
import json
import pandas as pd
import sqlalchemy
import pyodbc
import requests
url = "https://someurl.com/api/PersonPhoto"
headers = {
'Accept': "application/json",
'Authorization': "apikey XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
'Content-Type': "application/json",
'cache-control': "no-cache"
}
response = requests.request("GET", url, headers=headers)
data = json.loads(response.text)
ID,Photo,PhotoType = [],[],[]
for device in data['PersonPhoto']:
ID.append(device[u'ID'])
Photo.append(device[u'Photo'])
PhotoType.append(device[u'PhotoType'])
df = pd.DataFrame([ID,Photo,PhotoType]).T
df.columns = ['EMPID','Photo','PhotoType']
df = df.astype({'EMPID':'Int64'})
connStr = pyodbc.connect(
"DRIVER={SQL Server};"
"SERVER=SQLTest;"
"Database=Intranet123;"
"Trusted_Connection=yes;"
#"UID=ConnectME;"
#"PWD={Password1}"
)
cursor = connStr.cursor()
for index,row in df.iterrows():
cursor.execute("INSERT INTO dbo.Employee_Photo([EmployeeID],[PhotoBinary],[FileType]) values (?,?,?)", row['EMPID'],row['Photo'],row['PhotoType'])
connStr.commit()
cursor.close()
connStr.close()
In most Python database APIs including pyodbc adhering to the PEP 249 specs, the parameters argument in cursor.execute() is usually a sequence (i.e., tuple, list). Therefore, bind all values into an iterable and not as three separate argument values:
sql = "INSERT INTO dbo.Employee_Photo ([EmployeeID],[PhotoBinary],[FileType]) VALUES (?,?,?)"
# TUPLE
cursor.execute(sql, (row['EMPID'], row['Photo'], row['PhotoType']))
# LIST
cursor.execute(sql, [row['EMPID'], row['Photo'], row['PhotoType']])
By the way, avoid the explicit iterrows loop and use implicit loop with executemany using Pandas' DataFrame.values:
# EXECUTE PARAMETERIZED QUERY
sql_cols = ['EMPID', 'Photo', 'PhotoType']
cursor.executemany(sql, df[sql_cols].values.tolist())
conn.commit()
Actually, you do not even need Pandas as a middle layer (use library for just data science) and interact with original returned json:
# NESTED LIST OF TUPLES
vals = [(int(device[u'ID']), device[u'Photo'], device[u'PhotoType']) \
for device in data['PersonPhoto']]
cursor.executemany(sql, vals)
conn.commit()
You're using the old Windows built-in SQL Server driver. Try the newer one, which you can get from here for multiple platforms.
Don't read too much into the error message. Something is malformed in the network protocol layer.
Can you dump the types and values of the parameters causing the issue. My guess is that the driver is setting the parameter types incorrectly.
EG:
for index,row in df.iterrows():
empid = row['EMPID']
photo = row['Photo']
photoType = row['PhotoType']
print("empid is ",type(empid), " photo is ", type(photo), " photoType is ", type(photoType))
print("empid: ",empid, " photo: ", photo, " photoType: ", photoType)
cursor.execute("INSERT INTO dbo.Employee_Photo([EmployeeID],[PhotoBinary],[FileType]) values (?,?,?)", empid,photo,photoType)
connStr.commit()
cursor.close()
connStr.close()
stmt = "UPDATE requests SET (hostname,domainname,naptrsrvptrinitial,cnameptrfinal,publiclist,privatelist) = ('%s','%s','%s','%s','{%s}','{%s}') WHERE requestid = %d"%(str(myjson['hostName']),str(myjson['domainName']),str(myjson['customRecord']),str(myjson['canName']),publicArray,privateArray,int(myjson['requestID']))
curs.execute(stmt)
I have above query which is subjected to sql injection, below is query which mitigate sql injection.
curs.execute("UPDATE requests SET (hostname,domainname,naptrsrvptrinitial,cnameptrfinal,publiclist,privatelist) = (%s,%s,%s,%s,{%s},{%s}) WHERE requestid = %s",(str(myjson['hostName']),str(myjson['domainName']),str(myjson['customRecord']),str(myjson['canName']),publicArray,privateArray,int(myjson['requestID'])))
if I pass array {%s} in above query it throws an error??? How do I resolve??
I'm new to Python. I'm executing this very basic query:
connection = psycopg2.connect("dbname='test' host='localhost' user='admin' password='pass' port='9100'")
cur = connection.cursor()
cur.execute("""SELECT id FROM pages WEHERE uri = %(uri)s""", {'uri': uri})
row = cur.fetchall()
and keep getting this error:
<class 'psycopg2.ProgrammingError'>
('syntax error at or near "uri"\nLINE 1: SELECT id FROM pages WEHERE uri = \'http://example.com/index.php...\n ^\n',)
uri is a string and has the value http://example.com/index.php
Could you please help me?? This is making me crazy
It should be:
cur.execute("""SELECT id FROM pages WHERE uri = %(uri)s""", {'uri': uri})
That is, it should be where instead of wehere. Since there is no function like wehere in SQL, the syntax error is thrown.
The error itself is self-explanatory. Next time, read the error message closely.