I want to delete rows in a snowflake table based on a list I convert from df. However, when I execute the code below, the error shows TypeError: not all arguments converted during string formatting
Here is some mock up code similar to my setup
data = ['5003s00000gUnEqAAK','5003s00000gUnEqAAK']
id_df = pd.DataFrame(data,columns=['ID'])
l = id_df['ID'].to_list()
delete_query = f"""delete from xx.table1 where id in (%s)""" % ','.join('?' * len(l))
conn.cursor().execute(delete_query.l)
What did I do wrong so the engine cannot format the variable to the correct way?
I expect the engine will plug the list in where clause correctly
Related
I am trying to call a function from SQL in Python. I am reading some values from an excel and those values have been moved in a list with json.dumps(list) .And afterword these values should be matched with the ones in SQL and retrieve accordingly other columns .
try:
ps_connection = psycopg2.connect(user="x",
password="y",
host="z",
port="w",
database="w")
cursor = ps_connection.cursor()
cursor.execute(select "Issuer Status" from api.match_sec_str_array("list"))
except psycopg2.OperationalError as connection_error:
print('Unable to connect to the database')
print(connection_error)
if cursor.description:
# read the names of the columns
columns = [desc[0] for desc in cursor.description]
# read the actual data
data = cursor.fetchall()
cursor.close()
ps_connection.close()
pd.DataFrame(data, columns = columns)
The error is this one :
Input In [15]
cursor.execute(select 'Issuer Status' from api.match_sec_str_array("list"))
^
SyntaxError: invalid syntax
Do you have any idea if i missed anything ? I am also not sure if cursor.execute creates string and i have to create a query builder . Any help is much appreciated . Thanks a lot in adavance!
In cursor.execute() you need to pass a string with your query. You are currently passing random keywords which are not recognized.
First, you should build your query string, and then passing to the execute method.
Have a look on the Psycopg docs: https://www.psycopg.org/docs/usage.html
If you need to build the query string out of the result of some calls (like that api object) consider using the string format method or f-strings.
I have a list l= ['A','B','C'...] which has a dynamic number of elements. I want to run the below statement in using SQL via Python.
sqlstr = "select * from [table] where ID in (%s) and column2=?"
sqlstr = sqlstr % ','.join('?' * len(l))
However, when I try to run
pd.read_sql(sqlstr,conn,params=[l,parameter2])
There is an error
The SQL contains 4 parameter markers, but 2 parameter supplied
I understand why there is an error because the list is parsed as 1 single parameter rather than individually. But I don't know how to fix it.
I notice that the auto_convert_lobs argument does not work as expected if I am iterating a result using cx_Oracle.Cursor object.
From the documentation I read, cx_Oracle will convert Clobs into String by default : http://docs.sqlalchemy.org/en/latest/dialects/oracle.html#lob-objects
The query I'm iterating is a function that returns a cursor instead of rows :
SELECT returns_a_cursor() FROM dual
-- pseudo code of returns_a_cursor():
FUNCTION returns_a_cursor() RETURN SYS_REFCURSOR
sql := 'SELECT ...';
OPEN cursor FOR sql;
RETURN cursor;
The only reason I'm using a function instead of a direct query is only for readability and organization.
My python code looks like so
engine = create_engine('oracle+cx_oracle://...')
conn = engine.connect()
result = conn.execute('SELECT returns_a_cursor() FROM dual').fetchone()
for row in result[0]:
print row
This 'cursor' method will return my CLOB data types to cx_Oracle.LOB data instead of string.
If I change my code to execute a query without the function, it will be able to convert the CLOB data to string.
engine = create_engine('oracle+cx_oracle://...')
conn = engine.connect()
result = conn.execute('SELECT a_clob_column FROM a_table')
for row in result:
print row
Is there a way that I can keep the auto_convert_lobs setting in my use case ?
I'm trying to return a hard coded value in my SQL query, but when running the query using pyodbc, random records return '\x0e' instead of the hard coded value (in this case '16'). If I run the query on the server (MS SQL Server 2008), the query returns all the correct results and values.
The beginning of the query looks like this:
My SQL Code:
Select '"16","' + S.ShipNum + '","'
My python code:
cursor.execute("""Select '\"16\",\"' + SS.ShipNum + '\",\"'
Is there another way to guarantee a value is returned from a query?
\016 is the oct representation of \x0e
So I would think that it has more to do with the way in which you are escaping your double quotes. In your python you are actually geting \16 and not "16" as you desire.
You should try a prepared statment maybe.
ps = db.prepare("SELECT 16")
ps()
returns:
[(16,)]
Addtional examples can be seen here:
[http://python.projects.pgfoundry.org/docs/0.8/driver.html#parameterized-statements]
You can see all of the ascii and other character sets here
[http://donsnotes.com/tech/charsets/ascii.html]
It looks like you're trying to create a comma-delimited, quoted, string representation of the row. Don't try to do this in the database query, string formatting isn't one of T-SQL's strengths.
Pass the static value using a parameter, then join the row values. Using sys.databases for the example:
params = ("Some value",)
sql = "SELECT ?, name, user_access_desc FROM sys.databases"
for row in cursor.execute(sql):
print(','.join('"{0}"'.format(column) for column in row))
I am currently taking numeric values (amongst many other string and numeric values) from a set of access databases and uploading them to a single MS SQL Server database.
I am using 32-bit Python 3.3 and the respective pyodbc package.
I was wondering if there is a way to capture the fact that the numeric field is empty in the
Access database without the driver returning the string 'None' instead*. The syntax used is as follows:
access_con = pyodbc.connect(connection_string)
access_cur = access_con.cursor()
access_SQL = 'SELECT * FROM ' + source_table
rows = access_cur.execute(access_SQL).fetchall()
for row in rows:
[Statement uploading each row to SQL Server using an INSERT INTO statement]
Any help would be appreciated; whether as a solution or as a more direct way to transfer the data.
*EDIT: 'None' is only a string because I turned it into one to add it to the INSERT INTO statement. Using row.replace('None','NULL') replaced all of the 'None' instances and replaced it with 'NULL' which was interpreted as a NULL value by the ODBC driver.
None is a Python object, not a string. It is the equivalent of NULL in SQL Server, or an "empty" column value in Access.
For example, if I have an Access table with the following definition:
That contains the following values (note that the first value of the Number column is empty):
Relevant Python code produces:
...
>>> cursor = connection.cursor()
>>> rows = cursor.execute('select * from Table1').fetchall()
>>> print(rows)
[(None, ), (1, )]
This sample confirms the empty Access value is returned as None.
This PyODBC Documentation provides a good explanation of how ODBC and Python data types are mapped.