how to get the output of DDL commands Python cx_oracle - python

I am currently working with cx_oracle
Here with SELECT statements I am able to use the fetchall() function to get rows.
But how to get the outputs for queries that fall under Data Definition Language (DDL) category.
For example, after executing a GRANT statement with cursor.execute(), the expected output assuming the query is valid would be,
"GRANT executed successfully"
But how do I get this with cx_oracle, Python.

The answer is that you have print it yourself, which is what SQL*Plus does.
DDL statements are statements not queries because they do not return data. They return a success or error condition to the tool that executed them, which can then print any message. In your case the tool is cx_Oracle. There isn't a way to get the type (GRANT, CREATE etc) of the statement automatically in cx_Oracle. Your application can either print a generic message like 'statement executed successfully', or you can extract the first keyword(s) from the SQL statement so you can print a message like SQL*Plus does.

Related

Python cassandra update statement error 'mismatched input 're' expecting K_WHERE '?

I am trying to update cassandra database using python client as follows.
def update_content(session, id, content)
update_statement = """
UPDATE mytable SET content='{}' WHERE id={}
"""
session.execute(update_statement.format(content, id))
It works for most of the cases but in some scenarios the content is a string of the form
content = "Content Message -'[re]...)"
which results in error Exception calling application: <Error from server: code=2000 [Syntax error in CQL query] message="line 2:61 mismatched input 're' expecting K_WHERE (
which I am not sure why is it happening?
Is cassandra trying to interpret the string as regex somehow.
I tried printing the data before updation and its seems fine
"UPDATE mytable SET content='Content Message -'[re]...)' WHERE id=2"
To avoid such problems you should stop using the .format to create CQL statements, and start to use prepared statements that allow you to:
avoid problems with not escaped special characters, like, '
allows to do basic type checking
get better performance, because query will be parsed once, and only the data will be sent over the wire
you'll get token-aware query routing, meaning that query will be sent directly to one of the replicas that holds data for partition
Your code need to be modified as following:
prep_statement = session.prepare('UPDATE mytable SET content=? WHERE id=?')
def update_content(session, id, content):
session.execute(prep_statement, [content, id])
Please notice that statement need to be prepared only once, because it includes the round-trip to the cluster nodes to perform parsing of the query

Escaping single quotes in postgres/python

In making some slow code run more efficiently, I ran into a snag with escape characters. The original code (which works) was something like this:
sql = f"""INSERT INTO {schema}.{table}
(seg_id, road_type, road_name)
VALUES ({s_id}, '{road_type}', %s);"""
# road_name may have an apostrophe. Let the execute() handle it.
cur.execute(sql, (road_name,))
This query ran many thousands of times, and the execute statements make the code sluggish due to communicating with a remote server. I would like to get a series of sql statements and execute them all at once, something like this:
sql += f"""INSERT INTO {schema}.{table}
(seg_id, road_type, road_name)
VALUES ({s_id}, '{road_type}', '{road_name}');\n"""
The sql statement is not executed immediately, but is "stacked" so that I can execute a collection of sql statements in a single cur.execute(sql) which happens later. However, since it does not have the road_name as an argument in the execute statement, it fails when there's a road name with an apostrophe (like Eugene O'Neill Drive).
What is a viable alternative?
NOTE: The query is in a function that is called from multiple processes. The {variables} are being passed as arguments.

Python execute sql query using sqlcmd

Hi I want to execute query using sqlcmd so I am calling it using subprocess.call() . This process sometimes its working but in loop it does not work. It only the execute the last argument. Please help below is the sample code I am trying-
import subprocess
host = 'hostname'
db = 'SQLTest'
sqlcmd = "C:\Program Files\Microsoft SQL Server\100\Tools\Binn\SQLCMD.EXE"
query = "INSERT INTO [dbo].[Test](type,ident,lat,long,y_proj,x_proj,new_seg,display,color,altitude,depth,temp,time,model,filename,ltime) VALUES ('TRACK','ACTIVE LOG','40.79015493','-77.85914183','4627311.94501541','1779470.5827101','False','False','255','351.979858398438','0','0','2008/06/11-14:33:33','eTrex Venture','','2008/06/11 09:33:33')"
for x in range (0,5):
subprocess.call([sqlcmd,'-S' ,host, '-d', db, '-Q', query])
Or is there any other method. I even tried pymysql module. But it shows authentication error.
I got the error. It was related to the query I was passing. The query was reading from a text file. So it had spaces in them except the last query. and for single testing I was using the last query. After fixing that it worked.
That's a very creative solution!
When you say " in loop it does not work", could you tell us what's happening? Is there an error message? Does it run, but no data is inserted in the table? Can you get this to work properly outside a loop?
The first thing I notice is
sqlcmd = "c:\program files\...."
You might want to make that a raw string, by putting an "r" in front of the quotes, like so:
sqlcmd = r"c:\program files\...."
That will prevent Python from trying to interpret the backslash as a special characters.
It looks like you're trying to talk to a SQL Server, so pymysql is not going to help (that's for talking to a MySQL Server). I would suggest looking into pyodbc or pymssql as an alternative.

Unable to enter data into SQL DB

I am trying to insert values to a SQL DB where I pull data from a dictionary. I ran into a problem when my program tries to enter 0xqb_QWQDrabGr7FTBREfhCLMZLw4ztx into a column named VersionId. The following is my sample code and error.
cursor.execute("""insert into [TestDB].[dbo].[S3_Files] ([Key],[IsLatest],[LastModified],[Size(Bytes)],[VersionID]) values (%s,%s,%s,%s,%s)""",(item['Key'],item['IsLatest'],item['LastModified'],item['Size'],item['VersionId']))
conn_db.commit()
pymssql.ProgrammingError: (102, "Incorrect syntax near 'qb_QWQDrabGr7FTBREfhCLMZLw4ztx'.DB-Lib error message 20018, severity 15:\nGeneral SQL Server error: Check messages from the SQL Server\n")
Based on the error I assume SQL does not like the 0x in the beginning of the VersionId string because of security issues. If my assumption is correct, what are my options? I also cannot change the value of the VersionId.
Edit: This what I get when I print that cursor command
insert into [TestDB].[dbo].[S3_Files] ([Key],[IsLatest],[LastModified],[Size(Bytes)],[VersionID]) values (Docs/F1/Trades/Buy/Person1/Seller_Provided_-_Raw_Data/GTF/PDF/GTF's_v2/NID3154229_23351201.pdf,True,2015-07-22 22:05:38+00:00,753854,0xqb_QWQDrabGr7FTBREfhCLMZLw4ztx)
Edit 2: The odd thing is that when I try to enter the insert command manually on SQL management studio, it doesn't like the (') in the path name in the first parameter, so I escaped the character, added (') to each values except the number and the command worked. At this point I am pretty stumped on why the insert is not working.
Edit 3: I decided to do a try except on every insert and I see that the ones that VersionIds that get caught have the pattern 0x..... Again, does anyone know if my assumption of security correct?
I guess that's what happens when our libraries try to be smarter than us...
No SQL server around to test, but I assume the reason the 0x values are failing is because the way pymssql passes the parameter causes the server to interprete this as a hexadecimal string and the 'q' following the '0x' does not fit its expectations of 0-9 and A-F chars.
I don't have enough information to know if this is a library bug and/or if it can be worked around; the pymssql documentation is not very extensive, but I would try the following:
if you can, check in MSSQL Profiler what command is actually coming in
build your own command as a string and see if the error persists (remember Bobby Tables before putting that in production, though: https://xkcd.com/327/)
try to work around it by adding quotes etc
swith to another library / use SQLAlchemy

Query filtering in Django with sqlite

I've tried the following query with Django,
def search(search_text):
q = Info.objects.filter(title__contains=str(search_text))
print q.query
The query that get printed is
SELECT "d"."id", "d"."foo" FROM "d_info" WHERE "d_info"."title" LIKE %hello% ESCAPE '\'
The query fails because the text after LIKE doesn't have quotes around it. The query succeeds when I run the query on the sql prompt with quotes around the text after LIKE like below
SELECT "d"."id", "d"."foo" FROM "d_info" WHERE "d_info"."title" LIKE '%hello%' ESCAPE '\'
How do I get Django to add the quotes around the search_text so that the query succeeds ?
I'm using Djanog with sqlite3
I tried this out with Postgresql 8.3. The query is generated without quotes. However executing the filter returns a valid queryset with expected instances. Can you try executing
q = Info.objects.filter(title__contains=str(search_text))
print q.count()
and see if it works?
Posting my comment above as the answer
It so turns out the query works within Django, but when asked to print the query and if I copy the query printed and execute it in a mysql shell or sqlite shell, it doesn't work.
Django is probably printing the query wrong

Categories

Resources