I would like to store windows path in MySQL without escaping the backslashes. How can I do this in Python? I am using MySQLdb to insert records into the database. When I use MySQLdb.escape_string(), I notice that the backslashes are removed.
Have a look at os.path.normpath(thePath)
I can't remember if it's that one, but there IS a standard os.path formating function that gives double backslashes, that can be stored in a db "as is" and reused later "as is". I have no more windows machine and cannot test it anymore.
Just use a dictionary to add slashes wherever required to make the query valid :
http://codepad.org/7mjbwKBf
def addslashes(s):
dict = {"\0":"\\\0", "\\":"\\\\"} #add more here
return ''.join(dict.get(x,x) for x in s)
query = "INSERT INTO MY_TABLE id,path values(23,'c:\windows\system\')";
print(addslashes(query));
Related
This question already has an answer here:
How to prevent pandas dataframe from adding double quotes around #tmp when using sqlalchemy and sybase?
(1 answer)
Closed 2 years ago.
I using a framework, that creates temporary tables (sqlalchemy and pandas).
However, it creates a table surrounded by quotes, and in my case, I'm using Sybase and it returns an permission error.
When I create manually, without quotes works perfectly.
To workround it, I put \b at the beggining of string:
table_name=f'\b{table_name}'
When I test with \b, it erases the left quote, but I can't find a way to delete the closing quote.
Obs: I already tested table_name=f'\b{table_name}' + \u'\u007f'
For example:
table_name="#test"
df.to_sql(con=engine,name=table_name,index=False)
Generates following create:
CREATE TABLE "#test" (nome TEXT NULL)
I'm getting error, because quotes.
However, with this code, I can remove left quote:
table_name="\b#test"
df.to_sql(con=engine,name=table_name,index=False)
It generates:
CREATE TABLE #test" (nome TEXT NULL)
Thanks
If you have access to the string itself, you can always slice it to remove whatever characters you want. To remove the first and last ones:
>>> table_name = '"some test table"'
>>> table_name[1:-1] # No surrounding " characters.
some test table
Adding delete characters to the string (what you currently have) just affects the console output. Those characters are still present.
Assuming this is Sybase ASE and the error the OP is receiving is related to a syntax issue with the use of quotes, consider:
using double quotes around (table/column) names is referred to (in ASE) as quoted identifiers
ASE's default behavior is for quoted identifiers to be off (ie, generate an error if double quotes are used around (table/column) names)
in order to use quoted identifiers the client application needs to explicitly enable quoted identifiers either via a connection parameter or via the explicit command set quoted_identifier on
ASE also supports the use of square brackets ([]) around (table/column) names with the added benefit that there is no need to explicitly set quoted_identifier on
Again, assuming this is Sybase ASE, I'd want to find out if the client side app a) has the ability to use square brackets ([]) in place of double quotes or b) has the ability to enable quoted_identifier support or c) has the ability to disable quoted identifiers (ie, not wrap names in double quotes).
My problem is with pandas function to_sql, because when I pass the table name as parameter, it automatically adds quotes surround table name
You appear to be using the internal SQLAlchemy dialect for Sybase which is in the process of being replaced by the external SAP ASE (Sybase) dialect that I maintain. The issue with DDL rendering of #temp tables was fixed in version 1.0.1.
I'm using python to read values from REST API and write to a MySQL table. One of the returned values is a JSON, which I want to store in the DB.
The value returned by the API has escaped quotes and looks something like:
{\"dashboard\":\"val\"}
When I use print, I see that the escape characters are replaced with the actual quotes (which is the desired outcome):
{"dashboard":"val"}
However, when I'm using the MySQLdb execute or executemany (with tokenised params) - the value written to the database has all the double quotes replaced with single quotes, making it a non-valid json:
{'dashboard':'val'}
How do I avoid that?
You should change your library into mysql.connector or pymysql or etc , because MySQLdb has some problems you can not predict. Even though your parameters and base_sql are correct. I recommend mysql.connector, because it is MySQL official library. https://dev.mysql.com/downloads/connector/python/
So I am trying to create a script with a MySQL Query "show databases like 'login';"
I am able to use string substitution and use "login" but i am unable to get the single quotes around it.
Below is how I am trying to do it but i cant get the single quotes even if I escape it using "\".
db = "'"+val+"'"
print "DB...", db
run.cmd("echo 'res = cur.execute(\"SHOW DATABASES like %s;\")\n' >> /run.py" % (db))
Since your string is wrapped in double quotes you shouldn't need to escape the single quotes. Just add the val directly into the format, then add the single quotes to the string being formatted.
run.cmd("echo 'res = cur.execute(\"SHOW DATABASES like '%s';\")\n' >> /run.py" % (val))
Also, is there any reason why you're appending a bash command into a python script?
SHOW DATABASES like '\'%s\'';\" -> this is what worked.
I know that variants of this topic have been discussed elsewhere, but none of the other threads were helpful.
I want to hand over a string from python to sql. It might however happen that apostrophes (') occur in the string. I want to escape them with a backslash.
sql = "update tf_data set authors=\'"+(', '.join(authors).replace("\'","\\\'"))+"\' where tf_data_id="+str(tf_data_id)+";"
However, this will always give \\' in my string. Therefore, the backslash itself is escaped and the sql statement doesn't work.
Can someone help me or give me an alternative to the way I am doing this?
Thanks
Simply don't.
Also don't concatenate sql queries as these are prone to sql injections.
Instead, use a parameterized query:
sql = "update tf_data set authors=%(authors)s where tf_data_id=%(data_id)s"
# or :authors and :data_id, I get confused with all those sql dialects out there
authors = ', '.join(authors)
data_id = str(tf_data_id)
# db or whatever your db instance is called
db.execute(sql, {'authors': authors, 'data_id': data_id})
You're using double-quoted strings, but still escaping the single quotes within them. That's not required, all you need to do is escape the backslash that you want to use in the replace operation.
>>> my_string = "'Hello there,' I said."
>>> print(my_string)
'Hello there,' I said.
>>> print(my_string.replace("'", "\\'"))
\'Hello there,\' I said.
Note that I'm using print. If you just ask Python to show you its representation of the string after the replace operation, you'll see double backslashes because they need to be escaped.
>>> my_string.replace("'", "\\'")
"\\'Hello there,\\' I said."
As others have alluded to, if you are using a python package to execute your SQL use the provided methods with parameter placeholders(if available).
My answer addresses the escaping issues mentioned.
Use a String literal with prefix r
print(r"""the\quick\fox\\\jumped\'""")
Output:
the\quick\fox\\\jumped\'
I have a little script that creates a certain INSERT SQL statement for me.
For postgresql I need to wrap the values to be inserted within two single quotes.
Unfortunately some of the value strings to be inserted also contain a single quote, and I need to escape them automatically.
for line in f:
out.write('(\'' + line[:2] + '\', \'' + line[3:-1] + '\'),\n')
How can I make sure that any single quote (e.g. ' ) inside line[3:-1] is automatically escaped?
Thanks,
UPDATE:
e.g. the line
CI|Cote D'ivoire
fails due '
Update 2:
I can't use double quotes in values, e.g.
INSERT INTO "App_country" (country_code, country_name) VALUES ("AF", "Afghanistan")
I get the error message: ERROR: column "AF" does not exist
This however works fine:
INSERT INTO "App_country" (country_code, country_name) VALUES ('AF', 'Afghanistan')
As described in the PEP-249, the DBPI is a generic interface to various databases. Different implementations exist for different databases. For postgres there is psycopg. from the docs:
cur.execute(
... """INSERT INTO some_table (an_int, a_date, a_string)
... VALUES (%s, %s, %s);""",
... (10, datetime.date(2005, 11, 18), "O'Reilly"))
You simple pass your parameters in a tuple. The underlying library escapes it for you. This is much safer and easier than trying to roll your own.
The SQL standard way to escape a quote is to double it:
'This won''t be a problem.'
So replace every quote with two quotes (and use double quotes in Python to stay sane):
out.write("('" + line[:2] + "', '" + line[3:-1].replace("'", "''") + "'),\n")
Never use a generated, rolled-your-own escaping for DML. Use the appropriate DBAPI as Keith has mentioned. Work would have gone into that to make sure escapes from various sources and type conversion can occur almost transparently. If you're using DDL such as a CREATE TABLE whatever (...) - you can be more slight slack-handed if you trust your own datasource.
using data shown in example:
import sqlite3
text = "CI|Cote D'ivoire" # had to been escaped as it's a string literal, but from another data source - possibly not...
code, name = text.split('|', 1)
db = sqlite3.connect(':memory:')
db.execute('create table something(code, name)')
db.execute('insert into something(code, name) values(?, ?)', (code, name))
for row in db.execute('select * from something'):
print row
# (u'CI', u"Cote D'ivoire")
For a complete solution toadd escape characters to a string, use:
re.escape(string)
>>> re.escape('\ a.*$')
'\\\\\\ a\\.\\*\\$'
for more, see: http://docs.python.org/library/re.html
Not sure if there are some SQL related limitations, but you could always use double quotes to surround your string that contains the single quote.
Eg.
print "That's all Folks!"
or single quotes to surround double quotes:
print 'The name of the file is "rosebud".'