I created the code to create and execute SQL query statements using pymysql in Python.
If i put the sql query statement generated in code directly in db, it works normally.
However, if i execute the sql query statement created in code with cursor.execute (sql), db will not contain data.
When I connect, I also gave the local_infile = True option. How do I create the code?
Related
I am trying to run an Stored Proc in SQL Server (Synapse) using pyodbc.This SP merely performs INSERT operation into a table.
sql =""" EXEC my_sp (?,?,?)"""
values = (str(v1),str(v2),int(v3))
cur.execute(sql,(values))
conn.commit()
This above code is not giving any error. But when I am querying the Table, I don't see the records. However, when I am running above SP in SSMS console, this works fine.
Can't figure out why this is so?
Any clue?
I need to extract a bunch of records within a singlestore database and insert the records into another table. For performance, the ideal way to do this is to create a query string with an Insert Into statement and iterate through on a daily basis.
I can't seem to get python to execute the query in the database, but it appears to run successfully?
fh = 'file_containing_insert_select_query.sql'
qry = open(fh).read()
for i in range(2):
qry_new = some_custom_function_to_replace_dates(qry, i)
engine = tls.custom_engine_function()
engine.execute(qry_new)
I've verified that the sql statements created by my custom function can be copy/pasted to a sql editor and executed successfully, but it won't run in python... any thoughts?
After execution of the above query, you need to send a 'commit' query to database using connection.commit()
(Where connection contains the Database connection credentials and ip address) so it can save the no. of rows inserted via python program.
if you want it to run really fast, it’s usually better to use set-oriented SQL, like INSERT INTO targetTbl … SELECT FROM …;
That way, it doesn’t have to round-trip through the client app.
I am new to sql alchemy.
I have a postgres local server, and I want to use sql alchemy to create a database.
I have the following code:
connection = engine.connect()
connection.execute(
text("CREATE DATABASE :database_name").bindparams(bindparam('database_name', quote=False)),
database_name="test_db"
)
But this unfortunately single quotes the database name parameter, which does not work in postgres. The logs from sql alchemy:
[SQL: CREATE DATABASE %(database_name)s]
[parameters: {'database_name': 'test_db'}]
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.SyntaxError) syntax error at or near "'test_db'" LINE 1: CREATE DATABASE 'test_db`'
In postgres logs, it executes the following statement, which is invalid because of the single quotes. A valid one would have double quotes:
CREATE DATABASE 'test_db'
Is there a way for the bind parameter to not be quoted in the resulting statement? I do not want to do the parameter quoting and string creation myself, as I think this abstraction should be handled by sql alchemy - in case I change my underlying database engine for eg, and this looks to be the mechanism sql alchemy promotes to avoid sql injections too.
The same question would apply to other postgres statements like creating an user with a password, or granting privileges to an existing user, which all need quoting which is postgres specific.
You cannot have parameters in statements other than SELECT, INSERT, UPDATE or DELETE.
You'll have to construct the CREATE DATABASE statement as a string containing the database name. Something like
from psycopg2 import sql
cursor.execute(
sql.SQL("CREATE DATABASE {}").format(sql.Identifier('test_db'))
)
I would like to extract database table generation scripts using PyMysql. When I execute a statement like cursor.execute("show create table table_name;") it does not return the expected result.
I am running into MySQL behavior on Google Cloud SQL I have never seen before.
Every MySQL command we try is working from a Python script except INSERT. We can create the table and show tables, but when we insert data - nothing appears in the table. Yet, if we copy that exact same insert statement to the MySQL command line and run it, the insert works fine.
BUT here is the unusual part. Even though the Python script fails to insert data, the UID AUTO INCREMENT field has incremented for every empty and failed insert. For example, if the Python script fails to insert a row, the next time we run an insert from the mySQL command line, we see that the UID field has incremented by one.
It is as if MySQL started to insert the data, auto-incremented the UID field, but then the data never arrived.
We are using MySQL on Google Cloud SQL. The insert is a simple test:
insert into queue (filename, text) VALUES ('test', 'test')
Any ideas what this is or how to debug it?
It turns out AUTOCOMMIT is set to OFF on Google Cloud SQL.
All SQL inserts must be followed by a commit statement.
For example:
import MySQLdb
db = mdb.connect(ip, login, pword)
query = "insert into tbname (entity, attribute) VALUES('foo', 'bar')"
cursor = db.cursor()
cursor.execute(query)
db.commit()