I'm new to python, just moving from php5. I read about the qmark param notation to prepare a query but I get the following:
Traceback (most recent call last):
File "t.py", line 10, in <module>
cursor.execute("SELECT * FROM object WHERE otype = ?", ["user"])
File "/usr/local/lib/python3.4/site-packages/pymysql/cursors.py", line 130, in execute
query = query % self._escape_args(args, conn)
TypeError: not all arguments converted during string formatting
Is that only for sqlite connector?
import sys
import pymysql
try:
connection = pymysql.connect(host="127.0.0.1", user="root", passwd="pass", db="xxx")
except:
sys.exit("Database connection error")
cursor = connection.cursor()
cursor.execute("SELECT * FROM object WHERE otype = ?", "user")
for row in cursor:
print(row)
cursor.close()
connection.close()
Yes. Unfortunately the DB-API specification left it up to each implementer to choose the parameter. pysqlite chose the ?, and MySQLdb chose %s. Since pymsql is a drop-in replacement for MySQLdb, it also uses %s.
Related
I am trying to insert data to my table in Microsoft SQL Server using a Python script.
Initially I am trying to upload an excel file, but got error messages and have tried to lesser the scale of the task. At this point I am trying to push some data trough to an already existing table in my database. In this example my servername is SERVERNAME, database is DATABASENAME and my table is TABLE. The script is:
import pyodbc
conn = pyodbc.connect('Driver={SQL Server};'
'Server=SERVERNAME;'
'Database=DATABASENAME;'
'Trusted_Connection=yes;')
cursor = conn.cursor()
cursor.execute('INSERT INTO DATABASENAME.dbo.TABLE(Name, Age, City)
VALUES ('Alan', '30', 'London');
for row in cursor:
print(row)
I get this error message:
Traceback (most recent call last):
File "<pyshell#38>", line 1, in <module>
exec(open("C:\\pandasql.py").read())
File "<string>", line 9
cursor.execute('INSERT INTO DATABASENAME.dbo.TABLE (Name, Age, City)
^
SyntaxError: EOL while scanning string literal
I want the one row with data to be seen in my database. What am I doing wrong?
If you want to use single quotes within a single-quote string you need to escape them by adding a \ before them - or using a double-quote string.
Example:
import pyodbc
conn = pyodbc.connect('Driver={SQL Server};'
'Server=SERVERNAME;'
'Database=DATABASENAME;'
'Trusted_Connection=yes;')
cursor = conn.cursor()
cursor.execute("INSERT INTO DATABASENAME.dbo.TABLE(Name, Age, City) VALUES ('Alan', '30', 'London')")
If you want your changes to be saved to the database, you also need to call commit on the connection or set autocommit to True:
# either enable autocommit
conn = pyodbc.connect('...CONNECTION STRING...', autocommit=True)
# or after inserting the row:
conn.commit()
If you want to retrieve the resulting row, you need to select it first, e.g.:
cursor.execute("SELECT * FROM DATABASENAME.dbo.TABLE")
for row in cursor:
print(row)
or use an OUTPUT clause on your INSERT statement:
cursor.execute("""
INSERT INTO DATABASENAME.dbo.TABLE(Name, Age, City)
OUTPUT Inserted.Name, Inserted.Age, Inserted.City
VALUES ('Alan', '30', 'London')
""")
for row in cursor:
print(row)
Full example for your code snippet:
import pyodbc
conn = pyodbc.connect('Driver={SQL Server};'
'Server=SERVERNAME;'
'Database=DATABASENAME;'
'Trusted_Connection=yes;')
cursor = conn.cursor()
cursor.execute("""
INSERT INTO DATABASENAME.dbo.TABLE(Name, Age, City)
OUTPUT Inserted.Name, Inserted.Age, Inserted.City
VALUES ('Alan', '30', 'London')
""")
for row in cursor:
print(row)
conn.commit()
The colors in the snippet should already show you the problem: you are using a single quote to begin the string and end your string somewhere in between through using another single quote.
A simple way would be to escape that qoute (through adding a \ in front); a better way that also helps to secure your code against SQL injection would be to use prepared statements
I'm unable to delete a specific table in my PostgreSQL database. That table is called "user". When I try to run the snippet of code below,
import psycopg2
conn = psycopg2.connect("dbname='mydatabase' user='postgres' host='localhost' password='mypassword'")
cur = conn.cursor()
cur.execute("DROP TABLE user;")
conn.commit()
conn.close()
It spits out the following error
Traceback (most recent call last):
File "dev_psycog.py", line 20, in <module>
cur.execute("DROP TABLE user;")
psycopg2.ProgrammingError: syntax error at or near "user"
LINE 1: DROP TABLE user;
I can delete any other table in my database just fine, but I can't seem to delete my table called "user". Is it because "user" is a reserved keyword?
Quote "user" as below
import psycopg2
conn = psycopg2.connect("dbname='mydatabase' user='postgres' host='localhost' password='mypassword'")
cur = conn.cursor()
cur.execute('DROP TABLE "user";')
conn.commit()
conn.close()
See here.
There is a second kind of identifier: the delimited identifier or
quoted identifier. It is formed by enclosing an arbitrary sequence of
characters in double-quotes (").
I am making a project where I connect to a database with Python then update and change things. I have run into problems when trying to retrieve information.
I am using this code:
import sqlite3
conn = sqlite3.connect('Project.db')
print ("Opened database sucessfully")
cursor = conn.execute("SELECT ID,ResidentTitle,ResidentForname FROM Residents")
for row in cursor:
print ("ID = "), row[0]
print ("ResidentTitle ="), row[1]
print ("Name ="), row[2]
print ("done");
conn.close()
from this I am getting back the error:
Traceback (most recent call last):
File "C:/sqlite/Sqlplz.py", line 7, in <module>
cursor = conn.execute("SELECT ID,ResidentTitle,ResidentForname FROM Residents")
sqlite3.OperationalError: no such table: Residents
How can I resolve this error?
cursor = conn.execute("SELECT ID,ResidentTitle,ResidentForname FROMResidents")
-------------------------------------------------------------------^
You are missing space, you should update like that
cursor = conn.execute("SELECT ID,ResidentTitle,ResidentForname FROM Residents")
Problem is fixed, issue with a broken save file.
I'm planning to insert data to bellow CF that has compound keys.
CREATE TABLE event_attend (
event_id int,
event_type varchar,
event_user_id int,
PRIMARY KEY (event_id, event_type) #compound keys...
);
But I can't insert data to this CF from python using cql.
(http://code.google.com/a/apache-extras.org/p/cassandra-dbapi2/)
import cql
connection = cql.connect(host, port, keyspace)
cursor = connection.cursor()
cursor.execute("INSERT INTO event_attend (event_id, event_type, event_user_id) VALUES (1, 'test', 2)", dict({}) )
I get the following traceback:
Traceback (most recent call last):
File "./v2_initial.py", line 153, in <module>
db2cass.execute()
File "./v2_initial.py", line 134, in execute
cscursor.execute("insert into event_attend (event_id, event_type, event_user_id ) values (1, 'test', 2)", dict({}))
File "/usr/local/pythonbrew/pythons/Python-2.7.2/lib/python2.7/site-packages/cql-1.4.0-py2.7.egg/cql/cursor.py", line 80, in execute
response = self.get_response(prepared_q, cl)
File "/usr/local/pythonbrew/pythons/Python-2.7.2/lib/python2.7/site-packages/cql-1.4.0-py2.7.egg/cql/thrifteries.py", line 80, in get_response
return self.handle_cql_execution_errors(doquery, compressed_q, compress)
File "/usr/local/pythonbrew/pythons/Python-2.7.2/lib/python2.7/site-packages/cql-1.4.0-py2.7.egg/cql/thrifteries.py", line 98, in handle_cql_execution_errors
raise cql.ProgrammingError("Bad Request: %s" % ire.why)
cql.apivalues.ProgrammingError: Bad Request: unable to make int from 'event_user_id'
What am I doing wrong?
It looks like you are trying to follow the example in:
http://pypi.python.org/pypi/cql/1.4.0
import cql
con = cql.connect(host, port, keyspace)
cursor = con.cursor()
cursor.execute("CQL QUERY", dict(kw='Foo', kw2='Bar', kwn='etc...'))
However, if you only need to insert one row (like in your question), just drop the empty dict() parameter.
Also, since you are using composite keys, make sure you use CQL3
http://www.datastax.com/dev/blog/whats-new-in-cql-3-0
connection = cql.connect('localhost:9160', cql_version='3.0.0')
The following code should work (just adapt it to localhost if needed):
import cql
con = cql.connect('172.24.24.24', 9160, keyspace, cql_version='3.0.0')
print ("Connected!")
cursor = con.cursor()
CQLString = "INSERT INTO event_attend (event_id, event_type, event_user_id) VALUES (131, 'Party', 3156);"
cursor.execute(CQLString)
For python 2.7, 3.3, 3.4, 3.5, and 3.6 for installation you can use
$ pip install cassandra-driver
And in python:
import cassandra
Documentation can be found under https://datastax.github.io/python-driver/getting_started.html#passing-parameters-to-cql-queries
I have this statement
cursor = connection.cursor()
query = "SELECT * from table"
cursor.execute(query)
res = cursor.fetchall()
cursor = connection.cursor()
query = "SELECT * from table"
cursor.execute(query)
print cursor.rowcount
According to the Python Database API Specification v2.0, the rowcount attribute of the cursor object should return the number of rows that the last query produced or affected (the latter is for queries that alter the database). If your database module conforms to the API specification, you should be able to use the rowcount attribute.
The num_rows() function you are looking for does not exist in the MySQLdb module. There is an internal module called _mysql which has a result class with a num_rows method, but you shouldn't be using that - the very existence of _mysql is considered an implementation detail.
Most voted answer is not working yet. It should be like that:
cursor = connection.cursor()
query = "SELECT * FROM table"
cursor.execute(query)
cursor.fetchall()
print (cursor.rowcount)