MYSql CGI Script - python

Below Code is my CGI Script, where am trying to do a insert Command.
#! C:\Python27\python.exe -u
import cgi
import MySQLdb
import xml.sax.saxutils
query = cgi.parse()
db = MySQLdb.connect(
host = "127.0.0.1",
user = "root",
passwd = "mysql123",
db = "welcome")
print 'Content-type: text/plain\n\n<?xml version="1.0" encoding="utf-8"?>\n<result>'
try:
c = db.cursor()
c.execute("insert into welcome.registrations values ('test','test',now())")
print '\t<update>true</update>'
except:
print '\t<update>false</update>'
print "</result>"
when i run the go to the url - .com/cgi-bin/reg.cgi, am not finding any insert operation done in mysql DB

You need to do db.commit() after c.execute().
Or you could do:
with db:
c = db.cursor()
c.execute("insert into welcome.registrations values ('test','test',now())")

make sure you do a db.commit() after every insert (or in the end, any will work ):
try:
c = db.cursor()
c.execute("insert into welcome.registrations values ('test','test',now())")
db.commit()
db.close() # Make sure you close the connection else multiple connections will result in hanging of mysql server.

Related

MySQL connection with Python through PythonAnywhere

I want to connect to MySql database using Python through PythonAnywhere, without creating a Flask/Django application.
I have seemingly managed to connect through MySQLdb, using the code below, but I do not receive a response when I run the code. Any solutions?
import MySQLdb
db = MySQLdb.connect(
host = "myuser.mysql.pythonanywhere-services.com",
user = "myuser",
passwd = XXX,
db = "myuser$db_name"
)
cursor = db.cursor()
cursor.execute("SELECT * FROM table_name")
for x in cursor:
print(x)
cursor.close()
db.close()
You retrieve all rows in the table, without error.
cursor.execute("SELECT * FROM table_name")
for x in cursor:
print(x)
Yet you see no output. This is normal for a table that contains zero rows.
Consider doing one or more INSERTs, and a COMMIT,
prior to the query.

Python update query in mariadb

I am trying to update my mariadb table via python code .While compile the query nothing happen in my database. please check below code and let me know where i made mistake in update function
import mariadb
connection= mariadb.connect(user="user1", database="db1", host="ippp" ,password="pass")
cursor= connection.cursor()
cursor.execute("UPDATE product_options_combinations SET quantity=5944 WHERE item_code ='31628'")
cursor.close()
connection.close()
Hello here I have a clean code example for you. How to update it.
import pymysql
# Create a connection object
# IP address of the MySQL database server
Host = "localhost"
# User name of the database server
User = "user"
# Password for the database user
Password = ""
database = "GFG"
conn = pymysql.connect(host=Host, user=User, password=Password, database)
# Create a cursor object
cur = conn.cursor()
query = f"UPDATE PRODUCT SET price = 1400 WHERE PRODUCT_TYPE = 'broadband'"
cur.execute(query)
#To commit the changes
conn.commit()
conn.close()
You just need to add connection.commit() to your code, but I recommend you use a parametrized SQL preferably with a list of tuples,more of which might be added if needed, along with cursor.executemany() as being more performant for DML statements such as
import mariadb
connection= mariadb.connect(user="user1",
password="pass",
host="ippp",
port=3306,
database="db1")
cursor= connection.cursor()
dml="""
UPDATE product_options_combinations
SET quantity=%s
WHERE item_code =%s
"""
val=[
(5944,'31628')
]
cursor.executemany(dml,val)
connection.commit()
cursor.close()
connection.close()
Are you sure that the connection is working properly?
Have you tried to implement a try and catch routine to print mariadb errors?
Something like this:
# Connect to MariaDB Platform
import mariadb
try:
conn = mariadb.connect(
user="user",
password="password",
host="xx.xx.xx.xx",
port=3306,
database="db_name"
)
except mariadb.Error as e:
print(f"Error connecting to MariaDB Platform: {e}")
sys.exit(1)

Add JSON or XML or CSV data to MySQL server

Is there any way of sending either JSON, XML, or CSV data to a local MySQL server?
I'm new to MySQL, and wasn't able to find anything online.
Either data type will work as I have code that can covert all of my data into whichever format I require, i.e. JSON, XML, and CSV.
Any help is appreciated!
1). I am going to give you answer for JSON >> How to store JSON data in MySQL DB using python ?
If your JSON format is following and you want to store associative in MySQL database >> table then you can follow the first example.
Example: 1
JSON format
{
"first_key" : 10,
"second_key" : 20
}
Python core script for JSON.
import MySQLdb
myjson = json.loads(jdata)
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='',
db='myjson_db'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
cursor = db.cursor()
sql = """INSERT INTO my_table (array_key, array_value) VALUES (%s, %s)"""
for array_key, array_value in myjson.items():
cursor.execute(sql, (array_key, array_value))
If you want to store data in only one column then you can follow the second one as per follow.
Example: 2
import MySQLdb
myjson = json.loads(jdata)
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='',
db='myjson_db'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
cursor = db.cursor()
sql = """INSERT INTO my_table (json_column) VALUES (%s)"""
cursor.execute(sql, (myjson))
2). Lets start with XML >> How to store XML data in MySQL DB using
python ?
XML data
<?xml version="1.0" encoding="UTF-8" ?>
<first_key>10</first_key>
<second_key>20</second_key>
Next step is: please install: Python script for converts XML to JSON from here import and import xml2json in our python core script.
Python Core script for XML
import MySQLdb
import xml2json
import json
xml_data = json.loads(xml2json.xml2json(xmldata))
### data store functionality or logic is same as example 1 and example 2
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='',
db='myxml_db'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
cursor = db.cursor()
sql = """INSERT INTO my_table (xml_data) VALUES (%s)"""
cursor.execute(sql, (xml_data))
3). Lets discuss for CSV >> How to store CSV data in MySQL DB using
python ?
import csv
import MySQLdb
csv_data = csv.reader(file('my_csv_file.csv'))
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='',
db='mycsv_db'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
for row in csv_data:
cursor.execute('INSERT INTO my_csv_table(csv_first_column, \
csv_first_column)' \
'VALUES("%s", "%s")',
row)
I'm unaware of anyway of inserting JSON, XML or CSV into a MySQL database directly.
You can parse the data to a script which can insert it into a Database by using a module such as MySQL-python.
My python isn't great but hopefully this example should suffice.
#!usr/bin/python
# Import mySQL module to interact with database.
import MySQLdb
# Import json module to convert the JSON into a Python data structure.
import json
# Convert the JSON to a usable format.
data = json.loads(yourjson)
# Connect to MySQL server.
db = mySQLdb.connect(host='yourhost',
user='youruser',
passwd='yourpassword',
db='yourschema')
# Create an object to handle SQL statements.
cur = db.cursor()
# Attempt to execute the SQL statement, if not revert any changes.
try:
cur.execute('INSERT INTO table SET col1 = %s, col2 = %s', data.foo, data.bar)
db.commit()
except:
db.rollback()

execute sql command in python

when i run this code in my raspberry-pi nothing happend and the code runs smouthly but there no results
#!/usr/bin/python
import MySQLdb
db = MySQLdb.connect(host="localhost", # your host, usually localhost
user="root", # your username
passwd="raspberry", # your password
db="raspberry") # name of the data base
cur = db.cursor()
cur.execute("UPDATE visitors SET nb_visits = nb_visits+1 WHERE id = 1")
You should commit your transaction before changes take effect :
cur.execute("UPDATE visitors SET nb_visits = nb_visits+1 WHERE id = 1")
db.commit()

psycopg2 not actually inserting data

I need to insert JSON data from tornado to postgres, so here's test like this:
from psycopg2 import connect
conn = connect("user='pguser' host='localhost' dbname='pgdb' password='pgpass'")
cursor = conn.cursor()
data = '[{"id":"sdf","name":"wqe","author":"vb"}]'
for row in eval(data):
print row
cursor.execute("""INSERT INTO books(id,name,author) VALUES('%s','%s','%s')""" % \
(row['id'], row['name'], row['author'])
)
>>> cursor.execute("SELECT * FROM books")
>>> cursor.fetchall()
[('sdf', 'wqe', 'vb')]
>>>
$> psql -d pgdb -U pguser -W
Password for user pguser:
psql (9.1.6)
Type "help" for help.
pgdb=> select * from books;
id | name | author
----+------+--------
(0 rows)
As you can see after doing select in python shell, there's some data, but in psql there's
0 rows! What may I be doing wrong?
Python 2.7.2+
You didn't commit the transaction.
Psycopg2 opens a transaction automatically, and you must tell it to commit in order to make the data visible to other sessions.
See the psycopg2 FAQ and the connection.commit() method.
Just had the same perplexing issue. To put options together:
as #Craig Ringer writes after cursor.execute you can run connection.commit
cursor.execute('INSERT INTO table VALUES(DEFAULT, %s)', email)
...
connection.commit()
OR
after connect set autocommit
connection = connect("user='pguser' host='localhost' dbname='pgdb' password='pgpass'")
connection.autocommit = True
OR
use set_session to set autocommit
connection = connect("user='pguser' host='localhost' dbname='pgdb' password='pgpass'")
connection.set_session(autocommit=True)
All worked for me.

Categories

Resources