I'm attempting to get a python script to insert data into a database without having it drop the table first.. I'm sure this isn't hard to do but I can't seem to get the code right..
Here is the full python script..
#!/usr/bin/python
# -*- coding: utf-8 -*-
import requests
import hashlib
import time
import MySQLdb
#Dont forget to fill in PASSWORD and URL TO saveTemp (twice) in this file
sensorids = ["28-000004944b63", "28-000004c01b2c"]
avgtemperatures = []
for sensor in range(len(sensorids)):
temperatures = []
for polltime in range(0,3):
text = '';
while text.split("\n")[0].find("YES") == -1:
# Open the file that we viewed earlier so that python can see what is in it. Replace the serial number as before.
tfile = open("/sys/bus/w1/devices/"+ sensorids[sensor] +"/w1_slave")
# Read all of the text in the file.
text = tfile.read()
# Close the file now that the text has been read.
tfile.close()
time.sleep(1)
# Split the text with new lines (\n) and select the second line.
secondline = text.split("\n")[1]
# Split the line into words, referring to the spaces, and select the 10th word (counting from 0).
temperaturedata = secondline.split(" ")[9]
# The first two characters are "t=", so get rid of those and convert the temperature from a string to a number.
temperature = float(temperaturedata[2:])
# Put the decimal point in the right place and display it.
temperatures.append(temperature / 1000 * 9.0 / 5.0 + 32.0)
avgtemperatures.append(sum(temperatures) / float(len(temperatures)))
print avgtemperatures[0]
print avgtemperatures[1]
#connect to db
db = MySQLdb.connect("localhost","user","password","temps" )
#setup cursor
cursor = db.cursor()
#create temps table
cursor.execute("DROP TABLE IF EXISTS temps")
sql = """CREATE TABLE temps (
temp1 FLOAT,
temp2 FLOAT )"""
cursor.execute(sql)
#insert to table
try:
cursor.execute("""INSERT INTO temps VALUES (%s,%s)""",(avgtemperatures[0],avgtemperatures[1]))
db.commit()
except:
db.rollback()
#show table
cursor.execute("""SELECT * FROM temps;""")
print cursor.fetchall()
((188L, 90L),)
db.close()
This is the part I need assistance with..
If I have it drop the table it works fine but I don't want it to drop the table, just insert the new data into the same table.
#connect to db
db = MySQLdb.connect("localhost","user","pasword1","temps" )
#setup cursor
cursor = db.cursor()
#create temps table
cursor.execute("DROP TABLE IF EXISTS temps")
sql = """CREATE TABLE temps (
temp1 FLOAT,
temp2 FLOAT )"""
cursor.execute(sql)
#insert to table
try:
cursor.execute("""INSERT INTO temps VALUES (%s,%s)""",(avgtemperatures[0],avgtemperatures[1]))
db.commit()
except:
db.rollback()
#show table
cursor.execute("""SELECT * FROM temps;""")
print cursor.fetchall()
((188L, 90L),)
db.close()
You shouldn`t have to drop a table each time you want to enter data. In fact, it defeats the whole purpose of the database since you will remove all the previous data each time you run your script.
You should ask to create the table but only if it does not exists. Use the following.
sql = """CREATE TABLE IF NOT EXISTS temps (
temp1 FLOAT,
temp2 FLOAT )"""
cursor.execute(sql)
I've had this problem with updating. Try adding COMMIT to the end of your sql. I use psycopg2 to connect to a postgresql database. Here is an example.
def simple_insert():
sql = '''INSERT INTO films VALUES ('UA502', 'Bananas', 105, '1971-07-13', 'Comedy', '82 minutes'); COMMIT;'''
try:
conn = psycopg2.connect(database)
cur = conn.cursor()
cur.execute(sql)
except:
raise
I think your problem is your not saving the transaction and the COMMIT command should fix it.
Related
What the following script is supposed to do,
Connect to postgreSQL database
Grab last id entry in the database final table
Compare that entry to data uploaded into a staging table from a .csv using the [id] column (trying to avoid duplicates)
Insert data from the staging table to the final table (only entries where timestamp id is greater than last entry from previous data)
Truncate staging table
The code as written below works, but is unfinished. I am to the point where I have to compare the timestamp1 or t1 to the id column in the staging table. I'm unsure of how to go about that though.
This spot in the code,
#insert new entries into final db table
cursor.execute("INSERT INTO test SELECT * FROM stagingtable WHERE ####
I am hoping for a bit of assistance or guidance with what needs to be done. My python skills are new and it has taken me a good deal to get this far in. I'm sure a for loop is required, but I'm not sure how to incorporate the timetable format to the id column "%Y/%m/%d %H:%M:%S.%f". When applied correctly, the difference for new entries to the timestamp id should be positive and entries that already exist, either zero or negative. Some may suggest that a Merge Into would work, but at the moment the final table will continually collect data without truncating any earlier uploads. So it'll eventually take longer and longer to compare data using the Merge method (to my understanding).
import csv
import pyodbc
import time
from datetime import datetime
#connect to database
#DB connection string
print("Establishing Database connection...")
con = pyodbc.connect('DSN=sqldatabase')
cursor = con.cursor()
print("...Connected to database.")
#recall last timestamp entry in final db table
timestamp1 = cursor.execute("select max(id) from test;").fetchval()
#read file and copy data into staging table
print("Reading file contents and copying into staging table...")
with open('C:\\Users\\user\\Desktop\\test2.csv') as csvfile:
readCSV = csv.reader(csvfile, delimiter=',')
columns = next(readCSV) #skips the header row
query = 'insert into stagingtable({0}) values ({1})'
query = query.format(','.join(columns), ','.join('?' * len(columns)))
for data in readCSV:
cursor.execute(query, data)
con.commit()
timestamp2 = cursor.execute("select max(id) from stagingtable;").fetchval()
t1 = datetime.strptime(timestamp1, "%Y/%m/%d %H:%M:%S.%f")
# t2 = datetime.strptime(timestamp2, "%Y/%m/%d %H:%M:%S.%f")
# difference = t2 - t1
# print(difference)
#insert new entries into final db table
cursor.execute("INSERT INTO test SELECT * FROM stagingtable WHERE ####
#clear staging table
print("Clearing previous data download...")
cursor.execute("TRUNCATE TABLE stagingtable")
con.commit()
con.close()
print("...Completed clearing staging table.")
import csv
import pyodbc
import time
from datetime import datetime
#connect to database
#DB connection string
print("Establishing Database connection...")
con = pyodbc.connect('DSN=SQLdatabase')
cursor = con.cursor()
print("...Connected to database.")
#recall last timestamp entry in db table
t1 = datetime.strptime(cursor.execute("SELECT MAX(id) FROM test;").fetchval(), "%Y/%m/%d %H:%M:%S.%f")
#read file and copy data into table
print("Reading file contents and copying into table...")
with open('C:\\Users\\user\\Desktop\\test2.csv') as csvfile:
readCSV = csv.reader(csvfile, delimiter=',')
columns = next(readCSV) #skips the header row
t2 = datetime.strptime(next(readCSV)[0], "%Y/%m/%d %H:%M:%S.%f")
while t2 < t1:
t2 = datetime.strptime(next(readCSV)[0], "%Y/%m/%d %H:%M:%S.%f")
query = 'insert into test({0}) values ({1})'
query = query.format(','.join(columns), ','.join('?' * len(columns)))
for data in readCSV:
cursor.execute(query, data)
con.commit()
print("Data posted to table")
I did away with the staging table. This was the final outcome.
I have a problem getting the query results from my Python-Code. The connection to the database seems to work, but i always get the error:
"InterfaceError: No result set to fetch from."
Can somebody help me with my problem? Thank you!!!
cnx = mysql.connector.connect(
host="127.0.0.1" ,
user="root" ,
passwd="*****",
db="testdb"
)
cursor = cnx.cursor()
query = ("Select * from employee ;")
cursor.execute(query)
row = cursor.fetchall()
If your problem is still not solved, you can consider replacing the python mysql driver package and use pymysql.
You can write code like this
#!/usr/bin/python
import pymysql
db = pymysql.connect(host="localhost", # your host, usually localhost
user="test", # your username
passwd="test", # your password
db="test") # name of the data base
# you must create a Cursor object. It will let
# you execute all the queries you need
cur = db.cursor()
query = ("SELECT * FROM employee")
# Use all the SQL you like
cur.execute(query)
# print all the first cell of all the rows
for row in cur.fetchall():
print(row[0])
db.close()
This should be able to find the result you want
add this to your code
for i in row:
print(i)
you did not print anything which is why that's not working
this will print each row in separate line
first try to print(row),if it fails try to execute using the for the loop,remove the semicolon in the select query statement
cursor = connection.cursor()
rows = cursor.execute('SELECT * FROM [DBname].[dbo].TableName where update_status is null ').fetchall()
for row in rows:
ds = row[0]
state = row[1]
here row[0] represent the first columnname in the database
& row[1] represent the second columnname in the database & so on
I have some Python code the selects data from Oracle spatial and inserts into Spatialite. My problem is that the cursor contains the geometry in binary and I can’t figure out how to read the binary into the Spatialite insert statement. Just to added this all works if I use WKT but some of the geometries are too long hence the reason for the binary format.
Can anyone help please?
# Import system modules
import cx_Oracle
from pyspatialite import dbapi2 as sl_db
def db_connect():
# Build connect from TNS names
o_db = cx_Oracle.connect("xxxxx", "xxxxx", "xxxxx_gl_dev")
cursor = o_db.cursor()
return cursor
def db_lookup(cursor):
# Select records
sql = "SELECT sdo_util.to_wkbgeometry(a.shape), a.objectid FROM span a WHERE a.objectid = 1382372"
cursor.execute(sql)
row = cursor.fetchall()
return row
def db_insert(row):
# Insert Rows in new spatailite table
database_name = 'C:\\Temp\\MYDATABASE.sqlite'
db_connection = sl_db.connect(database_name)
db_cursor = db_connection.cursor()
sql = 'INSERT INTO "SPAN_OFL" ("geometry", "OBJECTID") Values GeomFromWKB(?,27700),?);'
db_cursor.executemany(sql, row)
db_connection.commit()
db_connection.close()
# main code
cursor = db_connect()
row = db_lookup(cursor)
db_insert(row)
I am trying to make a random code generator in python that writes to a database. I have the codes generating and writing to the database, but instead of adding full codes to the database it loops through letters. Here is my code for the code generator:
import string
import random
import sqlite3
def id_generator():
db = sqlite3.connect('codes.db')
c = db.cursor()
number_of_codes = 10
stringLength = 9
id_code = input("what letter should this begin with: \n")
id_code = id_code.upper()
dict_ofcodes = []
for x in range(0, number_of_codes):
codez = (''.join(random.choice(string.ascii_uppercase) for i in range(stringLength)))
final_codez = (id_code + codez)
dict_ofcodes.insert(x, final_codez)
print (dict_ofcodes)
dict_ofcodes_tuple = tuple(dict_ofcodes)
print(dict_ofcodes_tuple)
for x in range(0, number_of_codes):
c.executemany(''' INSERT INTO codes(codes) VALUES(?)''', dict_ofcodes_tuple[x])
db.commit()
db.close()
id_generator()
Here is what it prints
['AALRRIULNC', 'AZTKZBKTLK', 'ATWMWYWICO', 'AWQJIJYEJH', 'AQFIONPUNJ', 'AMJRXUIJXM', 'AUDRLSBLSG', 'ABXYXDUMPD', 'AUAXRQURBH', 'ADQEVIRDFU']
('AALRRIULNC', 'AZTKZBKTLK', 'ATWMWYWICO', 'AWQJIJYEJH', 'AQFIONPUNJ', 'AMJRXUIJXM', 'AUDRLSBLSG', 'ABXYXDUMPD', 'AUAXRQURBH', 'ADQEVIRDFU')
It writes to the database single letters of the codes:
A
F
Y
and so on
The code I used to create the schema is contained in the a python file
import sqlite3
def writeDB():
db = sqlite3.connect('codes.db')
c = db.cursor()
# Create table
c.execute('''CREATE TABLE codes (codes TEXT)''')
# Save (commit) the changes
db.commit()
#can also close the connection if done with it.
# be sure any changes have been committed or they will be lost.
db.close()
writeDB()
I created the file with the mac terminal.
How could I write the full codes to the database?
The problem is with this line:
c.executemany(''' INSERT INTO codes(codes) VALUES(?)''', dict_ofcodes_tuple[x])
executemany is used to iterate over a list of parameters and call the sql statement for each parameter. So your dict_ofcodes_tupel[x] is treated as a character array and the INSERT is called for each character.
If you want to insert the entire string as one, use execute() instead.
c.execute(''' INSERT INTO codes(codes) VALUES(?)''', (dict_ofcodes_tuple[x],))
or
c.execute(''' INSERT INTO codes(codes) VALUES(?)''', [dict_ofcodes_tuple[x]])
I'm migrating a script from another language to Python. I watered this down on the specifics of the database calls etc... but this is what the file looks like. I intentionally made some queries fail as I was testing the transaction and it did not rollback() the queries executed prior to the forced error. I am a little confused as how to the transactions work with Python, the example I followed was this one, it was a loop with several queries nested within transactions so I adapted the code according to what I understood from it.
#!/usr/bin/python
import MySQLdb
import thread
import os
# Open database connection
# added local_infile=1 to allow the import to work, otherwise you get an error
db = MySQLdb.connect(CONNECTION ARGS...)
# define our function that will be called from our thread
def import_queued_file(conn,distid):
# prepare a cursor object using cursor() method
cursor = conn.cursor()
# total lines imported for all files for a distributor
total_lines_imported = 0
# current lines imported for each file on each iteration
current_lines_imported = 0
# default this to 0, this will have the total lines for our imports on each iteration
previous_lines_imported = 0
# initialize the file exists flag to 0
file_exists = 0
# sql statement to retrieve the file(s) for a specific distributor
sql = """
SELECT
...
FROM ...
WHERE ...
"""
# execute the sql statement
cursor.execute(sql)
# if we have records, execute the code below
if (cursor.rowcount > 0):
# set the records to the files variable
files = cursor.fetchall()
# set a variable to count iterations
# we'll use this to determine if we need to drop the table
cnt = 0
# keep track of the total number of lines imported per distributor (may be multiple files)
lines_imported = 0
# loop the recordset
for col in files:
# increment the cnt variable
cnt += 1
# set file_exists to 0 at the beginning of the iteration
file_exists = 0
# set some variables to be used in our sql load data statement
var1 = col[1]
var2 = col[2]
....
# this is the path of our file that we will be using for MySQL LOAD DATA also
# TODO: REFACTOR SO THAT THE /home/webex/backup/ IS NOT HARD CODED
inventoryfile = "/path/to/file/%s" % (filepath)
# check to see if we have a file
if (os.path.exists(inventoryfile)):
try:
# set file exists to true
file_exists = 1
# if cnt > 1, it means we have more than 1 file for this distributor
# only drop the table if this is the first iteration
if (cnt == 1):
# drop table sql statement
sql = "DROP TABLE IF EXISTS %s" % (temptable)
# execute the sql command
cur = conn.cursor()
cur.execute(sql)
cur.close()
# assign the create table statement to the sql variable
sql = """
CREATE TABLE IF NOT EXISTS
.......
.......
) ENGINE=MyISAM DEFAULT CHARSET=utf8
""" % (temptable)
# execute the sql statement
cur = conn.cursor()
cur.execute(sql)
cur.close()
# query the temptable to see if we have any records
sql = "SELECT COUNT(0) AS total FROM %s" % (temptable)
cur = conn.cursor()
cur.execute(sql)
cur.close()
# get the count of how many records exist in the database
number_of_line_items = cur.fetchall()
previous_lines_imported = number_of_line_items[0][0]
# load data local infile sql statement
sql = """
LOAD DATA LOCAL INFILE ...
"""
# execute the load data infile sql statement
cur = conn.cursor()
cur.execute(sql)
cur.close()
# clean up the table by removing...
# rows that don't have a part_number,
# rows that have part_number's less than 3 characters
sql = """
DELETE FROM ...
""" % (temptable)
# execute the delete query
cur = conn.cursor()
cur.execute(sql)
cur.close()
# query the temptable to see if we have any records after the import
sql = "SELECT COUNT(0) AS total FROM %s" % (temptable)
# execute the count query
cur = conn.cursor()
cur.execute(sql)
cur.close()
# get the count of how many records exist in the database after the import
number_of_line_items = cur.fetchall()
# get the current lines imported
current_lines_imported = number_of_line_items[0][0] - previous_lines_imported
# add the current lines imported to the total lines imported
total_lines_imported += current_lines_imported
# update distributor_file_settings table last_updated_on field
sql = """
UPDATE ...
""" % (file_id,distributor__id)
print sql
# execute the update query
cur = conn.cursor()
cur.execute(sql)
cur.close()
# close cursor
conn.commit()
except:
conn.rollback()
# no records exists for this distributor
else:
print "dist doesn't exist"
cursor.close()
import_queued_file(db,42)
# prepare a cursor object using cursor() method
cursor = db.cursor()
# select distinct file settings
sql = """
SELECT ...
"""
# disconnect from server
db.close()
After reviewing the code again and again, the issue happened to be the table type. After changing it to INNODB it worked as expected.