So I have a mysql table and I am trying to take each element from one of the fields of the table. The field and table are both called "keywords". In the field there are many different random words and I am trying to take all of those and save them to a text file. Any help on how to implement this would be great, here is what I have so far.
#!/usr/bin/env python
import MySQLdb
db = MySQLdb.connect(host="", user="", passwd="", db="")
cursor = db.cursor()
sql = """SELECT DISTINCT keywords FROM keywords"""
cursor.execute(sql)
cursor.fetchall()
db.close()
for s in sql:
tweets = open("keywords.txt", "w")
What I was thinking is to turn what sql fetches into a list if possible and write that to the file. But I am open to any suggestions, thanks.
Something like this should work:
import MySQLdb
db = MySQLdb.connect(host="", user="", passwd="", db="")
cursor = db.cursor()
sql = """SELECT DISTINCT keywords FROM keywords"""
tweets = open("keywords.txt", "w")
cursor.execute(sql)
for row in cursor:
print>>tweets, row[0]
tweets.close()
db.close()
Related
I have the code below which selects some data from one table, finds related data in another table, updates the data in the related table and deletes the data from first table. The use of cursors below works but not sure it is best. Do I need to define a new cursor(x) = db.cursor() each time like this?
db = MySQLdb.connect(host=cred.host, user=cred.user, password=cred.password,
db=cred.db, port=cred.port)
cursor = db.cursor()
cursor.execute("SELECT * FROM tbl_sqs order by timeOfOfferChange ASC limit 200")
for reprice in cursor.fetchall():
#do initial processing of data retreived from tbl_sqs
#select the current value(s) from tbl_inventory_data that are for the same product from the same seller
 cursor2 = db.cursor() # prepare a cursor object using cursor() method
cursor2 = db.cursor()
cursor2.execute("SELECT * FROM tbl_inventory_data WHERE `asin`=%s and `user`=%s", (ASIN, SellerId))
db.commit()
for row in cursor2.fetchall(): #iterate over inventory items
cursor3 = db.cursor() # prepare a cursor object using cursor() method#
cursor3.execute("UPDATE tbl_inventory_data SET `…..WHERE `seller-sku`=%s AND `user`=%s"))
db.commit()
cursor4 = db.cursor()
cursor4.execute("DELETE FROM tbl_sqs WHERE MessageId=%s", (message_id)) # delete the message just processed.
db.commit()
You don't need to create a cursor for the database on every query. It's best that you create a cursor one time and use the same cursor until the end of using the database. Each cursor creation has an overhead on the database and in the big or busy databases might cause some problems.
Also, after done using the database closing the cursor is good:
cursor.close()
I am trying to read a geojson file and insert the records into a postgres table - using the below python code.
import json
import psycopg2
conn = psycopg2.connect(host="<<ip_address>>",database="DB1", user="<<id>>", password="pwd")
cur = conn.cursor()
with open('NTA_shape.json') as f:
Geojson_data = json.load(f)
for feature in Geojson_data['features']:
type_val=feature['geometry']['type']
geom=feature['geometry']['coordinates']
ntaname=feature['properties']['NTAName']
boroname=feature['properties']['BoroName']
data = {"type":type_val,"coordinates":geom}
sql ="""Insert into <<Table_NAME> (geom,ntaname,boroname) VALUES(ST_GeomFromGeoJSON(%s),%s,%s)"""
nta_boro=(json.dumps(data),ntaname,boroname)
cur.execute(sql,nta_boro)
conn.commit()
conn.close()
But when I query the table, lot of records are missing.
If I print the json.dumps(data) variable - its showing all records.
I am not sure, what am i missing during table insert
Kindly help.
I was able to fix with below change
nta_boro=(json.dumps(data,),ntaname,boroname)
I am using Python 2.7 and pyodbc to talk to a Postgresql database. The execute statement freezes and does not return when I use prepared statement. If I use straight sql then it works fine. Any ideas?
Here is what I have in code:
DBconn = pyodbc.connect("DSN=devDB;UID=dev;PWD=dev", autocommit=True)
cursor = DBconn.cursor()
sql = """ select distinct age from user where name = (?) """
params = ('john',)
cursor.execute(sql, params)
#works fine
#sql = """ select distinct age from user where name = 'john' """
#cursor.execute(sql)
row = cursor.fetchone()
print(row)
cursor.close()
DBconn.close()
PS: This query should only return one row and there is data is really small.
I have a simple single table in Python 2.7 sqllite. I just want to port the table to a external .csv file.
Been reading a few tutorials and they are writing gobs and gobs of code to do this.
Seems like this would be a simple method to call up the table ('Select * FROM Table') and save it to .csv.
Thanks
You could also use the csv module for output, especially if your string fields contain commas.
#!/usr/bin/python3
import sqlite3
connection = sqlite3.connect('example_database')
cursor = connection.cursor()
cursor.execute('drop table example_table')
cursor.execute('create table example_table(string varchar(10), number int)')
cursor.execute('insert into example_table (string, number) values(?, ?)', ('hello', 10))
cursor.execute('insert into example_table (string, number) values(?, ?)', ('goodbye', 20))
cursor.close()
cursor = connection.cursor()
cursor.execute('select * from example_table')
for row in cursor.fetchall():
print('{},{}'.format(row[0], row[1]))
cursor.close()
connection.close()
I have this code in Python:
conn = sqlite3.connect("people.db")
cursor = conn.cursor()
sql = 'create table if not exists people (id integer, name VARCHAR(255))'
cursor.execute(sql)
conn.commit()
sql = 'insert into people VALUES (3, "test")'
cursor.execute(sql)
conn.commit()
sql = 'insert into people VALUES (5, "test")'
cursor.execute(sql)
conn.commit()
print 'Printing all inserted'
cursor.execute("select * from people")
for row in cursor.fetchall():
print row
cursor.close()
conn.close()
But seems is never saving to the database, there is always the same elements on the db as if it was not saving anything.
On the other side If I try to access the db file via sqlite it I got this error:
Unable to open database "people.db": file is encrypted or is not a database
I found on some other answers to use conn.commit instead of conn.commit() but is not changing the results.
Any idea?
BINGO ! people! I Had the same problem. One of thr reasons where very simple. I`am using debian linux, error was
Unable to open database "people.db": file is encrypted or is not a database
database file was in the same dir than my python script
connect line was
conn = sqlite3.connect('./testcases.db')
I changed this
conn = sqlite3.connect('testcases.db')
! No dot and slash.
Error Fixed. All works
If someone think it is usefull, you`re welcome
This seems to work alright for me ("In database" increases on each run):
import random, sqlite3
conn = sqlite3.connect("people.db")
cursor = conn.cursor()
sql = 'create table if not exists people (id integer, name VARCHAR(255))'
cursor.execute(sql)
for x in xrange(5):
cursor.execute('insert into people VALUES (?, "test")', (random.randint(1, 10000),))
conn.commit()
cursor.execute("select count(*) from people")
print "In database:", cursor.fetchone()[0]
You should commit after making changes i.e:
myDatabase.commit()
can you open the db with a tool like sqlite administrator ? this would proove thedb-format is ok.
if i search for that error the solutions point to version issues between sqlite and the db-driver used. maybe you can chrck your versions or AKX could post the working combination.
regards,khz