how to render data from postgresql to csv in python flask app? - python

I am new in python and trying to write a code in it. I am trying to run a select query but i am not able to to render a data to csv file ?
this is the psql query :
# \copy (
# SELECT
# sr.imei,
# sensors.label,sr.created_at,
# sr.received_at,
# sr.type_id,
#
but How to write it in python to render it to csv file ?
thanking you,
Vikas

sql = "COPY (SELECT * FROM sensor_readings WHERE reading=blahblahblah) TO STDOUT WITH CSV DELIMITER ';'"
with open("/tmp/sensor_readings.csv", "w") as file:
cur.copy_expert(sql, file)
I think you just need to change the sql for your use, and it should work.

Install psycopg2 via pip install psycopg2 than you need something like this
import csv
import psycopg2
query = """
SELECT
sr.imei,
sensors.label,sr.created_at,
sr.received_at,
sr.type_id,
sr.data FROM sensor_readings as sr LEFT JOIN sensors on sr.imei = sensors.imei
WHERE sr.imei not like 'test%' AND sr.created_at > '2019-02-01'
ORDER BY sr.received_at desc
"""
conn = psycopg2.connect(database="routing_template", user="postgres", host="localhost", password="xxxx")
cur = conn.cursor()
cur.execute(query)
with open('result.csv', 'w') as f:
writer = csv.writer(f, delimiter=',')
for row in cur.fetchall():
writer.writerow(row)
cur.close()
conn.close()

Related

"no such table: Sport" on exporting a table from sqlite db (django view)

In my django view, after updating a table, I put this code for exporting that table into csv file:
import sqlite3 as sql
import os
import csv
# export Data
print ("Export data into csv file..............")
conn = sql.connect('sqlite3.db') # I tried: db.sqlite3 -> same
cursor=conn.cursor()
cursor.execute("select * from Sport")
with open("heartrateai_data.csv", "w") as csv_file:
csv_writer = csv.writer(csv_file, delimiter="\t")
csv_writer.writerow([i[0] for i in cursor.description])
csv_writer.writerows(cursor)
dirpath = os.getcwdb()+"/heartrateai_data.csv"
print("Data exported Successfully into {}".format(dirpath))
conn.close()
But it gives me the error: Exception Value: no such table: Sport.
I am sure that the table name is correct because is the same in my model.py.
I am not sure if it correct the line with connection and connection close. I am new in this.
My browser:
Edit 2:
I saw that the correct way to write the path is with 'E:\...' or with r'E:...'. I wrote like this in my code conn = sql.connect(r'E:\Work\django\analysisData\db.sqlite3') but I have the same error. "No such table: Sport"
Try this
python manage.py makemigrations
python manage.py migrate
it can be just Django's error
After I wrote this lines in my code:
con = sql.connect(r'E:\Work\django\analysisData\db.sqlite3')
cursor = con.cursor()
cursor.execute("SELECT name FROM sqlite_master WHERE type='table';")
print(cursor.fetchall())
I saw that the name of the table isn't the same like in the model.py. The name of tables is: projectName_NameOfTable.
I modified the table name, and I don't have that error again.

Python Error - no viable alternative input when trying to insert values from file

I'm trying to insert some values from a csv file through Python but I'm getting a no viable alternative at input error. When I specify the values instead of %s the code works but when I try to use %s it fails. This is my code:
import jaydebeapi
import jpype
import pyodbc
import pandas as pd
import csv
conn = pyodbc.connect("myconnection")
cursor = conn.cursor()
with open('/Users/user/Desktop/TEST.csv') as f:
reader = csv.reader(f)
for row in reader:
cursor.execute("INSERT INTO mytable (user_id, email) VALUES(%s,%s)", row)
#close the connection to the database.
mydb.commit()
cursor.close()

Python: CSV writer very slow, need advice to speed it up

I'm using a simple script to pull data from an Oracle DB and write the data to a CSV file using the CSV writer.
The table i'm querying contains about 25k records, the script runs perfectly except for its actually very slow. It takes 25 minutes to finish.
In what way could i speed up this by altering the code? Any tips from you heroes are welcome.
#
# Load libraries
#
from __future__ import print_function
import cx_Oracle
import time
import csv
#
# Connect to Oracle and select the proper data
#
con = cx_Oracle.connect('secret')
cursor = con.cursor()
sql = "select * from table"
#
# Determine how and where the filename is created
#
path = ("c:\\path\\")
filename = time.strftime("%Y%m%d-%H%M%S")
extentionname = (".csv")
csv_file = open(path+filename+extentionname, "w")
writer = csv.writer(csv_file, delimiter=',', lineterminator="\n",
quoting=csv.QUOTE_NONNUMERIC)
r = cursor.execute(sql)
for row in cursor:
writer.writerow(row)
cursor.close()
con.close()
csv_file.close()
Did you try using writerows function from csv module? Instead of writing each record one by one, it gives provision to write all at once. This should fasten up the things.
data = [] #data rows
with open('csv_file.csv', 'w') as csv_file:
writer = csv.DictWriter(csv_file)
writer.writeheader()
writer.writerows(data)
Alternatively, you can also use pandas module to write big chunk of data to CSV file. This method is explained with examples here.

Dump a MySQL table to a CSV file and save it in a given location using Python script

I used following python script to dump a MySQL table to a CSV file. But it was saved in the same folder which python script is saved. I want to save it in another folder. How can I do it? Thank you
print 'Writing database to csv file'
import MySQLdb
import csv
import time
import datetime
import os
currentDate=datetime.datetime.now().date()
user = ''
passwd = ''
host = ''
db = ''
table = ''
con = MySQLdb.connect(user=user, passwd=passwd, host=host, db=db)
cursor = con.cursor()
query = "SELECT * FROM %s;" % table
cursor.execute(query)
with open('Data on %s.csv' % currentDate ,'w') as f:
writer = csv.writer(f)
for row in cursor.fetchall():
writer.writerow(row)
print 'Done'
Change this:
with open('/full/path/tofile/Data on %s.csv' % currentDate ,'w') as f:
This solves your problem X. But you have a problem Y. That is 'How do i efficiently, dump CSV data from mysql, without having to write a lot of code?'
Answer to problem Y is SELECT INTO OUTFILE

Python, converting CSV file to SQL table

I have a CSV file without headers and am trying to create a SQL table from certain columns in the file. I tried the solutions given here: Importing a CSV file into a sqlite3 database table using Python,
but keep getting the error that col1 is not defined. I then tried inserting headers in my CSV file and am still getting a KeyError.
Any help is appreciated! (I am not very familiar with SQL at all)
If the .csv file has no headers, you don't want to use DictReader; DictReader assumes line 1 is a set of headers and uses them as keys for every subsequent line. This is probably why you're getting KeyErrors.
A modified version of the example from that link:
import csv, sqlite3
con = sqlite3.connect(":memory:")
cur = con.cursor()
cur.execute("CREATE TABLE t (col1, col2);")
with open('data.csv','rb') as fin:
dr = csv.reader(fin)
dicts = ({'col1': line[0], 'col2': line[1]} for line in dr)
to_db = ((i['col1'], i['col2']) for i in dicts)
cur.executemany("INSERT INTO t (col1, col2) VALUES (?, ?);", to_db)
con.commit()
This below code, will read all the csv files from the path and load all the data into table present in sqllite 3 database.
import sqllite3
import io
import os.path
import glob
cnx = sqlite3.connect(user='user', host='localhost', password='password',
database='dbname')
cursor=cnx.cursor(buffered= True);
path ='path/*/csv'
for files in glob.glob(path + "/*.csv"):
add_csv_file="""LOAD DATA LOCAL INFILE '%s' INTO TABLE tabkename FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 LINES;;;""" %(files)
print ("add_csv_file: %s" % files)
cursor.execute(add_csv_file)
cnx.commit()
cursor.close();
cnx.close();
Let me know if this works.

Categories

Resources