Use of variables in postgres script - python

I want to use a variable in my SQL script. Variable's value is 100 (just a number). I have stored it as a csv. file in this directory: C:\Users\Dino\Desktop\my_file.csv.
I want in the sql script to run this:
import os
from ask_db import ask_db_params #this script creates the connection to the database
import sys, os
def my_function(cur, conn):
sql="""
\set outputdir'C:\\Users\\Dino\\Desktop'
\set new_var :outputdir '\\my_file.csv'
copy my_file to :'new_var';"""
cur.execute(sql)
conn.commit()
if __name__ == '__main__':
try:
conn = ask_db_params()
cur = conn.cursor()
analysis_data(cur,conn)
logging.info('Data analysed.')
except Exception as e:
logging.error('Failure', exc_info=True)
exit(1)
I have the error:
syntax error at or near "\" ....
It refers to the first line.
Any help regarding the syntax?
P.S. I m running python to call the sql script. Windows OS

That won't work. \set is a psql command, not an SQL command.
You will have to use string manipulation in Python to construct an SQL string that looks like this:
COPY my_file TO 'C:\Users\Dino\Desktop\my_file.csv'
Then use execute() with that SQL string.

Related

Python and sqlite3 simplest example not working

I'm trying to execute this simple python script but it seems to do nothing: I don't get any error, I try to execute the query directly on sqlite3 and it works....I don't have any idea why isn't working, can anyone help me?
import sqlite3 as lite
import sys
con = None
try:
con = lite.connect('/home/pi/Moranberries/web/moranberries.db')
cur = con.cursor()
cur.execute("INSERT INTO sensor_interior (temperatura,humedad) VALUES (111,222)")
except lite.Error, e:
print "Error %s:" % e.args[0]
sys.exit(1)
finally:
if con:
con.close()
To execute this script I named it prueba.py an call it from terminal as this:
python prueba.py
There is no error message.
You're not committing your changes to the DB. If you call con.commit() after cur.execute, it should write the changes.

psycopg2 - Issue with for loop when trying to insert data to postgres table - eval() arg 1 must be a string or code object

I have come to my wits end with this one.
So I created a script that captures data from the operating system, inserts it into a dictionary. The next step should be for that data to be inserted into a postgres database. However I am getting the following error when the for loop is executed that loops through the data.
/usr/bin/python2.7
/home/gmastrokostas/PycharmProjects/learning/Violent_Python.py
Traceback (most recent call last): File
"/home/gmastrokostas/PycharmProjects/learning/Violent_Python.py", line
52, in
for db_loop in eval(server_info): TypeError: eval() arg 1 must be a string or code object
Below is the postgres table script and also the python script it self. The name of the table is SERVERS and the columns are "hostname", "OS", "RAM", "CPU".
SQL SCRIPT TO CREATE THE TABLE - PostgreSQL 9.4
CREATE TABLE servers
(
hostname character varying(50),
"OS" character varying(25),
"RAM" numeric(4,2),
"CPU" character varying(50)
)
WITH (
OIDS=FALSE
);
ALTER TABLE servers
OWNER TO dbuser;
PYTHON SCRIPT
#!/usr/bin/python
import psutil
import os
import math
import platform
import subprocess
import socket
import psycopg2
from decimal import *
#Used by the convert values function.
factor = 1024
def Host_name():
hostname = socket.gethostname()
return hostname
def OS_make():
#CPU INFORMATION.
#cpu_version = platform.dist()[0]+" "+platform.dist()[1]
cpu_version = platform.dist()[0]
return cpu_version
def Virtual_memory_usage():
cvr_info = psutil.virtual_memory().total
return round(cvr_info, 2)
def convert_values():
cvr_info = Virtual_memory_usage()
i = int(math.log(cvr_info)/math.log(factor))
result = float(cvr_info)/factor**i
return result
def get_processor_info():
if platform.system() == "Windows":
return platform.processor()
elif platform.system() == "Darwin":
return subprocess.check_output(['/usr/sbin/sysctl', "-n", "machdep.cpu.brand_string"]).strip()
elif platform.system() == "Linux":
command = "cat /proc/cpuinfo | grep 'model name' | head -1 | awk -F: '{print $2}'"
return subprocess.check_output(command, shell=True).strip()
return ""
#Adjusting precision and then appending to list information taken from Virtual_memory_usage function
mem_frmt = "{0:.2f}".format(convert_values())
server_info = {'hostname': Host_name(), 'OS':OS_make(), 'RAM':mem_frmt, 'CPU':get_processor_info()}
conn = psycopg2.connect("host='10.0.0.41' dbname='serverinfo' user='dbuser'")
cur = conn.cursor()
for db_loop in eval(server_info):
print db_loop #For testing
cur.execute("""INSERT INTO servers(hostname,OS,RAM,CPU) VALUES('%s','%s','%s','%s')""" % \
(db_loop['hostname'], db_loop['OS'], db_loop['RAM'], db_loop['CPU'])
#FOR TESTING BUT IT DOES NOT WORK EITHER
#cur.execute("""INSERT INTO servers(hostname) VALUES('%s')""" % \
# (db_loop['hostname'])
)
conn.commit()
The issue is that you are trying to eval a dictionary. The eval function compiles code - ie a string. I'm not sure what your intention was with using eval there.
The good news is that you don't need to invoke eval. And even better news is that I don't see you need a loop at all.
You have constructed a dictionary of named data to insert. All you need to do is open the connection, get a cursor and insert the server_info data, similar to what you're doing with the db_loop line.
The are multiple errors in your code. First of all you want to insert a single row of data (server_info) so you don't need the loop. Also the call to eval() doesn't make any sense because that function is used to evaluate Python code in a string and you're passing it a dictionary as argument.
Then, you should not quote the SQL parameters yourself (e.g., use "%s" and not "'%s'").
And finally, given that you already have the data in a dictionary, use that to provide bound variables to the execute() method.
I would write the code as follows:
server_info = {'hostname': Host_name(), 'OS':OS_make(), 'RAM':mem_frmt, 'CPU':get_processor_info()}
conn = psycopg2.connect("host='10.0.0.41' dbname='serverinfo' user='dbuser'")
curs = conn.cursor()
curs.execute("""INSERT INTO servers (hostname,OS,RAM,CPU) VALUES (%(hostname)s,%(OS)s,%(RAM)s,%(CPU)s)""", server_info)
conn.commit()

psycopg2 across python files

I am writing a Python application (console based) that makes use of a PostgreSQL database (via psycopg2) and R (via rpy). It is a large procedure-based application and involves several steps and sometimes repeating of steps and do not always involve all steps.
I have is the following:
main_file.py
modules/__init__.py
modules/module1.py
modules/module2.py
functions/__init__.py
functions/function1.py
functions/function2.py
The init files just states import module1, module2 or function1, function2 depending which init file it is.
The content of the main_file.py looks something like this:
import modules
from functions import function1
class myClass():
def my_function(self):
scripts = [
# modules.module1.function,
modules.module2.function,
]
print "Welcome to the program."
function1.connect()
for i in scripts:
i
cur.close()
print "End of program"
if __name__ == '__main__':
myClass().my_function()
The reason for the loop is to comment out certain steps if I don't need them. The connect() function I'm trying to call is the psycopg2 connection. It looks like this (inside function1.py file):
import sys
import psycopg2
def connect():
try:
con = psycopg2.connect(database=dbname, user=dbuser)
cur = con.cursor()
db = cur.execute
except psycopg2.DatabaseError, e:
if con:
con.rollback()
print e
sys.exit
In the main_file.py example I'm trying to run module2, which needs to connect to the database, using something like the following:
def function:
db("SELECT * INTO new_table FROM old_table")
con.commit()
How do I get Python (2.7) to recognise the global names db, cur and con? Thus connecting once-off to the database and keeping the active connection through all steps in the program?
You should add a function to the module that initialize the DB that will return the created DB objects, and then have every module that wants to use the DB call that function:
function1.py
import sys
import psycopg2
con = cur = db = None
def connect():
global con, cur, db
try:
con = psycopg2.connect(database=dbname, user=dbuser)
cur = con.cursor()
db = cur.execute
except psycopg2.DatabaseError, e:
if con:
con.rollback()
print e
sys.exit
def get_db():
if not (con and cur and db):
connect()
return (con, cur, db)
function2.py
import function1
con, cur, db = function1.get_db()
def function:
db("SELECT * INTO new_table FROM old_table")
con.commit()
There's no way to make certain variables global to every single module in package. You have explicitly import them from whatever module they live in, or return them from a function call.

How to execute code when a Python script is closed out?

I have a raspberry pi that records temperature and stores them in a MySQL database on my website. I often toy with the script, so I am hitting ctrl+c on the running script and re executing it. I would like to properly issue close() on the database connection. How can I run a line of code when the script is exited in python?
import MySQLdb
con = MySQLdb.connect(...)
cursor = con.cursor()
# ...
#if script closes:
#con.close()
import MySQLdb
con = MySQLdb.connect(...)
cursor = con.cursor()
try:
# do stuff with your DB
finally:
con.close()
The finally clause is executed on success as well as on error (exception).
If you hit Ctrl-C, you get a KeyboardInterrupt exception.
or:
import atexit
def close_db(con):
con.close()
# any other stuff you need to run on exiting
import MySQLdb
con = MySQLdb.connect(...)
# ...
atexit.register(close_db, con=con)
See here for more info.

can't execute '.read create.sql'

This might be an obvious error but I'm trying to create a database within python from a script I've already created.
conn = sqlite3.connect('testDB')
c = conn.cursor()
c.execute('.read create.sql')
This gives an error "sqlite3.OperationalError: near ".": syntax error"
If I do the same thing at the sqlite3 cmd line it works fine
[me#myPC ~]$ sqlite3 testDB
SQLite version 3.3.6
Enter ".help" for instructions
sqlite> .read create.sql
sqlite>
It seems that any commands that start with a . give me problems.
just pass the content of the file to the .execute method:
conn = sqlite3.connect('testDB')
c = conn.cursor()
SQL = open('create.sql').read()
c.executescript(SQL)
I would suppose that commands starting with . are for the CLI client itself, not for the backend.
So you have no chance to do so and would have to do file reading and executing the queries by yourself, i.e. in Python.

Categories

Resources