how to insert python logs in postgresql table? - python

I need to insert the logs from my test case into a table in postgresql data base.
I was able to connect to the db but I can't figure out how to insert this line result in the tabble, I have tried the below but it doesnt work
import logging
import psycopg2
from io import StringIO
from config import config
params = config()
conn = psycopg2.connect(**params)
print(conn)
curr = conn.cursor()
try:
if not hw.has_connection():
logging.error('Failure: Unable to reach websb! ==> '+ str(driver.find_element_by_xpath('//span[#jsselect="heading" and #jsvalues=".innerHTML:msg"]').text))
return
elif hw.is_websiteReachable():
logging.info("Success: website is reachable!")
curr.execute("""INSERT INTO db_name(logs) VALUES (%s)""", ("Success: website is reachable!"))
conn.commit()
except:
logging.error("Failure: Unable to reach website!")
return
Iam a total beginner in this. I have searched but I couldnt find a clear example or guide about it. the above code throws the exception eventhough the website is reachable. sorry if I sound dumb.

It looks like you're incorrectly constructing your SQL statement. Instead of INSERT INTO db_name(table_name) ... it should be INSERT INTO table_name(column_name) .... If you've correctly connected to the appropriate database in your connection settings, you usually don't have to specify the database name each time you write your SQL.
Therefore I would recommend, the following modification (assuming your table is called logs and it has a column named message):
# ...
sql = 'INSERT INTO logs(message) VALUES (%s);'
msg = 'Success: website is reachable!'
curr.execute(sql, (msg))
conn.commit()
You can read the pyscopg2 docs here for more information as well if that would help with passing named parameters to your SQL queries in Python.

You can check a good solution that I personally use in my in-server projects. You just need to give a connection-string to the CRUD object and all the things will be done. For Postgres you can use:
'postgresql+psycopg2://username:password#host:port/database'
or
'postgresql+pg8000://username:password#host:port/database'
for more details check SQLAlchemy Engine Configuration.

Related

POSTGRESQL Queries using Python

I am trying to access tables from a database using python. There was some code on the website: https://rnacentral.org/help/public-database
import psycopg2.extras
def main():
conn_string = "host='hh-pgsql-public.ebi.ac.uk' dbname='pfmegrnargs' user='reader' password='NWDMCE5xdipIjRrp'"
conn = psycopg2.connect(conn_string)
cursor = conn.cursor(cursor_factory=psycopg2.extras.DictCursor)`
# retrieve a list of RNAcentral databases
query = "SELECT * FROM rnc_database"
cursor.execute(query)
for row in cursor:
print(row)`
When i run this code, i get back a list of databases:
I want to access tables from one of these databases but I don't know what the schema for those tables are or what the values in each list returned represents. I have been looking at 'postgresql to python' resources but all of them are about accessing tables when you know the name of the tables and the columns within.... Is there code for how I can access the table names from the database?
Thank You
Edit: sorry, i thought i linked the website before
The dataset you want to use has schema diagram here https://rnacentral.org/help/public-database
For general purpose I would use something like https://dbeaver.io/ tool it will show you all the schemas in the db and tables inside the schema and so forth. The DBeaver settings to connect to your db would look like this
If you want to keep using python script to explore the db this sql query
SELECT *
FROM pg_catalog.pg_tables
WHERE schemaname != 'pg_catalog' AND
schemaname != 'information_schema';
Should help you.

python pyodbc SQLite sql injections

I use pyodbc in my python flask Project for the SQLite DB connection.
I know and understand SQL Injections but this is my first time dealing with it.
I tried to execute some
I have a function which concatenates the SQL String in my database.py file:
def open_issue(self, data_object):
cursor = self.conn.cursor()
# data_object is the issue i get from the user
name = data_object["name"]
text = data_object["text"]
rating_sum = 0
# if the user provides an issue
if name:
# check if issue is already in db
test = cursor.execute(f'''SELECT name FROM issue WHERE name = "{name}"''')
data = test.fetchall()
# if not in db insert
if len(data) == 0:
# insert the issue
cursor.executescript(f'''INSERT INTO issue (name, text, rating_sum)
VALUES ("{name}", "{text}", {rating_sum})''')
else:
print("nothing inserted!")
In the api.py file the open_issue() function gets called:
#self.app.route('/open_issue')
def insertdata():
# data sent from client
# data_object = flask.request.json
# unit test dictionary
data_object = {"name": "injection-test-table",
"text": "'; CREATE TABLE 'injected_table-1337';--"}
DB().open_issue(data_object)
The "'; CREATE TABLE 'injected_table-1337';--" sql injection has not created the injected_table-1337, instead it got inserted normally like a string into the text column of the injection-test-table.
So i don't really know if i am safe for the standard ways of SQL injection (this project will only be hosted locally but good security is always welcome)
And secondary: are there ways with pyodbc to check if a string contains sql syntax or symbols, so that nothing will get inserted in my example or do i need to check the strings manually?
Thanks a lot
As it turns out, with SQLite you are at much less risk of SQL injection issues because by default neither Python's built-in sqlite3 module nor the SQLite ODBC driver allow multiple statements to be executed in a single .execute call (commonly known as an "anonymous code block"). This code:
thing = "'; CREATE TABLE bobby (id int primary key); --"
sql = f"SELECT * FROM table1 WHERE txt='{thing}'"
crsr.execute(sql)
throws this for sqlite3
sqlite3.Warning: You can only execute one statement at a time.
and this for SQLite ODBC
pyodbc.Error: ('HY000', '[HY000] only one SQL statement allowed (-1) (SQLExecDirectW)')
Still, you should follow best practices and use a proper parameterized query
thing = "'; CREATE TABLE bobby (id int primary key); --"
sql = "SELECT * FROM table1 WHERE txt=?"
crsr.execute(sql, (thing, ))
because this will also correctly handle parameter values that would cause errors if injected directly, e.g.,
thing = "it's good to avoid SQL injection"

Bulk Insert into SQL Server with Python not working

I'm attempting to bulk insert a csv into a table in SQL server. The catch is, the data doesn't match the columns of the destination table. The destination table has several audit columns that are not found in the source file. The solution I found for this is to insert into a view instead. The code is pretty simple:
from sqlalchemy import create_engine
engine = create_engine('mssql+pyodbc://[DNS]')
conn = engine.connect()
sql = "BULK INSERT [table view] FROM '[source file path]' WITH (FIELDTERMINATOR = ',',ROWTERMINATOR = '\n')"
conn.execute(sql)
conn.close()
When I run the SQL statement inside of SSMS it works perfectly. When I try to execute it from inside a Python script, the script runs but no data winds up in the table. What am I missing?
Update: It turns out bulk inserting into a normal table doesn't work either.
Before closing the connection, you need to call commit() or the SQL actions will be rolled back on connection close.
conn.commit()
conn.close()
It turns out that instead of using SQL Alchemy, I had to use pypyodbc. Not sure why this worked and the other way didn't. Example code found here:How to Speed up with Bulk Insert to MS Server from Python with Pyodbc from CSV
This works for me after checking sqlalchemy transactions refeference. I don't explicitly set conn.commit() as
The block managed by each .begin() method has the behavior such that the transaction is committed when the block completes.
with engine.begin() as conn:
conn.execute(sql_bulk_insert)

[Openerp]How to make SQL Query with another database user from python code?

I need to make SQL Query in Openerp with another user than postgres from python code that has only SELECT privileges. Is there a way that cursor(cr) receives connection string?
Ok, I found an easy solution for this. As OpenERP uses psycopg as postgres database cursor I have explicitly made an psycopg object with parameters that I need:
conn = psycopg1.connect(database=cr.dbname, user=dbuser, password=dbpass)
cur = conn.cursor()
Be careful, if you want to use dictfetchall you need to import psycopg1:
from psycopg2 import psycopg1
cur.execute(sql)
res = cur.dictfetchall()

How to access a SQL Server 2008 stored procedure with a table valued parameter in Python

I’m looking for a way to take a result set and use it to find records in a table that resides in SQL Server 2008 – without spinning through the records one at a time. The result sets that will be used to find the records could number in the hundreds of thousands. So far I am pursuing creating a table in memory using sqlite3 and then trying to feed that table to a stored procedure that takes a table valued parameter. The work on the SQL Server side is done, the user defined type is created, the test procedure accepting a table valued parameter exists and I’ve tested it through TSQL and it appears to work just fine. In Python a simple in memory table was created through sqlite3. Now the catch, the only documentation I have found for accessing a stored procedure with a table valued parameter uses ADO.Net and VB, nothing in Python. Unfortunately, I’m not enough of a programmer to translate. Has anyone used a SQL Server stored procedure with a table valued parameter? Is there another approach I should look into?
Here are some links:
Decent explanation of table valued parameters and how to set them up in SQL and using in .Net
http://www.sqlteam.com/article/sql-server-2008-table-valued-parameters
http://msdn.microsoft.com/en-us/library/bb675163.aspx#Y2142
Explanation of using ADO in Python – almost what I need, just need the structured parameter type.
http://www.mayukhbose.com/python/ado/ado-command-3.php
My simple code
--TSQL to create type on SQL database
create Type PropIDList as Table
(Prop_Id BigInt primary key)
--TSQL to create stored procedure on SQL database. Note reference to
create procedure PropIDListTest #PIDList PropIDList READONLY
as
SET NOCOUNT ON
select * from
#PIDList p
SET NOCOUNT OFF
--TSQL to test objects.
--Declare variable as user defined type (table that has prop_id)
declare #pidlist as propidlist
--Populate variable
insert into #pidlist(prop_id)
values(1000)
insert into #pidlist(prop_id)
values(2000)
--Pass table variable to stored procedure
exec PropIDListTest #pidlist
Now the tough part – Python.
Here is the code creating the in memory table
import getopt, sys, string, os, tempfile, shutil
import _winreg,win32api, win32con
from win32com.client import Dispatch
from adoconstants import *
import sqlite3
conn1 = sqlite3.connect(':memory:')
c = conn1.cursor()
# Create table
c.execute('''create table PropList
(PropID bigint)''')
# Insert a row of data
c.execute("""insert into PropList
values (37921019)""")
# Save (commit) the changes
conn1.commit()
c.execute('select * from PropList order by propID')
# lets print out what we have to make sure it works
for row in c:
print row
Ok, my attempt at connecting through Python
conn = Dispatch('ADODB.Connection')
conn.ConnectionString = "Provider=sqloledb.1; Data Source=nt38; Integrated Security = SSPI;database=pubs"
conn.Open()
cmd = Dispatch('ADODB.Command')
cmd.ActiveConnection = conn
cmd.CommandType = adCmdStoredProc
cmd.CommandText = "PropIDListTest #pidlist = ?"
param1 = cmd.CreateParameter('#PIDList', adUserDefined) # I “think” the parameter type is the key and yes it is most likely wrong here.
cmd.Parameters.Append(param1)
cmd.Parameters.Value = conn1 # Yeah, this is probably wrong as well
(rs, status) = cmd.Execute()
while not rs.EOF:
OutputName = rs.Fields("Prop_ID").Value.strip().upper()
print OutputName
rs.MoveNext()
rs.Close()
rs = None
conn.Close()
conn = None
# We can also close the cursor if we are done with it
c.close()
conn1.close()
I have coded TVPs from ADO.NET before.
Here is a question on TVPs in classic ADO that I am interested in, sql server - Classic ADO and Table-Valued Parameters in Stored Procedure - Stack Overflow. It does not give a direct answer but alternatives.
The option of XML is easier, you have probably already considered it; it would require more server side processing.
Here is the msdn link for low level ODBC programming of TVPs.
Table-Valued Parameters (ODBC). This one is the closest answer if you can switch to ODBC.
You could pass a csv string to nvarchar(max) and then pass it to a CLR SplitString function, that one is fast but has default behaviour I disagree with.
Please post back what works or does not here.

Categories

Resources