I would like to access a SQL server from windows 10 using python and pyodbc, not with the account i currently am logged in windows but with a different windows account
runan different user cannot work because the user that have access to the DB has not access to python directory nad i can include him in the users of my PC
Is it possible?
So, I ended up solving this not using pyodbc but using pymssql instead. I'm not sure if that help OP though. Just figured I'd share.
import dotenv, os, pymssql, win32security, win32con
from modules.utility import write_to_json
dotenv.load_dotenv()
def impersonate_user(username, password, domain):
"""Impersonate the security context of another user."""
handler = win32security.LogonUser(
username, domain, password,
win32con.LOGON32_LOGON_INTERACTIVE,
win32con.LOGON32_PROVIDER_DEFAULT)
win32security.ImpersonateLoggedOnUser(handler)
write_to_json(get_records_from_database(), 'JSON_FILE_NAME')
handler.Close()
def terminate_impersonation():
"""Terminate the impersonation of another user."""
win32security.RevertToSelf()
def connect_pymssql():
con = pymssql.connect(
server=os.getenv('DB_SERVER'),
database=os.getenv('DB_NAME'),
port=os.getenv('DB_PORT')
)
cur = con.cursor()
return con, cur
def get_records_from_database():
con, cur = connect_pymssql()
cur.execute(f"select * from [TABLE];")
result = cur.fetchall()
con.close()
return [list(x) for x in result] if result else None
Related
Let me start off by saying I am extremely new to Python and Postgresql so I feel like I'm in way over my head. My end goal is to get connected to the dvdrental database in postgresql and be able to access/manipulate the data. So far I have:
created a .config folder and a database.ini is within there with my login credentials.
in my src i have a config.py folder and use config parser, see below:
def config(filename='.config/database.ini', section='postgresql'):
# create a parser
parser = ConfigParser()
# read config file
parser.read(filename)
# get section, default to postgresql
db = {}
if parser.has_section(section):
params = parser.items(section)
for param in params:
db[param[0]] = param[1]
else:
raise Exception('Section {0} not found in the {1} file'.format(section, filename))
return db
then also in my src I have a tasks.py file that has a basic connect function, see below:
import pandas as pd
from clients.config import config
import psycopg
def connect():
""" Connect to the PostgreSQL database server """
conn = None
try:
# read connection parameters
params = config()
# connect to the PostgreSQL server
print('Connecting to the PostgreSQL database...')
conn = psycopg.connect(**params)
# create a cursor
cur = conn.cursor()
# execute a statement
print('PostgreSQL database version:')
cur.execute('SELECT version()')
# display the PostgreSQL database server version
db_version = cur.fetchone()
print(db_version)
# close the communication with the PostgreSQL
cur.close()
except (Exception, psycopg.DatabaseError) as error:
print(error)
finally:
if conn is not None:
conn.close()
print('Database connection closed.')
if __name__ == '__main__':
connect()
Now this runs and prints out the Postgresql database version which is all well & great but I'm struggling to figure out how to change the code so that it's more generalized and maybe just creates a cursor?
I need the connect function to basically just connect to the dvdrental database and create a cursor so that I can then use my connection to select from the database in other needed "tasks" -- for example I'd like to be able to create another function like the below:
def select_from_table(cursor, table_name, schema):
cursor.execute(f"SET search_path TO {schema}, public;")
results= cursor.execute(f"SELECT * FROM {table_name};").fetchall()
return results
but I'm struggling with how to just create a connection to the dvdrental database & a cursor so that I'm able to actually fetch data and create pandas tables with it and whatnot.
so it would be like
task 1 is connecting to the database
task 2 is interacting with the database (selecting tables and whatnot)
task 3 is converting the result from 2 into a pandas df
thanks so much for any help!! This is for a project in a class I am taking and I am extremely overwhelmed and have been googling-researching non-stop and seemingly end up nowhere fast.
The fact that you established the connection is honestly the hardest step. I know it can be overwhelming but you're on the right track.
Just copy these three lines from connect into the select_from_table method
params = config()
conn = psycopg.connect(**params)
cursor = conn.cursor()
It will look like this (also added conn.close() at the end):
def select_from_table(cursor, table_name, schema):
params = config()
conn = psycopg.connect(**params)
cursor = conn.cursor()
cursor.execute(f"SET search_path TO {schema}, public;")
results= cursor.execute(f"SELECT * FROM {table_name};").fetchall()
conn.close()
return results
I am trying to read the database of my MariaDB server. I have set it up like this:
CREATE DATABASE database1;
CREATE USER RaspberryPi#'%' IDENTIFIED BY 'password';
GRANT ALL PRIVILEGES ON database1.* TO RaspberryPi#'%';
SELECT database1;
CREATE TABLE Users;
This is on my Raspberry Pi 4 with the IP: 192.168.0.92. Now I have This Python script on my Windows computer:
import mysql.connector
mydb = mysql.connector.connect(host="192.168.0.92", user="RaspberryPi", passwd="password", database="database1")
mycursor = mydb.cursor()
mycursor.execute("SELECT * FROM Users")
print(mycursor)
Now I convert my Python script to an .exe file using pyinstaller. My problem is, that if I give this file to some other people, he can easily convert the .exe file to his original .py file and then he has my login credentials. Can I code it somehow, that the username and password isn't shown or the script can't be converted back?
Thanks.
you can use this snipet:
Module Imports
import mariadb
import sys
# Connect to MariaDB Platform
try:
conn = mariadb.connect(
user="db_user",
password="db_user_passwd",
host="192.0.2.1",
port=3306,
database="employees"
)
except mariadb.Error as e:
print(f"Error connecting to MariaDB Platform: {e}")
sys.exit(1)
# Get Cursor
cur = conn.cursor()
for more reference use this - https://mariadb.com/resources/blog/how-to-connect-python-programs-to-mariadb/
Update:
in order to make it more secure you can put the username and password with get input
I have written a Python Tool with an wxPython GUI which has mainly the task to get a lot of user input regarding Customer Data, Product Data and so on and save it to a SQL Database, at the moment locally with a SQLite3 Database for testing an now switching to MS Azure to have anybody work in the same Database.
As i now plan to use a MS Azure SQL DB i have a few questions an i am hoping this is the right place to ask:
What is the best library to connect to Azure via Python? I found
pyodbc and pymssql but i think both need to have an extra driver
installed? Is this true and is this a problem in real usecases?
I have many modules, like Manage_Customer.py and Manage_Factory.py and so on. In all of them I connect to my Database. I have no module which is like a SQL Master which handels some overhead.
So my code looks like this most of the time:
import wx
import sqlite3
SQL_PATH = "Database_Test.db"
class ManageCustomerToDB(wx.Dialog):
def __init__(self, *args, **kw):
super(ManageCustomerToDB, self).__init__(*args, **kw)
def InitUI(self):
#[GUI an so on...]
# I do this on time inside a module:
conn = sqlite3.connect(SQL_PATH)
self.c = conn.cursor()
# Use functions like the ones below...
def GetCustomerData(self):
self.c.execute("SELECT * FROM Customer WHERE CustomerID = ?", (self.tc_customer_id.GetValue(),))
customer_data = self.c.fetchall()
# Do something with Customer Data
def GetPersonData(self):
self.c.execute("SELECT * FROM Person WHERE PersonID = ?", (self.tc_person_id.GetValue(),))
person_data = self.c.fetchall()
# Do something with Person Data
I hope this example shows what i do. Are there any bigger mistakes i do?
After a read in SQL I dont have to close the DB in any way?
Thanks for your help and let me know if i can improve my question or give more details.
It is not a good idea to create a new connection to Azure SQL every time you CRUD. This is a waste of resources, and when the number of accesses reaches a certain number, it will have a large impact on the performance of mssql.
I suggest you use database connection pool. The pool manager will initial several connections to SQL Server instance, and then reuse these connections when requested.
There is an existing package which you can take advantage of. It is DBUtils. You can use the PoolDB from it with pyodbc together.
A sample for showing how database connection pool works:
import pyodbc
from DBUtils.PooledDB import PooledDB
class Database:
def __init__(self, server, driver, port, database, username, password):
self.server = server
self.driver = driver
self.port = port
self.database = database
self.username = username
self.password = password
self._CreatePool()
def _CreatePool(self):
self.Pool = PooledDB(creator=pyodbc, mincached=2, maxcached=5, maxshared=3, maxconnections=6, blocking=True, DRIVER=self.driver, SERVER=self.server, PORT=self.port, DATABASE=self.database, UID=self.username, PWD=self.password)
def _Getconnect(self):
self.conn = self.Pool.connection()
cur = self.conn.cursor()
if not cur:
raise "connection error"
else:
return cur
# query sql
def ExecQuery(self, sql):
cur = self._Getconnect()
cur.execute(sql)
relist = cur.fetchall()
cur.close()
self.conn.close()
return relist
# non-query sql
def ExecNoQuery(self, sql):
cur = self._Getconnect()
cur.execute(sql)
self.conn.commit()
cur.close()
self.conn.close()
def main():
server = 'jackdemo.database.windows.net'
database = 'jackdemo'
username = 'jack'
port=1433
password = '*********'
driver= '{ODBC Driver 17 for SQL Server}'
ms = Database(server=server, driver=driver, port=port, database=database, username=username, password=password)
resList = ms.ExecQuery("select * from Users")
print(resList)
if __name__ == '__main__':
main()
Answers to your questions:
Q1: What is the best library to connect to Azure via Python? I found pyodbc and pymssql but i think both need to have an extra driver installed? Is this true and is this a problem in real usecases?
Answer: Both of then would be OK. ODBC stands for Open Database Connectivity, so it could be used to connect many databases. I see the Microsoft tutorial uses pyodbc, so maybe it is a better choice.
Q2: I have many modules, like Manage_Customer.py and Manage_Factory.py and so on. In all of them I connect to my Database. I have no module which is like a SQL Master which handels some overhead.
Answer: Use database connection pool.
Q3: After a read in SQL I dont have to close the DB in any way?
Answer: If you use database connection pool, the connection will be put back too pool after you call close() method.
It's more of a theoratical question but i have been trying to find a correct answer of it for hours and yet i have't arrived at a solution. I have a big flask app and it contains multiple routes.
#app.route('/try'):
#app.route('/new'):
and many others. I am using MySQLdb for database purpose. Before i was having this in the starting of the application.
import MySQLdb as mysql
db = mysql.connect('localhost', 'root', 'password', 'db')
cursor = db.cursor()
It works fine but after a time, it generates a error "Local Variable 'cursor' referenced before assignment.". This may be due to the reason that after a time mysql closes a connection. So, i entered
cursor=db.cursor() in every route function and close it afer i have done the processing like this:
db = mysql.connect('localhost', 'root', 'password', 'db')
#app.route('/')
def home():
cursor=db.cursor()
...some processing...
cursor.close()
return render_template('home.html')
#app.route('/new')
def home_new():
cursor=db.cursor()
...some processing...
cursor.close()
return render_template('homenew.html')
Now i want to ask is this approach right? Should i define a cursor for each request and close it?
This is how I have my MySQLdb setup
def requestConnection():
"Create new connection. Return connection."
convt = cv.conversions.copy()
convt[3] = int
convt
conn = db.connect(host=c.SQL_HOST, port=c.SQL_PORT, user=c.SQL_USER, passwd=c.SQL_PASSWD, db=c.SQL_DB, conv=convt, use_unicode=True, charset="utf8")
return conn
def requestCursor(conn):
return conn.cursor(db.cursors.DictCursor)
Then, at the start of every SQL function I do this:
def executeQuery(query):
"Execute a given query. Used for debug purpouses."
conn = requestConnection()
cur = requestCursor(conn)
cur.execute(query)
r = cur.fetchall()
cur.close()
conn.close()
return r
I change conversions because I had to change int values in DB from Float to int due to my work, but you can skip this step.
If not, you need to import this:
import MySQLdb as db # https://github.com/farcepest/MySQLdb1
import MySQLdb.converters as cv
Hope it helps!
I have a .ini (configuration file) where I have mentioned the server name, Database Name, UserName and Password with which I can connect my app to the MSSQL
self.db = pyodbc.connect(
'driver={SQL Server};server=homeserver;database=testdb;uid=home;pwd=1234')`
corresponding data mentioned above connect statement is now in config.ini
self.configwrite = ConfigParser.RawConfigParser()
configread = SafeConfigParser()
configread.read('config.ini')
driver = configread.get('DataBase Settings','Driver')
server = str(configread.get('DataBase Settings','Server'))
db = str(configread.get('DataBase Settings','Database'))
user = str(configread.get('DataBase Settings','Username'))
password = str(configread.get('DataBase Settings','Password'))'
How can I pass these variables in the pyodbc connect statement?
I tried this:
self.db = pyodbc.connect('driver={Driver};server=server;database=db;uid=user;pwd=password')
But I am getting an error.
Other options for the connect function:
# using keywords for SQL Server authentication
self.db = pyodbc.connect(driver=driver, server=server, database=db,
user=user, password=password)
# using keywords for Windows authentication
self.db = pyodbc.connect(driver=driver, server=server, database=db,
trusted_connection='yes')
self.db = pyodbc.connect('driver={%s};server=%s;database=%s;uid=%s;pwd=%s' % ( driver, server, db, user, password ) )
%s is used to include variables into the string
the variables are placed into the string according to their order after the %
Mixing strings and input variable in sql connection string using pyodbc library - Python
inspired of this answer
`conn=pyodbc.connect('Driver={SQL Server};'
'Server='+servername+';'
'Database=master;'
'UID=sa;'
'PWD='+pasword+';'
)`