pyodbc trusted connection not working on IIS - python

I'm building a Python web app on Flask that use pyodbc to read data from a SQL Server database. Only trusted connection is allowed per the database's policy.
When I tried on my local, it works perfectly well. But when I hosted it on IIS (via WFastCGI) it's not working (Internal Server Error is raised) - failed to connect on below:
pyodbc.connect(blabla,trusted_connection = yes)
Just want to know how the user can get their credential and then continue connection to the SQL.

This call has worked for me in the past (connecting to SQL Server):
conn = pyodbc.connect(''''
TRUSTED_CONNECTION=Yes;
DRIVER={SQL Server};
SERVER={myServer};
DATABASE=myDB;''')

Related

confusion on full process of setting up heroku postgresdb for python app

so I have followed this guide https://devcenter.heroku.com/articles/connecting-heroku-postgres#connecting-in-python and provisioned a database for my heroku app. In my server code, I added the database url and ssl require. However, how does my server actually get permission to write to the database?
Locally, with psycopg2 you would have :
conn = psycopg2.connect(database="dbName", user = "postgres", password = "password", host = "127.0.0.1", port = "5432"), and thats how your server authenticates itself and has permission to write onto the postgres db. With heroku however, none of that extra information is provided except for the db url link. What is the remaining process that I must to do to allow my server deployed on heroku to have permission to write to its associated provisioned database?

Connection Timing Out When Accessing Gcloud MySQL from Python

I have a python application where I'm trying to access a MySQL database on Google's cloud service.
I've been following this set up guide for connecting via an external application (Python) and I am using the pymysql package. I'm attempting to connect via the proxy and have already authenticated my connection via gcloud auth log in from the console.
As of now, I CAN access the database via the console, but I need to be able to make queries from my python script to build it out. When I try running it as is, I get the following error:
OperationalError: (2003, "Can't connect to MySQL server on '34.86.47.192' (timed out)")
Here's the function I'm using, with security sensitive info starred out:
def uploadData():
# cd to the directory with the MySQL exe
os.chdir('C:\\Program Files\\MySQL\\MySQL Server 8.0\\bin')
# Invoke the proxy
subprocess.call('start cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather', shell=True)
# Create connection
# I have also tried host = '127.0.0.1' for localhost here
conn = pymysql.connect(host='34.86.47.192',
user='root',
password='*******',
db='gribdata')
try:
c = conn.cursor()
# Use the right databse
db_query = 'use gribdata'
c.execute(db_query)
query = 'SELECT * FROM clients'
c.execute(query)
result = c.fetchall()
print(result)
except Error as e:
print(e)
finally:
conn.close()
Yeah, this one's pretty limited in documentation, but what you want to do is run it from it's hosted IP and configure access to your external IP address on your server. So you want use that IP (34.xxx.xxx.xxx) rather than the loopback 127 local host IP.
To get it to work, you want to go to your connections tab and add a new connection within Gcloud. Make sure the public address box is checked, the IP is correct, and you save once done.
There's some excellent details here from some Gcloud engineers. Looks like some of the source documentation is outdated and this is the way to connect now.
First of all, confirm that the Cloud SQL proxy is indeed installed in the directory that you are expecting it to be. The Cloud SQL proxy is not part of MySQL Server, hence you should not find it in C:\\Program Files\\MySQL\\MySQL Server 8.0\\bin, at least by default. Instead, the Cloud SQL proxy is a tool provided by Google and is just an .exe file that can be stored in any directory you wish. For instructions on how to download the Proxy you can check the docs
The Cloud SQL proxy creates a secure link between the Cloud SQL instance and your machine. what it does is forward a local port in your machine to the Cloud SQL instance. Thus, the host IP that you should use if you are using the proxy is 127.0.0.1
conn = pymysql.connect(host='127.0.0.1',
user='root',
password='*******',
db='gribdata')
When starting the Cloud SQL Proxy with TCP socket, you should add the port to which you want to forward Cloud SQL's traffic at the end of the start command =tcp:3306
subprocess.call('start cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather=tcp:3306', shell=True)
Have you tried to connect CloudSQL from the console? Once you connected, you should get a message in the console displaying "Listening on 127.0.0.1:3306".Your connection command should be
"cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather=tcp:3306"
Try to connect cloud proxy from the console and try to create a connection with pymysql. Use "127.0.0.1".

Jupyter Notebook unable to connect to local MySQL database

I'm trying to connect my Jupyter Notebook, I use Google Colab, with my local MySQL database. When I run this script in PyCharm, my preferred IDE, it works no problem. Only when I run it in Google Colab or Project Jupyter, both get the same error, does this happen.
import pymysql
import pandas as pd
mySQLuser = 'username'
mySQLpasswd = 'password'
mySQLhost = '127.0.0.1'
mySQLport = '3306'
mySQLdatabase = 'databasename'
connection = pymysql.connect(user = mySQLuser, passwd = mySQLpasswd, host = mySQLhost, port = mySQLport, db = mySQLdatabase)
I've also tried the same with sqlalchemy(create_engine) and mysql.connector, but get the same error.
OperationalError: (2003, "Can't connect to MySQL server on '127.0.0.1' ([Errno 111] Connection refused)")
I've also tried granting all privileges to the user and that didn't change anything. Another suggestion I've seen online is to null or change the bind-address in the configuration file (my.ini), but that didn't affect anything either.
Are you sure your colab/jupyter instances are run locally?
If that's not the case, you will not be able to access the SQL database on localhost (127.0.0.1) until you make your it accessible remotely (which implies several steps: making it accessible, changing the mySQLhost address in your code, authorizing connections from the host server in your SQL settings).
Got it to work. The service provider was rejecting the requests despite me configuring the network to allow it, but I sorted it out.

Connect to remote Postgres server via SQLAlchemy

I am trying to send some commands to a remote Postgres server using SQLAlchemy but each time I receive an error.
Please note that I can connect to the remote Postgres using SSH username and password to login to the server. For that I have used my local terminal, PuTTY and WinSCP so the problem appears to be in the Python code I have written
# create postgres engine to connect to the database
engine = create_engine('postgres://server_username:server_password#server_name:port/database')
with engine.connect() as conn:
ex = conn.execute("SELECT version();")
conn.close() # not needed but keep just in case
print(ex)
Running the code above yields the following error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) SSL SYSCALL error: Connection reset by peer (0x00002746/10054)
expected authentication request from server, but received S
I have also tried adding the SSL verification parameter as follows
create_engine('postgres://server_username:server_password#server_name:port/database?sslmode=verify-full')
which returned the error
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) root certificate file "C:\Users\aris.pavlides\AppData\Roaming/postgresql/root.crt" does not exist
Either provide the file or change sslmode to disable server certificate verification.
at which point I had nothing to lose so I disabled certificate verification altogether
create_engine('postgres://server_username:server_password#server_name:port/database?sslmode=disable')
which returned the initial error message.
Do you have any ideas on how I can modify the code to make it work?

ServerTimeoutError with MongoDB Atlas using PyMongo (running setup code)

I'm running the starter code from MongoDB trying to connect to a cluster I just setup in MongoDB Atlas. I'm getting a Server Selection Timeout Error.
I haven't been able to find anything in the MongoDB troubleshooting docs about this issue.
client = pymongo.MongoClient(
"mongodb+srv://USR:PWD#cluster0-eoik8.mongodb.net/test?retryWrites=true&w=majority")
db = client.admin
pprint(db.command("serverStatus"))
I've properly encoded the URL (using an online URL encoding service).
The error is
pymongo.errors.ServerSelectionTimeoutError: connection closed,connection closed,connection closed
The reason it wasn't working was because I hadn't whitelisted my IP address.

Categories

Resources