Cannot connect to PostgresSQL through Google-Cloud-Proxy - python

I'm running a test Python script that uploads a small amount of data to a PostgreSQL database using SQL Alchemy. Apache Airflow (hosted by Google Cloud Composer) is running this script on a schedule. The script always runs totally fine whenever I run it on Airflow locally and connect directly to the DB.
When I run it on Cloud Composer however, I need to connect through Google's Cloud SQL Proxy for security reasons.
Here's the weird part - the script runs fine about 50% of the time when connecting through Google's cloud-sql-proxy. The other 50% of the time I get the error (psycopg2.OperationalError) FATAL: password authentication failed for user "XXXX. And then usually it retries and it connects just fine.
I'm assuming it's probably something to do with how I am managing my connection (how it's being opened, pooled, or closed) so here is how I'm doing that. I'm using SQLAlchemy in my Python script like so:
from sqlalchemy import create_engine
engine = create_engine(f"postgresql://{USER}:{PASS}#{HOST}/{DB_NAME}")
conn = engine.connect()
# Upload a bit of data here
conn.close()
engine.dispose()
I've always used this way of uploading data locally so I'm a bit confused why that wouldn't work with Google Cloud Proxy / Google Cloud Composer (hosted Airflow).

Related

Unable to connect remotely to mysql database hosted online

I am trying to remotely connect to a mysql database from my company network, using the mysql.connector module in python. The db is hosted online (siteground.com).
Error: Error while connecting to MYSQL 2003: Can't connect to MySQL server on.... (10061 no connection could be made because the target machine actively refused it)*
I have the correct hostname, username, password etc.., and have added my company's public IP address as an allowed remote access host.
I have been able to successfully connect using the exact same procedure from a python notebook hosted on google colaboratory (in google drive).
Any ideas on what the issue might be?
Thank you

Run a SQL Server Agent Job from Python

I am trying to trigger a SQL Server Agent Job (takes a backup of the db and places into a directory) from python. Unfortunately, I haven't found anything in regards to python triggering a SQL Server Agent Job (only the other way around, SQL Server Agent Job triggering a python script).
Once I get that backup, I want to restore this db into a different SQL Server using the same python script.
Thanks for any help!!
You can run a job from Transact-SQL from Python:
EXEC dbo.sp_start_job N'My Job Name';
GO
See documentation for more information.

cloud Sql connect from Local Python IDE

How to connect Google Cloud Sql from Local Python code?I am using Pycharm IDE.So I need the detail process to establish connection with google cloud mysql
You can find the detailed process here :
Connecting to Cloud SQL - MySQL
1.If you haven't already, set up a Python Development Environment by following the python setup guide and create a project.
2.Create a 2nd Gen Cloud SQL Instance by following these instructions. Note the connection string, database user, and database password that
you create.
3.Create a database for your application by following these instructions. Note the database name.
4.Create a service account with the 'Cloud SQL Client' permissions by following these instructions. Download a JSON key to use to
authenticate your connection.
5.Running locally : Launch proxy with TCP or Launch proxy with Unix Domain Socket

Time out when connecting to Redshift from AWS EC2 using psycopg2

I'm working on a simple Python program to query a Redshift cluster using psycopg2. When I run the code on my local machine it works as expected: it creates the connection, it runs the queries and I get the expected outcome. However, I loaded it on my EC2 instance because I want to schedule several runs a week and the execution fails with the following error:
psycopg2.OperationalError: could not connect to server: Connection timed out
Is the server running on host "xxxx" and accepting
TCP/IP connections on port 5439?
Considering that the code is working without problems on the local machine and the security settings should be the same as EC2, do you have any suggestions and/or workarounds?
Thanks a lot.

Can't connect to PyMongo DB on Kubernetes

Everything worked fine when I ran it on Docker, but after I migrated it to Kubernetes it stopped connecting to the DB. It says:
pymongo.errors.ServerSelectionTimeoutError
pymongo.errors.ServerSelectionTimeoutError: connection closed
whenever I try to access a page that uses the DB.
I connect like this:
app.config['MONGO_DBNAME'] = 'pymongo_db'
app.config['MONGO_URI'] = 'mongodb://fakeuser:FakePassword#ds1336984.mlab.com:63984/pymongo_db'
Any way to get it to connect?
Edit:
I think it has more so to do with the Istio sidecars as when deployed on Kubernetes minus Istio, it runs normally. The issue only appears when running Istio.
Most likely Istio (the Envoy sidecar) is controlling egress traffic. You can check if you have any ServiceEntry and VirtuaService in your cluster for your specific application:
$ kubectl -n <your-namespace> get serviceentry
$ kubectl -n <your-namespace> get virtualservice
If they exist, check if they are allowing traffic to ds1336984.mlab.com. If they don't exist you will have to create them.

Categories

Resources