Google Cloud SQL - OS environment not set to Google App Engine - python

I am attempting to connected to a Google Cloud SQL instance in python and have gone through google's tutorial: https://cloud.google.com/appengine/docs/python/cloud-sql/
I am essentially cloning google's tutorial code and for some reason this line isn't working right for me:
if (os.getenv('SERVER_SOFTWARE') and
os.getenv('SERVER_SOFTWARE').startswith('Google App Engine/')):
This if statement is not being entered and I'm not sure why - it is then defaulting to accessing a local database based on the else statement. How is the os server_software environment set? I'm new to all of this but basically because that is not getting set, I am not able to access my google cloud sql instance. How do I make sure this if statement is entered?

SERVER_SOFTWARE is an environment variable that is automatically set by GAE. It could either be something like Google App Engine/x.x.xx when deployed or Development/x.x when running locally.
Basically the section of the code you're referring to checks whether your app is deployed and is running on GAE servers and if so - it will connect to a Google Cloud SQL instance, otherwise, if your app is running locally, it will attempt to connect to a local mysql instance.
It's done that way because you wouldn't normally want to mess with your production (deployed) data while developing & testing locally as many thing could go wrong.
Since you're stating that the if statement is not being entered - it's safe to assume that you are trying to run the program locally but are expecting it to connect to a Google Cloud SQL instance, for that, the next few lines in the example you provided are explaining how to do it:
db = MySQLdb.connect(host='127.0.0.1', port=3306, db='guestbook', user='root', charset='utf8')
# Alternatively, connect to a Google Cloud SQL instance using:
# db = MySQLdb.connect(host='ip-address-of-google-cloud-sql-instance', port=3306, user='root', charset='utf8')
so what you need to do is comment out the first line (the one it attempts to connect to a localhost mysql server) and uncomment the one where it connects to the Google Cloud SQL instance (note that you will have to update several parameters that reflect the configuration that you have, i.e. the host parameter and possible others).

Related

Connecting to Cloud SQL postgres instance with SSL in python

I am trying to connect to a postgres instance I have in cloud sql. I have everything set up and am able to connect to it if ssl encryption is turned off. But now that I have it on I am trying to connect but running into some error.
def run():
connector = Connector()
def getconn():
conn = connector.connect(
os.getenv("CONNECTION_NAME"),
"pg8000",
user = os.getenv('DB_USERNAME'),
password = os.getenv("DB_PASSWORD"),
db=os.getenv('DB_NAME'),
ip_type= IPTypes.PRIVATE
)
return conn
pool = sqlalchemy.create_engine(
"postgresql+pg8000://",
creator=getconn,
pool.execute("CREATE TABLE........;")
All the certs are stored in secret manager as strings so I am using env variables to grab them, which is why I used cadata for example. But running into this error cadata does not contain a certificate why is this error coming up?
I'd recommend using the Cloud SQL Python Connector to connect to Cloud SQL from Python as it will generate the SSL context for you, meaning no need to manage SSL certificates! It also has additional benefits of not needing to authorize networks etc.
You can find a code sample for the Python Connector similar to the one you are using for establishing a TCP connection.
There is also an interactive getting started Colab Notebook that will walk you through using the Python Connector without you needing to change a single line of code!
It makes connecting to Cloud SQL both easy and secure.

Connecting to SQL Server with Python SQLAlchemy impersonating a specific windows account

I am trying to connect to a SQL Server instance using SQLAlchemy through Python, however I require the SQL connection to come from a specific Windows AD account that isn't the one I run VSCode with. Is there any way to modify the connection string below to explicitly feed SQL Server a Windows login that isn't the same login I am using the run VSCode?
(I am able to connect with this connection string if I "Run as a different User" in VSCode, however the AD accounts with SQL access does not have shared drive access and therefore cannot access shared files, so this won't scale long-term)
import urllib
from sqlalchemy import create_engine
params = urllib.parse.quote_plus('DRIVER={SQL Server};SERVER={server};DATABASE={database};Trusted_Connection=Yes')
engine = create_engine(f'mssql+pyodbc:///?odbc_connect={params}')

Accessing an Azure Database for MySQL Single Server from outside Azure

Moving this question from DevOps Stack Exchange where it got only 5 views in 2 days:
I would like to query an Azure Database for MySQL Single Server.
I normally interact with this database using a universal database tool (dBeaver) installed onto an Azure VM. Now I would like to interact with this database using Python from outside Azure. Ultimately I would like to write an API (FastAPI) allowing multiple users to connect to the database.
I ran a simple test from a Jupyter notebook, using SQLAlchemy as my ORM and specifying the pem certificate as a connection argument:
import pandas as pd
from sqlalchemy import create_engine
cnx = create_engine('mysql://XXX', connect_args={"ssl": {"ssl_ca": "mycertificate.pem"}})
I then tried reading data from a specific table (e.g. mytable):
df = pd.read_sql('SELECT * FROM mytable', cnx)
Alas I ran into the following error:
'Client with IP address 'XX.XX.XXX.XXX' is not allowed to connect to
this MySQL server'.
According to my colleagues, a way to fix this issue would be to whitelist my IP address.
While this may be an option for a couple of users with static IP addresses I am not sure whether it is a valid solution in the long run.
Is there a better way to access an Azure Database for MySQL Single Server from outside Azure?
As mentioned in comments, you need to whitelist the IP address ranges(s) in the Azure portal for your MySQL database resource. This is a well accepted and secure approach.

Connecting to Cloud SQL from Google Cloud Function using Python and SQLAlchemy

I read all documentation related to connecting to MysQL hosted in Cloud SQL from GCF and still can't connect. Also, tried all hints in documentation of SQLAlchemy related to this.
I am using the following connection
con = 'mysql+pymysql://USER:PASSWORD#/MY_DB?unix_socket=/cloudsql/Proj_ID:Zone:MySQL_Instance_ID'
mysqlEngine = sqlalchemy.create_engine(con)
The error I got was:
(pymysql.err.OperationalError) (2003, "Can't connect to MySQL server on 'localhost' ([Errno 111] Connection refused)") (Background on this error at: http://sqlalche.me/e/e3q8)
You need to make sure you are using the correct /cloudsql/<INSTANCE_CONNECTION_NAME> (This is in the format <PROJECT_ID>:<REGION>:<INSTANCE_ID>). This should be all that's needed if your Cloud SQL instance is in the same project and region as your Function.
The GCF docs also strongly recommends limiting your pool to a single connection. This means you should set both pool_size=1 and max_overflow=0 in your engine settings.
If you would like to see an example of how to set these settings, check out this sample application on Github.
I believe that your problem is with the Connection_name represented by <PROJECT_ID>:<REGION>:<INSTANCE_ID> at the end of the con string variable.
Which by the way should be quoted:
con = 'mysql+pymysql://USER:PASSWORD#/MY_DB?unix_socket=/cloudsql/<PROJECT_ID>:<REGION>:<INSTANCE_ID>'
Check if you are writing it right with this command:
gcloud sql instances describe <INSTANCE_ID> | grep connectionName
If this is not the case, keep in mind these considerations present in the Cloud Functions official documentation:
First Generation MySQL instances must be in the same region as your Cloud Function. Second Generation MySQL instances as well as PostgreSQL instances work with Cloud Functions in any region.
Your Cloud Function has access to all Cloud SQL instances in your project. You can access Second Generation MySQL instances as well as PostgreSQL instances in other projects if your Cloud Function's service account (listed on the Cloud Function's General tab in the GCP Console) is added as a member in IAM on the project with the Cloud SQL instance(s) with the Cloud SQL Client role.
After a long thread with Google Support, we found the reason to be: simply we should enable public access to Cloud SQL without any firewall rule. It is undocumented and can drive you crazy, but the silver bullet for the support team is to say: it is in beta!
I was having this issue. Service account was correct, had the correct permissions, same exact connection string as in my App Engine application. Still got this in the logs.
dial unix /cloudsql/project:region:instance connect: no such file or directory
Switching from 2nd generation Cloud Function to 1st generation solved it. Didn't see it documented anywhere that 2nd couldn't connect to Cloud SQL instances.

Connect google datastore from existing google compute engine in python

I'm trying to connect to Datastore from existing compute engine instance and I'm getting:
[ python 2.7 - googledatastore-v1beta2_rev1_2.1.0-py2.7 ]
googledatastore.connection.RPCError: commit RPC client failure with HTTP(403) Forbidden: Unauthorized.
the Datastore API is enabled, Permissions is set but GCE is in different zone, one project
what else ?
GCE env:
DATASTORE_DATASET = project_id
DATASTORE_PRIVATE_KEY_FILE = absolute path to pem file
DATASTORE_SERVICE_ACCOUNT = service_account_email
Any tips what should I do/check ? I'm confused because I have exactly the same configuration in my local environment - when I click "play" in pyCharm everything works well ;)
Maybe I missed something...
Thanks for your help ;)
This is currently a bug in the Cloud Datastore client library. If you are running on GCE, it will try to use the scope rules and then fail before trying other authentication methods.

Categories

Resources