Using Cloud SQL (Postgres) in Cloud Function - python

I'm trying to connect to Cloud SQL running PostgreSQL 11 using a Cloud Function Python script. Ideally, I want to connect to the DB using a simple pandas.DataFrame.to_sql() which requires a sqlalchemy.create_engine() to connect.
This is my current attempt (I've tried both with/without /.s.PGSQL.5432 suffix):
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME/.s.PGSQL.5432')
engine.connect()
I also try the code sample provided by Google which uses psycopg2 directly but also didn't work
Error: function crashed. Details:
could not connect to server: Connection refused
Is the server running locally and accepting
connections on Unix domain socket "/cloudsql/vida-production:us-central1:vida-ops/.s.PGSQL.5432"?
Edit:
Thanks #kurtisvg for pointing this out. Since the Cloud Function is trying to connect to Cloud SQL that is on a different project. I have to add the Cloud Function's service account to IAM of the Cloud SQL project.
After setting IAM, the Python code I use to connect sqlalchemy engine to Cloud SQL is this:
# Don't add the "/.s.PGSQL.5432 suffix" because it will already be added back automatically by the library...
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME')
engine.connect()

Make sure to check the following things:
Have the Cloud SQL Admin API enabled
Make sure the Functions default service account has the Cloud SQL Client IAM role
If you are connecting between two projects, make sure to enable the Admin API and grant permissions (for both projects) to the Service Account that is running the function you are trying to connect from.

Related

How to setup Cloud SQL Connection for Cloud Run with run_v2.services.CloudSQLInstance in python client

I use python client library for Cloud API to update a CloudRun service. The steps are as follows:
Get Google credentials
Build google.cloud.run_v2.ServicesClient with these credentials
Build google.cloud.run_v2.RevisionTemplate
Build google.cloud.run_v2.Service
Run request with google.cloud.run_v2.UpdateServiceRequest
I need to set up SQL Connection to this Cloud Run service. Any idea how to glue CloudSqlInstance into the code?
https://cloud.google.com/python/docs/reference/run/latest/google.cloud.run_v2.types.CloudSqlInstance
Google search gives only GUI examples

Access GCP Cloud SQL from AI notebook?

Looking for relevant python code to be used in the GCP AI Platform notebook which will be able to query GCP Cloud SQL (specifically Postgresql.) Unfortunately, I haven't found any relevat resources/tutorials from GCP official or even unaffiliated resources.
I work on the Developer Relations team for Cloud SQL at Google. The quickest and simplest way to connect to a Cloud SQL instance from an AI Platform Notebook is to use our newly released Cloud SQL Python Connector. To install, run pip install cloud-sql-python-connector[pg8000]. This will also install the pg8000 driver which is used to connect to Postgres.
Then, you can use the following code to get a connection object which you can use to run queries:
conn = connector.connect(
"project_name:region-name:instance-name",
"pg8000",
user="postgres",
password="password",
db="db_name"
)
An alternative, which is language-independent, is to download and run the Cloud SQL Auth Proxy in a separate terminal from your notebook.
For either of these solutions, make sure to enable the Cloud SQL Admin API first.

Is there any way to connect a GAE application with a custom Google Cloud Database?

I'm trying to connect my webapp2 application to a 'in-cloud' database.
To run it local I'm using the following commands:
--datastore_path=/<path>/<to>/<project>/.db/datastore
--blobstore_path=/<path>/<to>/<project>/.db/blobstore
The problem is that I don't want a local path to my datastore/blobstore.
Is there any way to connect in a 'in-cloud' database passing a different path? Can't find any solutions like that
You can use Remote API:
The Remote API library allows any Python client to access services
available to App Engine applications.
For example, if your App Engine application uses Datastore or Google
Cloud Storage, a Python client could access those storage resources
using the Remote API.
in order to get remotely access to Google Cloud Datastore using webapp2.

How to deploy PostgresSQL python app which is hosted in google cloud?

I have problem with deploying python app which writes down additional info to sql database. Postgres SQL is hosted in google cloud and for connecting to it i'm using credential files and invoking it through cmd in path with stored json. Command
In powershell
>>CD "path where stored"
>>cloud_sql_proxy -credential_file=./credentials.json -instances="[PROJECT-ID]:[REGION]:[INSTANCE-NAME]=tcp:5435"
The question, how i will deploy it google app engine, And how can i call function with json. without typing it into cmd?
I see you are using the Cloud SQL Proxy to connect to your PostgreSQL instance.
In the specific case of App Engine the public facing documentation offers clear instructions and code on how to achieve this.
You basically need to provide the App Engine's service account the correct permissions (the recommended role will be Cloud SQL Client) and adapt the code in your Python application to connect to the instance.
The link shared shows how to use the SQL alchemy to achieve this connection for both Public and Private IP.

How to create a connection with google cloud instance with python code?

I'm completely working on python and need to connect my instance of cloud SQL to my python project(sort of Software). Now what I need is that without using cloud_sql_proxy I need to make the connection only and only using python so that client need not need to install Google Cloud SDK.
used cloud_sql_proxy need a way to execute that without google SDK
cloud_sql_proxy -instances=Instance-Name:tcp:3306
I expect that without installing google SDK by only using python client can access the database
If you really need to do this:
Expose your cloudsql instance to the ip address that the python code runs on. Do this in the console under cloudsql -> Connections -> Authorized networks.
Connect via the ip address of the instance using your chosen database connection tool. Looking at your snippet, you are using postgres, so I would suggest psycopg2
Otherwise, if your python code is also running in GCP, you can use the internal ip (provided that they are in the same network)

Categories

Resources