Access GCP Cloud SQL from AI notebook? - python

Looking for relevant python code to be used in the GCP AI Platform notebook which will be able to query GCP Cloud SQL (specifically Postgresql.) Unfortunately, I haven't found any relevat resources/tutorials from GCP official or even unaffiliated resources.

I work on the Developer Relations team for Cloud SQL at Google. The quickest and simplest way to connect to a Cloud SQL instance from an AI Platform Notebook is to use our newly released Cloud SQL Python Connector. To install, run pip install cloud-sql-python-connector[pg8000]. This will also install the pg8000 driver which is used to connect to Postgres.
Then, you can use the following code to get a connection object which you can use to run queries:
conn = connector.connect(
"project_name:region-name:instance-name",
"pg8000",
user="postgres",
password="password",
db="db_name"
)
An alternative, which is language-independent, is to download and run the Cloud SQL Auth Proxy in a separate terminal from your notebook.
For either of these solutions, make sure to enable the Cloud SQL Admin API first.

Related

Is there a way to programmatically DROP a SQL Database in Azure?

I am working on a process to automatically remove and add databases to Azure. When the database isn't in use, it can be removed from Azure and placed in cheaper S3 storage as a .bacpac.
I am using SqlPackage.exe from Microsoft as a PowerShell script to export and import these databases from and to Azure respectively in either direction. I invoke it via a Python script to use boto3.
The issue I have is with the down direction at step 3. The sequence would be:
Download the Azure SQL DB to a .bacpac (can be achieved with SqlPackage.exe)
Upload this .bacpac to cheaper S3 storage (using boto3 Python SDK)
Delete the Azure SQL Database (It appears the Azure Blob Python SDK can't help me, and it appears SQLPackage.exe does not have a delete function)
Is step 3 impossible to automate with a script? Could a workaround be to SqlPackage.exe import a small dummy .bacpac with the same name to overwrite the old bigger DB?
Thanks.
To remove an Azure SQL Database using PowerShell, you will need to use Remove-AzSqlDatabase Cmdlet.
To remove an Azure SQL Database using Azure CLI, you will need to us az sql db delete.
If you want to write code in Python to delete the database, you will need to use Azure SDK for Python.

How to deploy PostgresSQL python app which is hosted in google cloud?

I have problem with deploying python app which writes down additional info to sql database. Postgres SQL is hosted in google cloud and for connecting to it i'm using credential files and invoking it through cmd in path with stored json. Command
In powershell
>>CD "path where stored"
>>cloud_sql_proxy -credential_file=./credentials.json -instances="[PROJECT-ID]:[REGION]:[INSTANCE-NAME]=tcp:5435"
The question, how i will deploy it google app engine, And how can i call function with json. without typing it into cmd?
I see you are using the Cloud SQL Proxy to connect to your PostgreSQL instance.
In the specific case of App Engine the public facing documentation offers clear instructions and code on how to achieve this.
You basically need to provide the App Engine's service account the correct permissions (the recommended role will be Cloud SQL Client) and adapt the code in your Python application to connect to the instance.
The link shared shows how to use the SQL alchemy to achieve this connection for both Public and Private IP.

Using Cloud SQL (Postgres) in Cloud Function

I'm trying to connect to Cloud SQL running PostgreSQL 11 using a Cloud Function Python script. Ideally, I want to connect to the DB using a simple pandas.DataFrame.to_sql() which requires a sqlalchemy.create_engine() to connect.
This is my current attempt (I've tried both with/without /.s.PGSQL.5432 suffix):
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME/.s.PGSQL.5432')
engine.connect()
I also try the code sample provided by Google which uses psycopg2 directly but also didn't work
Error: function crashed. Details:
could not connect to server: Connection refused
Is the server running locally and accepting
connections on Unix domain socket "/cloudsql/vida-production:us-central1:vida-ops/.s.PGSQL.5432"?
Edit:
Thanks #kurtisvg for pointing this out. Since the Cloud Function is trying to connect to Cloud SQL that is on a different project. I have to add the Cloud Function's service account to IAM of the Cloud SQL project.
After setting IAM, the Python code I use to connect sqlalchemy engine to Cloud SQL is this:
# Don't add the "/.s.PGSQL.5432 suffix" because it will already be added back automatically by the library...
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME')
engine.connect()
Make sure to check the following things:
Have the Cloud SQL Admin API enabled
Make sure the Functions default service account has the Cloud SQL Client IAM role
If you are connecting between two projects, make sure to enable the Admin API and grant permissions (for both projects) to the Service Account that is running the function you are trying to connect from.

How to create a connection with google cloud instance with python code?

I'm completely working on python and need to connect my instance of cloud SQL to my python project(sort of Software). Now what I need is that without using cloud_sql_proxy I need to make the connection only and only using python so that client need not need to install Google Cloud SDK.
used cloud_sql_proxy need a way to execute that without google SDK
cloud_sql_proxy -instances=Instance-Name:tcp:3306
I expect that without installing google SDK by only using python client can access the database
If you really need to do this:
Expose your cloudsql instance to the ip address that the python code runs on. Do this in the console under cloudsql -> Connections -> Authorized networks.
Connect via the ip address of the instance using your chosen database connection tool. Looking at your snippet, you are using postgres, so I would suggest psycopg2
Otherwise, if your python code is also running in GCP, you can use the internal ip (provided that they are in the same network)

How can I make a database that can be accessed by multiple people running the same program on different computers?

I am looking to create a python application that accesses a database. However, people must be able to access this database from different computers and always receive an up to date version. I understand that this would have to be some sort of cloud based database, but I cannot seem to find a an API with python bindings or module that allows me to do this. Does anybody know of an API or module that I could use to do this?
you can try with Google Cloud Datastore and App Engine DataStore which are fulfil your requirements:
https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/
And for api you can use Remote API
If you are going to use AWS, you can use Amazon RDS for database and Elastic Beanstalk for deploying your python application on cloud.
This link provides you information how to implement the database part on AWS Adding an Amazon RDS DB Instance to Your Python Application Environment
If you want to use Microsoft Azure then you can refer to the following links
Azure SQL Database libraries for Python
Use Python to query an Azure SQL database

Categories

Resources