I'm completely working on python and need to connect my instance of cloud SQL to my python project(sort of Software). Now what I need is that without using cloud_sql_proxy I need to make the connection only and only using python so that client need not need to install Google Cloud SDK.
used cloud_sql_proxy need a way to execute that without google SDK
cloud_sql_proxy -instances=Instance-Name:tcp:3306
I expect that without installing google SDK by only using python client can access the database
If you really need to do this:
Expose your cloudsql instance to the ip address that the python code runs on. Do this in the console under cloudsql -> Connections -> Authorized networks.
Connect via the ip address of the instance using your chosen database connection tool. Looking at your snippet, you are using postgres, so I would suggest psycopg2
Otherwise, if your python code is also running in GCP, you can use the internal ip (provided that they are in the same network)
Related
I have problem with deploying python app which writes down additional info to sql database. Postgres SQL is hosted in google cloud and for connecting to it i'm using credential files and invoking it through cmd in path with stored json. Command
In powershell
>>CD "path where stored"
>>cloud_sql_proxy -credential_file=./credentials.json -instances="[PROJECT-ID]:[REGION]:[INSTANCE-NAME]=tcp:5435"
The question, how i will deploy it google app engine, And how can i call function with json. without typing it into cmd?
I see you are using the Cloud SQL Proxy to connect to your PostgreSQL instance.
In the specific case of App Engine the public facing documentation offers clear instructions and code on how to achieve this.
You basically need to provide the App Engine's service account the correct permissions (the recommended role will be Cloud SQL Client) and adapt the code in your Python application to connect to the instance.
The link shared shows how to use the SQL alchemy to achieve this connection for both Public and Private IP.
I have built an app that uses mysql database with Python, I would love to share some functionalities with different applications and that calls for an online database feature, kindly give me some insights over how i can move a python mysql database to online and how to make calls to it in order to facilitate for sharing of data between different applications.
I don't exactly know what you are calling a python database but there are some options here that you might want to consider
First, use heroku to host your app and heroku postgress to host your databaseOr you can use an EC2 aws machine to host your app and it's database (in case it's a custom code that you can't call from a browser using heroku)with both of these options you can access you database and the appp with the second one you can install other services such as ssh and other.
I have tried to follow googles documentation on how to set up local development using a database (https://cloud.google.com/appengine/docs/standard/python/tools/using-local-server#Python_Using_the_Datastore). However, i do not have the experience level to follow along. I am not even sure if that was the right guide. The application is a Django project that uses python 2.7. To run the local host, i usually type dev_appserver.py --host 127.0.0.1 .
My questions are:
how do i download the data store database on google cloud. I do not want to download the entire database, just enough data to populate local host so i can do tests
once the database is download, what do i need to do to connect it to the localhost? Do i have to change a parameter somewhere?
do i need to download the datastore? Can i just make a duplicate on the cloud and then connect to that datastore?
When i run localhost, should it not already be connected to the datastore? Since the site works when it is running on the cloud. Where can i find the connection URI?
Thanks for the help
The development server is meant to simulate the whole App Engine Environment, if you examine the output of the dev_appserver.py command you'll see something like Starting Cloud Datastore emulator at: http://localhost:PORT. Your code will interact with that bundled Datastore automatically, pushing and retrieving data according to the code you wrote. Your data will be saved on a file in local storage and will persist across different runs of the development server unless it's explicitly deleted.
This option doesn't provide facilities to import data from your existing Cloud Datastore instance although it's a ready to go solution if your testing procedures can afford populating the local database with mock data through the use of a custom created script that does so programmatically. If you decide for this approach just write the data creation script and execute it before running the tests.
Now, there is another option to simulate local Datastore using the Cloud SDK that comes with handy features for your purposes. You can find the available information for it under Running the Datastore Emulator documentation page. This emulator has support to import entities downloaded from your production Cloud Datastore as well as for exporting them into files.
Back to your questions:
Export data from the Cloud instance into a GCS bucket following this, then download the data from the bucket to your filesystem following this, finally import the data into the emulator with the command shown here.
To use the emulator you need to first run gcloud beta emulators datastore start in a Cloud Shell and then in a separate tab run dev_appserver.py --support_datastore_emulator=true --datastore_emulator_port=8081 app.yaml.
The development server uses one of the two aforementioned emulators, in both cases it is not connected to your Cloud Datastore. You might create another project aimed for development purposes with a copy of your database and deploy your application there so you don't use the emulator at all.
Requests at datastore are made trough the endpoint https://datastore.googleapis.com/v1/projects/project-id although this is not related to how the emulators manage the connections in your local server.
Hope this helps.
I'm trying to connect to Cloud SQL running PostgreSQL 11 using a Cloud Function Python script. Ideally, I want to connect to the DB using a simple pandas.DataFrame.to_sql() which requires a sqlalchemy.create_engine() to connect.
This is my current attempt (I've tried both with/without /.s.PGSQL.5432 suffix):
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME/.s.PGSQL.5432')
engine.connect()
I also try the code sample provided by Google which uses psycopg2 directly but also didn't work
Error: function crashed. Details:
could not connect to server: Connection refused
Is the server running locally and accepting
connections on Unix domain socket "/cloudsql/vida-production:us-central1:vida-ops/.s.PGSQL.5432"?
Edit:
Thanks #kurtisvg for pointing this out. Since the Cloud Function is trying to connect to Cloud SQL that is on a different project. I have to add the Cloud Function's service account to IAM of the Cloud SQL project.
After setting IAM, the Python code I use to connect sqlalchemy engine to Cloud SQL is this:
# Don't add the "/.s.PGSQL.5432 suffix" because it will already be added back automatically by the library...
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME')
engine.connect()
Make sure to check the following things:
Have the Cloud SQL Admin API enabled
Make sure the Functions default service account has the Cloud SQL Client IAM role
If you are connecting between two projects, make sure to enable the Admin API and grant permissions (for both projects) to the Service Account that is running the function you are trying to connect from.
We have a Web Server in our company and create a MySQL Server on OpenShift.
We need to use Python to access the database server without rhc port forward,
Can we have others way to access MySQL on OpenShift directly?
thanks
You can access the gear directly just as you would any other Database not housed on OpenShift.
When you created the MySQL cartridge you should have been given a connection string:
mysql://OPENSHIFT_DB_GEAR_DNS:OPENSHIFT_DB_PORT/...
You can use that provided connection string and authentication to access the application.
Note: These strings above are environment variables on the gear, and will typicaly translate to something like:
mysql://app-namespace.rhcloud.com:55582/
This can be used from outside of the gear by other applications to access the database that is hosted on OpenShift.
Note: The OpenShift forums have lots covering this topic.