I have problem with deploying python app which writes down additional info to sql database. Postgres SQL is hosted in google cloud and for connecting to it i'm using credential files and invoking it through cmd in path with stored json. Command
In powershell
>>CD "path where stored"
>>cloud_sql_proxy -credential_file=./credentials.json -instances="[PROJECT-ID]:[REGION]:[INSTANCE-NAME]=tcp:5435"
The question, how i will deploy it google app engine, And how can i call function with json. without typing it into cmd?
I see you are using the Cloud SQL Proxy to connect to your PostgreSQL instance.
In the specific case of App Engine the public facing documentation offers clear instructions and code on how to achieve this.
You basically need to provide the App Engine's service account the correct permissions (the recommended role will be Cloud SQL Client) and adapt the code in your Python application to connect to the instance.
The link shared shows how to use the SQL alchemy to achieve this connection for both Public and Private IP.
Related
I'm trying to connect my webapp2 application to a 'in-cloud' database.
To run it local I'm using the following commands:
--datastore_path=/<path>/<to>/<project>/.db/datastore
--blobstore_path=/<path>/<to>/<project>/.db/blobstore
The problem is that I don't want a local path to my datastore/blobstore.
Is there any way to connect in a 'in-cloud' database passing a different path? Can't find any solutions like that
You can use Remote API:
The Remote API library allows any Python client to access services
available to App Engine applications.
For example, if your App Engine application uses Datastore or Google
Cloud Storage, a Python client could access those storage resources
using the Remote API.
in order to get remotely access to Google Cloud Datastore using webapp2.
I have built an app that uses mysql database with Python, I would love to share some functionalities with different applications and that calls for an online database feature, kindly give me some insights over how i can move a python mysql database to online and how to make calls to it in order to facilitate for sharing of data between different applications.
I don't exactly know what you are calling a python database but there are some options here that you might want to consider
First, use heroku to host your app and heroku postgress to host your databaseOr you can use an EC2 aws machine to host your app and it's database (in case it's a custom code that you can't call from a browser using heroku)with both of these options you can access you database and the appp with the second one you can install other services such as ssh and other.
I have tried to follow googles documentation on how to set up local development using a database (https://cloud.google.com/appengine/docs/standard/python/tools/using-local-server#Python_Using_the_Datastore). However, i do not have the experience level to follow along. I am not even sure if that was the right guide. The application is a Django project that uses python 2.7. To run the local host, i usually type dev_appserver.py --host 127.0.0.1 .
My questions are:
how do i download the data store database on google cloud. I do not want to download the entire database, just enough data to populate local host so i can do tests
once the database is download, what do i need to do to connect it to the localhost? Do i have to change a parameter somewhere?
do i need to download the datastore? Can i just make a duplicate on the cloud and then connect to that datastore?
When i run localhost, should it not already be connected to the datastore? Since the site works when it is running on the cloud. Where can i find the connection URI?
Thanks for the help
The development server is meant to simulate the whole App Engine Environment, if you examine the output of the dev_appserver.py command you'll see something like Starting Cloud Datastore emulator at: http://localhost:PORT. Your code will interact with that bundled Datastore automatically, pushing and retrieving data according to the code you wrote. Your data will be saved on a file in local storage and will persist across different runs of the development server unless it's explicitly deleted.
This option doesn't provide facilities to import data from your existing Cloud Datastore instance although it's a ready to go solution if your testing procedures can afford populating the local database with mock data through the use of a custom created script that does so programmatically. If you decide for this approach just write the data creation script and execute it before running the tests.
Now, there is another option to simulate local Datastore using the Cloud SDK that comes with handy features for your purposes. You can find the available information for it under Running the Datastore Emulator documentation page. This emulator has support to import entities downloaded from your production Cloud Datastore as well as for exporting them into files.
Back to your questions:
Export data from the Cloud instance into a GCS bucket following this, then download the data from the bucket to your filesystem following this, finally import the data into the emulator with the command shown here.
To use the emulator you need to first run gcloud beta emulators datastore start in a Cloud Shell and then in a separate tab run dev_appserver.py --support_datastore_emulator=true --datastore_emulator_port=8081 app.yaml.
The development server uses one of the two aforementioned emulators, in both cases it is not connected to your Cloud Datastore. You might create another project aimed for development purposes with a copy of your database and deploy your application there so you don't use the emulator at all.
Requests at datastore are made trough the endpoint https://datastore.googleapis.com/v1/projects/project-id although this is not related to how the emulators manage the connections in your local server.
Hope this helps.
I'm trying to connect to Cloud SQL running PostgreSQL 11 using a Cloud Function Python script. Ideally, I want to connect to the DB using a simple pandas.DataFrame.to_sql() which requires a sqlalchemy.create_engine() to connect.
This is my current attempt (I've tried both with/without /.s.PGSQL.5432 suffix):
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME/.s.PGSQL.5432')
engine.connect()
I also try the code sample provided by Google which uses psycopg2 directly but also didn't work
Error: function crashed. Details:
could not connect to server: Connection refused
Is the server running locally and accepting
connections on Unix domain socket "/cloudsql/vida-production:us-central1:vida-ops/.s.PGSQL.5432"?
Edit:
Thanks #kurtisvg for pointing this out. Since the Cloud Function is trying to connect to Cloud SQL that is on a different project. I have to add the Cloud Function's service account to IAM of the Cloud SQL project.
After setting IAM, the Python code I use to connect sqlalchemy engine to Cloud SQL is this:
# Don't add the "/.s.PGSQL.5432 suffix" because it will already be added back automatically by the library...
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME')
engine.connect()
Make sure to check the following things:
Have the Cloud SQL Admin API enabled
Make sure the Functions default service account has the Cloud SQL Client IAM role
If you are connecting between two projects, make sure to enable the Admin API and grant permissions (for both projects) to the Service Account that is running the function you are trying to connect from.
I am looking to create a python application that accesses a database. However, people must be able to access this database from different computers and always receive an up to date version. I understand that this would have to be some sort of cloud based database, but I cannot seem to find a an API with python bindings or module that allows me to do this. Does anybody know of an API or module that I could use to do this?
you can try with Google Cloud Datastore and App Engine DataStore which are fulfil your requirements:
https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/
And for api you can use Remote API
If you are going to use AWS, you can use Amazon RDS for database and Elastic Beanstalk for deploying your python application on cloud.
This link provides you information how to implement the database part on AWS Adding an Amazon RDS DB Instance to Your Python Application Environment
If you want to use Microsoft Azure then you can refer to the following links
Azure SQL Database libraries for Python
Use Python to query an Azure SQL database