While setting permissions for a new Google Compute Engine VM instance, I noticed that "Cloud Datastore" is a service I can grant my VM access to.
As far as I can tell, remote_api and the Python Protobuf Datastore API both use service accounts, which bypass VM permissions.
Does Google have any Datastore libraries that support VM-permission-based authentication?
Yes.
There's also the Cloud Datastore API which can be accessed quite easily with gcloud-python, Python idiomatic client for Google Cloud Platform services. Specifically its datastore client a convenience wrapper for invoking APIs/factories w/ a dataset ID same as cloud project id.
Related
I am trying to get Firestore working and I am wondering which Python package I should use. It appears that there is some overlap between the functionality of the firebase_admin.firestore and google.cloud.firestore, as well as between firebase_admin.credentials and google.auth.credentials. But it also seems like there is some incompatibility between them, or at least they can't be used together.
What is the difference between these Python packages, and which is it recommended that a beginner should use?
Thanks!
Here’s a simple answer for beginners, as requested:
Google provides Firebase and Google Cloud Platform as two different suites of products. Some Google products are shared across these, Cloud Firestore being one of them.
So, if one is using Cloud Firestore from within a Firebase project, it is recommended to use firebase_admin.firestore and firebase_admin.credentials packages and other Firebase APIs as required.
If on the other hand, one is using the Google Cloud Platform project, it is recommended to use google.cloud.firestore, and google.auth.credentials packages and other GCP Cloud APIs as required.
HTH, reach out for any additional questions/queries.
If you plan to use other Firebase functionality (e.g. eventually have mobile users that authenticate with Firebase, and connect to their Firestore instance from their phones or web pages), then firebase_admin is the best choice.
Otherwise, if you plan on using Firestore without Firebase, then google.cloud.firestore would be more straight-forward.
The same applies to the credentials libraries.
And the main differences are between the Cloud services:
Cloud Firestore supports SDKs for Android, IOS, and Web. Combined with Cloud Firestore security rules and Firebase Auth, the mobile and web SDKs support serverless app architectures where clients connect directly to your Cloud Firestore database. With a serverless architecture, you do not need to maintain an intermediary server between your clients and your Cloud Firestore database.
The Firebase Admin SDKs bundle the Google Cloud client libraries for Cloud Firestore alongside client libraries and SDKs for several other Firebase features. And it is for accessing your Firebase products on a backend server you control, which could be Cloud Functions, or even your desktop. It will typically have full access to everything, as determined by the service account you used to initialize it.
I'm trying to connect my webapp2 application to a 'in-cloud' database.
To run it local I'm using the following commands:
--datastore_path=/<path>/<to>/<project>/.db/datastore
--blobstore_path=/<path>/<to>/<project>/.db/blobstore
The problem is that I don't want a local path to my datastore/blobstore.
Is there any way to connect in a 'in-cloud' database passing a different path? Can't find any solutions like that
You can use Remote API:
The Remote API library allows any Python client to access services
available to App Engine applications.
For example, if your App Engine application uses Datastore or Google
Cloud Storage, a Python client could access those storage resources
using the Remote API.
in order to get remotely access to Google Cloud Datastore using webapp2.
I am trying to build a python script and deploy it as an HTTP function/Serverless cloud function on Pivotal cloud foundry or GKE, but I have gone through several articles and most of them mention using an SA and download Json key, setup env variable to JSON key location and run the script.
But how can I provide local downloaded JSON key file when I deploy it on cloud?
I have gone through below links but I couldn't understand as I am new to GCP, can anyone provide me an elaborated anws on how can I achieve this?
Google Cloud Vision API - Python
Google cloud vision api- OCR
https://cloud.google.com/vision/docs/quickstart-client-libraries#client-libraries-usage-python
According to docs, during function execution, Cloud Functions uses the service account PROJECT_ID#appspot.gserviceaccount.com as its identity. For instance, when making requests to Google Cloud Platform services using the Google Cloud Client Libraries, Cloud Functions can automatically obtain and use tokens to authorize the services this identity has permissions to use.
By default, the runtime service account has the Editor role, which lets it access many GCP services. In your case, you will need to enable the Vision API and grant the default service account with necessary permissions. Check out the Function Identity Docs for more details.
Instead of a SA json file, you could use an api key if that's easier for you. However, if you use an api key, you will only be able to send image bytes or specify a public image.
I'm developing a platform which is needed to get the deployed/configured services from the AWS cloud. When I have checked with google I got know we need to set-up a third party user with the cross-platform role.
Scenario:
consider In my AWS account have configured s3, Cognito, ec2, load balancer, Dynamodb, etc. I want to make an engine to get the user deployed services from the AWS using python programming.
I have not found a satisfactory answer/tutorial for this, but I'm sure it must be out there. My goal is to access Google Drive programmatically using my credentials. A secondary and lower-priority goal is to do this properly and that means using OAuth rather than ClientLogin.
Thus: How do you authenticate with the Google Drive API using your own credentials for your own Google Drive (without creating an application on the Google Developers Console)?
All of the documentation assumes an application, but what I'm writing is merely helper scripts in Python 2.7 for my own benefit.
"How do you authenticate with the Google Drive API using your own credentials for your own Google Drive (without creating an application on the Google Developers Console)?"
You can't. The premise of OAuth is that the user is granting access to the application, and so the application must be registered. In Google's case, that's the API/Cloud Console.
In your case, there is no need to register each application that uses your helper scripts. Just create an app called helper_scripts, embed the client Id in your script source, and then reuse those scripts in as many applications as you like.