I am trying to get Firestore working and I am wondering which Python package I should use. It appears that there is some overlap between the functionality of the firebase_admin.firestore and google.cloud.firestore, as well as between firebase_admin.credentials and google.auth.credentials. But it also seems like there is some incompatibility between them, or at least they can't be used together.
What is the difference between these Python packages, and which is it recommended that a beginner should use?
Thanks!
Here’s a simple answer for beginners, as requested:
Google provides Firebase and Google Cloud Platform as two different suites of products. Some Google products are shared across these, Cloud Firestore being one of them.
So, if one is using Cloud Firestore from within a Firebase project, it is recommended to use firebase_admin.firestore and firebase_admin.credentials packages and other Firebase APIs as required.
If on the other hand, one is using the Google Cloud Platform project, it is recommended to use google.cloud.firestore, and google.auth.credentials packages and other GCP Cloud APIs as required.
HTH, reach out for any additional questions/queries.
If you plan to use other Firebase functionality (e.g. eventually have mobile users that authenticate with Firebase, and connect to their Firestore instance from their phones or web pages), then firebase_admin is the best choice.
Otherwise, if you plan on using Firestore without Firebase, then google.cloud.firestore would be more straight-forward.
The same applies to the credentials libraries.
And the main differences are between the Cloud services:
Cloud Firestore supports SDKs for Android, IOS, and Web. Combined with Cloud Firestore security rules and Firebase Auth, the mobile and web SDKs support serverless app architectures where clients connect directly to your Cloud Firestore database. With a serverless architecture, you do not need to maintain an intermediary server between your clients and your Cloud Firestore database.
The Firebase Admin SDKs bundle the Google Cloud client libraries for Cloud Firestore alongside client libraries and SDKs for several other Firebase features. And it is for accessing your Firebase products on a backend server you control, which could be Cloud Functions, or even your desktop. It will typically have full access to everything, as determined by the service account you used to initialize it.
Related
I am trying to build a python script and deploy it as an HTTP function/Serverless cloud function on Pivotal cloud foundry or GKE, but I have gone through several articles and most of them mention using an SA and download Json key, setup env variable to JSON key location and run the script.
But how can I provide local downloaded JSON key file when I deploy it on cloud?
I have gone through below links but I couldn't understand as I am new to GCP, can anyone provide me an elaborated anws on how can I achieve this?
Google Cloud Vision API - Python
Google cloud vision api- OCR
https://cloud.google.com/vision/docs/quickstart-client-libraries#client-libraries-usage-python
According to docs, during function execution, Cloud Functions uses the service account PROJECT_ID#appspot.gserviceaccount.com as its identity. For instance, when making requests to Google Cloud Platform services using the Google Cloud Client Libraries, Cloud Functions can automatically obtain and use tokens to authorize the services this identity has permissions to use.
By default, the runtime service account has the Editor role, which lets it access many GCP services. In your case, you will need to enable the Vision API and grant the default service account with necessary permissions. Check out the Function Identity Docs for more details.
Instead of a SA json file, you could use an api key if that's easier for you. However, if you use an api key, you will only be able to send image bytes or specify a public image.
We are using GCP's Firebase with Firestore for a new mobile app we are developing. As part of this effort we need to deploy a number of cloud functions which will act as Firestore triggers for doing some back end processing.
Our intention is to keep the deploys encapsulated inside of Firebase by using the firebase cli tools. However when we attempt to initiate the Firebase project for functions using the "firebase init functions" call the only two language options are "Javascript" and "Typescript", and the only deployable stack seems to be Node.js.
On previous GCP projects we had deployed Python based cloud functions (using the gcloud cli) and ideally we'd like to continue using Python for our Firebase cloud functions. So my questions are:
is it possible to deploy Python-based Firebase cloud functions? If not:
can we simply go back to deploying Python-based GCP cloud functions using the gcloud cli and still have them work as Firestore triggers?
Thanks
The Firebase CLI does not support deploying functions written in python.
You can certainly write Cloud Firestore triggers in python and deploy them with gcloud.
One thing you might not be aware of: the underlying Cloud Functions product is the same no matter how you deploy your functions. Firebase just adds tools and APIs on top of the existing Google Cloud Functions infrastructure. There is really no such thing as a "Firebase Cloud Function". There is just Cloud Functions, and you have options about how you can write and deploy them, either using gcloud, or the Firebase CLI.
Just have a small doubt, I am building a small application in python which will use firebase database and storage, I am reading too much about firebase admin sdk, What is the basic difference between normal firebase services and firebase admin sdk or is it one and the same, also if i am starting development now is using firebase admin sdk recommended...?
Also i will integrate the database and storage to my android application.
I'll add on to what Doug said in his answer, I suspect you might be confusing the front-end Firebase packages with back-end admin packages.
If you are going to have the client interact with Firebase, you'll need to use front-end packages. Depending on the platform your front-end is being presented on (Web, iOS, Android, etc...) there are different options to suit your platform... web=JavaScript SDK... ios=ios SDK... etc...
The Admin SDKs allow you to add back-end functionality. Because there are so many different languages that can be used on the back-end, there are many flavors of the Admin SDK.
This release-notes page does a good job demonstrating the many "Firebase" packages available... both front-end and back-end.
If you're writing code with python and you want to access Firebase and Cloud serivces, the recommended option is to use the admin SDK. It's designed to be the easiest way to read and write data in your database, upload and download files to your Cloud Storage buckets, and perform other administrative functions, such as authenticated user management.
I'm not sure what you mean by "normal Firebase services".
Python Firebase admin SDK: If you use Firebase Admin SDK then you can have admin access to things without worrying about restrictions and rules on your database.
Python Firebase: If you use python firebase (normal firebase) then you need to authenticate each time you access your data (Means you can set rules and permissions for your document)
My goal is to build an app powered by Google App Engine + NDB Datastore which facilitates a RESTful API so that I can use VueJS in the frontend.
I am at a lost with this task. From my research, I have been pointed towards endpoints-proto-datastore and Google Cloud Endpoints. The front page of endpoints-proto-datastore states the following:
This library is intended to be used with the Python version of Google
Cloud Endpoints.
This sentence suggests that even this library requires, depends or relates to Google Cloud Endpoints. I have tried to get started with Google Cloud Endpoints and end up dabbling with swagger.io and the Open API. I've been through the rabbit hole for months. At this point, I'm seeking some clarification.
I have the following questions:
What is the best approach to achieving my goal?
What is the relationship between these two pieces of the puzzle (Cloud Endpoints and endpoints-proto-datastore)?
What is the simplest way to get started with my goal?
Thank you.
endpoints-proto-datastore is a library designed to be used with the Cloud Endpoints Framework for Python. However, endpoints-proto-datastore is not itself part of the Cloud Endpoints Framework, and is not supported by Google.
While Cloud Datastore is accessible from any environment, ndb Datastore can be used only in App Engine's Standard environment, so you would need to use the Cloud Endpoints Framework for Python, rather than the regular Cloud Endpoints functionality.
While setting permissions for a new Google Compute Engine VM instance, I noticed that "Cloud Datastore" is a service I can grant my VM access to.
As far as I can tell, remote_api and the Python Protobuf Datastore API both use service accounts, which bypass VM permissions.
Does Google have any Datastore libraries that support VM-permission-based authentication?
Yes.
There's also the Cloud Datastore API which can be accessed quite easily with gcloud-python, Python idiomatic client for Google Cloud Platform services. Specifically its datastore client a convenience wrapper for invoking APIs/factories w/ a dataset ID same as cloud project id.