I am attempting to use the Google Admin SDK to read user data within cloud functions. I am developing this on a local machine then running a Google Build that deploys the cloud function. How can I initialize the Admin SDK in python using a key (or a set of keys)? Ideally, I'd like to keep the secret key locally under .env (not in source control), then use IAM's Cryptographic Keychain to generate an encrypted key that can be kept in source control. However, I do not see how I can initialize the Python Admin SDK with just a series of keys. How is this achievable?
Related
I'm trying to start using Google Analytics API 4 as instructed with Python and Jupyter Notebook. I follow the instructions https://developers.google.com/analytics/devguides/reporting/data/v1/quickstart-client-libraries and get to Step 3. Configure authentication
And then they write that you need to set GOOGLE_APPLICATION_CREDENTIALS="[PATH]" I downloaded this file to my computer and added it to the project folder, but I can't get verified using the service account.
On github they write https://github.com/googleapis/python-analytics-data#installation that you need to use a virtual environment? Is it so? Will it work without it?
I am using service-account, not oauth 2.0
To be clear GOOGLE_APPLICATION_CREDENTIALS is a virtual environmental variable. This variable is used by many of the google client libraries to load the credentials for any of the APIs. The question i have duplicated this as shows a number of ways to set it.
As you seem to still be a little unsure. Here is some additinal information.
As stated in the docs.
An easy way to provide service account credentials is by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable, the API client will use the value of this variable to find the service account key JSON file.
You need to set an env var on your machine to the path of the service account key file.
There are a number of examples of how to do that.
Authenticating as a service accoun
Set GOOGLE_APPLICATION_CREDENTIALS in Python project to use Google API
I did. I'll tell you how it turns out for beginners:
https://developers.google.com/analytics/devguides/reporting/data/v1/quickstart-client-libraries
Click on the blue button Enable the Google Analytics Data API v1, a project is created, a service account is created in Google Cloud and the Google Analytics API is activated. Download JSON file
After that, you need to add a service account in the GA4 resource
Then you need to authenticate according to the method from here https://cloud.google.com/docs/authentication/production#setting_the_environment_variable
This requires Passing credentials manually. Using the code in the article, the path to the json file on the computer is indicated
When trying to authenticate a service account, 403 GET storage.googleapis.com/storage/v1/… will be issued: starting-account-smhpwtovr5jj #test-data-api-1654114095791.iam.gserviceaccount.com does not have storage.buckets.list access to the Google Cloud project.
To do this, you need to go to IAM and create a new role for the Owner or Storage Admin service account. After that, GOOGLE_APPLICATION_CREDENTIALS finds and you can start installing the Data API library
I'm trying to zip deploy an azure function from a blob storage.
I have set SCM_DO_BUILD_DURING_DEPLOYMENT, to true.
I have also set WEBSITE_RUN_FROM_PACKAGE to the remote url.
I am able to deploy easily if the function is in a remote url. However, I can't seem to do it if I have it as a blob on azure.
The prefarable runtime for this is python.
For having the zip deploy from storage account you need to navigate to your .zip blob in your storage account and get the generated SAS token for that blob.
Then add the same url in your Function App Application settings for WEBSITE_RUN_FROM_PACKAGE.
NOTE:- This option is the only one supported for running from a package on Linux hosted in a Consumption plan.
For more information on this you can refer Run your functions from a package file in Azure
I am trying to build a python script and deploy it as an HTTP function/Serverless cloud function on Pivotal cloud foundry or GKE, but I have gone through several articles and most of them mention using an SA and download Json key, setup env variable to JSON key location and run the script.
But how can I provide local downloaded JSON key file when I deploy it on cloud?
I have gone through below links but I couldn't understand as I am new to GCP, can anyone provide me an elaborated anws on how can I achieve this?
Google Cloud Vision API - Python
Google cloud vision api- OCR
https://cloud.google.com/vision/docs/quickstart-client-libraries#client-libraries-usage-python
According to docs, during function execution, Cloud Functions uses the service account PROJECT_ID#appspot.gserviceaccount.com as its identity. For instance, when making requests to Google Cloud Platform services using the Google Cloud Client Libraries, Cloud Functions can automatically obtain and use tokens to authorize the services this identity has permissions to use.
By default, the runtime service account has the Editor role, which lets it access many GCP services. In your case, you will need to enable the Vision API and grant the default service account with necessary permissions. Check out the Function Identity Docs for more details.
Instead of a SA json file, you could use an api key if that's easier for you. However, if you use an api key, you will only be able to send image bytes or specify a public image.
We are using GCP's Firebase with Firestore for a new mobile app we are developing. As part of this effort we need to deploy a number of cloud functions which will act as Firestore triggers for doing some back end processing.
Our intention is to keep the deploys encapsulated inside of Firebase by using the firebase cli tools. However when we attempt to initiate the Firebase project for functions using the "firebase init functions" call the only two language options are "Javascript" and "Typescript", and the only deployable stack seems to be Node.js.
On previous GCP projects we had deployed Python based cloud functions (using the gcloud cli) and ideally we'd like to continue using Python for our Firebase cloud functions. So my questions are:
is it possible to deploy Python-based Firebase cloud functions? If not:
can we simply go back to deploying Python-based GCP cloud functions using the gcloud cli and still have them work as Firestore triggers?
Thanks
The Firebase CLI does not support deploying functions written in python.
You can certainly write Cloud Firestore triggers in python and deploy them with gcloud.
One thing you might not be aware of: the underlying Cloud Functions product is the same no matter how you deploy your functions. Firebase just adds tools and APIs on top of the existing Google Cloud Functions infrastructure. There is really no such thing as a "Firebase Cloud Function". There is just Cloud Functions, and you have options about how you can write and deploy them, either using gcloud, or the Firebase CLI.
I have not found a satisfactory answer/tutorial for this, but I'm sure it must be out there. My goal is to access Google Drive programmatically using my credentials. A secondary and lower-priority goal is to do this properly and that means using OAuth rather than ClientLogin.
Thus: How do you authenticate with the Google Drive API using your own credentials for your own Google Drive (without creating an application on the Google Developers Console)?
All of the documentation assumes an application, but what I'm writing is merely helper scripts in Python 2.7 for my own benefit.
"How do you authenticate with the Google Drive API using your own credentials for your own Google Drive (without creating an application on the Google Developers Console)?"
You can't. The premise of OAuth is that the user is granting access to the application, and so the application must be registered. In Google's case, that's the API/Cloud Console.
In your case, there is no need to register each application that uses your helper scripts. Just create an app called helper_scripts, embed the client Id in your script source, and then reuse those scripts in as many applications as you like.