I've been using GCP for some time.
I created a new GCP Project to test new functions, and enabled the Runtime API.
However, clicking Deploy pops up this message:
"User does not have the 'iam.serviceAccounts.actAs' permission on webcalc-taskc2#appspot.gserviceaccount.com required to create the function. You can fix this by running gcloud iam service-accounts add-iam-policy-binding webcalc-taskc2#appspot.gserviceaccount.com --member=user: --role=roles/iam.serviceAccountUser"
I entered this into Shell, replacing with my Owner email for GCP. Still no luck ;(
You have to add permissions that can use service account to your account.
According to Google document, A service account is a special type of Google account intended to represent a non-human user that needs to authenticate and be authorized to access data in Google APIs.
Seems that "webcalc-taskc2#appspot.gserviceaccount.com" is your service account runs cloud function.
The pop up message you received said that a user account running cloud function does not have 'iam.serviceAccounts.actAs' permission on that service account.
So you have to entered command in the message with replaced --member=user to --member=user:Your account email.
You can check your cloud function service account at details -> General information tab.
Related
We have created a Flutter Web app that fetches bigquery data through bigquery API from Cloud Function. We were using a service account for authentication but as we want to make our application public, we need to use OAuth for end-user and use OAuth credentials.
I have tried to deploy the code from this link for testing on cloud function but the cloud function keeps on running and shuts down because of timeout. I then checked the logs and found that, the reason was the cloud function doesn't allow the browser to open for authentication as it would do when run locally.
Logs:
Function execution started
Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2 /auth?response_type=code&client_id=XXXXXXXXXXXXXXXX&redirect_uri=http%3A%2F%2Flocalhost%3A8080%2F&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery&state=IXYIkeZUTaisTMGUGkVbgnohlor7Jx&access_type=offline.
Function execution took 180003 ms, finished with status: 'timeout'
I am confused as to how can I now authenticate and authorize a user once and have that credentials for every other bigquery API used in our web app.
I think you are missing the point of the use of Cloud Functions. The documentation you shared clearly states:
This guide explains how to authenticate by using user accounts for access to the BigQuery API when your app is installed onto users' machines.
This is never the case for a Cloud Function, since it is hosted in a Google Cloud Server and available for you to use via an HTTP request or a background process.
Because of that, Cloud Function will interact with other GCP products by using Service Accounts and if you want to setup authentication you will have to set it up in the Cloud Function layer, for which I recommend you to take a look at this documentation which explains the principles of authentication with Cloud Functions
I am attempting to programmatically create federated user account and setup that user account using automation.
The reason for this is because we would want to create customized user environments (by logging in the user).
By Default when a domain is federated at IBM, the user account at that domain is not created, the user has to login for the account to be created. See notes from Documentation:
I want to have the user created so automation scripts can provision services and resource using Schematics SDK (Workspace)
I found that the user can be logged in and trigger account creation by using CLI
https://cloud.ibm.com/docs/account?topic=account-federated_id
The program is that when using CLI, it prompts for a one-time code for SSO logins (federated)
It says to avoid the one-time codes for automation scripts, you have to use API key
https://cloud.ibm.com/docs/account?topic=account-federated_id#api_key
However you can only get the API key after the user has been created. Which brings me to this question.. how do we get the API key before getting the user logged, is there a way programmatically or what have others done in python to get around this one-time token prompt to log in the federated user so that their environment and account is created for schematics and other automation scripts to deploy instances etc.?
I have developed a Google Cloud Function (GCF) in python, which i want to access from a web service deployed on AWS (written in python). While in the development phase of the GCF, It had Cloud Function Invoker permission set to allUsers. I assume that is why it didn't ask for an Authorization Token when called.
I want to revoke this public access and make it so that i can only call this function from the web service code and it is not accessible public-ally.
Possible Approach :In my research i have found out that this can be done using the following steps:
Removing all the unnecessary members who have permissions to the GCF.
Creating a new service account which has restricted access to only use GCF.
Download the service account key (json) and use it in the AWS web application
Set environment variable GOOGLE_APPLICATION_CREDENTIALS equal to the path of that service account key (json) file.
Questions
How to generate the Access token using the service account, which may then be appended as Authorization Bearer within the HTTP call made to the GCF? Without this token the GCF should throw error.
The docs say not to put the service account key in the source code. Then what is the best way to go about it. They suggest to use KMS which seems like an overkill.
Do not embed secrets related to authentication in source code, such as API keys, OAuth tokens, and service account credentials. You can use an environment variable pointing to credentials outside of the application's source code, such as Cloud Key Management Service.
What are the bare minimum permissions i will require for the service account?
Please feel free to correct me if you think my understanding is wrong and there is a better and preferable way to do it.
UPDATE: The web service on AWS will call the GCF in a server-to-server fashion. There is no need to propagate the client-end (end-user) credentials.
In your description, you don't mention who/what will call your GCF. A user? A Compute? Another GCF? However, this page can help you to find code example
Yes, secret in plain text and pushed on GIT is not yet a secret! Here again, I don't know what performing the call. If it's a compute, functions, cloud run, or any service of GCP, don't use JSON file key, but the component identity. I would say, create a service account and set it to this component. Tell me more on where are you deploying if you want more help!
Related to 2: if you have a service account, what the minimal role: cloudfunctions.Invoker. It's the minimal role to invoke function
gcloud beta functions add-iam-policy-binding RECEIVING_FUNCTION \
--member='serviceAccount:CALLING_FUNCTION_IDENTITY' \
--role='roles/cloudfunctions.invoker'
Right now when I kick off the dataflow it kicks it off as my UserName. Is there a way to run the GCP Dataflow as a service account? If so, can you provide any samples?
Much Appreciated!
To set up service account based authentication:
In the GCP Console, go to the Create service account key page.
Here
From the Service account list, select New service account.
In the Service account name field, enter a name.
From the Role list, select Project > Owner.
Note: The Role field authorizes your service account to access resources. You can view and change this field later by using the GCP Console. If you are developing a production app, specify more granular permissions than Project > Owner. For more information, see granting roles to service accounts.
Click Create.
A JSON file that contains your key downloads to your computer.
Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the file path of the JSON file that contains your service account key. This variable only applies to your current shell session, so if you open a new session, set the variable again.
You can get more information from here : https://cloud.google.com/dataflow/docs/quickstarts/quickstart-python
I want to create Python project to get data from Google Analytics from a certain page.
I have created new project in Google Console (console.google.developers.com) and got my OAuth 2.0 credentians in JSON format. I'm loosely following this tutorial: Python QuickStart.
I already got redirected to OAuth and selected my account but the script is getting
"User does not have sufficient permissions for this profile".
Will adding access to my user in Google Analytics be enough?
And how to run this project from shell on a remote server? I will not have the ability to just open browser and select Google account in CLI...
"User does not have sufficient permissions for this profile".
The user you are authencating with does not have access to the google analytics account you are trying to access. this would be the profile id that you are using in your code.
make sure that you are authencting your appliction using the same user you are using to log into google analytics
double check the profile id that you are using in your code.