Secure Google Cloud Functions Calls from Server-Side, Authentication strategy? - python

I have developed a Google Cloud Function (GCF) in python, which i want to access from a web service deployed on AWS (written in python). While in the development phase of the GCF, It had Cloud Function Invoker permission set to allUsers. I assume that is why it didn't ask for an Authorization Token when called.
I want to revoke this public access and make it so that i can only call this function from the web service code and it is not accessible public-ally.
Possible Approach :In my research i have found out that this can be done using the following steps:
Removing all the unnecessary members who have permissions to the GCF.
Creating a new service account which has restricted access to only use GCF.
Download the service account key (json) and use it in the AWS web application
Set environment variable GOOGLE_APPLICATION_CREDENTIALS equal to the path of that service account key (json) file.
Questions
How to generate the Access token using the service account, which may then be appended as Authorization Bearer within the HTTP call made to the GCF? Without this token the GCF should throw error.
The docs say not to put the service account key in the source code. Then what is the best way to go about it. They suggest to use KMS which seems like an overkill.
Do not embed secrets related to authentication in source code, such as API keys, OAuth tokens, and service account credentials. You can use an environment variable pointing to credentials outside of the application's source code, such as Cloud Key Management Service.
What are the bare minimum permissions i will require for the service account?
Please feel free to correct me if you think my understanding is wrong and there is a better and preferable way to do it.
UPDATE: The web service on AWS will call the GCF in a server-to-server fashion. There is no need to propagate the client-end (end-user) credentials.

In your description, you don't mention who/what will call your GCF. A user? A Compute? Another GCF? However, this page can help you to find code example
Yes, secret in plain text and pushed on GIT is not yet a secret! Here again, I don't know what performing the call. If it's a compute, functions, cloud run, or any service of GCP, don't use JSON file key, but the component identity. I would say, create a service account and set it to this component. Tell me more on where are you deploying if you want more help!
Related to 2: if you have a service account, what the minimal role: cloudfunctions.Invoker. It's the minimal role to invoke function
gcloud beta functions add-iam-policy-binding RECEIVING_FUNCTION \
--member='serviceAccount:CALLING_FUNCTION_IDENTITY' \
--role='roles/cloudfunctions.invoker'

Related

GCP IAM: Granting a role to a service account while/after creating it via python API

Goal:
Using python, I want to create a service account in a project on the Google Cloud Platform and grant that service account one role.
Problem:
The docs explain here how to grant a single role to the service account. However, it seems to be only possible by using the Console or the gcloud tool, not with python. The alternative for python is to update the whole IAM policy of the project to grant the role for the single service account and overwrite it (described here). However, overwriting the whole policy seems quite risky because in case of an error the policy of the whole project could be lost. Therefore I want to avoid that.
Question:
I'm creating a service account using the python code provided here in the docs. Is it possible to grant the role already while creating the service account with this code or in any other way?
Creating a service account, creating a service account key, downloading a service account JSON key file, and granting a role are separate steps. There is no single API to create a service account and grant a role at the same time.
Anytime you update a project's IAM bindings is a risk. Google prevents multiple applications from updating IAM at the same time. It is possible to lock everyone (users and services) out of a project by overwriting the policy with no members.
I recommend that you create a test project and develop and debug your code against that project. Use credentials that have no permissions to your other projects. Otherwise use the CLI or Terraform to minimize your risks.
The API is very easy to use provided that you understand the API, IAM bindings, and JSON data structures.
As mentioned in John’s answer, you should be very careful when manipulating the IAM module, if something goes wrong it could end in services completely inoperable.
Here is a Google’s document which manipulates the IAM resources using the REST API.
The owner role can be granted to a user, serviceAccount, or a group that is part of an organization. For example, group#myownpersonaldomain.com could be added as an owner to a project in the myownpersonaldomain.com organization, but not the examplepetstore.com organization.

Daemon application authentication for OneDrive files

I have a OneDrive for Business user account within a large organization. I'd like to have a daemon service running (Python) that automatically uploads files to this user's OneDrive.
This service will be running in a headless VM, so browser-based authentication (especially if it needs to be done more than once) is very difficult.
What are my options for authenticating this app to allow it to write to the user's OneDrive? I've registered an app and created a client secret for it. I was experimenting with the authorization flow described here, but that SDK is deprecated and no longer supported, so I'd prefer to use Graph if possible.
What are my options for authentication with Python in this scenario, and is any sample code / example available?
Both delegated and application permissions are supported on MS Graph API: https://learn.microsoft.com/en-us/graph/api/drive-list?view=graph-rest-1.0&tabs=http. Application permissions might not be acceptable for your use case since they would allow access to all users' OneDrives?
Application permissions would definitely be the easiest choice.
But you can also implement this scenario using delegated permissions
You would need the user to initialize the process by authenticating interactively once.
When they do that, store the refresh token in a secret store accessible by the server application.
Then it can use the refresh token to get a new refresh token + access token when needed.
This approach has some more complexity but does allow you to only give access to this one user's OneDrive for the app.
Also, keep in mind that refresh tokens can expire.
The user would need to re-authenticate if that happens.
If this process is critical, application permissions can be a really good idea despite the downsides.

How to use default credentials with DWD to access Drive/Gmail APIs

we want to use default application credentials (python code running in GCP) to perform domain-wide delegation to access Gmail/Drive APIs. The main reason for this is that using default credentials alleviates us from needing to create/manage a gcp service account key (which is very sensitive), whereas code running in GCP (appengine/cloud functions) handles key management for us securely.
We know that Google's professional services have published how to do this for accessing Admin SDK APIs here, however, we're not able to make this work with Gmail/Drive APIs.
Does anyone know if this is technically possible, and if so how?
For what I understood from your question you don't want to use a Service Account, but instead some Application Default Credentials (ADC).
Basically, you will always need to use a Service Account, but if you are running your app on Compute Engine, Kubernetes Engine, the App Engine flexible environment, or Cloud Functions, it will not be necessary for you to create it in your own as it is stated HERE.
You will only need to get the credentials needed to your project and then you will able to call the Gmail API as you would normally do:
from google.auth import compute_engine
credentials = compute_engine.Credentials()

How to handle keys and credentials when deploying to Google Cloud Functions?

I have several cloud functions (in Python) that require a modulair package auth in which there is a subfolder with credentials (containing mostly json files of Google Service Accounts files or Firebase configurations).
From a security perspective, I have obviously not included these files on the .git by adding the folder in the .gitignore file (auth/credentials).
However, I am now stuck with what to do when deploying the Google Cloud Function (.gcloudignore). If I deploy it with the credentials then I imagine that these keys are exposed on the server? How could I overcome this?
I have heard some speaking of environmental variables, but I am not sure if this is more secure than just deploying it?
What is the Google Way of doing it?
You have two primary solutions available to you. The first is that the Cloud Function can run with the identity of a custom Service Account. This service account can then be associated with all the roles necessary for your logic to achieve its task. The value of this is that no credentials need be explicitly known to your logic. The environment in which your calls are being made "implicitly" has all that it needs.
See: Per-function identity
The second mechanism which is more in line with what you are currently doing uses the concept of the Compute Metadata Server. This metadata can be configured with the tokens necessary to make on-ward calls. The metadata is configured separately from your Cloud Function logic which merely retrieves the data as needed.
See: Fetching identity and access tokens.

Securing Google Cloud Functions using WSO2

I'm writing an application (web and mobile) where I would like to use WSO2 for user authentication, authorization and SSO.
My mobile app will authenticate the users against the WSO2-is.
All the API's used by the app are google cloud functions written in python.
I would like to bring a security layer to my GCF's.
From my understanding I can use WSO2-am as a bridge between the app and the GCF to provide security, but I would like to leverage the high scalability of the GCF archicteture and avoid the WSO2-am being a bottleneck.
Is it possible use the WSO2-am and make the GCF to check the permissions access against it, allowing the app calling the API directly instead of using the WSO2-am as a bridge ?
If yes, may you provide some documentation/blogpost/whatever that could help ?
In WSO2 APIM, the gateway does all the authentication and authorization stuff when the requests go through it (to the backend).
So, in the case of,
(1) OAuth2 tokens, the gateway talks to the key manager to validate the token, subscription (API-to-Application) and token scopes.
(2) Self-contained JWT tokens, the gateway can do all these validations itself.
So now in your case, since you don't want to send the requests through the gateway, you have to do what gateway does, within the cloud function itself. In that case, the JWT tokens will be the best choice as they can be validated without connecting to the key manager.
In addition to that, the gateway keeps a token cache so that it doesn't have to validate the same token again and again. You can have a similar cache (if possible) within your cloud functions too. However, in your case, you will have to externalize the cache due to the short-lifetime nature of cloud functions.
Here is the gateway code[1] which does the token, scopes and subscription validations. You can use it as a guide to write yours.
[1] https://github.com/wso2/carbon-apimgt/blob/master/components/apimgt/org.wso2.carbon.apimgt.gateway/src/main/java/org/wso2/carbon/apimgt/gateway/handlers/security/jwt/JWTValidator.java

Categories

Resources