Google Service Account + Drive API unable to set expirationTime (python) - python

I'm trying to use a service account and the Google Drive API to automate sharing a folder and i want to be able to set the expirationTime property.
I've found previous threads that mention setting it in a permissions.update() call, but whatever i try i get the same generic error - 'Expiration dates cannot be set on this item.'.
I've validated i'm passing the correct date format, because i've even shared manually from my drive account and then used the permissions.list() to get the expirationTime from the returned data.
I've also tried creating a folder in my user drive, making my service account and editor and then trying to share that folder via API but i get the same problem.
Is there something that prevents a service account being able to set this property?
To note - I haven't enabled the domain wide delegation and tried impersonating yet.
Sample code:
update_body = {
'role': 'reader',
'expirationTime': '2023-03-13T23:59:59.000Z'
}
driveadmin.permissions().update(
fileId='<idhere>', permissionId='<idhere>', body=update_body).execute()

Checking the documentation from the feature it seems that it's only available to paid Google Workspace subscriptions as mentioned in the Google Workspace updates blog. You are most likely getting the error Expiration dates can't be set on this item as the service account is treated as a regular Gmail account and you can notice that this feature is not available for this type of accounts in the availability section of the update:
If you perform impersonation with your Google Workspace user I'm pretty sure that you won't receive the error as long as you have one of the subscriptions in which the feature is enabled. You can check more information about how to perform impersonation and Domain Wide Delegation here.

Related

Azure SDK for Python: Copy blobs

For my current Python project I' using the Microsoft Azure SDK for Python.
I want to copy a specific blob from one container path to another and tested already some options, described here.
Overall they are basically "working", but unfortunately the new_blob.start_copy_from_url(source_blob_url) command always leads to an erorr: ErrorCode:CannotVerifyCopySource.
Is someone getting the same error message here, or has an idea, how to solve it?
I was also trying to modify the source_blob_url as a sas-token, but still doesn't work. I have the feeling that there is some connection to the access levels of the storage account, but so far I wasn't able to figure it out. Hopefully someone here can help me.
Is someone getting the same error message here, or has an idea, how to solve it?
As you have mentioned you might be receiving this error due to permissions while including the SAS Token.
The difference to my code was, that I used the blob storage sas_token from the Azure website, instead of generating it directly for the blob client with the azure function.
In order to allow access to certain areas of your storage account, a SAS is generated by default with a number of permissions such as read/write, services, resource type, Start and expiration date/time, and Allowed IP addresses, etc.
It's not that you always need to generate directly for the blob client with the azure function but you can generate one from the portal too by allowing the permissions.
REFERENCES: Grant limited access to Azure Storage resources using SAS - MSFT Document

is there a direct explanation for each google-ads.yaml item?

I am looking for collect data from Google ADS API into GCP by using Python scripts and it requires to fill these items for authentication in the google-ads.yaml file:
developer_token:
client_id:
client_secret:
refresh_token:
login_customer_id:
I was able to fill these items by asking people in my company or generating it with google python scripts in GitHub but I need to understand the role of each, the docs seems to be disperse with a long learning path.
You can follow this guidebook to make your google-ads.yaml file. And for the sample role you provided, below are the definitions of each but you can check this sample template for more details about it.
Developer token
A developer token is required when making requests to the Google Ads API regardless of whether you're using the OAuth2 or Service Account configurations. To obtain a developer token see: https://developers.google.com/google-ads/api/docs/first-call/dev-token
developer_token: INSERT_DEVELOPER_TOKEN_HERE
OAuth2 configuration
The below configuration parameters are used to authenticate using the recommended OAuth2 flow. For more information on authenticating with OAuth2 see:
https://developers.google.com/google-ads/api/docs/oauth/overview
client_id: INSERT_OAUTH2_CLIENT_ID_HERE
client_secret: INSERT_OAUTH2_CLIENT_SECRET_HERE
refresh_token: INSERT_REFRESH_TOKEN_HERE
Login customer ID configuration
Required for manager accounts only: Specify the login customer ID used to authenticate API calls. This will be the customer ID of the authenticated manager account. It should be set without dashes, for example: 1234567890 instead of 123-456-7890. You can also specify this later in code if your application uses multiple manager account + OAuth pairs.
login_customer_id: INSERT_LOGIN_CUSTOMER_ID_HERE

Creating a new Shared Drive

When I run the code below, which is taken from: https://developers.google.com/drive/api/v3/manage-shareddrives#python
# Create a new drive
test_drive_metadata = {'name': 'Test Drive'}
request_id = str(uuid.uuid4())
test_drive = self.service.drives().create(
body=test_drive_metadata,
requestId=request_id,
fields='id'
).execute()
I get "The user does not have sufficient permissions for this file." This does not happen if I create files, if I list shared drives or anything else. There are no other required scopes other than ['https://www.googleapis.com/auth/drive'].
It should be noted that I am using a service account. Are service accounts not allowed to create shared drives? This is not documented anywhere as far as I am aware if this is the case.
There doesn't seem to be any explicit documentation regarding this limitation, but considering that Service Accounts (without using DWD and impersonation) are supposed to manage application data, and not user data, it makes sense that they cannot be used to manage data that is shared with regular users.
Also, the use of Service Accounts to manage shared documents seems to be advised against, according to the official documentation:
Using the service account as a common owner to create many shared documents can have severe performance implications.
On the other hand, since Service Accounts have certain limitations compared to regular accounts (for example, Event creation in Calendar), this could probably be one of these limitations.
In any case, in order to make sure that's the case, I'd suggest you to report this behaviour in this Issue Tracker component.
Reference:
Drive API: Perform G Suite Domain-Wide Delegation of Authority
Using OAuth 2.0 for Server to Server Applications

Is it possible to limit a Google service account to specific BigQuery datasets within a project?

I've set up a service account using the GCP UI for a specific project Project X. Within Project X there are 3 datasets:
Dataset 1
Dataset 2
Dataset 3
If I assign the role BigQuery Admin to Project X this is currently being inherited by all 3 datasets.
Currently all of these datasets inherit the permissions assigned to the service account at the project level. Is there any way to modify the permissions for the service account such that it only has access to specified datasets? e.g. allow access to Dataset 1 but not Dataset 2 or Dataset 3.
Is this type of configuration possible?
I've tried to add a condition in the UI but when I use the Name resource type and set the value equal to Dataset 1 I'm not able to access any of the datasets - presumably the value is not correct. Or a dataset is not a valid name resource.
UPDATE
Adding some more detail regarding what I'd already tried before posting, as well as some more detail on what I'm doing.
For my particular use case, I'm trying to perform SQL queries as well as modifying tables in BigQuery through the API (using Python).
Case A:
I create a service account with the role 'BigQuery Admin'.
This role is propagated to all datasets within the project - the property is inherited and I can not delete this service account role from any of the datasets.
In this case I'm able to query all datasets and tables using the Python API - as you'd expect.
Case B:
I create a service account with no default role.
No role is propagated and I can assign roles to specific datasets by clicking on the 'Share dataset' option in the UI to assign the 'BigQuery Admin' role to them.
In this case I'm not able to query any of the datasets or tables and get the following error if I try:
*Forbidden: 403 POST https://bigquery.googleapis.com/bq/projects/project-x/jobs: Access Denied: Project X: User does not have bigquery.jobs.create permission in project Project X.*
Even though the permissions required (bigquery.jobs.create in this case) exist for the dataset I want, I can't query the data as it appears that the bigquery.jobs.create permission is also required at a project level to use the API.
I'm posting the solution that I found to the problem in case it is useful to anyone else trying to accomplish the same.
Assign the role "BigQuery Job User" at a project level in order to have the permission bigquery.jobs.create assigned to the service account for that project.
You can then manually assign specific datasets the role of "BigQuery Data Editor" in order to query them through the API in Python. Do this by clciking on "Share dataset" in the BigQuery UI. So for this example, I've "Shared" Dataset 1 and Dataset 2 with the service account.
You should now be able to query the datasets for which you've assigned the BigQuery Data Editor role in Python.
However, for Dataset 3, for which the "BigQuery Data Editor" role has not been assigned, if you attempt to query a table this should return the error:
Forbidden: 403 Access Denied: Table Project-x:dataset_1.table_1: User does not have permission to query table Project-x:dataset_1.table_1.
As described above, we now have sufficient permissions to access the project but not the table within Dataset 3 - by design.
As you can see here, you can grant access in your dataset to some entities, including service accounts:
Google account e-mail: Grants an individual Google account access to
the dataset
Google Group: Grants all members of a Google group access
to the dataset Google Apps
Domain: Grants all users and groups in a
Google domain access to the dataset
Service account: Grants a service
account access to the dataset
Anybody: Enter "allUsers" to grant
access to the general public
All Google accounts: Enter
"allAuthenticatedUsers" to grant access to any user signed in to a
Google Account
I suggest that you create a service account without permissions in BigQuery and then grant the access for a specific dataset.
I hope it helps you.
Please keep in mind that access to BigQuery can be granted at project level or dataset level.
The dataset is the lowest level you can assign permissions, so that accounts can access all the resources in the dataset, e.g. tables, views, columns and rows. Permissions at project level permissions, as you have already noticed, are propagated (heritage) for all the datasets in the project.
Regarding your service account, by default Google Cloud assigns it a structure like service_accunt_name#example.gserviceaccount.com, and during the process of sharing the dataset, as commented by #rmesteves, you will need this email address to grant it the desired permissions.
It seems that the steps you described "Name resource type" are not the correct ones. In the BigQuery UI please try:
Click on the dataset name (e.g. Dataset1 in your example) you want to share.
Then, at the right on the screen you will see the option "Share Dataset", click on it.
Follow instructions to set up to your service account a BigQuery role like BigQuery Admin, BigQuery Data Owner, BigQuery User, among others. Check the previous link to be aware of what kind of things the roles can perform.

Microsoft Graph service fail with ms account

In my company, I need to upload Excel files on OneDrive.
We have a 365 Business Plan and every employee has an own 365 account, but I want to maintain just one repository for merged files and avoid to share the same repos account among all, so I prefer to implement a "access without user" through client credentials flow.
The first problem that I've met is the authorization: when I try to authorize the app by /adminconsent endpoint, it fails because my client account is not an administrator :-( So I've tried to use another account, a simple Microsoft Account (for that I've made a new registration of the app in the Application Portal) but when I try to authorize the app I receive this error:
"AADSTS50020: We are unable to issue tokens from this API version for a Microsoft account. Please contact the application vendor as they need to use version 2.0 of the protocol to support this."
What's wrong?
As an alternative, I've thought to continue with 365 Business employee accounts, create a folder with a tech account and share it, but when using Graph Explorer with an employee account and make the request
/me/drive/sharedWithMe
I receive just the shared folder but without the content
Here the code (I'm using the requests_oauthlib Python module):
In the beginning, I initialize the class object
client = BackendApplicationClient(client_id=config.CLIENT_ID)
self.oauth = OAuth2Session(
client.client_id,
scope=config.SCOPES,
redirect_uri='https://me.local/allowed')
then I make a request for authorization_url
auth_base = 'https://login.microsoftonline.com/common/adminconsent'
self.authorization_url, state = self.oauth.authorization_url(
auth_base,
state="12345")
return self.authorization_url
and the request for the token
return self.oauth.fetch_token(
token_url=https://login.microsoftonline.com/common/oauth2/v2.0/token',
client_id=config.CLIENT_ID,
scope="https://graph.microsoft.com/.default",
client_secret=config.CLIENT_SECRET,
authorization_response='https://me.local/authorized'
)
You need to be a tenant administrator in order to consent application only access (where you only use client id and secret). However, you can use alternative flows such as Resource Owner Credentials Grant and On-Behalf-Of Grant which requires you to have the credentials of a user with relevant permissions.
You can also read about those flows in my post:
Getting Access Token for Microsoft Graph Using OAuth REST API.
Regarding the message with "version 2.0" - it may be caused by a mixup between version 1 and version 2 of the Microsoft OAuth API. Version 1 is only meant for organization users (users which sit inside azure active directory) and version 2 support Microsoft accounts as well. You can read more about the difference between the two versions in here. Make sure you use one of those versions for the entire process (creating the app, assigning and consenting permissions, and requesting an access token). Mixing between the two versions may not work.

Categories

Resources