How can I change the owner of a Google Sheets spreadsheet? - python

With the following, I can programmatically create a spreadsheet in Google sheets, but the owner of the sheet is the developer account (a crazy string ending in "gserviceaccount.com"), and my normal account can't view the spreadsheet. What else do I need to do in order to add Google users to the read/write permissions?
from oauth2client.service_account import ServiceAccountCredentials
from googleapiclient import discovery
# ... json_key is the json blob that has the credentials
scope = ['https://spreadsheets.google.com/feeds']
credentials = ServiceAccountCredentials.from_json_keyfile_dict(json_key, scope)
service = discovery.build('sheets', 'v4', credentials=credentials)
spreadsheet = {
"properties": {"title": "my test spreadsheet"}
}
service.spreadsheets().create(body=spreadsheet).execute()
Edit:
I tried changing the scope to ['https://www.googleapis.com/auth/drive'] but the answer below still doesn't work for me. When I run
print [xx for xx in dir(service) if not xx.startswith('_')]
I get
['new_batch_http_request', u'spreadsheets']
In other words, permissions() isn't a method in service as I have service defined. What should I be doing differently?

I figured it out from reading the comment left by Chris. All that was missing from his comments is you do in fact need to use particular scopes in his drive_service. Notice the changes in scope I use to build the different objects:
from oauth2client.service_account import ServiceAccountCredentials
from googleapiclient.discovery import build
key = '/path/to/service_account.json'
# Build 'Spreadsheet' object
spreadsheets_scope = [ 'https://www.googleapis.com/auth/spreadsheets' ]
sheets_credentials = ServiceAccountCredentials.from_json_keyfile_name(key, spreadsheets_scope)
sheets_service = build('sheets', 'v4', credentials=sheets_credentials)
# returns 'Spreadsheet' dict
# https://developers.google.com/sheets/api/reference/rest/v4/spreadsheets#resource-spreadsheet
spreadsheet = sheets_service.spreadsheets().create(
body={
"properties": {
'title': 'spreadsheets test',
},
"sheets": [],
}
).execute()
# id for the created file
spreadsheetId = spreadsheet['spreadsheetId']
# url of your file
spreadsheetUrl = spreadsheet['spreadsheetUrl']
# Build 'Permissions' object
drive_scope = [ 'https://www.googleapis.com/auth/drive' ]
drive_credentials = ServiceAccountCredentials.from_json_keyfile_name(key, drive_scope)
drive_service = build('drive', 'v3', credentials=drive_credentials)
# returns 'Permissions' dict
permissions = drive_service.permissions().create(
fileId=spreadsheetId,
transferOwnership=True,
body={
'type': 'user',
'role': 'owner',
'emailAddress': 'example#email.com',
}
).execute()
# apply permission
drive_service.files().update(
fileId=spreadsheetId,
body={'permissionIds': [permissions['id']]}
).execute()
print ('\nOpen me:\n\n%s\n' % spreadsheetUrl)
So the logic is, a 'Spreadsheet Resource' is made from build with all its properties and sheet data, with the owner set to your service account. Next, a 'Drive Resource' is made, this is the Resource with the permissions() method. execute() returns a newly created permissions id used to update() the spreadsheet file.

Service is just a generic name for the result of a discovery.build call. In this case, not having the 'permissions' method is just that its not available on the same service. The following code should be sufficient if changing owner isn't required. To add someone with read and write access, the following works for me:
def grant_permissions(spreadsheet_id, writer):
drive_service = discovery.build('drive', 'v3')
permission = drive_service.permissions().create(
fileId=spreadsheet_id,
body={
'type': 'user',
'role': 'writer',
'emailAddress': writer,
}
).execute()
drive_service.files().update(
fileId=spreadsheet_id,
body={'permissionIds': [permission['id']]}
).execute()
To actually change the owner, the transfer ownership flag must be set:
def change_owner(spreadsheet_id, writer):
drive_service = discovery.build('drive', 'v3')
permission = drive_service.permissions().create(
fileId=spreadsheet_id,
transferOwnership=True,
body={
'type': 'user',
'role': 'owner',
'emailAddress': writer,
}
).execute()
drive_service.files().update(
fileId=spreadsheet_id,
body={'permissionIds': [permission['id']]}
).execute()
The service account being used must have the right permissions though. I believe the ones that worked for me was checking the g suite box when first creating the service account.

Try to use the method Permissions: insert from the documentation. You will be able to insert a permission for a file or a Team Drive.
Here is the sample code provided from the documentation:
from apiclient import errors
# ...
def insert_permission(service, file_id, value, perm_type, role):
"""Insert a new permission.
Args:
service: Drive API service instance.
file_id: ID of the file to insert permission for.
value: User or group e-mail address, domain name or None for 'default'
type.
perm_type: The value 'user', 'group', 'domain' or 'default'.
role: The value 'owner', 'writer' or 'reader'.
Returns:
The inserted permission if successful, None otherwise.
"""
new_permission = {
'value': value,
'type': perm_type,
'role': role
}
try:
return service.permissions().insert(
fileId=file_id, body=new_permission).execute()
except errors.HttpError, error:
print 'An error occurred: %s' % error
return None
Use Try it now to test live data and see the API request and response.
For further reading, check this SO post.

Related

Cannot transfer google calendar events using Google API Python SDK

I have created function that is supposed to move all events from one Google calendar to another. Here is how it looks like:
def merge_calendar(email_from, email_to, service):
off_board_user_calendar = service.events().list(calendarId=email_from).execute()
off_board_user_events = off_board_user_calendar.get('items', [])
# I tried to use this code, to resolve this "You need to have reader access to this calendar." error,
# but it didn't work
#
# rule = {
# 'scope': {
# 'type': 'user',
# 'value': email_from,
# },
# 'role': 'reader'
# }
#
# created_rule = service.acl().insert(calendarId=email_from, body=rule).execute()
# print(f'Updated ACL rule {created_rule}')
for event in off_board_user_events:
updated_event = service.events().move(
calendarId=email_from,
eventId=event['id'],
destination=email_to
).execute()
print(f'Event has been transferred: {updated_event["updated"]}')
print('All events have been transferred successfully.')
Right after execution I get this error - "You need to have reader access to this calendar.". And so, as see from comment, I tried to resolve this error, but this commented code brings me another error - just "Forbidden".
I am not quite sure what I am doing wrong. How can I transfer all events from on calendar to another
Also I think it is important to mention how I create service entity. I was trying to do this using 2 methods:
Normal credentials:
creds = None
if os.path.exists('token.json'):
creds = Credentials.from_authorized_user_file('token.json', SCOPES[api_name])
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(client_secret_file, SCOPES[api_name])
creds = flow.run_local_server()
with open('token.json', 'w') as token:
token.write(creds.to_json())
and using Google Service Account
if delegated_user is not None:
credentials = service_account.Credentials.from_service_account_file(
'service.json', scopes=SCOPES[api_name])
creds = credentials.with_subject(delegated_user)
Both didn't work.
PS.
Calendar scope I have is 'https://www.googleapis.com/auth/calendar'.
Thanks in advance!
Some things that you might look at:
Check the Domain Wide Delegation in your admin console and make sure that the service account ID is the same service account that you are using in your code.
Add the scope that you mentioned in your question 'https://www.googleapis.com/auth/calendar' which is the most restricted scope on the Calendar API.
Try to delegate the user with credentials.create_delegated(email) instead of credentials.with_subject(delegated_user).
Actually, there is no need to transfer event by event. It'll be enough just to update ACL, just like this:
def merge_calendar(email_from, email_to, service):
rule = {
'scope': {
'type': 'user',
'value': email_to,
},
'role': 'owner'
}
service.acl().insert(calendarId=email_from, body=rule).execute()
You will just get an email with proposition to add this calendar to your Google Calendar.
Talking about authentication I had this user delegation:
credentials = service_account.Credentials.from_service_account_file(
'service.json', scopes=['https://www.googleapis.com/auth/calendar'])
creds = credentials.with_subject(email_from)
References
Google Service Account

Update/Add labels to Google Kubernetes Engine Cluster Workloads through Python API

Has anyone tried to add or update the Clusters from Google Kubernetes Engine through Python API?
I managed to do this for Compute instances, but the guide for Kubernetes Engine says its deprecated:
https://cloud.google.com/kubernetes-engine/docs/reference/rest/v1/projects.zones.clusters.nodePools/update
Tried it and it fails saying it does not find "labels":
googleapiclient.errors.HttpError: <HttpError 400 when requesting
https://container.googleapis.com/v1/projects/testingproject/zones/us-east1/clusters/testing-cluster/resourceLabels?alt=json
returned "Invalid JSON payload received. Unknown name "labels": Cannot
find field.". Details: "[{'#type':
'type.googleapis.com/google.rpc.BadRequest', 'fieldViolations':
[{'description': 'Invalid JSON payload received. Unknown name
"labels": Cannot find field.'}]}]">
My code is this:
credentials = GoogleCredentials.get_application_default()
service = discovery.build('container', 'v1', credentials=credentials)
# Deprecated. The Google Developers Console [project ID or project
# number](https://developers.google.com/console/help/new/#projectnumber).
# This field has been deprecated and replaced by the name field.
project_id = 'testingproject' # TODO: Update placeholder value.
# Deprecated. The name of the Google Compute Engine
# [zone](/compute/docs/zones#available) in which the cluster
# resides.
# This field has been deprecated and replaced by the name field.
zone = 'us-east1' # TODO: Update placeholder value.
# Deprecated. The name of the cluster.
# This field has been deprecated and replaced by the name field.
cluster_id = 'testing-cluster' # TODO: Update placeholder value.
set_labels_request_body = {
'labels': 'value'
}
request = service.projects().zones().clusters().resourceLabels(projectId=project_id, zone=zone, clusterId=cluster_id, body=set_labels_request_body)
response = request.execute()
# TODO: Change code below to process the `response` dict:
pprint(response)
I want to update the Workload named 'matei-testing-2000-gke-ops' inside the cluster 'testing-cluster'.
Any ideas?
Thank you
Update: It does not find the labels because the name is resourceLabels. But I get the following error after:
googleapiclient.errors.HttpError: <HttpError 400 when requesting
https://container.googleapis.com/v1/projects//zones//clusters//resourceLabels?alt=json
returned "Invalid value at 'resource_labels'
(type.googleapis.com/google.container.v1.SetLabelsRequest.ResourceLabelsEntry),
"value"". Details: "[{'#type':
'type.googleapis.com/google.rpc.BadRequest', 'fieldViolations':
[{'field': 'resource_labels', 'description': 'Invalid value at
'resource_labels'
(type.googleapis.com/google.container.v1.SetLabelsRequest.ResourceLabelsEntry),
"value"'}]}]">
I've not now tried this.
But IIUC, you'll need to:
ditch (or use defaults) for e.g. project_id, zone and cluster_id parameters of resourceLabels
add name to your body and it should be of the form: projects/*/locations/*/clusters/*
i.e.
import os
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
service = discovery.build('container', 'v1', credentials=credentials)
PROJECT = os.getenv("PROJECT")
LOCATION = os.getenv("ZONE")
CLUSTER = os.getenv("CLUSTER")
NAME = "projects/{project}/locations/{location}/clusters/{cluster}".format(
project=project_id,
location=zone,
cluster=cluster_id)
# To update `resourceLabels` you must first fingerprint them
# To get the current `labelFingerprint`, you must `get` the cluster
body = {
'name': NAME,
}
request = service.projects().zones().clusters().get(
projectId=project_id,
zone=zone,
clusterId=cluster_id)
response = request.execute()
labelFingerprint = response["labelFingerprint"]
if "resourceLabels" in response:
print("Existing labels")
resourceLabels = response["resourceLabels"]
else:
print("No labels")
resourceLabels = {}
# Add|update a label
resourceLabels["dog"] = "freddie"
# Construct `resourceLabels` request
body = {
'name': NAME,
'resourceLabels': resourceLabels,
'labelFingerprint': labelFingerprint,
}
request = service.projects().zones().clusters().resourceLabels(
projectId=project_id,
zone=zone,
clusterId=cluster_id,
body=body)
# Do something with the `response`
response = request.execute()
clusters.get
`clusters#CLuster
And
gcloud container clusters describe ${CLUSTER} \
--zone=${ZONE} \
--project=${PROJECT} \
--format="value(labelFingerprint,resourceLabels)"
Before:
a9dc16a7
After:
b2c32ec0 dog=freddie

Azure key vault create in Python

I am trying to programmatically create a key vault in python using this tutorial (https://learn.microsoft.com/en-us/python/api/overview/azure/key-vault?view=azure-python).
No errors till the last step where it throws an exception when I call client.vaults.create_or_update() because I might not have used the right values for ALLOW_OBJECT_ID and ALLOW_TENANT_ID. The documentation says these values can be found on the portal but I could not find it, is there a way to get it programmatically?
Error:
srest.exceptions.AuthenticationError: , AdalError: Get Token request returned http error: 400 and server response: {"error":"unauthorized_client","error_description":"AADSTS700016: Application with identifier XXX was not found in the directory YY
Code:
import subprocess
import json
from azure.mgmt.keyvault import KeyVaultManagementClient
from azure.common.credentials import ServicePrincipalCredentials
def get_subscription():
subs = json.loads(subprocess.check_output('az account list',
shell=True).decode('utf-8'))
subscription = subs[1]['id']
cmd = 'az ad sp create-for-rbac --role="Contributor" --scopes="/subscriptions/%s"' % subscription
creds = json.loads(subprocess.check_output(cmd, shell=True).decode('utf-8'))
return subscription, creds
def create_key_vault(vault_name='TestKeyVault'):
subscription_id, creds = get_subscription()
client_id = creds['appId']
secret = creds['password']
tenant = creds['tenant']
credentials = ServicePrincipalCredentials(client_id=client_id, secret=secret, tenant=tenant)
client = KeyVaultManagementClient(credentials, subscription_id)
ALLOW_OBJECT_ID = client_id
ALLOW_TENANT_ID = tenant
RESOURCE_GROUP = 'SomeRG'
VAULT_NAME = vault_name
# Vault properties may also be created by using the
# azure.mgmt.keyvault.models.VaultCreateOrUpdateParameters
# class, rather than a map.
operation = client.vaults.create_or_update(
RESOURCE_GROUP,
VAULT_NAME,
{
'location': 'eastus',
'properties': {
'sku': {
'name': 'standard'
},
'tenant_id': ALLOW_TENANT_ID,
'access_policies': [{
'object_id': ALLOW_OBJECT_ID,
'tenant_id': ALLOW_TENANT_ID,
'permissions': {
'keys': ['all'],
'secrets': ['all']
}
}]
}
}
)
vault = operation.result()
print(f'New vault URI: {vault.properties.vault_uri}')
Well, the objects could be the users, security groups, service principals in your Azure AD tenant, if you not familiar with access policy in keyvault, check this doc.
To get them grammatically, the easiest way in your case is to use Azure CLI in python.
Use az account show to get the tenantId.
Use az ad user list to get the objectId of the user.
Use az ad group list to get the objectId of the security group.
Use az ad sp list to get the objectId of the service principal.
Then you should specify the ALLOW_OBJECT_ID and ALLOW_TENANT_ID with the any objectId you need and tenantId above.

Creating A Spreadsheet In A Folder With GSpread

I am having trouble finding any documentation on how to create a GSheet in a certain Google Drive directory using GSpread.
I have checked the documentation and had a look around some of the back end code.
I am currently using the code below to create the spreadsheet:
worksheet = sh.add_worksheet(title='Overview', rows='100', cols='9')
I want to be able to create the spreadsheet in a directory on a google drive, for example:
X > Y > Spreadsheet
Any help would be greatly appreciated,
Cheers.
You want to create new Spreadsheet to the specific folder.
You want to achieve this using Python.
If my understanding is correct, how about this answer?
Modification points:
Unfortunately, Sheets API cannot achieve this. In this case, it is required to use Drive API.
In your script, I think that you use gspread.authorize() like gc = gspread.authorize(credentials). In this modification, credentials is used.
The script in your question of worksheet = sh.add_worksheet(title='Overview', rows='100', cols='9') is used for adding a sheet to the existing Spreadsheet. When you create new Spreadsheet using gspread, please use sh = gc.create('A new spreadsheet').
In this case, the new Spreadsheet is created to the root folder.
Preparation:
Before you use the following script, please enable Drive API at API console, and please add the scope of https://www.googleapis.com/auth/drive. If you are using the scope of https://www.googleapis.com/auth/drive.file, please use this and the scope is not required to be modified to https://www.googleapis.com/auth/drive.
If you are using OAuth2, please remove the file including the refresh token. And then, please run the script and reauthorize again. By this, the added scope is reflected to the access token.
If you are using Service account, it is not required to remove the file.
Pattern 1:
The following sample script creates new Spreadsheet to the specific folder.
Sample script:
from apiclient import discovery
destFolderId = '###' # Please set the destination folder ID.
title = '###' # Please set the Spreadsheet name.
drive_service = discovery.build('drive', 'v3', credentials=credentials) # Use "credentials" of "gspread.authorize(credentials)".
file_metadata = {
'name': title,
'mimeType': 'application/vnd.google-apps.spreadsheet',
'parents': [destFolderId]
}
file = drive_service.files().create(body=file_metadata).execute()
print(file)
Pattern 2:
If you want to move the existing Spreadsheet to the specific folder, please use the following script.
Sample script:
from apiclient import discovery
spreadsheetId = '###' # Please set the Spreadsheet ID.
destFolderId = '###' # Please set the destination folder ID.
drive_service = discovery.build('drive', 'v3', credentials=credentials) # Use "credentials" of "gspread.authorize(credentials)".
# Retrieve the existing parents to remove
file = drive_service.files().get(fileId=spreadsheetId,
fields='parents').execute()
previous_parents = ",".join(file.get('parents'))
# Move the file to the new folder
file = drive_service.files().update(fileId=spreadsheetId,
addParents=destFolderId,
removeParents=previous_parents,
fields='id, parents').execute()
References:
Files: create
Files: update
Moving files between folders
If I misunderstood your question and this was not the direction you want, I apologize.
Edit:
When you want to share the folder, please use the following script.
Sample script:
drive_service = discovery.build('drive', 'v3', credentials=credentials) # Use "credentials" of "gspread.authorize(credentials)".
folderId = "###" # Please set the folder ID.
permission = {
'type': 'user',
'role': 'writer',
'emailAddress': '###', # Please set the email address of the user that you want to share.
}
res = drive_service.permissions().create(fileId=folderId, body=permission).execute()
print(res)
Reference:
Permissions: create
Here is a solution that works for me.
It creates a new google spreadsheet within the folder on Google Drive.
NOTE 1: the folder must be shared with the Google Developer account (smth like getgooglesheets#<your-app-name>.iam.gserviceaccount.com).
This email could be found in your keyfile.
GOOGLE API V2
from googleapiclient import discovery
from oauth2client.service_account import ServiceAccountCredentials
PATH_TO_KEYFILE = "./"
KEYFILE = "<appname>-<numbers>.json" # Download from google dev account
DESTFOLDER_ID = "1lZs9O...xLW8D" # Some folder on Google drive
TITLE = "My_New_Spread_Sheet"
EMAIL = "<new_owner_email>#gmail.com"
def share_file(...):
""" Routine that changes rights and ownership of the file """
...
return
file_metadata = {
'title': TITLE,
'mimeType': 'application/vnd.google-apps.spreadsheet',
'parents': [{'id': DESTFOLDER_ID}]
}
scope = ['https://spreadsheets.google.com/feeds', 'https://www.googleapis.com/auth/drive']
keyfile = os.path.join(PATH_TO_KEYFILE, KEYFILE)
credentials = ServiceAccountCredentials.from_json_keyfile_name(keyfile, scope)
drive_service = discovery.build('drive', 'v2', credentials=credentials) # VERSION 2 GOOGLE API
response = drive_service.files().insert(body=file_metadata).execute() # Here file is created!
print(response) # full Google API response
print(f"🔗 LINK: https://docs.google.com/spreadsheets/d/{file['id']}/edit?usp=drivesdk") # output for convenience...
share_file(file_id=file['id'], email=EMAIL, change_ownership=True) # Here I change ownership of the file — that is out of the scope of the question
NOTE 2: the new file initially belongs to the developer account, not to the folder owner's account.
GOOGLE API VER 3
same as above, differs in the following:
...
drive_service = discovery.build('drive', 'v3', credentials=credentials) # VERSION 3 GOOGLE API
file_metadata = {
'name': TITLE, # use 'name' property
'mimeType': 'application/vnd.google-apps.spreadsheet',
'parents': [DESTFOLDER_ID] # no dictionary inside the list
}
response = drive_service.files().create(body=file_metadata).execute() # 'create()' instead of insert()
...
Also, the response details will slightly differ.

Google Api Client Library for python doesn't return nextPageToken in Files: list method with query

I need to get the list of all files and folders in google drive owned by a user. For some reasons file.list method doesn't return nextPageToken in response and I see only few results, since I can't go to the next page.
I have tried API Client Library for python and API explorer, but I receive only several records per user.
My code
users = ['user1#domain.com', 'user2#domain.com']
drive_array = []
if users:
for item in users:
page_token_drive = None
query = "'%s' in owners" % (item)
while True:
drive_result = service_drive.files().list(q=query, corpora='domain', includeTeamDriveItems=False,
supportsTeamDrives=False, fields='nextPageToken, files(id,owners)',
pageToken=page_token_drive).execute()
drive_array.extend(drive_result.get('files', []))
page_token_drive = drive_result.get('nextPageToken', None)
if not page_token_drive:
break
I expect to get id of a file and owners array for all files owned by the user
[
{
"id": "12344",
"owners": [
{
"kind": "drive#user",
"displayName": "User1 User1",
"photoLink": "https://lg",
"me": false,
"permissionId": "1234556",
"emailAddress": "user1#domain.com"
}
]
},
{
"id": "09875",
"owners": [
{
"kind": "drive#user",
"displayName": "User1 User1",
"photoLink": "https://lh5",
"me": false,
"permissionId": "56565665655656566",
"emailAddress": "user1#domain.com"
}
]
}
]
According to the documentation, to authorize requests to G Suite APIs, you need to use OAuth 2.0 and you basically have two options (or flows if you want to stick with the official terminology):
User authorization with a consent screen (e.g OAuth 2.0 for installed applications)
Domain-wide delegation for server to server applications (e.g. Using OAuth 2.0 for Server to Server Applications)
With the first option, once that the user completes the flow, you can only access to its resources. So if you want to list all the contents of drive for different users in the G Suite domain, you need to use the second option.
I also recommend you to use the python client pagination feature to manage the listing of files.
Here is a working example of option 1 (Python 3.6)
import os
import pickle
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
SCOPES = ['https://www.googleapis.com/auth/drive', ]
users = ['user1#domain.eu', 'user2#domain.eu']
# we check if we save the credentials in the past and we reuse them
if not os.path.exists('credentials.dat'):
# no credentials found, we run the standard auth flow
flow = InstalledAppFlow.from_client_secrets_file('client_id.json', SCOPES)
credentials = flow.run_local_server()
with open('credentials.dat', 'wb') as credentials_dat:
pickle.dump(credentials, credentials_dat)
else:
with open('credentials.dat', 'rb') as credentials_dat:
credentials = pickle.load(credentials_dat)
if credentials.expired:
credentials.refresh(Request())
drive_sdk = build('drive', 'v3', credentials=credentials)
# drive files API
drive_files_api = drive_sdk.files()
for item in users:
query = "'{}' in owners".format(item)
drive_list_params = {
'q': query,
'corpora': 'domain',
'includeTeamDriveItems': False,
'supportsTeamDrives': False,
'fields': 'files(id,owners),nextPageToken',
}
# first request
files_list_req = drive_files_api.list(**drive_list_params)
while files_list_req is not None:
drive_file_list = files_list_req.execute()
print(drive_file_list.get('files', []))
# pagination handling
files_list_req = drive_files_api.list_next(files_list_req, drive_file_list)
If you run this, you will be prompted for authorization and the script will run on your drive listing files owned by other users and shared with you.
If you want to use the server-to-server flow with domain-wide delegation to list all the files (not just the ones shared with you), here is another working sample.
from googleapiclient.discovery import build
from google.oauth2 import service_account
SCOPES = ['https://www.googleapis.com/auth/drive', ]
users = ['user1#domain.eu', 'user2#domain.eu']
credentials = service_account.Credentials.from_service_account_file('client_secret.json', scopes=SCOPES)
for item in users:
delegated_credentials = credentials.with_subject(item)
drive_sdk = build('drive', 'v3', credentials=delegated_credentials)
# drive files API
drive_files_api = drive_sdk.files()
drive_list_params = {
'corpora': 'domain',
'includeTeamDriveItems': False,
'supportsTeamDrives': False,
'fields': 'files(id,owners),nextPageToken',
}
# first request
files_list_req = drive_files_api.list(**drive_list_params)
while files_list_req is not None:
drive_file_list = files_list_req.execute()
print(drive_file_list.get('files', []))
# pagination handling
files_list_req = drive_files_api.list_next(files_list_req, drive_file_list)

Categories

Resources