I am trying to get the thumbnailLink from files from a shared drive using python and the google Drive API, however, the file information does not include the thumbnailLink (although for most files the hasThumbnail field, which i do get as a field for the file, has a value of true)
I have looked around a lot and none of the solutions i have found seem to work (although this is my first python project as well as my first google drive api project, so i might just be ignorant of what i am doing)
What i have tried:
- setting the scope to 'https://www.googleapis.com/auth/drive' (was ..drive.metadata.readonly before)
- using a wildcard as such: results = drive.files().list(pageSize=10, fields="*",blablabla...). If i for instance try fields="thumbnailLink" it doesn't find any files.
- after getting the list, i tried using the id of each file from that list to do file = service.files().get(fileId=item_id, supportsAllDrives=True, fields="*").execute() but the same happens, i have many fields including the hasThumbnail field which is set to true, yet no thumbnail link.
- i tried using the "Try this API" console on the official website, where i did in fact get the thumbnailLink!! (with the same parameters as above). So i do not understand why this is missing when requested from my application.
Edit (code):
i have one method like so
SCOPES = ['https://www.googleapis.com/auth/drive']
def getDrive():
"""Shows basic usage of the Drive v3 API.
Prints the names and ids of the first 10 files the user has access to.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=53209)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
service = build('drive', 'v3', credentials=creds)
return service
then i call it from here and also get the files:
def getFiles(request):
drive = getDrive()
# Call the Drive v3 API
results = drive.files().list(
pageSize=10, fields="*", driveId="blabla", includeItemsFromAllDrives=True, corpora="drive", supportsAllDrives=True).execute()
items = results.get('files', [])
getItems = []
for item in items:
item_id = item['id']
getItems.append(drive.files().get(fileId=item_id, supportsAllDrives=True, fields="*").execute())
if not items:
print('No files found.')
else:
print('Files:')
print(getItems)
for item in items:
# print(u'{0} ({1})'.format(item['name'], item['id']))
print(item)
return render(request, "index.html", {'files': getItems})
Also, yes, i do use a service account, i can retrieve all the files i need, just not the thumbnailLink.
I don't think it makes sense to call list() and then also get() but i had read that the problem could be solved through the get() method, which in my case did not work.
The issue is in the structure of the response
If you specify fields="*", the response would be something like
{
"kind": "drive#fileList",
...
"files": [
{
"kind": "drive#file",
...
"hasThumbnail": true,
"thumbnailLink": "XXX",
"thumbnailVersion": "XXX"
...
}
..
]
}
So, thumbnailLink is nested inside of files.
In order to retrieve it specify:
fields='files(id, thumbnailLink)'
Related
I have created function that is supposed to move all events from one Google calendar to another. Here is how it looks like:
def merge_calendar(email_from, email_to, service):
off_board_user_calendar = service.events().list(calendarId=email_from).execute()
off_board_user_events = off_board_user_calendar.get('items', [])
# I tried to use this code, to resolve this "You need to have reader access to this calendar." error,
# but it didn't work
#
# rule = {
# 'scope': {
# 'type': 'user',
# 'value': email_from,
# },
# 'role': 'reader'
# }
#
# created_rule = service.acl().insert(calendarId=email_from, body=rule).execute()
# print(f'Updated ACL rule {created_rule}')
for event in off_board_user_events:
updated_event = service.events().move(
calendarId=email_from,
eventId=event['id'],
destination=email_to
).execute()
print(f'Event has been transferred: {updated_event["updated"]}')
print('All events have been transferred successfully.')
Right after execution I get this error - "You need to have reader access to this calendar.". And so, as see from comment, I tried to resolve this error, but this commented code brings me another error - just "Forbidden".
I am not quite sure what I am doing wrong. How can I transfer all events from on calendar to another
Also I think it is important to mention how I create service entity. I was trying to do this using 2 methods:
Normal credentials:
creds = None
if os.path.exists('token.json'):
creds = Credentials.from_authorized_user_file('token.json', SCOPES[api_name])
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(client_secret_file, SCOPES[api_name])
creds = flow.run_local_server()
with open('token.json', 'w') as token:
token.write(creds.to_json())
and using Google Service Account
if delegated_user is not None:
credentials = service_account.Credentials.from_service_account_file(
'service.json', scopes=SCOPES[api_name])
creds = credentials.with_subject(delegated_user)
Both didn't work.
PS.
Calendar scope I have is 'https://www.googleapis.com/auth/calendar'.
Thanks in advance!
Some things that you might look at:
Check the Domain Wide Delegation in your admin console and make sure that the service account ID is the same service account that you are using in your code.
Add the scope that you mentioned in your question 'https://www.googleapis.com/auth/calendar' which is the most restricted scope on the Calendar API.
Try to delegate the user with credentials.create_delegated(email) instead of credentials.with_subject(delegated_user).
Actually, there is no need to transfer event by event. It'll be enough just to update ACL, just like this:
def merge_calendar(email_from, email_to, service):
rule = {
'scope': {
'type': 'user',
'value': email_to,
},
'role': 'owner'
}
service.acl().insert(calendarId=email_from, body=rule).execute()
You will just get an email with proposition to add this calendar to your Google Calendar.
Talking about authentication I had this user delegation:
credentials = service_account.Credentials.from_service_account_file(
'service.json', scopes=['https://www.googleapis.com/auth/calendar'])
creds = credentials.with_subject(email_from)
References
Google Service Account
"""
BEFORE RUNNING:
---------------
1. If not already done, enable the Google Sheets API
and check the quota for your project at
https://console.developers.google.com/apis/api/sheets
2. Install the Python client library for Google APIs by running
`pip install --upgrade google-api-python-client`
"""
# TODO: Change placeholder below to generate authentication credentials. See
# https://developers.google.com/sheets/quickstart/python#step_3_set_up_the_sample
#
# Authorize using one of the following scopes:
# 'https://www.googleapis.com/auth/drive'
# 'https://www.googleapis.com/auth/drive.file'
# 'https://www.googleapis.com/auth/spreadsheets'
SCOPES = ['https://www.googleapis.com/auth/spreadsheets']
creds = None
if os.path.exists('google.json'):
creds = Credentials.from_authorized_user_file('google.json', SCOPES)
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'CLIENT.json',SCOPES)
creds = flow.run_local_server(port=0)
with open('google.json', 'w') as token:
token.write(creds.to_json())
service = discovery.build('sheets', 'v4', credentials=creds)
spreadsheet_body = {
'sheets': [{
'properties': {
'title': str(files[0])
}
}]
}
request = service.spreadsheets().create(body=spreadsheet_body)
if request == str(files[0]):
pass
else:
response = request.execute()
pprint(response)
How can I create condition? if google sheet name exist if TRUE then don't proceed to create. I read the documentation and I didn't see any possible answer or I am just mistaken to understand the documentation please help thank you.
I believe your goal is as follows.
You want to check whether a file (Google Spreadsheet) is existing in Google Drive using a filename.
You want to achieve this using googleapis for python.
In this case, how about the following sample script? In this case, in order to search the file using the filename, Drive API is used.
Sample script:
filename = str(files[0])
service = build("drive", "v3", credentials=creds)
results = service.files().list(pageSize=1, fields="files(id, name)", q="name='" + filename + "' and mimeType='application/vnd.google-apps.spreadsheet' and trashed=false",).execute()
files = results.get("files", [])
if not files:
# When the file of filename is not found, this script is run.
print("No files were found.")
else:
# When the file of filename is found, this script is run.
print("Files were found.")
When this script is run, you can check whether the file is existing in Google Drive in the filename.
In this case, please add a scope of "https://www.googleapis.com/auth/drive.metadata.readonly" as follows. And, please reauthorize the scopes. So, please remove the file of google.json and run the script again.
SCOPES = [
"https://www.googleapis.com/auth/spreadsheets",
"https://www.googleapis.com/auth/drive.metadata.readonly",
]
From your question, I couldn't know whether you are trying to use the script in the shared Drive. So, in this modification, the script cannot be used for the shared Drive. But, if you want to use this script in the shared Drive, please include corpora="allDrives", includeItemsFromAllDrives=True, supportsAllDrives=True to the request.
Reference:
Files: list
I am trying to create a call that gets all the group Gmail emails so that I can update those that aren't there and delete those that shouldn't be. I am currently trying the below code and I'm getting a scope error.
# If modifying these scopes, delete the file token.json.
SCOPES = ['https://www.googleapis.com/auth/admin.directory.group.members', 'https://www.googleapis.com/auth/admin.directory.group']
def main():
"""Shows basic usage of the Admin SDK Directory API.
Prints the emails and names of the first 10 users in the domain.
"""
creds = None
# The file token.json stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.json'):
creds = Credentials.from_authorized_user_file('token.json', SCOPES)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.json', 'w') as token:
token.write(creds.to_json())
service = build('admin', 'directory_v1', credentials=creds)
# Call the Admin SDK Directory API
print('Getting the members of Hospitality Team')
response_group = service.groups().list(customer='my_customer').execute()
for group in response_group['groups']:
print(group['email'])
Solution:
You could do the following:
List all your groups via Groups: list.
For each group, check whether it has members.
If the group has members, retrieve its members via Members: list.
For each desired member coming from the other API, check if it already exists in the group. If it doesn't exist, add it to your group via Members: insert.
For each current member in the group, check if it's one of the desired members. If it's not, delete it via Members: delete.
If the group does not have members, add all desired members to the group via Members: insert.
Code snippet:
def updateGroupMembers(service):
ideal_member_emails = ["member_1#example.com", "member_2#example.com", "member_3#example.com"]
response_group = service.groups().list(customer='my_customer').execute()
for group in response_group['groups']:
group_email = group['email']
response_members = service.members().list(groupKey=group_email).execute()
if "members" in response_members:
current_member_emails = list(map((lambda member : member["email"]), response_members["members"]))
for ideal_member_email in ideal_member_emails:
if ideal_member_email not in current_member_emails:
payload = {
"email": ideal_member_email
}
service.members().insert(groupKey=group_email, body=payload).execute()
for current_member_email in current_member_emails:
if current_member_email not in ideal_member_emails:
service.members().delete(groupKey=group_email, memberKey=current_member_email).execute()
else:
for ideal_member_email in ideal_member_emails:
payload = {
"email": ideal_member_email
}
service.members().insert(groupKey=group_email, body=payload).execute()
Notes:
The scopes you are providing should be enough for these calls. If you edited those scopes after last authenticating, remove the old token.json and authenticate again.
Make sure the authenticated user has edit access to these groups.
Here I'm assuming the desired list of members is the same for all groups. I'm also assuming you have a list of these emails (currently ideal_member_emails). If that's not the case, please edit the provided script according to your preferences.
If your list of groups and members is large enough, you should iteratively fetch the different page results for your list requests. See this related answer (regarding Users: list, but the process is identical) for more information on how to do this.
Reference:
Python library: members
I need to get the list of all files and folders in google drive owned by a user. For some reasons file.list method doesn't return nextPageToken in response and I see only few results, since I can't go to the next page.
I have tried API Client Library for python and API explorer, but I receive only several records per user.
My code
users = ['user1#domain.com', 'user2#domain.com']
drive_array = []
if users:
for item in users:
page_token_drive = None
query = "'%s' in owners" % (item)
while True:
drive_result = service_drive.files().list(q=query, corpora='domain', includeTeamDriveItems=False,
supportsTeamDrives=False, fields='nextPageToken, files(id,owners)',
pageToken=page_token_drive).execute()
drive_array.extend(drive_result.get('files', []))
page_token_drive = drive_result.get('nextPageToken', None)
if not page_token_drive:
break
I expect to get id of a file and owners array for all files owned by the user
[
{
"id": "12344",
"owners": [
{
"kind": "drive#user",
"displayName": "User1 User1",
"photoLink": "https://lg",
"me": false,
"permissionId": "1234556",
"emailAddress": "user1#domain.com"
}
]
},
{
"id": "09875",
"owners": [
{
"kind": "drive#user",
"displayName": "User1 User1",
"photoLink": "https://lh5",
"me": false,
"permissionId": "56565665655656566",
"emailAddress": "user1#domain.com"
}
]
}
]
According to the documentation, to authorize requests to G Suite APIs, you need to use OAuth 2.0 and you basically have two options (or flows if you want to stick with the official terminology):
User authorization with a consent screen (e.g OAuth 2.0 for installed applications)
Domain-wide delegation for server to server applications (e.g. Using OAuth 2.0 for Server to Server Applications)
With the first option, once that the user completes the flow, you can only access to its resources. So if you want to list all the contents of drive for different users in the G Suite domain, you need to use the second option.
I also recommend you to use the python client pagination feature to manage the listing of files.
Here is a working example of option 1 (Python 3.6)
import os
import pickle
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
SCOPES = ['https://www.googleapis.com/auth/drive', ]
users = ['user1#domain.eu', 'user2#domain.eu']
# we check if we save the credentials in the past and we reuse them
if not os.path.exists('credentials.dat'):
# no credentials found, we run the standard auth flow
flow = InstalledAppFlow.from_client_secrets_file('client_id.json', SCOPES)
credentials = flow.run_local_server()
with open('credentials.dat', 'wb') as credentials_dat:
pickle.dump(credentials, credentials_dat)
else:
with open('credentials.dat', 'rb') as credentials_dat:
credentials = pickle.load(credentials_dat)
if credentials.expired:
credentials.refresh(Request())
drive_sdk = build('drive', 'v3', credentials=credentials)
# drive files API
drive_files_api = drive_sdk.files()
for item in users:
query = "'{}' in owners".format(item)
drive_list_params = {
'q': query,
'corpora': 'domain',
'includeTeamDriveItems': False,
'supportsTeamDrives': False,
'fields': 'files(id,owners),nextPageToken',
}
# first request
files_list_req = drive_files_api.list(**drive_list_params)
while files_list_req is not None:
drive_file_list = files_list_req.execute()
print(drive_file_list.get('files', []))
# pagination handling
files_list_req = drive_files_api.list_next(files_list_req, drive_file_list)
If you run this, you will be prompted for authorization and the script will run on your drive listing files owned by other users and shared with you.
If you want to use the server-to-server flow with domain-wide delegation to list all the files (not just the ones shared with you), here is another working sample.
from googleapiclient.discovery import build
from google.oauth2 import service_account
SCOPES = ['https://www.googleapis.com/auth/drive', ]
users = ['user1#domain.eu', 'user2#domain.eu']
credentials = service_account.Credentials.from_service_account_file('client_secret.json', scopes=SCOPES)
for item in users:
delegated_credentials = credentials.with_subject(item)
drive_sdk = build('drive', 'v3', credentials=delegated_credentials)
# drive files API
drive_files_api = drive_sdk.files()
drive_list_params = {
'corpora': 'domain',
'includeTeamDriveItems': False,
'supportsTeamDrives': False,
'fields': 'files(id,owners),nextPageToken',
}
# first request
files_list_req = drive_files_api.list(**drive_list_params)
while files_list_req is not None:
drive_file_list = files_list_req.execute()
print(drive_file_list.get('files', []))
# pagination handling
files_list_req = drive_files_api.list_next(files_list_req, drive_file_list)
SCOPES_SHEETS = 'https://www.googleapis.com/auth/spreadsheets'
Gives read/write permissions ^
def main():
service_sheets = get_google_service('sheets', 'v4', 'token_sheets.json', SCOPES_SHEETS)
with open('services.pkl', 'wb') as f:
pickle.dump(service_sheets, f)
with open('services.pkl', 'rb') as f:
service_sheets = pickle.load(f)
with open('serviceCopy.pkl', 'wb') as f:
pickle.dump(service_sheets, f)
def get_google_service(type, version, token, scope):
store = file.Storage(token)
creds = store.get()
if not creds or creds.invalid:
flow = client.flow_from_clientsecrets('credentials.json', scope)
creds = tools.run_flow(flow, store)
return build(type, version, http=creds.authorize(Http()))
I have a program that I want to run as a service in the background. It involves reading/writing things in a google sheet. For this I have to create a google service but I don't want to have to do it every time the code runs so I'm trying to store the service object in a file instead. For some reason the file service.pkl is different from serviceCopy.pkl. I've tried changing the encoding for pickle.load(file, encoding='utf8') but I keep getting files that don't match.
To my understanding they should be exactly the same.
I think the issue is with loading the saved file but I'm not sure what's causing it.
I'm using python 3.6.