How to check if google sheet exist? Python - python

"""
BEFORE RUNNING:
---------------
1. If not already done, enable the Google Sheets API
and check the quota for your project at
https://console.developers.google.com/apis/api/sheets
2. Install the Python client library for Google APIs by running
`pip install --upgrade google-api-python-client`
"""
# TODO: Change placeholder below to generate authentication credentials. See
# https://developers.google.com/sheets/quickstart/python#step_3_set_up_the_sample
#
# Authorize using one of the following scopes:
# 'https://www.googleapis.com/auth/drive'
# 'https://www.googleapis.com/auth/drive.file'
# 'https://www.googleapis.com/auth/spreadsheets'
SCOPES = ['https://www.googleapis.com/auth/spreadsheets']
creds = None
if os.path.exists('google.json'):
creds = Credentials.from_authorized_user_file('google.json', SCOPES)
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'CLIENT.json',SCOPES)
creds = flow.run_local_server(port=0)
with open('google.json', 'w') as token:
token.write(creds.to_json())
service = discovery.build('sheets', 'v4', credentials=creds)
spreadsheet_body = {
'sheets': [{
'properties': {
'title': str(files[0])
}
}]
}
request = service.spreadsheets().create(body=spreadsheet_body)
if request == str(files[0]):
pass
else:
response = request.execute()
pprint(response)
How can I create condition? if google sheet name exist if TRUE then don't proceed to create. I read the documentation and I didn't see any possible answer or I am just mistaken to understand the documentation please help thank you.

I believe your goal is as follows.
You want to check whether a file (Google Spreadsheet) is existing in Google Drive using a filename.
You want to achieve this using googleapis for python.
In this case, how about the following sample script? In this case, in order to search the file using the filename, Drive API is used.
Sample script:
filename = str(files[0])
service = build("drive", "v3", credentials=creds)
results = service.files().list(pageSize=1, fields="files(id, name)", q="name='" + filename + "' and mimeType='application/vnd.google-apps.spreadsheet' and trashed=false",).execute()
files = results.get("files", [])
if not files:
# When the file of filename is not found, this script is run.
print("No files were found.")
else:
# When the file of filename is found, this script is run.
print("Files were found.")
When this script is run, you can check whether the file is existing in Google Drive in the filename.
In this case, please add a scope of "https://www.googleapis.com/auth/drive.metadata.readonly" as follows. And, please reauthorize the scopes. So, please remove the file of google.json and run the script again.
SCOPES = [
"https://www.googleapis.com/auth/spreadsheets",
"https://www.googleapis.com/auth/drive.metadata.readonly",
]
From your question, I couldn't know whether you are trying to use the script in the shared Drive. So, in this modification, the script cannot be used for the shared Drive. But, if you want to use this script in the shared Drive, please include corpora="allDrives", includeItemsFromAllDrives=True, supportsAllDrives=True to the request.
Reference:
Files: list

Related

Cannot transfer google calendar events using Google API Python SDK

I have created function that is supposed to move all events from one Google calendar to another. Here is how it looks like:
def merge_calendar(email_from, email_to, service):
off_board_user_calendar = service.events().list(calendarId=email_from).execute()
off_board_user_events = off_board_user_calendar.get('items', [])
# I tried to use this code, to resolve this "You need to have reader access to this calendar." error,
# but it didn't work
#
# rule = {
# 'scope': {
# 'type': 'user',
# 'value': email_from,
# },
# 'role': 'reader'
# }
#
# created_rule = service.acl().insert(calendarId=email_from, body=rule).execute()
# print(f'Updated ACL rule {created_rule}')
for event in off_board_user_events:
updated_event = service.events().move(
calendarId=email_from,
eventId=event['id'],
destination=email_to
).execute()
print(f'Event has been transferred: {updated_event["updated"]}')
print('All events have been transferred successfully.')
Right after execution I get this error - "You need to have reader access to this calendar.". And so, as see from comment, I tried to resolve this error, but this commented code brings me another error - just "Forbidden".
I am not quite sure what I am doing wrong. How can I transfer all events from on calendar to another
Also I think it is important to mention how I create service entity. I was trying to do this using 2 methods:
Normal credentials:
creds = None
if os.path.exists('token.json'):
creds = Credentials.from_authorized_user_file('token.json', SCOPES[api_name])
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(client_secret_file, SCOPES[api_name])
creds = flow.run_local_server()
with open('token.json', 'w') as token:
token.write(creds.to_json())
and using Google Service Account
if delegated_user is not None:
credentials = service_account.Credentials.from_service_account_file(
'service.json', scopes=SCOPES[api_name])
creds = credentials.with_subject(delegated_user)
Both didn't work.
PS.
Calendar scope I have is 'https://www.googleapis.com/auth/calendar'.
Thanks in advance!
Some things that you might look at:
Check the Domain Wide Delegation in your admin console and make sure that the service account ID is the same service account that you are using in your code.
Add the scope that you mentioned in your question 'https://www.googleapis.com/auth/calendar' which is the most restricted scope on the Calendar API.
Try to delegate the user with credentials.create_delegated(email) instead of credentials.with_subject(delegated_user).
Actually, there is no need to transfer event by event. It'll be enough just to update ACL, just like this:
def merge_calendar(email_from, email_to, service):
rule = {
'scope': {
'type': 'user',
'value': email_to,
},
'role': 'owner'
}
service.acl().insert(calendarId=email_from, body=rule).execute()
You will just get an email with proposition to add this calendar to your Google Calendar.
Talking about authentication I had this user delegation:
credentials = service_account.Credentials.from_service_account_file(
'service.json', scopes=['https://www.googleapis.com/auth/calendar'])
creds = credentials.with_subject(email_from)
References
Google Service Account

SDK, Trying to call Members from Group Gmail and Update

I am trying to create a call that gets all the group Gmail emails so that I can update those that aren't there and delete those that shouldn't be. I am currently trying the below code and I'm getting a scope error.
# If modifying these scopes, delete the file token.json.
SCOPES = ['https://www.googleapis.com/auth/admin.directory.group.members', 'https://www.googleapis.com/auth/admin.directory.group']
def main():
"""Shows basic usage of the Admin SDK Directory API.
Prints the emails and names of the first 10 users in the domain.
"""
creds = None
# The file token.json stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.json'):
creds = Credentials.from_authorized_user_file('token.json', SCOPES)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.json', 'w') as token:
token.write(creds.to_json())
service = build('admin', 'directory_v1', credentials=creds)
# Call the Admin SDK Directory API
print('Getting the members of Hospitality Team')
response_group = service.groups().list(customer='my_customer').execute()
for group in response_group['groups']:
print(group['email'])
Solution:
You could do the following:
List all your groups via Groups: list.
For each group, check whether it has members.
If the group has members, retrieve its members via Members: list.
For each desired member coming from the other API, check if it already exists in the group. If it doesn't exist, add it to your group via Members: insert.
For each current member in the group, check if it's one of the desired members. If it's not, delete it via Members: delete.
If the group does not have members, add all desired members to the group via Members: insert.
Code snippet:
def updateGroupMembers(service):
ideal_member_emails = ["member_1#example.com", "member_2#example.com", "member_3#example.com"]
response_group = service.groups().list(customer='my_customer').execute()
for group in response_group['groups']:
group_email = group['email']
response_members = service.members().list(groupKey=group_email).execute()
if "members" in response_members:
current_member_emails = list(map((lambda member : member["email"]), response_members["members"]))
for ideal_member_email in ideal_member_emails:
if ideal_member_email not in current_member_emails:
payload = {
"email": ideal_member_email
}
service.members().insert(groupKey=group_email, body=payload).execute()
for current_member_email in current_member_emails:
if current_member_email not in ideal_member_emails:
service.members().delete(groupKey=group_email, memberKey=current_member_email).execute()
else:
for ideal_member_email in ideal_member_emails:
payload = {
"email": ideal_member_email
}
service.members().insert(groupKey=group_email, body=payload).execute()
Notes:
The scopes you are providing should be enough for these calls. If you edited those scopes after last authenticating, remove the old token.json and authenticate again.
Make sure the authenticated user has edit access to these groups.
Here I'm assuming the desired list of members is the same for all groups. I'm also assuming you have a list of these emails (currently ideal_member_emails). If that's not the case, please edit the provided script according to your preferences.
If your list of groups and members is large enough, you should iteratively fetch the different page results for your list requests. See this related answer (regarding Users: list, but the process is identical) for more information on how to do this.
Reference:
Python library: members

Can not retrieve thumbnailLink from google drive API

I am trying to get the thumbnailLink from files from a shared drive using python and the google Drive API, however, the file information does not include the thumbnailLink (although for most files the hasThumbnail field, which i do get as a field for the file, has a value of true)
I have looked around a lot and none of the solutions i have found seem to work (although this is my first python project as well as my first google drive api project, so i might just be ignorant of what i am doing)
What i have tried:
- setting the scope to 'https://www.googleapis.com/auth/drive' (was ..drive.metadata.readonly before)
- using a wildcard as such: results = drive.files().list(pageSize=10, fields="*",blablabla...). If i for instance try fields="thumbnailLink" it doesn't find any files.
- after getting the list, i tried using the id of each file from that list to do file = service.files().get(fileId=item_id, supportsAllDrives=True, fields="*").execute() but the same happens, i have many fields including the hasThumbnail field which is set to true, yet no thumbnail link.
- i tried using the "Try this API" console on the official website, where i did in fact get the thumbnailLink!! (with the same parameters as above). So i do not understand why this is missing when requested from my application.
Edit (code):
i have one method like so
SCOPES = ['https://www.googleapis.com/auth/drive']
def getDrive():
"""Shows basic usage of the Drive v3 API.
Prints the names and ids of the first 10 files the user has access to.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=53209)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
service = build('drive', 'v3', credentials=creds)
return service
then i call it from here and also get the files:
def getFiles(request):
drive = getDrive()
# Call the Drive v3 API
results = drive.files().list(
pageSize=10, fields="*", driveId="blabla", includeItemsFromAllDrives=True, corpora="drive", supportsAllDrives=True).execute()
items = results.get('files', [])
getItems = []
for item in items:
item_id = item['id']
getItems.append(drive.files().get(fileId=item_id, supportsAllDrives=True, fields="*").execute())
if not items:
print('No files found.')
else:
print('Files:')
print(getItems)
for item in items:
# print(u'{0} ({1})'.format(item['name'], item['id']))
print(item)
return render(request, "index.html", {'files': getItems})
Also, yes, i do use a service account, i can retrieve all the files i need, just not the thumbnailLink.
I don't think it makes sense to call list() and then also get() but i had read that the problem could be solved through the get() method, which in my case did not work.
The issue is in the structure of the response
If you specify fields="*", the response would be something like
{
"kind": "drive#fileList",
...
"files": [
{
"kind": "drive#file",
...
"hasThumbnail": true,
"thumbnailLink": "XXX",
"thumbnailVersion": "XXX"
...
}
..
]
}
So, thumbnailLink is nested inside of files.
In order to retrieve it specify:
fields='files(id, thumbnailLink)'

Creating A Spreadsheet In A Folder With GSpread

I am having trouble finding any documentation on how to create a GSheet in a certain Google Drive directory using GSpread.
I have checked the documentation and had a look around some of the back end code.
I am currently using the code below to create the spreadsheet:
worksheet = sh.add_worksheet(title='Overview', rows='100', cols='9')
I want to be able to create the spreadsheet in a directory on a google drive, for example:
X > Y > Spreadsheet
Any help would be greatly appreciated,
Cheers.
You want to create new Spreadsheet to the specific folder.
You want to achieve this using Python.
If my understanding is correct, how about this answer?
Modification points:
Unfortunately, Sheets API cannot achieve this. In this case, it is required to use Drive API.
In your script, I think that you use gspread.authorize() like gc = gspread.authorize(credentials). In this modification, credentials is used.
The script in your question of worksheet = sh.add_worksheet(title='Overview', rows='100', cols='9') is used for adding a sheet to the existing Spreadsheet. When you create new Spreadsheet using gspread, please use sh = gc.create('A new spreadsheet').
In this case, the new Spreadsheet is created to the root folder.
Preparation:
Before you use the following script, please enable Drive API at API console, and please add the scope of https://www.googleapis.com/auth/drive. If you are using the scope of https://www.googleapis.com/auth/drive.file, please use this and the scope is not required to be modified to https://www.googleapis.com/auth/drive.
If you are using OAuth2, please remove the file including the refresh token. And then, please run the script and reauthorize again. By this, the added scope is reflected to the access token.
If you are using Service account, it is not required to remove the file.
Pattern 1:
The following sample script creates new Spreadsheet to the specific folder.
Sample script:
from apiclient import discovery
destFolderId = '###' # Please set the destination folder ID.
title = '###' # Please set the Spreadsheet name.
drive_service = discovery.build('drive', 'v3', credentials=credentials) # Use "credentials" of "gspread.authorize(credentials)".
file_metadata = {
'name': title,
'mimeType': 'application/vnd.google-apps.spreadsheet',
'parents': [destFolderId]
}
file = drive_service.files().create(body=file_metadata).execute()
print(file)
Pattern 2:
If you want to move the existing Spreadsheet to the specific folder, please use the following script.
Sample script:
from apiclient import discovery
spreadsheetId = '###' # Please set the Spreadsheet ID.
destFolderId = '###' # Please set the destination folder ID.
drive_service = discovery.build('drive', 'v3', credentials=credentials) # Use "credentials" of "gspread.authorize(credentials)".
# Retrieve the existing parents to remove
file = drive_service.files().get(fileId=spreadsheetId,
fields='parents').execute()
previous_parents = ",".join(file.get('parents'))
# Move the file to the new folder
file = drive_service.files().update(fileId=spreadsheetId,
addParents=destFolderId,
removeParents=previous_parents,
fields='id, parents').execute()
References:
Files: create
Files: update
Moving files between folders
If I misunderstood your question and this was not the direction you want, I apologize.
Edit:
When you want to share the folder, please use the following script.
Sample script:
drive_service = discovery.build('drive', 'v3', credentials=credentials) # Use "credentials" of "gspread.authorize(credentials)".
folderId = "###" # Please set the folder ID.
permission = {
'type': 'user',
'role': 'writer',
'emailAddress': '###', # Please set the email address of the user that you want to share.
}
res = drive_service.permissions().create(fileId=folderId, body=permission).execute()
print(res)
Reference:
Permissions: create
Here is a solution that works for me.
It creates a new google spreadsheet within the folder on Google Drive.
NOTE 1: the folder must be shared with the Google Developer account (smth like getgooglesheets#<your-app-name>.iam.gserviceaccount.com).
This email could be found in your keyfile.
GOOGLE API V2
from googleapiclient import discovery
from oauth2client.service_account import ServiceAccountCredentials
PATH_TO_KEYFILE = "./"
KEYFILE = "<appname>-<numbers>.json" # Download from google dev account
DESTFOLDER_ID = "1lZs9O...xLW8D" # Some folder on Google drive
TITLE = "My_New_Spread_Sheet"
EMAIL = "<new_owner_email>#gmail.com"
def share_file(...):
""" Routine that changes rights and ownership of the file """
...
return
file_metadata = {
'title': TITLE,
'mimeType': 'application/vnd.google-apps.spreadsheet',
'parents': [{'id': DESTFOLDER_ID}]
}
scope = ['https://spreadsheets.google.com/feeds', 'https://www.googleapis.com/auth/drive']
keyfile = os.path.join(PATH_TO_KEYFILE, KEYFILE)
credentials = ServiceAccountCredentials.from_json_keyfile_name(keyfile, scope)
drive_service = discovery.build('drive', 'v2', credentials=credentials) # VERSION 2 GOOGLE API
response = drive_service.files().insert(body=file_metadata).execute() # Here file is created!
print(response) # full Google API response
print(f"🔗 LINK: https://docs.google.com/spreadsheets/d/{file['id']}/edit?usp=drivesdk") # output for convenience...
share_file(file_id=file['id'], email=EMAIL, change_ownership=True) # Here I change ownership of the file — that is out of the scope of the question
NOTE 2: the new file initially belongs to the developer account, not to the folder owner's account.
GOOGLE API VER 3
same as above, differs in the following:
...
drive_service = discovery.build('drive', 'v3', credentials=credentials) # VERSION 3 GOOGLE API
file_metadata = {
'name': TITLE, # use 'name' property
'mimeType': 'application/vnd.google-apps.spreadsheet',
'parents': [DESTFOLDER_ID] # no dictionary inside the list
}
response = drive_service.files().create(body=file_metadata).execute() # 'create()' instead of insert()
...
Also, the response details will slightly differ.

Python Pickle.load() not loading correctly

SCOPES_SHEETS = 'https://www.googleapis.com/auth/spreadsheets'
Gives read/write permissions ^
def main():
service_sheets = get_google_service('sheets', 'v4', 'token_sheets.json', SCOPES_SHEETS)
with open('services.pkl', 'wb') as f:
pickle.dump(service_sheets, f)
with open('services.pkl', 'rb') as f:
service_sheets = pickle.load(f)
with open('serviceCopy.pkl', 'wb') as f:
pickle.dump(service_sheets, f)
def get_google_service(type, version, token, scope):
store = file.Storage(token)
creds = store.get()
if not creds or creds.invalid:
flow = client.flow_from_clientsecrets('credentials.json', scope)
creds = tools.run_flow(flow, store)
return build(type, version, http=creds.authorize(Http()))
I have a program that I want to run as a service in the background. It involves reading/writing things in a google sheet. For this I have to create a google service but I don't want to have to do it every time the code runs so I'm trying to store the service object in a file instead. For some reason the file service.pkl is different from serviceCopy.pkl. I've tried changing the encoding for pickle.load(file, encoding='utf8') but I keep getting files that don't match.
To my understanding they should be exactly the same.
I think the issue is with loading the saved file but I'm not sure what's causing it.
I'm using python 3.6.

Categories

Resources