I'm trying to get a list of folders that I created in Google Drive (Python, Flask):
SCOPES = ['https://www.googleapis.com/auth/drive']
drive_credentials = service_account.Credentials.from_service_account_file(
json_url, scopes=SCOPES)
drive_service = build('drive', 'v3', credentials = drive_credentials)
results = drive_service.files().list(
q="mimeType='application/vnd.google-apps.folder'",
spaces='drive').execute()
results.files is an empty array.
I can not figure out what is wrong. If I try the same query here https://developers.google.com/drive/api/v3/reference/files/list?apix_params=%7B%22q%22%3A%22mimeType%20%3D%20%27application%2Fvnd.google-apps.folder%27%22%7D I see all my folders.
Also, if I remove query I can see all my files but not folders.
UPD. I found it doesn't see any other files except Getting started pdf. I created just couple of test files and the query has still only one result.
I found what was the problem. Even although the scope gave "the full access" it actually did not. Only after I gave permission to my own email it started working.
What I did: in the Google drive interface I selected folders, then click "Share" and entered the email from credentials: xxx#yyy.iam.gserviceaccount.com
That's weird but worked.
Related
I'm using the Google Earth Engine in Python to get a Sentinel-2 composition and download it to my google drive. When doing a manual authentication everything is working fine:
ee.Authenticate()
ee.Initialize()
However, as I want to use my code in a workflow and don't use all the time the manual authentication, I am using a service account, like described here. It works fine and I can use GEE without doing anything manually:
# get service account
service_account = 'test#test.iam.gserviceaccount.com'
# get credentials
credentials = ee.ServiceAccountCredentials(service_account, gee_secret.json)
ee.Initialize(credentials)
In order to export my File to Google Drive I'm using following code:
# export options
export_config = {
'scale':10,
'region':aoi, #aoi is a polygon
'crs': 'EPSG:3031',
}
file_name = "test"
# export to drive
task = ee.Batch.Export.iamge.toDrive(image, file_name, **export_config)
task.start()
With both authentication methods this task is successfully finished (The status of the task is 'Completed'). However, only when using the manual authentication, I can see my exported image in my Drive. When using the automatic authentication, my Drive is empty.
Someone else already asked a similar question here. A possible idea here was that the image file is exported to the Google Drive of the service account and not to my personal Google Drive. However, I am not sure how to access this other Drive.
Does anyone have an idea how to solve this (=how to access the exported file?). Or have another solution for automatic authentication in which the file will be at my personal Google Drive?
Many thanks to the hints of DaImTo and Yancy Godoy! With these I could find a solution. I will post it here, so that it is maybe useful for others as well.
Indeed the export to the Google Drive worked perfectly, however it was not exported to my personal Google Drive, but to the Google Drive of the service account. It was therefore important to add the access to Google Drive for my service account (see here).
In the following you can find a complete workflow. For the downloading I am using PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from oauth2client.service_account import ServiceAccountCredentials
# this file contains the E-mail adress of the service account
with open(key_path + '/service_worker_mail.txt', 'r') as file:
service_account_file = file.read().replace('\n', '')
# get the credentials
credentials = ee.ServiceAccountCredentials(service_account_file, key_path + "/" + "gee_secret.json")
# authenticate and initialize Google Earth Engine
ee.Initialize(credentials)
# The following operations of Google Earth Engine, including the Export to the Drive
# ...
# ...
# ...
# authenticate to Google Drive (of the Service account)
gauth = GoogleAuth()
scopes = ['https://www.googleapis.com/auth/drive']
gauth.credentials = ServiceAccountCredentials.from_json_keyfile_name(key_path + "/" + "gee_secret.json", scopes=scopes)
drive = GoogleDrive(gauth)
# get list of files
file_list = drive.ListFile({'q': "'root' in parents and trashed=false"}).GetList()
for file in file_list:
filename = file['title']
# download file into working directory (in this case a tiff-file)
file.GetContentFile(filename, mimetype="image/tiff")
# delete file afterwards to keep the Drive empty
file1.Delete()
I am trying to write a script in Python to grab new emails from a specific folder and save the attachments to a shared drive to upload to a database. Power Automate would work, but the file size limit to save the attachment is a meager 20 MB. I am able to authenticate the token, but am getting the following error when trying to grab the emails:
Unauthorized for url.
The token contains no permissions, or permissions can not be understood.
I have included the code I am using to connect to Microsoft Graph.
(credentials and tenant_id are correct in my code, took them out for obvious reasons
from O365 import Account, MSOffice365Protocol, MSGraphProtocol
credentials = ('xxxxxx', 'xxxxxx')
protocol = MSGraphProtocol(default_resource='reporting.triometric#xxxx.com')
scopes_graph = protocol.get_scopes_for('message_all_shared')
scopes = ['https://graph.microsoft.com/.default']
account = Account(credentials, auth_flow_type='credentials', tenant_id="**", scopes=scopes,)
if account.authenticate():
print('Authenticated')
mailbox = account.mailbox(resource='reporting.triometric#xxxx.com')
inbox = mailbox.inbox_folder()
for message in inbox.get_messages():
print(message)
I have already configured the permissions through Azure to include all the necessary 'mail' delegations.
The rest of my script works perfectly fine for uploading files to the database. Currently, the attachments must be manually saved on a shared drive multiple times per day, then the script is run to upload. Are there any steps I am missing? Any insights would be greatly appreciated!
Here are the permissions:
auth_flow_type='credentials' means you are using client credentials flow.
In this case you should add Application permissions rather than Delegated permissions.
Don't forget to click on "Grant admin consent for {your tenant}".
UPDATE:
If you set auth_flow_type to 'Authorization', it will use auth code flow which requires the delegated permission.
I trying to working with Google Sheet by using Python
Following the guidance from this site:
https://www.twilio.com/blog/2017/02/an-easy-way-to-read-and-write-to-a-google-spreadsheet-in-python.html
Including steps:
Go to the Google APIs Console.
Create a new project.
Click Enable API. Search for and enable the Google Drive API.
Create credentials for a Web Server to access Application Data.
Name the service account and grant it a Project Role of Editor.
Download the JSON file.
Copy the JSON file to your code directory and rename it to
client_secret.json
Here is what I did so far
import gspread
from oauth2client.service_account import ServiceAccountCredentials
scope = ['https://docs.google.com/spreadsheets/d/1IypqDnLKKR-IX1oOwCuTejBQ-RFq87K9rqSF6GpSyn4/edit?usp=drive_web&ouid=109125393303568297837']
creds = ServiceAccountCredentials.from_json_keyfile_name('client_secret.json', scope)
client = gspread.authorize(creds)
sheet = client.open('test_xlsx').sheet1
However, I got this result errors
google.auth.exceptions.RefreshError: ('No access token in response.',
{'id_token'..}
Thats not a google scope. this is the list of valid google scopes#sheets
Even the tutorial you are following says to use the following scope. Without including the correct oauth scope you font have access.
scope = ['https://spreadsheets.google.com/feeds']
I was getting the same error, using the scopes from the tutorial, so I tried several from scopes#sheets suggested by DaImTo, until it eventually worked.
For me it was scope = ['https://www.googleapis.com/auth/drive']
I've written the following code to upload an image to my own Google Drive using a service account.
My code is returning successfully, giving me an ID back but there's nothing appearing on my actual Google Drive.
from django.conf import settings
import os
from apiclient.discovery import build
from apiclient.http import MediaFileUpload
from oauth2client.service_account import ServiceAccountCredentials
def get_service(api_name, api_version, scope, key_file_location):
credentials = ServiceAccountCredentials.from_json_keyfile_name(key_file_location, scopes=scope)
service = build(api_name, api_version, credentials=credentials)
return service
def setup_upload():
scope = ['https://www.googleapis.com/auth/drive']
key_file_location = os.path.join(os.path.dirname(settings.BASE_DIR), 'common/my-json-file.json')
service = get_service('drive', 'v3', scope, key_file_location)
file_path = os.path.join(os.path.dirname(settings.BASE_DIR), 'common/apple.png')
file_metadata = {'name': 'apple.png'}
media = MediaFileUpload(file_path, mimetype="image/png")
file = service.files().create(body=file_metadata, media_body=media, fields='id').execute()
print(file.get('id')) #this returns an actual ID
setup_upload()
I do get a long ID string back from the last line of setup_upload(). But nothing is appearing on my actual Google Drive. I'm expecting to see the apple.png file pop up in my home directory.
What am I missing here?
Quoting from the tutorial you linked to:
"You can take the service account email address and give it access to a directory on your Google drive. It will then be allowed to upload to that directory, but you wont have access to the files. You will need to complete a second step and give yourself personally permission to access those files by updating or patching the file permissions."
Have you given the service account access to the place where you want to write the file? If you haven't specified where to upload the file to, the service account may just be uploading the file into its own Drive.
In addition to #user2705223's answer, if you want to be able to access the files it uploads, then you must grant yourself access to them through the service account. Check if you had a successful login credentials and authorize the service account with right scope. You can try following this documentation to help you do the authorization to make API requests.
I am trying to list all the files in my drive (about 10) but the following will only list 1 (and that isn't even a real file of mine)....
the code:
from httplib2 import Http
from oauth2client.client import SignedJwtAssertionCredentials
client='my_client_id'
client_email = 'my_client_email'
with open("/path/to/file.p12") as f:
private_key = f.read()
credentials = SignedJwtAssertionCredentials(client_email, private_key, 'https://www.googleapis.com/auth/drive')
http_auth = credentials.authorize(Http())
drive_service = build('drive', 'v2', http=http_auth)
r = drive_service.files().list().execute()
files = r['items']
for f in files:
print f['id'], f['title']
result:
"<file_id> How to get started with Drive"
EDIT:
This question is similar but the answer is to have the correct oauth scope, which I have above.
EDIT #2:
I thought it might be a timing issue so I gave it a few hours and still no goose.
EDIT #3:
If I try to copy a file from another user then list my files then I'll get 2 files:
" How to get started with Drive"
" My New File"
So, this is just listing files created by that app? How do I get the rest of my files???
You use a service account to authenticate. A service account does not have by default the right to access your Drive data, but only files that it owns by itself.
You have three options to work around this :
Create a folder in your Drive account, and share it (read/write) with the service account. Any file you place in that folder will be readable and writable both by you and your service account.
If you use Google Apps For Business, setup domain wide delegation to allow your service account to impersonate all users in your domain. That way you will be able to get your service account to behave as if it were your actual Google Apps account.
Whether you use or not Google Apps For Business : do not use a service account but rather 3-legged OAuth. With 3-legged OAuth you will be able to generate an access token and a refresh token that will allow your application to act in Drive on behalf of your actual Google account. Note that this last options does not use service accounts at all.
The simplest is obviously option (1). If it is not acceptable then I would go with option (3), unless you actually want to be able to impersonate all the users in your domain.