I am trying to upload a folder in my local machine to google cloud bucket. I get an error with the credentials. Where should I be providing the credentials and what all information is needed in it.
from_dest = '/Users/xyzDocuments/tmp'
gsutil_link = 'gs://bucket-1991'
from google.cloud import storage
try:
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(source_file_name)
print('File {} uploaded to {}.'.format(source_file_name,destination_blob_name))
except Exception as e:
print e
The error is
could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://developers.google.com/accounts/do`cs/application-default-credentials.
You need to acquire the application default credentials for your project and set them as an environmental variable:
Go to the Create service account key page in the GCP Console.
From the Service account drop-down list, select New service account.
Enter a name into the Service account name field.
From the Role drop-down list, select Project > Owner.
Click Create. A JSON file that contains your key downloads to your computer.
Then, set an environmental variable which will provide the application credentials to your application when it runs locally:
$ export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/[FILE_NAME].json"
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes. Keep in mind that when you set an environment variable value in a session, it is reset every time the session is dropped.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide, in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly.
Passing the path to the service account key in code example:
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json('service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
Related
I am trying to write a script in Python to grab new emails from a specific folder and save the attachments to a shared drive to upload to a database. Power Automate would work, but the file size limit to save the attachment is a meager 20 MB. I am able to authenticate the token, but am getting the following error when trying to grab the emails:
Unauthorized for url.
The token contains no permissions, or permissions can not be understood.
I have included the code I am using to connect to Microsoft Graph.
(credentials and tenant_id are correct in my code, took them out for obvious reasons
from O365 import Account, MSOffice365Protocol, MSGraphProtocol
credentials = ('xxxxxx', 'xxxxxx')
protocol = MSGraphProtocol(default_resource='reporting.triometric#xxxx.com')
scopes_graph = protocol.get_scopes_for('message_all_shared')
scopes = ['https://graph.microsoft.com/.default']
account = Account(credentials, auth_flow_type='credentials', tenant_id="**", scopes=scopes,)
if account.authenticate():
print('Authenticated')
mailbox = account.mailbox(resource='reporting.triometric#xxxx.com')
inbox = mailbox.inbox_folder()
for message in inbox.get_messages():
print(message)
I have already configured the permissions through Azure to include all the necessary 'mail' delegations.
The rest of my script works perfectly fine for uploading files to the database. Currently, the attachments must be manually saved on a shared drive multiple times per day, then the script is run to upload. Are there any steps I am missing? Any insights would be greatly appreciated!
Here are the permissions:
auth_flow_type='credentials' means you are using client credentials flow.
In this case you should add Application permissions rather than Delegated permissions.
Don't forget to click on "Grant admin consent for {your tenant}".
UPDATE:
If you set auth_flow_type to 'Authorization', it will use auth code flow which requires the delegated permission.
For retrieving monitoring metrics from my project, I used below Python code:
from google.cloud import monitoring_v3
from google.oauth2 import service_account
from googleapiclient import discovery
credentials = service_account.Credentials.from_service_account_file(
r'D:\GCP\credentials\blahblah-04e8fd0245b8.json')
service = discovery.build('compute', 'v1', credentials=credentials)
client = monitoring_v3.MetricServiceClient()
project_name = f"projects/{blahblah-300807}"
resource_descriptors = client.list_monitored_resource_descriptors(
name=project_name)
for descriptor in resource_descriptors:
print(descriptor.type)
I did everything well. I gave the file path for credentials information correctly, but I received this error message:
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: \
Could not automatically determine credentials. \
Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create \
credentials and re-run the application. \
For more information, please see \
https://cloud.google.com/docs/authentication/getting-started
I even checked that link and tried the alternative method, but still, it didn't work. How can I rectify this? Am I making a mistake?
You don't use the credential when you create the client
client = monitoring_v3.MetricServiceClient()
You can change it like this
client = monitoring_v3.MetricServiceClient(credentials=credentials)
Personally, I prefer to not explicitly provide the credential in the code, and I prefer to use the environment variable GOOGLE_APPLICATION_CREDENTIALS for this.
Create an environment variable in your OS with the name GOOGLE_APPLICATION_CREDENTIALS and the value that point to the service account key file D:\GCP\credentials\blahblah-04e8fd0245b8.json.
But, if it's on your own computer, you can even don't use service account key file (which is not really secure, I explain why in this article), you can use your own credential. For this, simply create an application default credential (ADC) like this gcloud auth application-default login
Under Google Cloud Run, you can select which service account your container is running. Using the default compute service account fails to generate a signed url.
The work around listed here works on Google Cloud Compute -- if you allow all the scopes for the service account. There does not seem to be away to do that in Cloud Run (not that I can find).
https://github.com/googleapis/google-auth-library-python/issues/50
Things I have tried:
Assigned the service account the role: roles/iam.serviceAccountTokenCreator
Verified the workaround in the same GCP project in a Virtual Machine (vs Cloud Run)
Verified the code works locally in the container with the service account loaded from private key (via json file).
from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket('EXAMPLE_BUCKET')
blob = bucket.get_blob('libraries/image_1.png')
expires = datetime.now() + timedelta(seconds=86400)
blob.generate_signed_url(expiration=expires)
Fails with:
you need a private key to sign credentials.the credentials you are currently using <class 'google.auth.compute_engine.credentials.Credentials'> just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.
/usr/local/lib/python3.8/site-packages/google/cloud/storage/_signing.py, line 51, in ensure_signed_credentials
Trying to add the workaround,
Error calling the IAM signBytes API:
{ "error": { "code": 400,
"message": "Request contains an invalid argument.",
"status": "INVALID_ARGUMENT" }
}
Exception Location: /usr/local/lib/python3.8/site-packages/google/auth/iam.py, line 81, in _make_signing_request
Workaround code as mention in Github issue:
from google.cloud import storage
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
def get_signing_creds(credentials):
auth_request = requests.Request()
print(credentials.service_account_email)
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "", service_account_email=credentials.ser
vice_account_email)
return signing_credentials
client = storage.Client()
bucket = client.get_bucket('EXAMPLE_BUCKET')
blob = bucket.get_blob('libraries/image_1.png')
expires = datetime.now() + timedelta(seconds=86400)
signing_creds = get_signing_creds(client._credentials)
url = blob.generate_signed_url(expiration=expires, credentials=signing_creds)
print(url)
How do I generate a signed url under Google Cloud Run?
At this point, it seems like I may have to mount the service account key which I wanted to avoid.
EDIT:
To try and clarify, the service account has the correct permissions - it works in GCE and locally with the JSON private key.
Yes you can, but I had to deep dive to find how (jump to the end if you don't care about the details)
If you go in the _signing.py file, line 623, you can see this
if access_token and service_account_email:
signature = _sign_message(string_to_sign, access_token, service_account_email)
...
If you provide the access_token and the service_account_email, you can use the _sign_message method. This method uses the IAM service SignBlob API at this line
It's important because you can now sign blob without having locally the private key!! So, that solves the problem, and the following code works on Cloud Run (and I'm sure on Cloud Function)
def sign_url():
from google.cloud import storage
from datetime import datetime, timedelta
import google.auth
credentials, project_id = google.auth.default()
# Perform a refresh request to get the access token of the current credentials (Else, it's None)
from google.auth.transport import requests
r = requests.Request()
credentials.refresh(r)
client = storage.Client()
bucket = client.get_bucket('EXAMPLE_BUCKET')
blob = bucket.get_blob('libraries/image_1.png')
expires = datetime.now() + timedelta(seconds=86400)
# In case of user credential use, define manually the service account to use (for development purpose only)
service_account_email = "YOUR DEV SERVICE ACCOUNT"
# If you use a service account credential, you can use the embedded email
if hasattr(credentials, "service_account_email"):
service_account_email = credentials.service_account_email
url = blob.generate_signed_url(expiration=expires,service_account_email=service_account_email, access_token=credentials.token)
return url, 200
Let me know if it's not clear
The answer #guillaume-blaquiere posted here does work, but it requires an additional step not mentioned, which is to add the Service Account Token Creator role in IAM to your default service account, which will allow said default service account to "Impersonate service accounts (create OAuth2 access tokens, sign blobs or JWTs, etc)."
This allows the default service account to sign blobs, as per the signBlob documentation.
I tried it on AppEngine and it worked perfectly once that permission was given.
import datetime as dt
from google import auth
from google.cloud import storage
# SCOPES = [
# "https://www.googleapis.com/auth/devstorage.read_only",
# "https://www.googleapis.com/auth/iam"
# ]
credentials, project = auth.default(
# scopes=SCOPES
)
credentials.refresh(auth.transport.requests.Request())
expiration_timedelta = dt.timedelta(days=1)
storage_client = storage.Client(credentials=credentials)
bucket = storage_client.get_bucket("bucket_name")
blob = bucket.get_blob("blob_name")
signed_url = blob.generate_signed_url(
expiration=expiration_timedelta,
service_account_email=credentials.service_account_email,
access_token=credentials.token,
)
I downloaded a key for the AppEngine default service account to test locally, and in order to make it work properly outside of the AppEngine environment, I had to add the proper scopes to the credentials, as per the commented lines setting the SCOPES. You can ignore them if running only in AppEngine itself.
You can't sign urls with the default service account.
Try your service code again with a dedicated service account with the permissions, and see if that resolves your error
References and further reading:
https://stackoverflow.com/a/54272263
https://cloud.google.com/storage/docs/access-control/signed-urls
https://github.com/googleapis/google-auth-library-python/issues/238
An updated approach has been added to GCP's documentation for serverless instances such as Cloud Run and App Engine.
The following snippet shows how to create a signed URL from the storage library.
def generate_upload_signed_url_v4(bucket_name, blob_name):
"""Generates a v4 signed URL for uploading a blob using HTTP PUT.
Note that this method requires a service account key file. You can not use
this if you are using Application Default Credentials from Google Compute
Engine or from the Google Cloud SDK.
"""
# bucket_name = 'your-bucket-name'
# blob_name = 'your-object-name'
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(blob_name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow PUT requests using this URL.
method="PUT",
content_type="application/octet-stream",
)
return url
Once your backend returns the signed URL you could execute curl put request from your frontend as follows
curl -X PUT -H 'Content-Type: application/octet-stream' --upload-file my-file 'my-signed-url'
I had to add both Service Account Token Creator and Storage Object Creator to the default compute engine service account (which is what my Cloud Run services use) before it worked. You could also create a custom Role that has just iam.serviceAccounts.signBlob instead of Service Account Token Creator, which is what I did:
I store the credentials.json contents in Secret Manager then load it in my Django app like this:
project_id = os.environ.get("GOOGLE_CLOUD_PROJECT")
client = secretmanager.SecretManagerServiceClient()
secret_name = "service_account_credentials"
secret_path = f"projects/{project_id}/secrets/{secret_name}/versions/latest"
credentials_json = client.access_secret_version(name=secret_path).payload.data.decode("UTF-8")
service_account_info = json.loads(credentials_json)
google_service_credentials = service_account.Credentials.from_service_account_info(
service_account_info)
I tried the answer from #guillaume-blaquiere and I added the permission recommended by #guilherme-coppini but when using Google Cloud Run I always saw the same "You need a private key to sign credentials.the credentials you are currently using..." error.
I'm using the python google.cloud api
For example using the metrics module
from google.cloud import monitoring
client = monitoring.Client()
client.query(my/gcp/metric, minutes=10)
For my GOOGLE_APPLICATION_CREDENTIALS im using a service account that has specific access to a gcp project.
Does google.cloud have any modules that can let me derive the project from the service account (like get what project the service account is in)?
This would be convenient because each service account only has access to a single project, so I could set my service account and be able to reference that project in code.
Not sure if this will work, you may need to tweak it:
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
service = discovery.build('yourservicename', credentials=credentials)
request = service.projects().list()[0]
Google Cloud Identity and Access Management (IAM) API has ‘serviceAccounts.get’ method and which shows the projects associated with a service account as shown here. You need to have proper permissions on the projects for the API to work.
The method google.auth.default return a tuple (project_id, credentials) if that information is available on the environment.
Also, the client object knows to which project it is linked from (either client.project or client.project_id, I'm not sure which one for the Monitoring API).
If you set the service account manually with the GOOGLE_APPLICATION_CREDENTIALS env var, you can open the file and load its json. One of the parameters in a service account key file is the project id.
I am trying to list all the files in my drive (about 10) but the following will only list 1 (and that isn't even a real file of mine)....
the code:
from httplib2 import Http
from oauth2client.client import SignedJwtAssertionCredentials
client='my_client_id'
client_email = 'my_client_email'
with open("/path/to/file.p12") as f:
private_key = f.read()
credentials = SignedJwtAssertionCredentials(client_email, private_key, 'https://www.googleapis.com/auth/drive')
http_auth = credentials.authorize(Http())
drive_service = build('drive', 'v2', http=http_auth)
r = drive_service.files().list().execute()
files = r['items']
for f in files:
print f['id'], f['title']
result:
"<file_id> How to get started with Drive"
EDIT:
This question is similar but the answer is to have the correct oauth scope, which I have above.
EDIT #2:
I thought it might be a timing issue so I gave it a few hours and still no goose.
EDIT #3:
If I try to copy a file from another user then list my files then I'll get 2 files:
" How to get started with Drive"
" My New File"
So, this is just listing files created by that app? How do I get the rest of my files???
You use a service account to authenticate. A service account does not have by default the right to access your Drive data, but only files that it owns by itself.
You have three options to work around this :
Create a folder in your Drive account, and share it (read/write) with the service account. Any file you place in that folder will be readable and writable both by you and your service account.
If you use Google Apps For Business, setup domain wide delegation to allow your service account to impersonate all users in your domain. That way you will be able to get your service account to behave as if it were your actual Google Apps account.
Whether you use or not Google Apps For Business : do not use a service account but rather 3-legged OAuth. With 3-legged OAuth you will be able to generate an access token and a refresh token that will allow your application to act in Drive on behalf of your actual Google account. Note that this last options does not use service accounts at all.
The simplest is obviously option (1). If it is not acceptable then I would go with option (3), unless you actually want to be able to impersonate all the users in your domain.