A simple Python code to list obj in certain folder in Google Cloud Storage:
from apiclient import discovery
import apiclient
import json
client = discovery.build('storage', 'v1beta2')
request = client.objects().list(
bucket = 'mybucket',
prefix = 'myfolder1/myfolder',
key = 'A0rsER3odwksawsesse3Dw_d3Ks') # my API key
try:
response = request.execute()
print json.dumps(response, indent = 2)
except apiclient.errors.HttpError, e:
print e
Then I got below failure message:
https://www.googleapis.com/storage/v1beta2/b/mybucket/o?prefix=myfolder1%2Fmyfolder&alt=json&key=A0rsER3odwksawsesse3Dw_d3Ks returned "*Access Not Configured. Please use Google Developers Console to activate the API for your project.*">
I have enabled the API for my project, so it should not be the problem as the return message said, maybe it jump to incorrect project? As I know the flow should be -> project in GCS -> bucket -> root-folder -> folder-1.
So my question is how do this code know which project I'm using? Do I miss any code here?
Thanks for all the kind help!
API key alone is only good for accessing data in public buckets. GCS is able to map API usage back to your project by looking at the API key, which is unique. For anything else than public data access you need proper authentication, via OAuth2, for example. More info here:
https://developers.google.com/storage/docs/json_api/v1/how-tos/authorizing
Your code looks OK and it works with public buckets, which I've verified. Ensure you have both Google Cloud Storage and Google Cloud Storage JSON API enabled in Developers Console.
Related
I'm trying to set the environment variable from a dict but getting and error when connecting.
#service account pulls in airflow variable that contains the json dict with service_account credentials
service_account = Variable.get('google_cloud_credentials')
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]=str(service_account)
error
PermissionDeniedError: Error executing an HTTP request: HTTP response code 403 with body '<?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.</Details></Error>'
when reading if I use and point to file then there are no issues.
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]=/file/path/service_account.json
I'm wondering is there a way to convert the dict object to an os path like object? I don't want to store the json file on the container and airflow/google documentation isn't clear at all.
The Python stringio package lets you create a file-like object backed by a string, but that won't help here because the consumer of this environment variable is expecting a file path, not a file-like object. I don't think it's possible to do what you're trying to do. Is there a reason you don't want to just put the credentials in a file?
There is a way to do it, but the Google documentation is terrible. So I wrote a Github gist to document the recipe that I and a colleague (Imee Cuison) developed to use the key securely. Sample code below:
import json
from google.oauth2.service_account import Credentials
from google.cloud import secretmanager
def access_secret(project_id:str, secret_id:str, version_id:str="latest")->str:
"""Return the secret in string format"""
# Create the Secret Manager client.
client = secretmanager.SecretManagerServiceClient()
# Build the resource name of the secret version.
name = f"projects/{project_id}/secrets/{secret_id}/versions/{version_id}"
# Access the secret version.
response = client.access_secret_version(name=name)
# Return the decoded payload.
return response.payload.data.decode('UTF-8')
def get_credentials_from_token(token:str)->Credentials:
"""Given an authentication token, return a Credentials object"""
credential_dict = json.loads(secret_payload)
return Credentials.from_service_account_info(credential_dict)
credentials_secret = access_secret("my_project", "my_secret")
creds = get_credentials_from_token(credentials_secret)
# And now you can use the `creds` Credentials object to authenticate to an API
Putting the service account into the repository is not a good practice. As a best practice; You need to use authentication propagating from the default google auth within your application.
For instance, using Google Cloud Kubernetes you can use the following python code :
from google.cloud.container_v1 import ClusterManagerClient
credentials, project = google.auth.default(
scopes=['https://www.googleapis.com/auth/cloud-platform', ])
credentials.refresh(google.auth.transport.requests.Request())
cluster_manager = ClusterManagerClient(credentials=credentials)
I would like to develop an app engine application that directly stream data into a BigQuery table.
According to Google's documentation there is a simple way to stream data into bigquery:
http://googlecloudplatform.blogspot.co.il/2013/09/google-bigquery-goes-real-time-with-streaming-inserts-time-based-queries-and-more.html
https://developers.google.com/bigquery/streaming-data-into-bigquery#streaminginsertexamples
(note: in the above link you should select the python tab and not Java)
Here is the sample code snippet on how streaming insert should be coded:
body = {"rows":[
{"json": {"column_name":7.7,}}
]}
response = bigquery.tabledata().insertAll(
projectId=PROJECT_ID,
datasetId=DATASET_ID,
tableId=TABLE_ID,
body=body).execute()
Although I've downloaded the client api I didn't find any reference to a "bigquery" module/object referenced in the above Google's example.
Where is the the bigquery object (from snippet) should be located?
Can anyone show a more complete way to use this snippet (with the right imports)?
I've Been searching for that a lot and found documentation confusing and partial.
Minimal working (as long as you fill in the right ids for your project) example:
import httplib2
from apiclient import discovery
from oauth2client import appengine
_SCOPE = 'https://www.googleapis.com/auth/bigquery'
# Change the following 3 values:
PROJECT_ID = 'your_project'
DATASET_ID = 'your_dataset'
TABLE_ID = 'TestTable'
body = {"rows":[
{"json": {"Col1":7,}}
]}
credentials = appengine.AppAssertionCredentials(scope=_SCOPE)
http = credentials.authorize(httplib2.Http())
bigquery = discovery.build('bigquery', 'v2', http=http)
response = bigquery.tabledata().insertAll(
projectId=PROJECT_ID,
datasetId=DATASET_ID,
tableId=TABLE_ID,
body=body).execute()
print response
As Jordan says: "Note that this uses the appengine robot to authenticate with BigQuery, so you'll to add the robot account to the ACL of the dataset. Note that if you also want to use the robot to run queries, not just stream, you need the robot to be a member of the project 'team' so that it is authorized to run jobs."
Here is a working code example from an appengine app that streams records to a BigQuery table. It is open source at code.google.com:
http://code.google.com/p/bigquery-e2e/source/browse/sensors/cloud/src/main.py#124
To find out where the bigquery object comes from, see
http://code.google.com/p/bigquery-e2e/source/browse/sensors/cloud/src/config.py
Note that this uses the appengine robot to authenticate with BigQuery, so you'll to add the robot account to the ACL of the dataset.
Note that if you also want to use the robot to run queries, not just stream, you need to robot to be a member of the project 'team' so that it is authorized to run jobs.
According to the new Google Apps Marketplace documentation, in order to get the status of a user license, it should be enough to make a simple GET request to https://www.googleapis.com/appsmarket/v2/userLicense/{applicationId}/{userId}?key={ApiKey} where applicationId is a number, userId an email and ApiKey a string taken directly from the Google Cloud Console under APIs -> {App name} -> Server Key -> Api Key. I have also enabled the Google Apps Marketplace API in Google Cloud Console.
However, I always get the following error message:
{"error":
{"errors":[{"domain":"global","reason":"authError","message":"Invalid OAuth header","locationType":"header","location":"Authorization"}],
"code":401,"message":"Invalid OAuth header"}}
Can you help me?
EDIT: Following Arun Nagarajan's suggestion, I tried using a scope, but it still does not work. Here is my code (in Python on Google Appengine):
credentials = oauth2client.appengine.AppAssertionCredentials(scope='https://www.googleapis.com/auth/appsmarketplace.license')
http = credentials.authorize(http=httplib2.Http())
client = apiclient.discovery.build('appsmarket', 'v2', http=http, developerKey='XXXXXXXX')
entry = client.userLicense().get(applicationId='123456', userId='some_user#email.com').execute()
You have to use OAuth 2 with the https://www.googleapis.com/auth/appsmarketplace.license scope.
You cannot just get the license with just the server key.
This is a follow-up question for this question:
I have successfully created a private key and have read the various pages of Google documentation on the concepts of server to server authentication.
I need to create a JWT to authorize my App Engine application (Python) to access the Google calendar and post events in the calendar. From the source in oauth2client it looks like I need to use oauth2client.client.SignedJwtAssertionCredentials to create the JWT.
What I'm missing at the moment is a stylised bit of sample Python code of the various steps involved to create the JWT and use it to authenticate my App Engine application for Google Calendar. Also, from SignedJwtAssertionCredentials source it looks like I need some App Engine compatible library to perform the signing.
Can anybody shed some light on this?
After some digging I found a couple of samples based on the OAuth2 authentication. From this I cooked up the following simple sample that creates a JWT to access the calendar API:
import httplib2
import pprint
from apiclient.discovery import build
from oauth2client.client import SignedJwtAssertionCredentials
# Get the private key from the Google supplied private key file.
f = file("your_private_key_file.p12", "rb")
key = f.read()
f.close()
# Create the JWT
credentials = SignedJwtAssertionCredentials(
"xxxxxxxxxx#developer.gserviceaccount.com", key,
scope="https://www.googleapis.com/auth/calendar"
)
# Create an authorized http instance
http = httplib2.Http()
http = credentials.authorize(http)
# Create a service call to the calendar API
service = build("calendar", "v3", http=http)
# List all calendars.
lists = service.calendarList().list(pageToken=None).execute(http=http)
pprint.pprint(lists)
For this to work on Google App Engine you will need to enable PyCrypto for your app. This means adding the following to your app.yaml file:
libraries:
- name: pycrypto
version: "latest"
I have a google app engine site, and what I want to do, is get access to the files on my drive and publish them. Note that, my account owns both the drive and the app engine page.
I have tried looking at the google drive api, and the problem is that I don't know where to start with the following boilerplate code located in their documentation.
If you take a look at this function:
def get_credentials(authorization_code, state):
"""Retrieve credentials using the provided authorization code.
This function exchanges the authorization code for an access token and queries
the UserInfo API to retrieve the user's e-mail address.
If a refresh token has been retrieved along with an access token, it is stored
in the application database using the user's e-mail address as key.
If no refresh token has been retrieved, the function checks in the application
database for one and returns it if found or raises a NoRefreshTokenException
with the authorization URL to redirect the user to.
Args:
authorization_code: Authorization code to use to retrieve an access token.
state: State to set to the authorization URL in case of error.
Returns:
oauth2client.client.OAuth2Credentials instance containing an access and
refresh token.
Raises:
CodeExchangeError: Could not exchange the authorization code.
NoRefreshTokenException: No refresh token could be retrieved from the
available sources.
"""
email_address = ''
try:
credentials = exchange_code(authorization_code)
user_info = get_user_info(credentials)
email_address = user_info.get('email')
user_id = user_info.get('id')
if credentials.refresh_token is not None:
store_credentials(user_id, credentials)
return credentials
else:
credentials = get_stored_credentials(user_id)
if credentials and credentials.refresh_token is not None:
return credentials
except CodeExchangeException, error:
logging.error('An error occurred during code exchange.')
# Drive apps should try to retrieve the user and credentials for the current
# session.
# If none is available, redirect the user to the authorization URL.
error.authorization_url = get_authorization_url(email_address, state)
raise error
except NoUserIdException:
logging.error('No user ID could be retrieved.')
# No refresh token has been retrieved.
authorization_url = get_authorization_url(email_address, state)
raise NoRefreshTokenException(authorization_url)
This is a part of the boilerplate code. However, where am I supposed to get authorisation_code from?
I recently had to implement something similar, and it is quite tricky to find the relevant pieces of documentation.
This is what worked for me.
One-time setup to enable Google Drive for your Google App Engine project
Go to the Google APIs Console and select your App Engine project. If you don't see your App Engine project listed, you need to enable the cloud integration in the App Engine admin tool first (Administration > Application Settings > Cloud Integration > Create project)
In Google APIs Console, now go to Services and look for the "Drive API" in that long list. Turn it on.
Go to the API Access section on Google APIs Console, and find back the "Simple API Access" API Key. (see screenshot below)
Getting and installing the Python Drive API Client
Download the Python Drive API Client: https://developers.google.com/api-client-library/python/start/installation#appengine
Documentation on this Python API: https://google-api-client-libraries.appspot.com/documentation/drive/v2/python/latest/
Using the Python Drive API Client
To create the Drive service object, I use this:
import httplib2
def createDriveService():
"""Builds and returns a Drive service object authorized with the
application's service account.
Returns:
Drive service object.
"""
from oauth2client.appengine import AppAssertionCredentials
from apiclient.discovery import build
credentials = AppAssertionCredentials(scope='https://www.googleapis.com/auth/drive')
http = httplib2.Http()
http = credentials.authorize(http)
return build('drive', 'v2', http=http, developerKey=API_KEY)
You can then use this service object to execute Google Drive API calls, for example, to create a folder:
service = createDriveService()
res = {'title': foldername,
'mimeType': "application/vnd.google-apps.folder"}
service.files().insert(body=res).execute()
Caveats
I was not able to get the Drive API to work in unittesting, nor on the dev_appserver. I always get an error that my credentials are not valid. However, it works fine on the real app engine server.