I built a simple python application to be run on the Google App Engine. Code:
import webapp2
from oauth2client.contrib.appengine import AppAssertionCredentials
from apiclient.discovery import build
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
class MainPage(webapp2.RequestHandler):
def get(self):
self.response.headers['Content-Type'] = 'text/plain'
self.response.write('BigQuery App')
credentials = AppAssertionCredentials(
'https://www.googleapis.com/auth/sqlservice.admin')
service = discovery.build('bigquery', 'v2', credentials=credentials)
projectId = '<Project-ID>'
query_request_body = {
"query": "SELECT a from Data.test LIMIT 10"
}
request = service.jobs().query(projectId=projectId, body=query_request_body)
response = request.execute()
self.response.write(response)
app = webapp2.WSGIApplication([
('/', MainPage),
], debug=True)
I am able to deploy this code locally (http://localhost:8080) and everything works correctly, however I get the following error 500 Server Error when I try to deploy it to GAE using:
appcfg.py -A <Project-Id> -V v1 update .
This is the error I get from the Error Report Console:
error: An error occured while connecting to the server: DNS lookup failed for URL:http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/https://www.googleapis.com/auth/sqlservice.admin/?recursive=True
I believe it is an auth issue and to make sure my service account was authorized I went through the gcloud authentification for service accounts and I also set the set environment variables from the SDK.
I have been trying to get around this for a while, any pointers are very appreciated. Thank you.
Also, I have been using Service Account Auth by following these docs: https://developers.google.com/identity/protocols/OAuth2ServiceAccount where it says that I shouldn't be able to run AppAsseritionCredenitals locally, which adds to my confusion because I actually can with no errors.
EDIT:
After reuploading and reauthorizing my service account I was able to connect to the server. However, the authorization error continues with this:
HttpError: <HttpError 403 when requesting https://www.googleapis.com/bigquery/v2/projects/sqlserver-1384/queries?alt=json returned "Insufficient Permission">
To fix the "error while connecting to the server", follow the instructions listed in this answer: https://stackoverflow.com/questions/31651973/default-credentials-in-google-app-engine-invalid-credentials-error#=
and then re-upload the app
Then, to fix the HttpError 403 when requesting ... returned "Insufficient Permission", you have to change the scope you were requesting. In my case I was requesting:
credentials = AppAssertionCredentials(
'https://www.googleapis.com/auth/sqlservice.admin')
however, the correct scope for Google BigQuery is: https://www.googleapis.com/auth/bigquery. Which looks like this:
credentials = AppAssertionCredentials(
'https://www.googleapis.com/auth/bigquery')
If you are using a different API, use whichever scope is outlined in the documentations.
Related
I'm using the Firebase Admin Python SDK to read/write data to Firestore. I've created a service account with the necessary permissions and saved the credentials .json file in the source code (I know this isn't the most secure, but I want to get the thing running before fixing security issues). When testing the integration locally, it works flawlessly. But after deploying to GCP, where our service is hosted, calls to Firestore don't work properly and retry for a while before throwing 503 Deadline Exceeded errors. However, SSHing into a GKE pod and calling the SDK manually works without issues. It's just when the SDK is used in code flow that causes problems.
Our service runs in Google Kubernetes Engine in one project (call it Project A), but the Firestore database is in another project (call it project B). The service account that I'm trying to use is owned by Project B, so it should still be able to access the database even when it is being initialized from inside Project A.
Here's how I'm initiating the SDK:
from firebase_admin import get_app
from firebase_admin import initialize_app
from firebase_admin.credentials import Certificate
from firebase_admin.firestore import client
from google.api_core.exceptions import AlreadyExists
credentials = Certificate("/path/to/credentials.json")
try:
app = initialize_app(credential=credentials, name="app_name")
except ValueError:
app = get_app(name="app_name")
client = client(app=app)
Another wrinkle is that another part of our code is able to successfully use the same service account to produce Firebase Access Tokens. The successful code is:
import firebase_admin
from firebase_admin import auth as firebase_admin_auth
if "app_name" in firebase_admin._apps:
# Already initialized
app = firebase_admin.get_app(name="app_name")
else:
# Initialize
credentials = firebase_admin.credentials.Certificate("/path/to/credentials.json")
app = firebase_admin.initialize_app(credential=credentials, name="app_name")
firebase_token = firebase_admin_auth.create_custom_token(
uid="id-of-user",
developer_claims={"admin": is_admin, "site_slugs": read_write_site_slugs},
app=app,
)
Any help appreciated.
Turns out that the problem here was a conflict between gunicorn's gevents and the SDK's use of gRCP. Something related to websockets. I found the solution here. I added the following code to our Django app's settings:
import grpc.experimental.gevent as grpc_gevent
grpc_gevent.init_gevent()
I want to send requests to a deployed app on a cloud run with python, but inside the test file, I don't want to hardcode the endpoint; how can I get the URL of the deployed app with python script inside the test file so that I can send requests to that URL?
You can use gcloud to fetch the url of the service like this
gcloud run services describe SERVICE_NAME
--format="value(status.url)"
In a pure Python way, you can use Google's API Client Library for Run.
To my knowledge, there isn't a Cloud Client Library
The method is namespaces.services.get and it is documented by APIs Explorer namespaces.services.get.
One important fact with Cloud Run is that the API endpoint differs by Cloud Run region.
See service endpoint. You will need to override the client configuration (using ClientOptions) with the correct (region-specific) api_endpoint.
The following is from-memory! I've not run this code but it should be (nearly) correct:
import google.auth
import os
from googleapiclient import discovery
from google.api_core.client_options import ClientOptions
creds, project = google.auth.default()
REGION = os.getenv("REGION")
SERVICE = os.getenv("SERVICE")
# Must override the default run.googleapis.com endpoint
# with region-specific endpoint
api_endpoint = "https://{region}-run.googleapis.com".format(
region=REGION
)
options = ClientOptions(
api_endpoint=api_endpoint
)
service = discovery.build("run", "v1",
client_options=options,
credentials=creds
)
name = "namespaces/{namespace}/services/{service}".format(
namespace=project,
service=SERVICE
)
rqst = service.namespaces().services().get(name=name)
resp = rqst.execute()
The resp will be Service and you can grab its ServiceStatus url.
I'm new to AWS Chalice and I'm running into obstacles during deployment--essentially, everything works fine when I run chalice local, I can go to the route I've defined and it will return the related JSON data. However, once deployed and I try accessing the same route, I get the HTTP 502 Bad Gateway error:
{
"message": "Internal server error"
}
Is there something I'm missing when I set up my AWS IAM roles? Or, more generally, is there some level of configuration with AWS beyond just setting a config file with the access key and secret in the .aws directory on my system? Below is the basic boilerplate code I'm running when I get the error:
from chalice import Chalice
import json, datetime, os
from chalicelib import config
app = Chalice(app_name='trading-bot')
app.debug = True
#app.route('/quote')
def quote():
return {"hello": "world"}
Please let me know if there's any more details I can provide; again I'm new to Chalice and AWS in general so it may be some simple settings I need to update on my profile.
Thanks!
I am trying to get sample code for the python client library to work from Google Cloud Datalab. The program looks like this:
import googleapiclient.discovery
def get_client():
"""Builds a client to the dataproc API."""
dataproc = googleapiclient.discovery.build('dataproc', 'v1')
return dataproc
def list_clusters(dataproc, project):
result = dataproc.projects().regions().clusters().list(
projectId=project,
region='global').execute()
return result
if __name__ == "__main__":
dpc = get_client()
project='my-sandbox1-165203'
res = list_clusters(dpc, project)
print res
Following the tutorial it works just fine from my local system that has the Google Cloud SDK installed. I also got it to work from a Compute Engine instance with Cloud API access scopes enabled to 'Allow full access to all Cloud APIs'. The Datalab Compute Engine instance does have 'Allow full access to all Cloud APIs' as well as I can see from the console. But when I run the code from Cloud Datalab
dataproc.projects().regions().clusters().list(projectId=project,region='global').execute()
fails with
HttpError: <HttpError 401 when requesting https://dataproc.googleapis.com/v1/projects/my-sandbox1-165203/regions/global/clusters?alt=json returned "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.">
And as far as I can see the datalab code runs with the default service account as '!gcloud auth list' shows.
Any idea how I can get this to work?
I am having trouble accessing Firebase DB from a GAE app. It works fine when running locally, but deployed to GAE (*.appspot.com) I get an unauthorized error back from Firebase. There is likely some obvious thing I missed...
This is the _get_http() function I use. It works fine (after doing gcloud beta auth application-default login)
def _get_http():
_FIREBASE_SCOPES = [
'https://www.googleapis.com/auth/firebase.database',
#'https://www.googleapis.com/auth/userinfo.email'
]
"""Provides an authed http object."""
http = httplib2.Http()
# Use application default credentials to make the Firebase calls
# https://firebase.google.com/docs/reference/rest/database/user-auth
creds = GoogleCredentials.get_application_default().create_scoped(_FIREBASE_SCOPES)
creds.authorize(http)
return http
The error I get back from Firebase is:
{u'error': u'Unauthorized request.'}
The log from GAE looks includes this:
11:14:43.835
Attempting refresh to obtain initial access_token
11:14:44.625
Refreshing due to a 401 (attempt 1/2)
11:14:44.966
Refreshing due to a 401 (attempt 2/2)
Got it! I had commented out 'https://www.googleapis.com/auth/userinfo.email' thinking it was unnecessary. Wrong. Enabling it again, bingo!