I want to set my google json credential file at run time through postman. I have made a bigquery rest api. Right now I am passing it in my code like this:
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = 'C:/Users/Documents/bigQuery/service.json'
#app.route('/', methods=['GET', 'POST'])
def get_request():
query:
.
.
.
return results
if __name__ == "__main__":
app.run()
I tried taking the input from form data in postman approach but the code throws compile time error asking for credential file to be set up first.
compilation error:
DefaultCredentialsError: Could not automatically determine credentials.
Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see
https://cloud.google.com/docs/authentication/getting-started
Setting the GOOGLE_APPLICATION_CREDENTIALS is needed for the code to fetch the big query data.
I want postman for passing the key.json file to the code at run time.
Instead of setting the env variable, there are 2 other options:
Create the client using client = bigquery.Client.from_service_account_json(json_credentials_path) as per documentation.
Create the client using client = bigquery.Client.from_service_account_info(json_object) as per documentation.
Either way will allow you to pass the credentials at runtime. See the example below for the second option. The first option should be quite evident.
As you can see in the code below, the credentials get set on the client side, not on the server side.
server.py
from google.cloud import bigquery
from flask import Flask, request
import json
app = Flask(__name__)
def querysomething(json_object):
# https://googleapis.dev/python/bigquery/latest/generated/google.cloud.bigquery.client.Client.html#google.cloud.bigquery.client.Client.from_service_account_info
client = bigquery.Client.from_service_account_info(json_object)
# example below stolen from:
# https://cloud.google.com/bigquery/docs/reference/libraries#using_the_client_library
query = """
SELECT name, SUM(number) as total_people
FROM `bigquery-public-data.usa_names.usa_1910_2013`
WHERE state = 'TX'
GROUP BY name, state
ORDER BY total_people DESC
LIMIT 20
"""
query_job = client.query(query) # Make an API request.
print("The query data:")
for row in query_job:
# Row values can be accessed by field name or index.
print("name={}, count={}".format(row[0], row["total_people"]))
#app.route("/api/query", methods=["POST"])
def api_query():
print(request.is_json)
json_object = json.loads(request.get_json())
print(json_object)
querysomething(json_object)
return "ok"
if __name__ == "__main__":
app.run(debug=True)
client.py
import requests
import json
with open("credentials.json") as infile:
credentials = json.load(infile)
target = "http://127.0.0.1:5000/api/query"
asjson = json.dumps(credentials)
response = requests.post(target, json=asjson)
print(response, response.text)
Related
I've been making api with flask on Google App Engine and When I send request to this app from browser after deploy, I got 502 error. I'm sure this error is caused by credential of GCP by "gcloud app logs tail -s test" but The path of credential Json file and file name seems OK . I have googled and I tried every articles I have found there but could not solve.
I have already done export GOOGLE_APPLICATION_CREDENTIALS="/home/user/secret_key/bq.json"
Could anyone tell me the solution??
If there is lack of any info , please let me know . Thank you .
besides, my api function is getting luid parameter over http request and run SQL with that luid and if the row of the luid has data in cv_date column in BigQuery, it returns True to client.
【The result of "gcloud app logs tail -s test"】
File "/env/lib/python3.7/site-packages/google/auth/_default.py", line 97, in load_credentials_from_file "File {} was not found.".format(filename) google.auth.exceptions.DefaultCredentialsError: File /home/user/secret_key/bq.json was not found.
【/home/user/api_dev/main.py】
from flask import Flask,request
from google.cloud import bigquery
import os
credentials_json = '/home/user/secret_key/bq.json'
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = credentials_json
client = bigquery.Client()
app = Flask(__name__)
#app.route('/')
def get_request():
request_luid = request.args.get('luid') or ''
query = """
SELECT EXISTS(SELECT cv_date FROM `test-266110.conversion_log.conversion_log_202008*` t WHERE request_luid = p.luid)
"""
query_res = client.query(query)
return query_res
if __name__ == "__main__":
app.run()
【Remove the codes for BigQuery except import library and variables】
*This code works well and returns luid you input on url parameter
from flask import Flask, request
from google.cloud import bigquery
import os
credentials_json = '/home/user/secret_key/bq.json'
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = credentials_json
app = Flask(__name__)
#app.route('/')
def get_request():
request_luid = request.args.get('luid') or ''
return request_luid
if __name__ == "__main__":
app.run()
I'd recommend reading through the auth docs.
https://cloud.google.com/docs/authentication/production talks about service account interactions in a bit more detail. You likely don't need to pass in your credentials in the live app. You can simply set the GOOGLE_APPLICATION_CREDENTIALS when you're running locally to use the credentials, but you don't need to set it in production.
The issue is that the path you've specified (/home/user/secret_key/bq.json) is only valid for your development environment, and either not included in your production deployment at all or the absolute path to the file in the deployed app is different.
I'm having difficulty with my Cloud Function in GCP that is simply supposed to return the raw XML stored in a GCS Bucket when invoked with a basic GET request. It works fine without any type of authentication, however since I added the Flask-HTTPAuth package to the mix in order to add some measure of security before exposing the endpoint, the application deploys fine, but crashes without any sort of hint as to why as soon as it is invoked. The error in SD Logging is as follows:
severity: "DEBUG"
textPayload: "Function execution took 1847 ms, finished with status: 'crash'"
timestamp: "2020-07-15T17:22:15.158036700Z"
The function in question (anonymized):
from flask import Flask, request, jsonify, make_response, abort
from flask_httpauth import HTTPBasicAuth
from google.cloud import storage, secretmanager
import google.cloud.logging
import logging
import sys
app = Flask(__name__)
auth = HTTPBasicAuth()
PROJECT_ID = 'example_project'
GCS_BUCKET = 'example_bucket'
users = ['example_user']
# Instantiate logger
client = google.cloud.logging.Client()
client.get_default_handler()
client.setup_logging()
#auth.verify_password
def verify_password(username, password):
# Instantiate the Secret Manager client.
sm_client = secretmanager.SecretManagerServiceClient()
# Load secrets
name = sm_client.secret_version_path(PROJECT_ID, 'example_secrets_ref', 1)
secrets_pass = sm_client.access_secret_version(name)
passwords = [secrets_pass]
if username in users and password in passwords:
logging.info('auth success')
return username
logging.info('auth fail')
return abort(403)
#app.route('/')
#auth.login_required
def latest_xml():
try:
request_json = request.get_json()#silent=True)
storage_client = storage.Client(project=PROJECT_ID)
bucket = storage_client.get_bucket(GCS_BUCKET)
blob = bucket.get_blob('latest_pull.xml')
latest_xml = blob.download_as_string()
logging.info('Loaded blob from GCS')
return(latest_xml)
except exception as e:
logging.error(str(e))
logging.error("Failed to load blob from GCS")
sys.exit(1)
if __name__ == '__main__':
app.run()
I've tried setting the entrypoint as both the main function as well as the auth function to no avail. My question is: is it possible to even use basic auth in a GCP Cloud Function or am I barking up the wrong tree here?
Your function doesn't enforce the standard signature for http function
def latest_xml(request):
...
Here you use a flask web server, which is not need, and not used by Cloud Functions. However, I recommend you to have a look to Cloud Run, and to add a simple and generic Dockerfile to deploy . You can deploy your "function" as-is in a container and to have the same behavior as Cloud Functions.
EDIT
When you use flask, the request object is global for each request. You use it like this:
request_json = request.get_json()#silent=True)
With Cloud Functions, this object is caught by the Cloud Functions platform and passed in parameter to your function.
In the request object, you have the body of the request, useless in GET for example. But also, all the request context: headers, user agent, source ip,...
I'm looking to create a Cloud Function in GCP that receives a HTTP request with parameters, takes the parameters and passes them to Bigquery within SQL statement and returns a result that I can pass back to website.
I am very new at this and I am not an engineer by any stretch. I have got to the point where my Cloud Function deploys correctly and I receive a response "OK" within the browser
When I hit it but can't get the values returned from BQ to show on browser.
Here's my function so far and thanks for any help in advance.
import google.cloud.bigquery
def audience(QUERY):
# BQ Query to get add to cart sessions
QUERY = """select
visitId,
from bigquery-public-data.google_analytics_sample.ga_sessions_20170801
limit 10;
return QUERY"""
print(audience)
This is an example of a Cloud Function that runs exactly the query that you mention on your post. Nonetheless, it could be adapted to any other query very easily according to your needs. You basically need to follow this tutorial in order to deploy the function and get a basic understanding as to how to query data using the Client Library for BigQuery.
Here is the summary of what you need to do:
Create a folder (e.g. cloudfunctionsexample) and use cd [FOLDERNAME e.g. cloudfunctionsexample] to get inside the folder and inside the folder create two files: main.py and requirements.txt.
a. main.py :
from flask import escape
from google.cloud import bigquery
client = bigquery.Client()
def bigquery_example(request):
request_json = request.get_json(silent=True)
request_args = request.args
#Check if request have all the correct parameters to run the query
if request_json and 'column' in request_json:
column = request_json['column']
elif request_args and 'column' in request_args:
column = request_args['column']
else:
return('You are missing the column parameter on the request.')
if request_json and 'name' in request_json:
name = request_json['name']
elif request_args and 'name' in request_args:
name = request_args['name']
else:
return('You are missing the name of the dataset parameter on the request.')
if request_json and 'limit' in request_json:
limit = request_json['limit']
elif request_args and 'limit' in request_args:
limit = request_args['limit']
else:
return('You are missing the limit parameter on the request.')
#Construct the query based on the parameters
QUERY = ('SELECT '+column+' FROM `'+name+'` LIMIT '+limit)
#print(QUERY)
try:
query_job = client.query(QUERY) # API request
rows = query_job.result() # Waits for query to finish
# Create a list and make the results HTML compatible to be able to be displayed on the browser.
row_list = []
for row in rows:
row_list.append(str(row[column]))
return("<p>" + "</p><p>".join(row_list) + "</p>")
except e:
return(e)
b. requirements.txt :
flask
google-cloud-bigquery
(Assuming you have the Cloud SDK installed) and that your make sure that the App Engine default service account (which is the default account used by Cloud Functions) has the Editor role assigned run the following command to deploy the function on your project:
gcloud functions deploy bigquery_http_example --runtime python37 --trigger-http --allow-unauthenticated --entry-point=bigquery_example --timeout=540
Get the Cloud Function URL and use either the curl command to make a POST request or simply add the parameters to the Cloud Function URL to make an HTTP request to the Cloud Function endpoint and see the results directly on your browser.
a. curl :
curl -X POST https://[REGION-FUNCTIONS_PROJECT_ID].cloudfunctions.net/bigquery_http_example -H "Content-Type:application/json" -d '{"column":"visitId","name":"bigquery-public-data.google_analytics_sample.ga_sessions_20170801","limit":"10"}'
b. Cloud Function URL :
https://[REGION-FUNCTIONS_PROJECT_ID].cloudfunctions.net/bigquery_http_example?column=visitId&name=bigquery-public-data.google_analytics_sample.ga_sessions_20170801&limit=10
I am new to API, and get a tasks of creating POST API. I have created a code somehow.
I want to add data to the hello.txt through post API, So how will I do it?
Here is my code:
import flask
from flask import request, jsonify
app = flask.Flask(__name__)
app.config["DEBUG"] = True
#app.route('/api/v1/resources/messages', methods = ['POST'])
def api_message():
if request.headers['Content-Type'] == 'text/plain':
return "Text Message: " + request.data
elif request.headers['Content-Type'] == 'application/octet-stream':
return "Binary message written!"
elif request.headers['Content-Type'] == 'application/json':
f = open('F:\Asif_Ahmed\Projects\api\hello.txt',"w")
f.write(request.data)
f.close()
return "JSON Message: " + json.dumps(request.json)
else:
return "415 Unsupported Media Type ;)"
app.run()
from flask import Flask, jsonify, render_template, request #import flask library
from flask_basicauth import BasicAuth # import flask library for create basic authentication if needed
from flask_cors import CORS # import flask library Cross-Origin Resource Sharing that is a mechanism that uses additional HTTP headers to tell a browser to let a web application running at one origin (domain) have permission to access selected resources from a server at a different origin
app = Flask(__name__)
CORS(app) #set-up cors for my app
#if you want use basic authentication you need set-up username and password
app.config['BASIC_AUTH_USERNAME'] = 'admin'
app.config['BASIC_AUTH_PASSWORD'] = 'password'
basic_auth = BasicAuth(app)#set-up username and password for my app but in this case I'm not specifying yet in which API use them
#app.route('/api/v1/resources/add_messages', methods=['POST'])#create my POST api
#basic_auth.required# set-up basic authentication for this API, comment out if not needed
def update_credential ():
json_credential=request.get_json()#get the JSON sent via API
print (json_credential["message"])#get the node "message" of my JSON
###########
#code to write in your file, you need write the json_credential["message"]
###########
return ("ok")
if __name__ == '__main__':
app.run(host='0.0.0.0', port=1024, threaded=True)#start my flask app with local_host IP and specific port, if you don't specify the port it will run in the default port
In this case the JSON Input should be:
{"message":"your text"}
Please let me know if something is not clear, I even try this code on my local and the JSON is passed without problems.....
So you need run your python script and see that the API is running, if you had no JSON to send and was just a simple API that give back information you should have used even Chrome but in this case that you need send some JSON data I would advice you to use Postman.
See screenshot example:
I am using Flask kvsession to avoid replay attacks, as the client side cookie based session used by Flask-login are prone to it.
Eg: If on /index page your cookie in the header is set for your app header like
myapp_session : 'value1'
and if you navigate to /important page you will get a new header like
myapp_session : 'value2' so if a hacker gets 'value1' he can perform replay attacks and misuse it, as it is never invalidated.
To solve this I am using flask-kvsession which stores the session cookie header value in a cache or some backend. SO basically only one myapp_session is generated and invalidated when you logout. But the problem is :-
__init__.py
from simplekv.memory.redisstore import RedisStore
import redis
store = RedisStore(redis.StrictRedis())
#store = memcache.Client(['127.0.0.1:11211'], debug =0)
store.ttl_support = True
app = create_app(__name__)
current_kvsession = KVSessionExtension(store, app)
If you look at the cleanup_session part of the code for kv-session
http://pythonhosted.org/Flask-KVSession/#flask_kvsession.KVSessionExtension.cleanup_sessions
It only deletes the expired sessions. But If I want to explicitly delete the value for the current myapp_session for a particular user on logout, how do I do that?
#app.before_request
def redirect_if_logout():
if request.path == url_for('logout'):
for key in app.kvsession_store.keys():
logger.debug(key)
m = current_kvsession.key_regex.match(key)
logger.debug('found %s', m)
app.kvsession_store.delete(key)
But this deletes all the keys as I don`t know what the unique key for the current session is.
Q2. Also, how to use memcache instead of redis as it doesn`t have the app.kvsession_store.keys() function and gives i/o error.
I think I just figured the 1st part of your question on how you can delete the specific key on logout.
As mentioned in the docs:
Internally, Flask-KVSession stores session ids that are serialized as
KEY_CREATED, where KEY is a random number (the sessions “true” id) and
CREATED a UNIX-timestamp of when the session was created.
Sample cookie value that gets created on client side (you can check with that cookie manager extenion for firefox):
c823af88aedaf496_571b3fd5.4kv9X8UvyQqtCtNV87jTxy3Zcqc
and session id stored in redis as key:
c823af88aedaf496_571b3fd5
So on logout handler, you just need to read the cookie value, split it and use the first part of the string:
Sample Code which worked for me:
import redis
from flask import Flask
from flask_kvsession import KVSessionExtension
from simplekv.memory.redisstore import RedisStore
store = RedisStore(redis.StrictRedis())
app = Flask(__name__)
KVSessionExtension(store, app)
#Logout Handler
#app.route('/logout', methods=['GET'])
def logout():
#here you are reading the cookie
cookie_val = request.cookies.get('session').split(".")[0]
store.delete(cookie_val)
and since you have added ttl_support:
store.ttl_support = True
It will match the TTL(seconds) value from permanent_session_lifetime, if you have set that in config file or in the beginning of your app.py file.
For example, in my application I have set in the beginning of app.py file as:
session.permanent = True
app.permanent_session_lifetime = timedelta(minutes=5)
now, when I logout, it deletes the key in redis but it will not be removed until TTL for that turns to 0 from 300 (5 Min as mentioned in permanent_session_lifetime value ).
If you want to remove it from redis immediately, for that you can manually change the app.permanent_session_lifetime to 1 second, which will in turn change TTL for redis.
import redis
import os
from flask import Flask
from flask_kvsession import KVSessionExtension
from simplekv.memory.redisstore import RedisStore
store = RedisStore(redis.StrictRedis())
app = Flask(__name__)
KVSessionExtension(store, app)
#Logout Handler
#app.route('/logout', methods=['GET'])
def logout():
cookie_val = request.cookies.get('session').split(".")[0]
app.permanent_session_lifetime = timedelta(seconds=1)
store.delete(cookie_val)
Using the above code, I was able to thwart session replay attacks.
and solution to your 2nd question:
3 possible mistakes that I can see are:
1: In the beginning of your code you have created:
store = RedisStore(redis.StrictRedis())
but in the loop you are using it as kvsession_store instead of just store:
app.kvsession_store.keys()
To use it without any errors/exceptions you can use it as store.keys() instead of app.store.keys():
from flask_kvsession import KVSessionExtension
from simplekv.memory.redisstore import RedisStore
store = RedisStore(redis.StrictRedis())
for key in store.keys():
print key
store.delete(key) is not deleting the all keys, you are running it inside the loop which is one by one deleting all keys.