I am trying to send variables from my flutter app through a http trigger to a python script on google cloud functions.
I can successfully trigger the function but my function does not receive the variable.
Here is my relevant app code:
callCloudFunction() async {
final HttpsCallable callable = CloudFunctions.instance.getHttpsCallable(
functionName: 'testPath',
);
dynamic resp = await callable.call(<String, dynamic>{
'uid': uid,
}).catchError((onError) {
//Handle your error here if the function failed
});
}
Here is my relevant python code from cloud functions:
def main(request):
name = 'empty'
request_args = request.args
request_json = request.get_json(silent=True)
if request_json and 'uid' in request_json:
name = request_json['uid']
elif request_args and 'uid' in request_args:
name = request_args['uid']
print(name)
No issues trigger the function. It has other functionality that I did not show here so I can confirm the trigger works. Its just passing that variable "uid" that is not working.
You're using the Firebase SDK for Cloud Functions to invoke a callable function, but your python function is just a regular HTTP function. This isn't going to work without implementing the protocol that the SDK uses to communicate with the function. Backend support for callables is only provided for nodejs when deploying with the Firebase CLI.
It will probably be easier if you just make a normal HTTP request from the client instead of writing all the code required for the protocol.
I needed to use the request.form.get('uid') to get data from my post request. Hope this helps someone!
Related
I have been using Google Cloud Functions for over a week now and they have been great. I used a simple python 3.9 function to print a string to my terminal in my Next.js app (for testing purposes) and it was working great. Here is my sample Google Cloud Function.
def hello_world(request):
"""Responds to any HTTP request.
Args:
request (flask.Request): HTTP request object.
Returns:
The response text or any set of values that can be turned into a
Response object using
`make_response <http://flask.pocoo.org/docs/1.0/api/#flask.Flask.make_response>`.
"""
request_json = request.get_json()
if request.args and 'message' in request.args:
return request.args.get('message')
elif request_json and 'message' in request_json:
return request_json['message']
else:
return f'Function ran'
And here is my Next.js code that calls the function:
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from "next"
import { GoogleAuth } from "google-auth-library"
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
const url = process.env.FUNCTION_URL as string
//Example with the key file, not recommended on GCP environment.
const auth = new GoogleAuth({ keyFilename: process.env.KEYSTORE_PATH })
//Create your client with an Identity token.
const client = await auth.getIdTokenClient(url)
const result = await client.request({ url })
console.log(result.data)
res.json({ data: result.data })
}
I wrote another function to do the same thing and now every function just prints out raw html to the console? When I open the text in an index.html file it looks like this.
I rewrote the original cloud function exactly and even that doesn't work anymore. It prints that same html to the console. What is going on? My code is exactly the same and it breaks now...?
It was a simple "needle in the haystack" type of fix. I was getting the function url from within the cloud function listing view, clicking the Actions Menu dropdown on the desired function, and selecting "Copy Function." This took me to the configuration display page where it had the function url in the "Trigger" section.
Notice what I circled in red. For some reason this page appends a -1 to the function url. This is what made it fail for me. In my code I deleted the -1 from the url stored in the process.env.FUNCTION_URL variable. That made it work!
Im fairly new to AWS and its Cognito and API-Gateway services.
I have created in AWS a Cognito-specific User Pool and an AWS-specific API-Gateway API with some API-Endpoints to be accessed via REST API calls. The API-Gateway "Authorizer" is set to "Cognito".
After that, I have exported the Swagger document/OpenAPI2.0 using the AWS-Console specific export function and generated with the Swagger Editor a Python REST Client API.
The generated REST Client SDK generated the Model-specific "GET" function, e. g.:
# create an instance of the API class
api_instance = swagger_client.DefaultApi()
user_id = 'user_id_example' # str |
try:
api_response = api_instance.user_get(user_id)
pprint(api_response)
except ApiException as e:
print("Exception when calling DefaultApi->user_get: %s\n" % e)
In order to get a correct result from the function call api_instance.user_get(user_id)
I need somehow pass the access token to that function.
The question is now, how do I pass the access token - which I have successfully obtained after the User signed-in - to the Python REST Client API in order to invoke an API-Endpoint function which has an "Cognito" authorizer set?
I saw many expamples how to realize this with Postman or CURL, but this is not what I'm looking for. I want to invoke my "Cognito" protected API-Endpoint in AWS API-Gateway with the generated REST API Client. I assume, there must be a way to put the received access token to the "Authorization" Header in the HTTP-Request call, before the generated REST Client function is invoked.
Any help is very appreciated.
I'm not sure if I've understood you correctly, but this might help you.
import requests
endpoint = ".../api/ip"
data = {"ip": "1.1.2.3"}
headers = {"Authorization": "Bearer MyBearerAuthTokenHere"}
print(requests.post(endpoint, data=data, headers=headers).json())
#Edit
You don't need to parse the response as json if it isn't. This is just an Sample.
I have a situation where I am trying to create two Cloud Functions namely CF1 & CF2 and I have one Cloud Scheduler. Both cloud functions are having authenticated invocation enabled. My flow is Cloud Scheduler will trigger CF1. On completion of CF1, the CF1 will trigger CF2 as a http call. I have referred Cannot invoke Google Cloud Function from GCP Scheduler to access authenticated CF1 from Cloud Scheduler and able to access CF1. But I am getting problem when accessing CF2 from CF1. The CF1 does not trigger CF2 and also not giving any error message. Do we need to follow any other technique when accessing authenticated Cloud Function from another authenticated Cloud Function.
CF1 code:
import json
import logging
from requests_futures.sessions import FuturesSession
def main(request):
# To read parameter values from request (url arguments or Json body).
raw_request_data = request.data
string_request_data = raw_request_data.decode("utf-8")
request_json: dict = json.loads(string_request_data)
request_args = request.args
if request_json and 'cf2_endpoint' in request_json:
cf2_endpoint = request_json['cf2_endpoint']
elif request_args and 'cf2_endpoint' in request_args:
cf2_endpoint = request_args['cf2_endpoint']
else:
cf2_endpoint = 'Invalid endpoint for CF2'
logger = logging.getLogger('test')
try:
session = FuturesSession()
session.get("{}".format(cf2_endpoint))
logger.info("First cloud function executed successfully.")
except RuntimeError:
logger.error("Exception occurred {}".format(RuntimeError))
CF2 code:
import logging
def main(request):
logger = logging.getLogger('test')
logger.info("second cloud function executed successfully.")
Current output logs:
First cloud function executed successfully.
Expected output logs:
First cloud function executed successfully.
second cloud function executed successfully.
Note: Same flow is working if I use unauthenticated access to the both cloud functions.
Two things are happening here:
You're not using request-futures entirely correctly. Since the request is made asynchronously, you need to block on the result before the function implicitly returns, otherwise it might return before your HTTP request completes (although it probably is in this example):
session = FuturesSession()
future = session.get("{}".format(cf2_endpoint))
resp = future.result() # Block on the request completing
The request you're making to the second function is not actually an authenticated request. Outbound requests from a Cloud Function are not authenticated by default. If you looked at what the actual response is above, you would see:
>>> resp.status_code
403
>>> resp.content
b'\n<html><head>\n<meta http-equiv="content-type" content="text/html;charset=utf-8">\n<title>403 Forbidden</title>\n</head>\n<body text=#000000 bgcolor=#ffffff>\n<h1>Error: Forbidden</h1>\n<h2>Your client does not have permission to get URL <code>/function_two</code> from this server.</h2>\n<h2></h2>\n</body></html>\n'
You could jump through a lot of hoops to properly authenticate this request, as detailed in the docs: https://cloud.google.com/functions/docs/securing/authenticating#function-to-function
However, a better alternative would be to make your second function a "background" function and invoke it via a PubSub message published from the first function instead:
from google.cloud import pubsub
publisher = pubsub.PublisherClient()
topic_name = 'projects/{project_id}/topics/{topic}'.format(
project_id=<your project id>,
topic='MY_TOPIC_NAME', # Set this to something appropriate.
)
def function_one(request):
message = b'My first message!'
publisher.publish(topic_name, message)
def function_two(event, context):
message = event['data'].decode('utf-8')
print(message)
As long as your functions have the permissions to publish PubSub messages, this avoids the need to add authorization to the HTTP requests, and also ensures at-least-once delivery.
Google Cloud Function provide REST API interface what include call method that can be used in another Cloud Function HTTP invokation.
Although the documentation mention using Google-provided client libraries there is still non one for Cloud Function on Python.
And instead you need to use general Google API Client Libraries. [This is the python one].3
Probably, the main difficulties while using this approach is an understanding of authentification process.
Generally you need provide two things to build a client service:
credentials ans scopes.
The simpliest way to get credentials is relay on Application Default Credentials (ADC) library. The rigth documentation about that are:
https://cloud.google.com/docs/authentication/production
https://github.com/googleapis/google-api-python-client/blob/master/docs/auth.md
The place where to get scopes is the each REST API function documentation page.
Like, OAuth scope: https://www.googleapis.com/auth/cloud-platform
The complete code example of calling 'hello-world' clound fucntion is below.
Before run:
Create default Cloud Function on GCP in your project.
Keep and notice the default service account to use
Keep the default body.
Notice the project_id, function name, location where you deploy function.
If you will call function outside Cloud Function environment (locally for instance) setup the environment variable GOOGLE_APPLICATION_CREDENTIALS according the doc mentioned above
If you will call actualy from another Cloud Function you don't need to configure credentials at all.
from googleapiclient.discovery import build
from googleapiclient.discovery_cache.base import Cache
import google.auth
import pprint as pp
def get_cloud_function_api_service():
class MemoryCache(Cache):
_CACHE = {}
def get(self, url):
return MemoryCache._CACHE.get(url)
def set(self, url, content):
MemoryCache._CACHE[url] = content
scopes = ['https://www.googleapis.com/auth/cloud-platform']
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set,
# ADC uses the service account file that the variable points to.
#
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS isn't set,
# ADC uses the default service account that Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run,
# and Cloud Functions provide
#
# see more on https://cloud.google.com/docs/authentication/production
credentials, project_id = google.auth.default(scopes)
service = build('cloudfunctions', 'v1', credentials=credentials, cache=MemoryCache())
return service
google_api_service = get_cloud_function_api_service()
name = 'projects/{project_id}/locations/us-central1/functions/function-1'
body = {
'data': '{ "message": "It is awesome, you are develop on Stack Overflow language!"}' # json passed as a string
}
result_call = google_api_service.projects().locations().functions().call(name=name, body=body).execute()
pp.pprint(result_call)
# expected out out is:
# {'executionId': '3h4c8cb1kwe2', 'result': 'It is awesome, you are develop on Stack Overflow language!'}
Am beginner to Amazon web services.
I have a below lambda python function
import sys
import logging
import pymysql
import json
rds_host=".amazonaws.com"
name="name"
password="123"
db_name="db"
port = 3306
def save_events(event):
result = []
conn = pymysql.connect(rds_host, user=name, passwd=password, db=db_name,
connect_timeout=30)
with conn.cursor(pymysql.cursors.DictCursor) as cur:
cur.execute("select * from bodyPart")
result = cur.fetchall()
cur.close()
print ("Data from RDS...")
print (result)
cur.close()
bodyparts = json.dumps(result)
bodyParts=(bodyparts.replace("\"", "'"))
def lambda_handler(event, context):
save_events(event)
return bodyParts
using an above function am sending json to the client using API gateway, now suppose user selects an item from the list and send it back, in form of json where will i get http request and how should i process that request
I just made an additional information for #Harsh Manvar.
The easiest way I think is you can use
api-gateway-proxy-integration-lambda
Currently API Gateway support AWS lambda very good, you can pass request body (json) by using event.body to your lambda function.
I used it everyday in my hobby project (a Slack command bot, it is harder because you need to map from application/x-www-form-urlencoded to json through mapping template)
And for you I think it is simple because you using only json as request and response. The key is you should to select Integratiton type to Lambda function
You can take some quick tutorials in Medium.com for more detail, I only link the docs from Amazon.
#mohith: Hi man, I just made a simple approach for you, you can see it here.
The first you need to create an API (see the docs above) then link it to your Lambda function, because you only use json, so you need to check the named Use Lambda Proxy integration like this:
Then you need to deploy it!
Then in your function, you can handle your code, in my case, I return all the event that is passed to my function like this:
Finally you can post to your endpoint, I used postman in my case:
I hope you get my idea, when you successfully deployed your API then you can do anything with it in your front end.
I suggest you research more about CloudWatch, when you work with API Gateway, Lambda, ... it is Swiss army knife, you can not live without it, it is very easy for tracing and debug your code.
Please do not hesitate to ask me anything.
you can use aws service called API-gateway it will give you endpoint for http api requests.
this api gateway make connection with your lambda and you can pass http request to lambda.
here sharing info about creating rest api on lambda you can check it out : https://docs.aws.amazon.com/apigateway/latest/developerguide/how-to-create-api.html
aws also provide example for lambda GET, POST lambda example.you just have to edit code it will automatically make api-gateway.as reference you can check it.
From Lambda Console > create function > choose AWS serverless repository > in search bar type "get" and search > api-lambda-dynamodb > it will take value from user and process in lambda.
here sharing link you can direct check examples : https://console.aws.amazon.com/lambda/home?region=us-east-1#/create?tab=serverlessApps
Posting here because I just can't get a redirect working. Using AWS API Gateway linked to a Python Lambda function as a proxy just returns the response and header json. Here is the code
import json
def lambda_handler(event, context):
response = {}
response["statusCode"]=301
response["headers"]=[{"key": 'Location',"value":
'https://www.google.com'}]
data = {}
response["body"]=json.dumps(data)
return response
Any help will be appreciated?
Thanks
Mixed documentation on the web which was confusing. The syntax for specifying the redirect using Location needs to be the following when using Python:
import json
def lambda_handler(event, context):
response = {}
response["statusCode"]=302
response["headers"]={'Location': 'https://www.google.com'}
data = {}
response["body"]=json.dumps(data)
return response
I'll prefix this by saying that when I copied this code into a Lambda function, added an API Gateway to it using the settings that made sense to me, and tested from a browser and curl, I got the correct redirect. Which is expected, the code looks right and conforms to the specification in the documentation.
So I spent some time fiddling with settings in Lambda and in API Gateway to try to break it; plus searching the web to see how others have had it not work.
The general Internet consensus in 2021 (time of original post) is that there was a setting "Use Lambda proxy integration" in the API Gateway that needed to be turned on for API Gateway to interpret the returned JSON correctly, and this was not the default. I can't find that setting in the console today in that format, but when you create an API Gateway "API" you select integrations, the first one in the list is Lambda. Selecting that sets up the integration correctly for interpreting the JSON (in either v1 or v2 format).
If you are working in an older already-configured API Gateway endpoint, I'd suggest looking for the "Use Lambda proxy integration" setting, followed by setting up the API Gateway with the "Lambda" integration setting if it is the new-style interface.
a bit less line, with the same output
def handler(event, context):
response = {
"headers": {"Location": "https://www.google.com", },
"statusCode": 301,
}
return response