calling a google cloud function from google app script - python

I have a Google Cloud Function that I would like to call from my Google App Script on a Google Form submission.
The process will be: 1)user submits google form, 2)there will be a trigger (onformsubmit) that will run the app script function 3) app script function will trigger cloud function.
So far:
The script trigger works, in the logs it's listening correctly.
The cloud function works, I tested it in the Cloud function testing interface and when I run it from there, it does what I need it to do which is to update a google sheet as well as upload data to BigQuery.
The problem comes from calling that function from App Script that I have associated with my google form submission trigger. There seems to be no communication there, as cloud function logs don't show anything happening at trigger submission.
This is my app script code:
function onSubmit() {
var url = "myurl"
const token = ScriptApp.getIdentityToken()
var options = {
'method' : 'get',
'headers': {"Authorization":"Bearer "+ token}
};
var data = UrlFetchApp.getRequest(url,options);
return data
}
And my Cloud function is a HTTP one in Python and starts with:
def numbers(request):
Some troubleshooting:
When I test it, the execution log shows no errors
If I try to change UrlFetchApp to .fetch or change getIdentityToken to
getOAuthToken I get a 401 error for both
I added the following to my oauthScopes:
"openid",
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/script.container.ui",
"https://www.googleapis.com/auth/script.external_request",
"https://www.googleapis.com/auth/documents"```
I'm running both from the same Google Cloud account
I added myself to permissions in Cloud Function settings too
Any ideas of why the two aren't communicating would be appreciated!

I was able to resolve this in case anyone has a similar issue. Since my email address was associated with an organizational account, my Apps Script and GCP didn't allow the correct permissions.
In the settings of Apps Script, I couldn't change the GCP account with that function because the GCP was outside of my organization. Once I set up the Cloud Function on my organizations GCP, I was able to change the account manually in the settings and my function worked properly on the trigger.

Related

How can I allow my program permanent authorization for one spotify account (my own) with the spotify api?

I am building a local desktop app where I can read, classify, and create playlists.
The following auth code I have is:
##oauth
scope = "playlist-modify-public playlist-read-private playlist-modify-private"
sp = spotipy.Spotify(
auth_manager=spotipy.SpotifyOAuth(
client_id=client_id,
client_secret=client_secret,
redirect_uri= "https://example.com/callback/",
scope= scope, open_browser=False))
when run on cmd, this asks to click the link generated and then to paste the link that I was redirected to. I want to know if there is another way to provide authorization (automatically or permanently) so that my .exe app doesn't run into an error.
code in your response would help a lot.
You cannot grant permanent access to the APIs in a single call, but you can refresh your token automatically whenever the access expires, as shown in the docs.
If you're using Python, I recommend to do this via Spotipy, which makes the auth process much easier (see https://spotipy.readthedocs.io/en/master/#authorization-code-flow)

using OAuth2 user account authentication in the python google cloud API from jupyter notebook

I am trying to access BigQuery from python code in Jupyter notebook run on a local machine. So I installed the google cloud API packages on my laptop.
I need to pass the OAuth2 authentication. But unfortunately, I only have user account to our bigquery. I do not have service account and not application credentials, nor do I have the permissions to create such. I am only allowed to work with user account.
When running the bigquery.Client() function, it appears to look for application credentials by looking at an environment variable GOOGLE_APPLICATION_CREDENTIALS. But this, it seems, for my non existing application credentials.
I cannot find any other way to connect using user account authentication. But I find it extremely weird because:
The google API for R language works simply with user authentication. Parallel code in R (it has different API) just works!
I run the code from the dataspell IDE. I have created in the IDE a database resource connection to bigquery (with my user authentication). There I am capable of opening a console for the database and I can run SQL queries in the console with no problem. I have attached the bigquery session to my python notebook, and I can see my notebook attached to the big query session in the services pane. But I am still missing something in order to access some valid running connection in the python code. (I do not know how to get a python object representing a valid connected client).
I have been reading manuals from google and looked for code examples for hours... Alas, I cannot find any description of connecting a client using user account from my notebook.
Please, can someone help?
You can use the pydata-google-auth library to authenticate with a user account. This function loads credentials from a cache on disk or initiates an OAuth2.0 flow if the credentials are not found. This is not the recommended method to do an authentication.
import pandas_gbq
import pydata_google_auth
SCOPES = [
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/drive',
]
credentials = pydata_google_auth.get_user_credentials(
SCOPES,
# Set auth_local_webserver to True to have a slightly more convienient
# authorization flow. Note, this doesn't work if you're running from a
# notebook on a remote sever, such as over SSH or with Google Colab.
auth_local_webserver=True,
)
df = pandas_gbq.read_gbq(
"SELECT my_col FROM `my_dataset.my_table`",
project_id='YOUR-PROJECT-ID',
credentials=credentials,
)
The recommended way to do the authentication is to contact your GCP administrator and tell them to create a key for your account following the next instructions.
Then you can use this code to set up the authentication with the key that you have:
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file(
'/path/to/key.json')
You can see more of the documentation here.

Cloud Run: endpoint that runs a function as background job

I am trying to deploy a rest api in cloud run where one endpoint launches an async job. The job is defined inside a function in the code.
It seems one way to do it is to use Cloud Task, but this would mean to make a self-call to another endpoint of the deployed api. Specifically, to create an auxiliary endpoint that contains the job code (e.g. /run-my-function) and another one to set the queue to cloud task that launches the /run-my-function?
Is this the right way to do it or I have misunderstand something? In case it's the right way how to specify the url of the /run-my-function endpoint without explicitly hard-code the cloud run deployed uRL name?
The code for the endpoint that launches the endpoint with the run-my-function code would be:
from google.cloud import tasks_v2
client = tasks_v2.CloudTasksClient()
project = 'myproject'
queue = 'myqueue'
location = 'mylocation'
url = 'https://cloudrunservice-abcdefg-ca.b.run.app/run-my-function'
service_account_email = '12345#cloudbuild.gserviceaccount.com'
parent = client.queue_path(project, location, queue)
task = {
"http_request": {
"http_method": tasks_v2.HttpMethod.POST,
'url': url,
"oidc_token": {"service_account_email": service_account_email},
}
}
response = client.create_task(parent=parent, task=task)
However, this requires to hard-code the service name https://cloudrunservice-abcdefg-ca.b.run.app and to define an auxiliary endpoint /run-my-function that can be called via http
In your code you are able to get the Cloud Run URL without hardcoding it or setting it in an environment variable.
You can have a look to a previous article that I wrote, in the gracefull termison part. I provide a working code in Go, not so difficult to re-implement in Python.
Here the principle:
Get the Region and the project Number from the Metadata server. Keep in mind that Cloud Run has specific metadata like the region
Get the K_SERVICE env var (it's a standard Cloud Run env var)
Perform a call to the Cloud Run Rest API to get the service detail and customize the request with the data got previously
Extract the status.url JSON entry from the response.
Now you have it!
Let me know if you have difficulties to achieve that. I'm not good at Python, but I will be able to write that piece of code!

setting up a python http function in firestore

I have an app that is meant to integrate with third-party apps. These apps should be able to trigger a function when data changes.
The way I was envisioning this, I would use a node function to safely prepare data for the third parties, and get the url to call from the app's configuration on firestore. I would call that url from the node function, and wait for it to return, updating results as necessary (actually, triggering a push notification). -- these third-party functions would tend to be python functions, so my demo should be in python.
I have the initial node function and firestore setup so that I am currently triggering a ECONNREFUSED -- because I don't know how to set up the third-party function.
Let's say this is the function I need to trigger:
def hello_world(request):
request_json = request.get_json()
if request_json and 'name' in request_json:
name = request_json['name']
else:
name = 'World'
return 'Hello, {}!\n'.format(name)
Do I need to set up a separate gcloud account to host this function, or can I include it in my firestore functions? If so, how do I deploy this to firestore? Typically with my node functions, I am running firebase deploy and it automagically finds my functions from my index.js file.
If you're asking whether Cloud Functions that are triggered by Cloud Firestore can co-exist in a project with Cloud Functions that are triggered by HTTP(S) requests, then the answer is "yes they can". There is no need to set up a separate (Firebase or Cloud) project for each function type.
However: when you deploy your Cloud Functions through the Firebase CLI with firebase deploy, it will remove any functions that it finds in the project, that are not in the code. If you have functions both in Python and in Node.js, there is never a single codebase that contains both, so a blanket deploy would always delete some of your functions. So in that case, you should use the granular deploy option of the Firebase CLI.

Trigger python code from Google spreadsheets?

In excel you can create user defined functions with python using pyxll. I have been moving to Google spreadsheets and using their Google app script, but the libraries are so much bigger and better in python, I wish there was a way to build user defined functions using python from Google spreadsheets. There are ways to interact python with Google sheets like gspread. Is there a way to run python on Google app engine then get sheet to trigger that code? What other ways is there to trigger python code from Google spreadsheets?
You should create a webservice in GAE which then can be called using Google Apps Script UrlFetch class.
This is how I usually do to integrate a third party app with Apps Script App.
In a Spreadsheet container script you can create a code like
function myFunction(){
//your code
//Call the webservice
var response = UrlFetchApp.fetch('my_webservice_url', {payload:'...', method:'POST'});
Logger.log(response.getContentText());
// your code based on response
}
Above code can be triggered by a time driven trigger in Apps Script based on some conditions
Deploy your python code as a cloud function:
https://cloud.google.com/functions/docs/writing/http.
Then call your function with URL Fetch as shown above.
One way is to have some code that reads the spreadsheet all the time, then runs some other code when a condition is met.
Without GAE, you could use the following code:
#http://code.google.com/p/gdata-python-client/downloads/list
import gdata.spreadsheet.service as s
spreadsheet_key = 'spreadsheetkey'# https://docs.google.com/spreadsheet/ccc?key=<spreadsheet key>&usp=sharing#gid=0
worksheet_key = 'od6' #first tab
gd_client = s.SpreadsheetsService(spreadsheet_key, worksheet_key)
gd_client.email = 'user#gmail.com'
gd_client.password = 'password'
gd_client.ProgrammaticLogin()
list_feed = gd_client.GetListFeed(spreadsheet_key, worksheet_key)
for entry in list_feed.entry:
#read cell values and then do something if the condition is met
If you wanted to have the spreadsheet run code in a GAE app, then you could publish the spreadsheet and construct the URL of the spreadsheet (JSON) like this: https://spreadsheets.google.com/feeds/list/(spreadsheetkey)/od6/public/values?alt=json
This address can be accessed via the app, the cell values can be read, and some code can be triggered.
The approach is the same with both ideas: some code monitors the spreadsheet and when some condition is met, some other code is triggered. I'm not sure how you could run the code (in a GAE app, say) when the condition is met purely from the Google Spreadsheet.

Categories

Resources