Trigger python code from Google spreadsheets? - python

In excel you can create user defined functions with python using pyxll. I have been moving to Google spreadsheets and using their Google app script, but the libraries are so much bigger and better in python, I wish there was a way to build user defined functions using python from Google spreadsheets. There are ways to interact python with Google sheets like gspread. Is there a way to run python on Google app engine then get sheet to trigger that code? What other ways is there to trigger python code from Google spreadsheets?

You should create a webservice in GAE which then can be called using Google Apps Script UrlFetch class.
This is how I usually do to integrate a third party app with Apps Script App.
In a Spreadsheet container script you can create a code like
function myFunction(){
//your code
//Call the webservice
var response = UrlFetchApp.fetch('my_webservice_url', {payload:'...', method:'POST'});
Logger.log(response.getContentText());
// your code based on response
}
Above code can be triggered by a time driven trigger in Apps Script based on some conditions

Deploy your python code as a cloud function:
https://cloud.google.com/functions/docs/writing/http.
Then call your function with URL Fetch as shown above.

One way is to have some code that reads the spreadsheet all the time, then runs some other code when a condition is met.
Without GAE, you could use the following code:
#http://code.google.com/p/gdata-python-client/downloads/list
import gdata.spreadsheet.service as s
spreadsheet_key = 'spreadsheetkey'# https://docs.google.com/spreadsheet/ccc?key=<spreadsheet key>&usp=sharing#gid=0
worksheet_key = 'od6' #first tab
gd_client = s.SpreadsheetsService(spreadsheet_key, worksheet_key)
gd_client.email = 'user#gmail.com'
gd_client.password = 'password'
gd_client.ProgrammaticLogin()
list_feed = gd_client.GetListFeed(spreadsheet_key, worksheet_key)
for entry in list_feed.entry:
#read cell values and then do something if the condition is met
If you wanted to have the spreadsheet run code in a GAE app, then you could publish the spreadsheet and construct the URL of the spreadsheet (JSON) like this: https://spreadsheets.google.com/feeds/list/(spreadsheetkey)/od6/public/values?alt=json
This address can be accessed via the app, the cell values can be read, and some code can be triggered.
The approach is the same with both ideas: some code monitors the spreadsheet and when some condition is met, some other code is triggered. I'm not sure how you could run the code (in a GAE app, say) when the condition is met purely from the Google Spreadsheet.

Related

How to save a googlesheet chart as image by Python

I'm using Googlesheet API with Python and I can get access to the sheet and the cells now. However, I don't know how to get the chart in the sheet.
client = gspread.service_account_from_dict(creds)
workbook = client.open('HR - 8/16-8/31 Data')
sheet = workbook.get_worksheet(0)
H1 = sheet.acell('B3').value
I found this question:How to download charts in PNG from google sheet mentioned I can use the getCharts() function, but it is for JavaScript only. If there a similar function in Python?
Currently the API doesn't have a method to do this. The charts overview documentation explains how to manipulate and create them, but not how to export them. Reading the data also only gives you a JSON representation of it, not an image. It seems that the Apps Script getCharts() leverages other server-side functions that are not in the regular API.
This is documented as a feature request in Google's issue tracker here, so you can +1 it if you want. In that thread a possible workaround was posted. If you publish your file you can build a URL if you know the chartID to generate it as an image:
https://docs.google.com/spreadsheets/d/e/<publish-id>/pubchart?oid=<chart-id>&format=image
Gspread doesn't seem to have methods to do this so you'll have to use the Google APIs. In their Python Quickstart you can find a sample to set up authorization, and you can use spreadsheets.get(), which gives you all the data from the spreadsheet including the chart IDs. If you only have a single chart that you want to export periodically then you can just get the ID once from the UI and just retrieve it with Python. The caveat is that you have to publish the Sheet which you don't want to do with sensitive information.
As another alternative you could build an Apps Script Web App which uses the getCharts() method in the answer that you linked, and just send a POST message from your Python app and have Apps Script return the image in its response.

Cloud Run: endpoint that runs a function as background job

I am trying to deploy a rest api in cloud run where one endpoint launches an async job. The job is defined inside a function in the code.
It seems one way to do it is to use Cloud Task, but this would mean to make a self-call to another endpoint of the deployed api. Specifically, to create an auxiliary endpoint that contains the job code (e.g. /run-my-function) and another one to set the queue to cloud task that launches the /run-my-function?
Is this the right way to do it or I have misunderstand something? In case it's the right way how to specify the url of the /run-my-function endpoint without explicitly hard-code the cloud run deployed uRL name?
The code for the endpoint that launches the endpoint with the run-my-function code would be:
from google.cloud import tasks_v2
client = tasks_v2.CloudTasksClient()
project = 'myproject'
queue = 'myqueue'
location = 'mylocation'
url = 'https://cloudrunservice-abcdefg-ca.b.run.app/run-my-function'
service_account_email = '12345#cloudbuild.gserviceaccount.com'
parent = client.queue_path(project, location, queue)
task = {
"http_request": {
"http_method": tasks_v2.HttpMethod.POST,
'url': url,
"oidc_token": {"service_account_email": service_account_email},
}
}
response = client.create_task(parent=parent, task=task)
However, this requires to hard-code the service name https://cloudrunservice-abcdefg-ca.b.run.app and to define an auxiliary endpoint /run-my-function that can be called via http
In your code you are able to get the Cloud Run URL without hardcoding it or setting it in an environment variable.
You can have a look to a previous article that I wrote, in the gracefull termison part. I provide a working code in Go, not so difficult to re-implement in Python.
Here the principle:
Get the Region and the project Number from the Metadata server. Keep in mind that Cloud Run has specific metadata like the region
Get the K_SERVICE env var (it's a standard Cloud Run env var)
Perform a call to the Cloud Run Rest API to get the service detail and customize the request with the data got previously
Extract the status.url JSON entry from the response.
Now you have it!
Let me know if you have difficulties to achieve that. I'm not good at Python, but I will be able to write that piece of code!

calling a google cloud function from google app script

I have a Google Cloud Function that I would like to call from my Google App Script on a Google Form submission.
The process will be: 1)user submits google form, 2)there will be a trigger (onformsubmit) that will run the app script function 3) app script function will trigger cloud function.
So far:
The script trigger works, in the logs it's listening correctly.
The cloud function works, I tested it in the Cloud function testing interface and when I run it from there, it does what I need it to do which is to update a google sheet as well as upload data to BigQuery.
The problem comes from calling that function from App Script that I have associated with my google form submission trigger. There seems to be no communication there, as cloud function logs don't show anything happening at trigger submission.
This is my app script code:
function onSubmit() {
var url = "myurl"
const token = ScriptApp.getIdentityToken()
var options = {
'method' : 'get',
'headers': {"Authorization":"Bearer "+ token}
};
var data = UrlFetchApp.getRequest(url,options);
return data
}
And my Cloud function is a HTTP one in Python and starts with:
def numbers(request):
Some troubleshooting:
When I test it, the execution log shows no errors
If I try to change UrlFetchApp to .fetch or change getIdentityToken to
getOAuthToken I get a 401 error for both
I added the following to my oauthScopes:
"openid",
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/script.container.ui",
"https://www.googleapis.com/auth/script.external_request",
"https://www.googleapis.com/auth/documents"```
I'm running both from the same Google Cloud account
I added myself to permissions in Cloud Function settings too
Any ideas of why the two aren't communicating would be appreciated!
I was able to resolve this in case anyone has a similar issue. Since my email address was associated with an organizational account, my Apps Script and GCP didn't allow the correct permissions.
In the settings of Apps Script, I couldn't change the GCP account with that function because the GCP was outside of my organization. Once I set up the Cloud Function on my organizations GCP, I was able to change the account manually in the settings and my function worked properly on the trigger.

Trigger Python when a new row is added to Google Sheet

I have connected Python to Google Sheet through API under Google Cloud Platform . My project requires me to retrieve the new data whenever it is added to Google Sheet. Is there a way to trigger Python code to run to get the last row of the Google Sheet?
This depends on how your python script is run.
For example, if it's a cloud function, you can run it pretty easily with something like
function executePythonFunction() {
UrlFetchApp.fetch('<YOUR-PYTHON-CLOUD-FUNCTION-URL>');
}
by creating installable trigger for Change event

Upload content from a BIG CSV to CloudSQL using App Engine Python

I'm pretty new with Google App Engine.
What i need to do is to upload a pretty large CSV to CloudSQL.
I've got an HTML page that has a file upload module which when uploaded reaches the Blobstore.
After which i open the CSV with the Blob reader and execute each line to CloudSQL using cursor.execute("insert into table values"). The problem here is that i can only execute the HTTP request for a minute and not all the data gets inserted in that short a time. It also keeps the screen in a loading state throughout which i would like to avoid by making the code run in the back end if that's possible?
I also tried going the "LOAD DATA LOCAL INFILE" way.
"LOAD DATA LOCAL INFILE" works from my local machine when i'm connected to CloudSQL via the terminal. And its pretty quick.
How would i go about using this within App Engine?
Or is there a better way to import a large CSV into CloudSQL through the Blobstore or Google Cloud Storage directly after uploading the CSV from the HTML?
Also, is it possible to use Task Queues with Blob Store and then insert the data into CloudSQL on the backend?
I have used a similar approach for Datastore and not CloudSQL but the same approach can be applied to your scenario.
Setup a non-default module (previously backend, deprecated now) of your application
Send a http request which will trigger the module endpoint through a task queue (to avoid 60 second deadline)
Use mapreduce with CSV as input and do the operation on each line of csv within the map function (to avoid memory errors and resume pipeline from where it left in case of any errors during operation)
EDIT: Elaborating map reduce as per OP request, and also eliminating the use of taskqueue
Read the mapreduce basics from the docs found here
Download the dependency folders for mapreduce to work (simplejson, graphy, mapreduce)
Download this file to your project folder and save as "custom_input_reader.py"
Now copy the code below to your main_app.py file.
main_app.py
from mapreduce import base_handler
from mapreduce import mapreduce_pipeline
from custom_input_reader import GoogleStorageLineInputReader
def testMapperFunc(row):
# do process with csv row
return
class TestGCSReaderPipeline(base_handler.PipelineBase):
def run(self):
yield mapreduce_pipeline.MapPipeline(
"gcs_csv_reader_job",
"main_app.testMapperFunc",
"custom_input_reader.GoogleStorageLineInputReader",
params={
"input_reader": {
"file_paths": ['/' + bucketname + '/' + filename]
}
})
Create a http handler which will initiate the map job
main_app.py
class BeginUpload(webapp2.RequestHandler):
# do whatever you want
upload_task = TestGCSReaderPipeline()
upload_task.start()
# do whatever you want
If you want to pass any parameters, add the parameter in "run" method and provide values when creating the pipeline object
You can try importing CSV data via cloud console:
https://cloud.google.com/sql/docs/import-export?hl=en#import-csv

Categories

Resources