How to read google sheets from google app engine - python

I am trying to read google sheet from google app engine using gspread-pandas package. In local machine, we usually store google_secret.json in specific path. But when it comes to app engine, I saved the file in /root/.config/gspread-pandas/google_secret.json but even then I am getting the same error as below
Please download json from https://console.developers.google.com/apis/credentials and save as /root/.config/gspread_pandas/google_secret.json
2) To add to this, I have created credentials part from the GCS and now trying to get the dict file in Spread class of gspread pandas. But, since we need to store the authorization code the first time, the app engine access, the failure is still happening for google app engine
Thank you in advance

In order to achieve your technical purpose - doing operations with Google Sheets from App Engine, I would recommend you to use the Google Sheets API 1. This API let's you read and write data, format text and numbers, create charts and many other features.
Here 2 there is a quickstart for Python, for this API. If you stil encounter compatibility issues, or you have a hard time getting/ storing the credentials in a persistent way, you can always opt for App Engine Flexible, which offers you more freedom 3.

So I have used Service account key and read the service account key from GCS. This process solved my problem

Related

How to transition from using Google Sheets as a database? (Needs to be manually updated by non tech employees)

I am currently trying to help a small business transition from using Google Sheets as a database to something more robust and scalable - preferably staying within Google services. I've looked into Google Cloud Storage and BigQuery, however - there are employees that need to manually update new data so anything in GCP won't be user friendly for non technical persons. I was thinking of employees still manually updating the Google sheets, and write a Python program to automatically update GCS or BigQuery, but the issue is that Google sheets is extremely slow and cannot handle the amount of data that's currently stored in there now.
Has anyone faced a similar issue and have any ideas/suggestions? Thank you so much in advance :)
What you might be able to do is to save the Google Sheet file as a .csv file and then import it in BigQuery. From there, maybe the employes can use simple commands to insert data. Please note that this question is very opinion based and anyone can suggest various ways to achieve what you want.
Create a web app hosted on App engine for the front end and data entry and connect it to Cloud SQL or BQ as a backend. You can create a UI in your web app where employees can access data from CloudSQL/BQ if needed. Alternatively you can use Google forms for data entry and connect it to Cloud SQL.

Update google sheet from python script

I'm trying to have a python script update a google sheets. The script scrapes a dynamically loaded webpage and retrieves some values.
Because the site loads dynamically, I can't use google scripts. But if I wan't the python script to access the google sheet, I need a google service account. This requires a credit card, but I don't have access to a non-prepaid one.
I could also host the python API on gae, but this also requires a credit card.
Any ideas for how I could work around this?
I believe your goal as follows.
You want to put the values to Google Spreadsheet using python.
In order to achieve your goal, how about the following patterns?
1. Access to Google Spreadsheet using Sheets API with OAuth2.
When Sheets API is used, the values can be put to the Google Spreadsheet.
In this case, you can see the sample script at Quickstart. And, I think that in your situation, the use of Google Sheets API is free of charge. If you want to increase the quota of Sheets API, please check it at https://developers.google.com/sheets/api/limits.
2. Access to Google Spreadsheet using Sheets API with Service account.
When Sheets API is used, the values can be put to the Google Spreadsheet.
In this case, you can see the sample script from these threads. And also, I think that in your situation, the use of Google Sheets API is free of charge.
3. Access to Google Spreadsheet using Web Apps created by Google Apps Script.
When Google Apps Script is used, the values can be put to the Google Spreadsheet. And, in this method, Google Apps Script is used as the wrapper.
At first, create Web Apps using Google Apps Script, and your python script accesses to the Web Apps by sending the values with requests module. By this, at the Web Apps side, the retrieved values are put to Google Spreadsheet using Google Apps Script. The Web Apps is used as the wrapper API for using Google Spreadsheet. And also, in this case, you can also access to the Web Apps with and without using the access token. In this method, I think that your python script is simple modification.
In this case, you can see the detail information about Web Apps at Web Apps and Taking advantage of Web Apps with Google Apps Script.

backing up data from app engine datastore as spreadsheet or csv

I am making a survay website on google app engine using Python. For saving the survey form data i am using NDB Datastore. After the survey I have to import it as spreadsheet or CSV. How can i do that.
Thanks.
You need to take a look at the google spreadsheets api, google it, try it, come back when something specific doesnt work.
Also consider using google forms instead which already do what you want (save responses to a spreadsheet)

Use oauth to connect GAE python application to google's drive/docs/spreadsheet

I'd like to allow my Google App Engine application to connect to a clients Google Spreadsheet on their Google Drive. I've spent the last two and a half days trying, and I've gotten nowhere. Half of the GAE Python documentation seems to be out of date. For example some of the examples have webapp, and they don't work until I change them to webapp2, but that doesn't always work.
I created a OAuth2.0 thing (not really sure what to call it) at:
https://code.google.com/apis/console/
So now I have a Client ID and Client Secret, but one doc talked about a CONSUMER_KEY and CONSUMER_SECRET. So are they the same or?
I followed the following doc to use OAuth to read my tasks (I know it's a different API), but I couldn't figure out step/Task 3. I'm not sure if I have all of the files/librarys to connect using OAuth. I have the gdata-2.0.17 files, and I know how to connect to the drive and spreadsheets by hard coding the login credentials, but no user is going to give me their credentials.
I don't normally ask for code, or even help, but I'm completely lost with this whole OAuth API/Service.
If someone could post some sample code that uses OAuth 2.0 and webapp2, and that you have tested, that would be awesome.
If someone could link me to a sample GAE Python project that can authenticate with Google's servers and allow it to connect to the users spreadsheets using OAuth 2.0 and webapp2, I'd be over the moon.
A complete example application using Google Drive from GAE is explained in this article.
See Retrieving Authenticated Google Data Feeds with Google App Engine (Python) if you need to access the spreadsheet content.
The samples in this article is using Google Document List API but it could be easily adapted to use spreadsheets scope and spreadsheet client or service.
If you only need to list the files, I would recommend using Drive like #SebastionKreft suggested

Google App Engine + Google Cloud Storage + Sqlite3 + Django/Python

since it is not possible to access mysql remotely on GAE, without the google cloud sql,
could I put a sqlite3 file on google cloud storage and access it through the GAE with django.db.backends.sqlite3?
Thanks.
No. SQLite requires native code libraries that aren't available on App Engine.
Since you cannot modify files in Google Cloud Storage (see this doc for further details), I don't think it would be a good idea to even try it. Additionally, the overhead of downloading (probably) every time the whole sqlite file would make your app really slow.
Google Cloud SQL is meant for this, why don't you want to use it?
If you have every frontend instance load the DB file, you'll have a really hard time synchronizing them. It just doesn't make sense. Why would you want to do this?

Categories

Resources