since it is not possible to access mysql remotely on GAE, without the google cloud sql,
could I put a sqlite3 file on google cloud storage and access it through the GAE with django.db.backends.sqlite3?
Thanks.
No. SQLite requires native code libraries that aren't available on App Engine.
Since you cannot modify files in Google Cloud Storage (see this doc for further details), I don't think it would be a good idea to even try it. Additionally, the overhead of downloading (probably) every time the whole sqlite file would make your app really slow.
Google Cloud SQL is meant for this, why don't you want to use it?
If you have every frontend instance load the DB file, you'll have a really hard time synchronizing them. It just doesn't make sense. Why would you want to do this?
Related
I am currently trying to help a small business transition from using Google Sheets as a database to something more robust and scalable - preferably staying within Google services. I've looked into Google Cloud Storage and BigQuery, however - there are employees that need to manually update new data so anything in GCP won't be user friendly for non technical persons. I was thinking of employees still manually updating the Google sheets, and write a Python program to automatically update GCS or BigQuery, but the issue is that Google sheets is extremely slow and cannot handle the amount of data that's currently stored in there now.
Has anyone faced a similar issue and have any ideas/suggestions? Thank you so much in advance :)
What you might be able to do is to save the Google Sheet file as a .csv file and then import it in BigQuery. From there, maybe the employes can use simple commands to insert data. Please note that this question is very opinion based and anyone can suggest various ways to achieve what you want.
Create a web app hosted on App engine for the front end and data entry and connect it to Cloud SQL or BQ as a backend. You can create a UI in your web app where employees can access data from CloudSQL/BQ if needed. Alternatively you can use Google forms for data entry and connect it to Cloud SQL.
I am trying to read google sheet from google app engine using gspread-pandas package. In local machine, we usually store google_secret.json in specific path. But when it comes to app engine, I saved the file in /root/.config/gspread-pandas/google_secret.json but even then I am getting the same error as below
Please download json from https://console.developers.google.com/apis/credentials and save as /root/.config/gspread_pandas/google_secret.json
2) To add to this, I have created credentials part from the GCS and now trying to get the dict file in Spread class of gspread pandas. But, since we need to store the authorization code the first time, the app engine access, the failure is still happening for google app engine
Thank you in advance
In order to achieve your technical purpose - doing operations with Google Sheets from App Engine, I would recommend you to use the Google Sheets API 1. This API let's you read and write data, format text and numbers, create charts and many other features.
Here 2 there is a quickstart for Python, for this API. If you stil encounter compatibility issues, or you have a hard time getting/ storing the credentials in a persistent way, you can always opt for App Engine Flexible, which offers you more freedom 3.
So I have used Service account key and read the service account key from GCS. This process solved my problem
Is there any software library that provides an interface for storing and querying data like the Google App Engine Datastore, but uses a local file or service instead of running on App Engine?
The specific features I am looking for are:
Stores data as Entities with Named Properties
Query support
Atomic transactions
Python language bindings
Runs on my local machine
either stores to a single file
or connects to a local database
service
Free and open source
Thanks
You can also check MongoDB. It is an open source document-oriented database system.
You may also want to check out Appscale (http://www.appscale.com). It lets you run your App Engine apps without modification outside of Google (on your laptop, on your local cluster / behind your firewall, or in Amazon EC2). AppScale is and does each of the requirements you list here. It automatically installs/configures/manages the datastore service (and all other APIs/services) for your apps to use, so you don't have to.
Have a look at ZODB - not exactly alike but similiar http://www.zodb.org/
from the docs
Some of the features that ZODB brings to you:
Transparent persistence for Python objects
Full ACID-compatible
transaction support (including savepoints) History/undo ability
Efficient support for binary large objects (BLOBs)
Pluggable storages
Scalable architecture
I created a simple bookmarking app using django which uses sqlite3 as the database backend.
Can I upload it to appengine and use it? What is "Django-nonrel"?
Unfortunately, no you can't. Google App Engine does not allow you to write files, and that is needed by SQLite.
Until recently, it had no support of SQL at all, preferring a home-grown solution (see the "CAP theorem" as for why). This motivated the creation of projects like "Django-nonrel" which is a version of Django that does not require a relational database.
Recently, they opened a beta service that proposes a MySQL database. But beware that it is fundamentally less reliable, and that it is probably going to be expensive.
EDIT: As Nick Johnson observed, this new service (Google Cloud SQL) is fundamentally less scalable, but not fundamentally less reliable.
For my little framework Pyxer I would like to to be able to use the Google AppEngine datastores also outside of AppEngine projects, because I'm now used to this ORM pattern and for little quick hacks this is nice. I can not use Google AppEngine for all of my projects because of its's limitations in file size and number of files.
A great alternative would also be, if there was a project that provides an ORM with the same naming as the AppEngine datastore. I also like the GQL approach very much, since this is a nice combination of ORM and SQL patterns.
Any ideas where or how I might find such a solution? Thanks.
Nick Johnson, from the app engine team himself, has a blog posting listing some of the alternatives, including his BDBdatastore.
However, that assumes you want to use exactly the same ORM that you use now in app engine. There are tons of ORM options in general out there, though I am not familiar with the state of the art in Python. This question does seem to address the issue though.
You might also want to look at AppScale, which is "a platform that allows users to deploy and host their own Google App Engine applications".
It's probably overkill for your purposes, but definitely something to look into.
There is also the Remote API which the bulkloader tool uses to upload or download data into/from the Datastore.
Maybe it could be used to have applications which are not hosted on AppEngine to still use the Datastore there.