How can I connect multiple Google App Engine apps to my one Django app engine service so that I can write to another apps datastore? Is this even possible?
Directly accessing an app's datastore from another application is possible (you don't really need to write to the app itself for that!)
The fact that the other app is also a GAE app or not doesn't really matter, setting up the access control and accessing the respective datastore are the same.
I captured the details in How do I use Google datastore for my web app which is NOT hosted in google app engine?
If you don't want to give direct datastore access to the outside app then you could implement an inter-app communication protocol to achieve what you want:
the app owning the datastore would act as a server for the other apps and would perform itself the datastore accesses on their behalf
the other apps would be clients, sending requests to the server app to get it to perform the desired actions
With this approach you can implement any access control/restriction scheme you want on the server side, which is not really possible with the direct datastore access method.
I am trying to create a daemon python application which will get emails from outlook server using Microsoft outlook graph API. They have provided excellent tutorial and documentation on how to get it done for python app like django and flask. But I want to create daemon script which can get access code without using web interface(which was used in django).
Note: This app will only collect email from single email and will feed it to db.
Any help is appriciated.
It really depends on what kind of security you need. You can have your daemon/service authenticate with username/password directly, or you can have it authenticate with a certificate.
There are several different authentication scenarios, take a look at the docs page.
Either way, you need to register your daemon as an app in Azure and give it permissions to the Outlook API, just as if it were a web app.
To use Google API's, after activating them from the Google Developers Console, one needs to generate credentials. In my case, I have a backend that is supposed to consume the API server side. For this purpose, there is an option to generate what the Google page calls "Key for server applications". So far so good.
The problem is that in order to generate the key, one has to mention IP addresses of servers that would be whitelisted. But GAE has no static IP address that I could use there.
There is an option to manually get the IP's by executing:
dig -t TXT _netblocks.google.com #ns1.google.com
However there is no guarantee that the list is static (further more, it is known to change from time to time), and there is no programatic way I could automate the use of adding IP's that I get from dig into the Google Developers Console.
This leaves me with two choices:
Forget about GAE for this project, ironically, GAE cannot be used as a backend for Google API's (better use Amazon or some other solution for that). or
Program something like a watchdog over the output of the dig command that would notify me if there's a change, and then I would manually update the whitelist (no way I am going to do this - too dangerous), or allow all IP's to use the Google API granted it has my API key. Not the most secure solution but it works.
Is there any other workaround? Can it be that GAE does not support consuming Google API's server side?
You can use App Identity to access Google's API from AppEngine. See: https://developers.google.com/appengine/docs/python/appidentity/. If you setup your app using the cloud console, it should have already added your app's identity with permission to your project, but you can always check that out. From the "Permissions" Tab in cloud console for your project, make sure your service account is added under "Service Accounts" (in the form of your_app_id#appspot.gserviceaccount.com)
Furthermore, if you use something like the JSON API Libs available for python, you can use the bundled oauth2 library to do all of this for you using AppAssertionCredentials to authorize the API you wish to use. See: https://developers.google.com/api-client-library/python/guide/google_app_engine#ServiceAccounts
Yes, you should use App Identity. Forget about getting an IP or giving up on GAE :-) Here is an example of how to use Big Query, for example, inside a GAE application:
static {
// initializes Big Query
JsonFactory jsonFactory = new JacksonFactory();
HttpTransport httpTransport = new UrlFetchTransport();
AppIdentityCredential credential = new AppIdentityCredential(Arrays.asList(Constants.BIGQUERY_SCOPE));
bigquery = new Bigquery.Builder(httpTransport, jsonFactory, credential)
.setApplicationName(Constants.APPLICATION_NAME).setHttpRequestInitializer(credential)
.setBigqueryRequestInitializer(new BigqueryRequestInitializer(Constants.API_KEY)).build();
}
I am trying out Flask-OpenID in App Engine. Flask-OpenID uses a 'store' to save authentication information. If I mention '/some/path' to save data, it doesn't work in App Engine, as it is read-only.
For Flask-OpenID to work, I have to write my own 'store' which uses App Engine's datastore or cloud storage. I have not much idea on how to write this store. Is there any document available, so that I can follow. It will be helpful if I get any input on writing the 'store' using Flask and App Engine.
Disclaimer: I am not the author of the framework, but I am using it everyday. You can start with gae-init which is a working example using Flask-OAuth for authentication. Login and other goodies are provided out of the box, and you can get an overview about it and educate yourself at docs which are still under construction.
We've had some good experiences building an app on Google App Engine, this first app's target audience are Google Apps users, so no issues there in terms of it being hosted on Google infrastructure.
We like it so much that we would like to investigate using it for another app, however this next project is for a client who is not really that interested in what technology it sits on, they just want it to work, and work all of the time.
In this scenario, given that we have the technology applicability and capability side covered, are there any concerns that this stuff is still relatively new and that we may not be as much "in control" as if we had it done with traditional hosting?
You are correct: you are not in as much control vs. traditional hosting. However, hopefully the gains outweight the negatives. App Engine is extremely scalable -- it runs on the same hardware that runs Google itself. How often have you visited http://google.com and had that page or a search result fail?
Although you are letting Google run your code, the code is still your's to do as you please. With new projects like django-nonrel, you can create and run native Django apps directly on top of App Engine, and if it doesn't meet your needs down the line, it's fairly easy to take that app to an ISP that hosts Django apps (and there are plenty of those). More on this project below.
You don't have to worry about hardware, operating systems, coming up with a machine image, databases, web servers, front-end load balancers, CDNs/edge caching, software/package upgrades, license fees, etc. All these things are tangential to the web or other application that you have or will create to solve a particular problem. All this additional infrastructure is required whether you like it or not; but with App Engine, you only need to think about your app/solution and none of this extra stuff.
Obviously another thing you lose is some of your execution environment. To ensure that you're playing nicely with your cloud neighbors (resource hogging, security issues, etc.), you must execute in a sandbox, meaning your app cannot create local files, open network sockets, etc. However, App Engine provides a rich set of APIs and product features so that you at least can create meaningful apps:
scalable distributed object datastore (see below)
Memcache
URLFetch
images service (resize, crop, etc.)
users service/authentication task queues for background processing
Django web templating
blobstore for large files
denial-of-service blacklisting
transational tasks
datastore cursors
sending (and/or receiving) of email
sending (and/or receiving) of chat/IM/instant messages via XMPP
You also have a full dashboarded administration console which will let you monitor your app's usage, your billing settings and history, a full dump of your quota usage, and even your application logs which you can view or download.
To address the "main sore points" from #Anurag:
1a. the free quotas are fairly generous... enough to power a website that gets 5MM views/month. also, if you trust Google to give them your credit card, they will bump up the free quota levels even higher. look at their quota page and refer to the numbers in both the "Free Default Quota" and "Billing Enabled Default Quota" columns... here are some examples: a) # of Requests: 1.3MM default, 43MM w/billing enabled (wBE), b) Datastore API calls: 10MM default, 140MM wBE, c) URL Fetches: 657K default, 46MM wBE
1b. 30s max for requests: this is more security for you, because your app is now in a playground with others. Google has to ensure that all cloud neighbors play nicely with each other and not hog the CPU. However, the App Engine team is working on a way to allow for longer running background tasks... there's no timetable yet, but it is on the public roadmap.
1c. writing a chat server on App Engine is not only possible, but it is quite simple. here's one created using App Engine's XMPP API -- it's pretty dumb and just echoes back to the sender what they transmitted to us (be aware that you must have already invited the user to chat):
from google.appengine.api import xmpp
from google.appengine.ext import webapp
from google.appengine.ext.webapp.util import run_wsgi_app
class XMPPHandler(webapp.RequestHandler):
def post(self):
msg = xmpp.Message(self.request.POST)
msg.reply("I got your msg: '%s'" % msg.body)
application = webapp.WSGIApplication([
('/_ah/xmpp/message/chat/', XMPPHandler),
], debug=True)
def main():
run_wsgi_app(application)
if __name__ == '__main__':
main()
1d. another item on the the public roadmap is future "[support] for Browser Push (Comet) communication", so that's coming too.
2a. "not SQL" is one of Google App Engine's greatest strengths! relational databases don't scale and must be sharded at some point to keep an RDBMS from falling over. it is true however, that it is slightly more difficult to port because it is not traditional! Based on Google Bigtable, you can think of the App Engine datastore as a scalable distributed object database. App Engine lets you query the datastore using a Query object model, or if you insist, they also provide a SQL-like GqlQuery interface.
2b. with new avantgarde projects like django-nonrel, if you create a Django app and use its ORM, you can take a pure Django app and run it directly on top of App Engine. likewise, you can it off of App Engine and move it directly to more traditional ISP vendor that hosts Django applications. the queries stay exactly the same, and you don't have to care if it does SQL or not.
3a. long-running processes are already addressed in 1b above. Google is aware of this need and are working on it.
3b. the TaskQueue API supports 100k calls, but that's bumped to 1MM wBE... and this is on a daily basis.
3c. Google strongly encourages breaking up tasks into multiple subtasks. low latency apps are seen not to "hog the system" and are given better treatment than those which are slow and consume more resources from their cloud neighbors.
Yes, you would not be in as much control as with traditional hosting. Main sore points of GAE are
Quotas etc, 30 sec max for a request, so comet/reverse ajax etc out of window or very difficult. Try writing a chat server on google app engine.
Not Sql database, so difficult to port to other server if need be and sometime limitation with google database e.g. try sorting a query which has comparison on different column other than the sorted one.
Long running process, there is a Task api but that doesn't suffice if you want to do long running background processing, otherwise you will have to break your task in subtasks, so things get complicated and there are even quotas on how many tasks per sec you can run.
GAE is good if you app can be modeled as request-response registry, with little background processing.
See this too
Feedback on using Google App Engine?