Deploying an app to users' appspot - python

I am working on a Python App, which runs on App Engine. Is there a way I can publish the app on each customers' appSpot account, so that the App uses the users' cloud storage? Instead of running the App on my AppSpot account and all the users storing the data on my Cloud space?

Yes, absolutely.
You just need to have each client create an App Engine account with an application to which you have administrator access. You can adjust the settings on the application to forbid downloads of your code by the other administrators if that's appropriate for your agreement with the client. This also allows the clients to be billed directly for their instances' usage, and makes it completely impossible for data to leak between different clients' instances.
Using multiple applications for multiple clients who are licensing your application almost certainly does not violate part 4.4 of the TOS, although don't take this as legal advice.

No, you cannot do that. The app is hosted and run in the administrator's account which would be you. What you can do is, release the source code and point your users do install it in their appspot account, just like creating a new application.

I suppose it's not exactly what you need. But it can give you an idea where to go. Please check DryDrop project. There is small Python application you can ask each user to install on their account, then they can configure it to fetch your site files from your GitHub repo through webhooks functionality. I didn't try it, but, theoretically, you update your site, commit it to your repo, and all users get your updated application automatically. You can share your thoughts if that works for you.

Maybe. If it's an open source app that you're giving away, you can publish the source and instruct users to upload it to their own accounts.
If you're selling the app, displaying ads or otherwise trying to monetize the service, you probably want to stick with one instance. Using multiple instances to avoid paying for quota usage is direct violation of the App Engine TOS:
4.4. You may not develop multiple Applications to simulate or act as a
single Application or otherwise access
the Service in a manner intended to
avoid incurring fees.

No. Writing an application that deploys other applications is in violation of the terms of service.
Note we don't have any 'hard' limits - those limits that aren't billing enabled can be increased on application to us if you provide a reasonable use-case.

Related

Eliminating nuisance Instance starts

My GCP app has been abused by some users. To stop their usage I have attempted to eliminate features that can be abused, and have employed firewall rules to block certain users. But bad users continue to try to access my app via certain legacy URLs such as myapp.appspot.com/badroute. Of course, I still want users to use the default URL myapp.appspot.com .
I have altered my code in the following manner, but I am still getting Instances to start from them, and I do not want Instances in such cases. What can I do differently to avoid the bad Instances starting OR is there anything I can do to force such Instances to stop quickly instead of after about 15 minutes?
class Dummy(webapp2.RequestHandler):
def get(self):
logging.info("Dummy: " )
self.redirect("/")
app = webapp2.WSGIApplication(
[('/', MainPage),
('/badroute', Dummy)],debug=True)
(I may be referring to Instances when I should be referring to Requests.)
So whats the objective? you want users that visit /badroute to be redirected to some /goodroute ? or you want /badroute to not hit GAE and incur cost?
Putting a google cloud load balancer in front could help.
For the first case you could setup a redirect rule (although you can do this directly within App Engine too, like you did in your code example).
If you just want it to not hit app engine you could setup google cloud load balancer to have the /badroute route to some file in a GCS bucket instead of your GAE service
https://cloud.google.com/load-balancing/docs/https/ext-load-balancer-backend-buckets
However you wouldnt be able to use your *.appsot.com base url. You'd get a static IP which you should then map a custom domain to it
DISCLAIMER: I'm not 100% sure if this would work.
Create a new service dummy.
Create and deploy a dispatch.yaml (GAE Standard // GAE Flex)
Add the links you want to block to the dispatch.yaml and point them to the dummy service.
Set up the Identity Aware Proxy (IAP) and enable it for the dummy service.
???
Profit
The idea is that the IAP will block the requests before they hit the dummy service. Since the requests never actually get forwarded to the service dummy you will not have an instance start. The bots will get a nice 403 page from Google's own infrastructure instead.
EDIT: Be sure to create the dummy service with 0 instances as the idea is to not have it cost money.
EDIT2:
So let me expand a bit on this answer.
You can have multiple GAE services running within one GCP project. Each service is it's own app. You can have one service running a python Flask app and another running a Java Springboot app. You can have each be either GAE Standard or GAE Flex. See this doc.
Normally all traffic gets routed to the default service. Using dispatch.yaml you can make request to certain endpoints go to a specific service.
If you create the dummy service as a GAE Standard app, and you don't actually need it to do anything, you can then route all the endpoints that get abused to this dummy service using the dispatch.yaml. Using GAE Standard you can have the service use 0 instances (and 0 costs).
Using the IAP you can then make sure only your own Google account can access this app (which you won't do). In effect this means that the abusers cannot really access the service, as the IAP blocks it before hitting the service, as you've set it up so only your Google account can access it.
Note, the dispatch.yaml is separate from any services, it's one of the per-project configuration files for GAE. It's not tied to a specific service.
As stated, the dummy app doesn't actually need to do anything, but you need to deploy it once though, as this basically creates the service.
Consider using cloudflare to mitigate bot abuse, customize firewall rules regarding route access, rate limit ips, etc. This can be combined with Google cloud load balancer, if you’d like—as mentioned in https://stackoverflow.com/a/69165767/806876.
References
Cloudflare GCP integration: https://www.cloudflare.com/integrations/google-cloud/
There is a little information I did not provide in my question about my app.yaml:
handlers:
- url: /.*
script: mainapp.app
By simply removing .* from the url specification, no Instance start is created. The user gets Error: Not Found, though. So that satisfies my needs.
Edo Akse's Answer pushed me to this answer by reading here, so I am accepting his answer. I am still not clear how to implement Edo's Answer, though.

Accessing Google Adsense API from a server with Python?

Is there any way to use Python to access the Google Adsense API from a server without any user interaction?
This is typically done by setting up a "service account", but Google's docs say that "AdSense doesn't support Service Accounts".
They say to use the web or installed application flows, but these require the user to manually confirm access for every access. My application needs to run on a headless server, without user interaction, so it can pull data every hour, so this won't work. This similar question suggests going through the user consent screen once and then caching the token on the server, but this isn't feasible in my case since my process needs to be 100% automated, and the token will eventually expire and require user interaction.
Unfortunately, Google's docs are quiet unhelpful, and even worse their Python coding examples haven't been updated in 7 years, and don't even seem to have worked back then, as many of them don't even run Python 2.7, much less 3.
It's true that the AdSense Management API doesn't support service accounts. While there is setup required at first with the Web Flow, the same is true for service accounts which also have to be granted permissions on the account being accessed.
Regarding the tokens expiring, the Web Flow will yield a refresh token, which you can use to generate new access tokens (known as offline access, which doesn't require user involvement after the initial setup).

How do I deploy this app for my job: EC2, Elastic Beanstalk, something else entirely?

I'm tasked with creating a web app (I think?) for my job that will tracker something in our system. It'll be an internal tool that staff uses to keep track of the status of one of the things we do. It should look like trello, with cards that drag from step to step. That frontend exists, but my job is to make the system update when the cards are dragged. This requires using an API in Python and isn't that complicated to grab from/update. I have no idea how to put all of this together. My job is almost completely nontechnical and there's no one internally who knows what I'm doing except for me. I'm in so over my head here and have no idea where to begin. Is this something I should deploy on Elastic Beanstalk? EC2? How do I tie this together and put it somewhere?
Are you trying to pull in live data from Trello or from your companies own internal project management tool?
An EC2 might be useful, but honestly, it may be completely unnecessary if your company has its own servers. An EC2 is basically just a collection of rental computers to help with scaling. I have never used beanstalk so my input would be useless there.
From what I can assume from the question, you could have a python script running to pull from the API and make the changes without an EC2.
First thing you should do is gather as much information about what the end product should look like. From your question, I have the feeling that you have only a vague idea of what the stakeholders want. Don't be afraid to ask more clarification about an unclear task. It's better to spend 30 minutes discussing and taking note than to show the end-product after a month and realizing that's not what your boss/team wanted.
Question I would Ask
Who is going to be using this app? (technical or non-technical person)
For what purpose is this being developed?
Does it need to be on the web or can it be used locally?
How many users need to have access to this application?
Are we handling sensitive information with this application?
Will this need to be augmented with other functionality at some point?
This is just a sample of what I would ask, during the conversation with the stakeholder a lot more will pop up for sure.
What I think you have to do
You need to make a monitoring system for the tasks that need to be done by your development team (like a Kanban)
What I think you already have
A frontend with the card that are draggable to each bin. I also assume that you can create a new card and delete one in the frontend. The frontend is most likely written in React, Angular or Vue.js. You might also have no frontend framework (a mix of jQuery and vanilla js), but usually frontend developper end up picking a framework of sort to help the development.
A backend API in Python (in Flask or with Django-rest-framework most likely) that is communicating with a SQL database like postgresql or a Document database like MongoDB.
I'm making a lot of assumption here, but your aim should be to understand the technology you will be working with in order to check which hosting would work best. For instance, if the database that is setup is a MySQL database you might have some trouble with some hosting provider.
What I think you are missing
Currently the frontend and the backend don't communicate to each other. When you drag a card it won't persist if you refresh the page. Also, all of this is sitting in your computer and cannot be used by any one from your staff. You need to first connect the frontend with the backend so that the application has persistance. Then you need to deploy this application somewhere so that it is reachable by your staff.
What I would do is first work locally to make sure that the layer of persistance is working. This imply having the API server, the frontend server and the database server running simultaneously on your computer to develop. You should then fetch data from the API to know which cards are there in the database and then create them visually in your frontend at the right spot.
When you drop a card to a new spot after having dragging it should trigger a POST request to your API server in order to update the status of this particular card (look at the documentation of your API to check what you need to send).
The server should be sending back an updated version of the cards status if the POST request was sucessful, so your application should then just redraw the card at the right spot (it won't make a difference for you since they are already at the right spot and your frontend framework will most likely won't act on this change since the state hasn't changed). That's all I would do for that part.
I would then move to the deployment phase to make sure that whatever you did locally can still work online. I would use Heroku to start instead of jumping directly to AWS. Heroku is a service built on top of AWS which manage a lot of the complexity of AWS for you. This is great for prototyping and it means that when your stuff is ready you can migrate to AWS easily and be confident that a setup exist to make your app work. You might also be tied up to your company servers, which is another thing I would ask to the stakeholder (i.e. where can I put this application and where I can't put it).
The flow for a frontend + api + database application on Heroku is usually as follow. You create a github repo for your frontend (make it private) and you create an app on Heroku that will watch this repository for changes. It will re-deploy the application for you when it sees a change at a specific subdomain of Heroku hosting. You will need to configure some procfiles that will tell Heroku what to do with a given application type. This is where you need to double check what frontend you are using since that might change the procfiles used. It's most likely a node.js based frontend (React, Angular or Vue) so head over here for the documentation of how to put that online.
You will need to make a repo for the backend also that is separate from the frontend, these two entities are distinct and they only communicate through HTTP request (frontend->backend) and JSON (backend->frontend). You will need to follow the same idea as with the frontend to deploy, head over here.
Once you have these two online, you need to create a database on Heroku. This is done by adding a datastore to your api, head over here. There are some framework specific configuration you need to do to make the API talk to an online database, but then you will need to find that configuration on the framework documentation. The database could also be already up and living on your server, if this is the case you just need to configure your online backend to talk to that particular database at a particular address.
Once all of the above is done, re-test your application to check if you get the same behavior as before. This is a usable MVP, however there are no layer of security. Anyone with the right URL could just fetch your frontend and start messing around with your data.
There is more engineering that need to be done to make this a viable end product. This leads us to my final remark: why you are not using a product like Trello, Jira, or even Github Project? If it is to save some money on not paying for a subscription I think you should factor in the cost of development, security and maintenance of this application.
Hope it helps!
One simple option is Heroku for deploy your API and your frontend application.

can we use django authentication as a micro service in different project?

In my project we are trying to create micro service type architecture using djnago/django-rest-framework.
We've some services like:
User management service
Asset Management Service
Tool management service
All three services running on different Ports with different database.
Now my question is,
Can we use user management service in Asset and tools service for Token authentication?
This is quite a broad question and without knowing more of your specific architecture, what you have tried and what features you want/need, it will be hard to give anything more than a starting point.
That being said, you could look at setting up a Central Authentication Service. (CAS) There's multiple packages for Django that help you with this.
Try looking into Django CAS NG and Django MamaCAS. Django CAS NG seems to be the more actively developed of the two.
Remember that with microservices, one of the big advantages is that there's nothing forcing you to use the same technology across your entire stack. It may well make sense to have your auth components provided by something entirely different, for example rolling a KeyCloak server to handle SSO.

Feedback on availability with Google App Engine

We've had some good experiences building an app on Google App Engine, this first app's target audience are Google Apps users, so no issues there in terms of it being hosted on Google infrastructure.
We like it so much that we would like to investigate using it for another app, however this next project is for a client who is not really that interested in what technology it sits on, they just want it to work, and work all of the time.
In this scenario, given that we have the technology applicability and capability side covered, are there any concerns that this stuff is still relatively new and that we may not be as much "in control" as if we had it done with traditional hosting?
You are correct: you are not in as much control vs. traditional hosting. However, hopefully the gains outweight the negatives. App Engine is extremely scalable -- it runs on the same hardware that runs Google itself. How often have you visited http://google.com and had that page or a search result fail?
Although you are letting Google run your code, the code is still your's to do as you please. With new projects like django-nonrel, you can create and run native Django apps directly on top of App Engine, and if it doesn't meet your needs down the line, it's fairly easy to take that app to an ISP that hosts Django apps (and there are plenty of those). More on this project below.
You don't have to worry about hardware, operating systems, coming up with a machine image, databases, web servers, front-end load balancers, CDNs/edge caching, software/package upgrades, license fees, etc. All these things are tangential to the web or other application that you have or will create to solve a particular problem. All this additional infrastructure is required whether you like it or not; but with App Engine, you only need to think about your app/solution and none of this extra stuff.
Obviously another thing you lose is some of your execution environment. To ensure that you're playing nicely with your cloud neighbors (resource hogging, security issues, etc.), you must execute in a sandbox, meaning your app cannot create local files, open network sockets, etc. However, App Engine provides a rich set of APIs and product features so that you at least can create meaningful apps:
scalable distributed object datastore (see below)
Memcache
URLFetch
images service (resize, crop, etc.)
users service/authentication task queues for background processing
Django web templating
blobstore for large files
denial-of-service blacklisting
transational tasks
datastore cursors
sending (and/or receiving) of email
sending (and/or receiving) of chat/IM/instant messages via XMPP
You also have a full dashboarded administration console which will let you monitor your app's usage, your billing settings and history, a full dump of your quota usage, and even your application logs which you can view or download.
To address the "main sore points" from #Anurag:
1a. the free quotas are fairly generous... enough to power a website that gets 5MM views/month. also, if you trust Google to give them your credit card, they will bump up the free quota levels even higher. look at their quota page and refer to the numbers in both the "Free Default Quota" and "Billing Enabled Default Quota" columns... here are some examples: a) # of Requests: 1.3MM default, 43MM w/billing enabled (wBE), b) Datastore API calls: 10MM default, 140MM wBE, c) URL Fetches: 657K default, 46MM wBE
1b. 30s max for requests: this is more security for you, because your app is now in a playground with others. Google has to ensure that all cloud neighbors play nicely with each other and not hog the CPU. However, the App Engine team is working on a way to allow for longer running background tasks... there's no timetable yet, but it is on the public roadmap.
1c. writing a chat server on App Engine is not only possible, but it is quite simple. here's one created using App Engine's XMPP API -- it's pretty dumb and just echoes back to the sender what they transmitted to us (be aware that you must have already invited the user to chat):
from google.appengine.api import xmpp
from google.appengine.ext import webapp
from google.appengine.ext.webapp.util import run_wsgi_app
class XMPPHandler(webapp.RequestHandler):
def post(self):
msg = xmpp.Message(self.request.POST)
msg.reply("I got your msg: '%s'" % msg.body)
application = webapp.WSGIApplication([
('/_ah/xmpp/message/chat/', XMPPHandler),
], debug=True)
def main():
run_wsgi_app(application)
if __name__ == '__main__':
main()
1d. another item on the the public roadmap is future "[support] for Browser Push (Comet) communication", so that's coming too.
2a. "not SQL" is one of Google App Engine's greatest strengths! relational databases don't scale and must be sharded at some point to keep an RDBMS from falling over. it is true however, that it is slightly more difficult to port because it is not traditional! Based on Google Bigtable, you can think of the App Engine datastore as a scalable distributed object database. App Engine lets you query the datastore using a Query object model, or if you insist, they also provide a SQL-like GqlQuery interface.
2b. with new avantgarde projects like django-nonrel, if you create a Django app and use its ORM, you can take a pure Django app and run it directly on top of App Engine. likewise, you can it off of App Engine and move it directly to more traditional ISP vendor that hosts Django applications. the queries stay exactly the same, and you don't have to care if it does SQL or not.
3a. long-running processes are already addressed in 1b above. Google is aware of this need and are working on it.
3b. the TaskQueue API supports 100k calls, but that's bumped to 1MM wBE... and this is on a daily basis.
3c. Google strongly encourages breaking up tasks into multiple subtasks. low latency apps are seen not to "hog the system" and are given better treatment than those which are slow and consume more resources from their cloud neighbors.
Yes, you would not be in as much control as with traditional hosting. Main sore points of GAE are
Quotas etc, 30 sec max for a request, so comet/reverse ajax etc out of window or very difficult. Try writing a chat server on google app engine.
Not Sql database, so difficult to port to other server if need be and sometime limitation with google database e.g. try sorting a query which has comparison on different column other than the sorted one.
Long running process, there is a Task api but that doesn't suffice if you want to do long running background processing, otherwise you will have to break your task in subtasks, so things get complicated and there are even quotas on how many tasks per sec you can run.
GAE is good if you app can be modeled as request-response registry, with little background processing.
See this too
Feedback on using Google App Engine?

Categories

Resources