Flask: Difference between google cloud functions and google web deploy - python

I am a newbie who wants to deploy his flask app using google cloud functions. When I am searching it online, people are telling me to deploy it as a Flask app. I want to ask if there is any difference between those two.
A cloud instance or deploying flask app on google cloud VS cloud serverless function

As described by John and Kolban, Cloud Functions is a single purpose endpoint. You want to perform 1 thing, deploy 1 function.
However, if you want to have a many consistent things, like a microservice, you will have to deploy several endpoints that allow you to perform a CRUD on the same data object. You should prefer to deploy several endpoints (CRUD) and to have the capability to easily reuse class and object definitions and business logic. For this, a Flask webserver is that I recommend (and I prefer, I wrote an article on this).
A packaging in Cloud Run is the best for having a serverless platform and pay-per-use pricing model (and automatic scaling and...).
There is an additional great thing: Cloud Functions request object is based on Flask request object. By the way, and it's that I also present in my article, it's easy to switch from one platform to another one. You only have to choose according with your requirements, your skills,... I also wrote another article on this

If you deploy your Flask app as an application in a Compute Engine VM instance, you are basically configuring a computer and application to run your code. The notion of Cloud Functions relieves you from the chore and toil of having to create and manage the environment in which your program runs. A marketing mantra is "You bring the code, we bring the environment". When using Cloud Functions all you need do is code your application logic. The maintenance of the server, scaling up as load increases, making sure the server is available and much more is taken care of for you. When you run your code in your own VM instance, it is your responsibility to manage the whole environment.
References:
HTTP Functions
Deploying a Python serverless function in minutes with GCP

Related

Google App Engine: Automatically re-deploy once a day to update machine learning model?

I have the following situation: a Python Flask app running on Google App engine; this app serves predictions from a Spacy machine learning model. Throughout the day, there is a workflow in place which adds new training data for this model, and the App has a cron job that retrains the model taking this new training data into account every evening.
The problem is that I want each App instance to reference this newly trained model after it becomes available. I can upload the model somewhere (say, Google Cloud Storage) but, ultimately, each instance needs to find out about the existence of this new model, download it, and load it into memory/initialize it; this takes time, so I'd like to only do this once per day/on start up.
I'm currently wondering - is there a way to auto-redeploy the App once a day or automatically restart the instances? Is there a different way I should be going about this?
(Note: I would prefer to stick with Google App engine for now.)
It sounds like you should be deploying a new version of your app daily, and then warming the new instance before migrating traffic to it. This is with the assumption that initial start up is slow for your app to load this new model so you can't interrupt the running version because it will disrupt your traffic at that time.
To deploy versions, follow the official guide here and then to warm up and migrate traffic use the guide here.
To automate this process you can use the Admin API -- the question will be how you get the model to a specific location for the new version. I would recommend using same file name for the model so that your actual code stays the same consistently per version. With that, you should be able to build that directory and deploy the app with the new version programmatically every day -- but it depends on the rest of your setup and how you are storing and automating any other part of the process.
This sounds like a very complex process, but what I can told is that after you did all your previous settings for things to be right, you can also use Cloud Build in order to automate the deployments on App Engine. You can see in this quickstart how this process will work.
Basically, you store your application inside a repository, and with every new commit a trigger will make the deployment of your App Engine application version. You can also use git as a repository in order to achieve that, following the steps in this guide.
If you want the whole process to be fully automated you can think of a solution for some auto-commits, like using the Cloud Scheduler.

Golang App On Google App Engine Calling Python Scripts

I would like to build an app written in Go and host it on Google App Engine. I would like it to call some Python scripts which were written for another app (and therefore not need to rewrite that code in Go).
As far as I have seen, the Google App Engine specifies the language of the app (i.e. runtime Go in that case) but it is not clear if that app could also run some Python script. Could anyone let me know if this would be possible? If not, what would be the best process? Have a separate Python service called by the Go app - ideally I would like both services to use the same domain name?
Thanks a lot for your help!
I don't think you'll find good support for Python in the Go runtime, or vice versa.
Your best bet would be to define a custom runtime via Cloud Run -- this will have all the same serverless benefits of App Engine, but will allow you to run code in both languages in the same service.

Firebase cloud functions using Python?

We are using GCP's Firebase with Firestore for a new mobile app we are developing. As part of this effort we need to deploy a number of cloud functions which will act as Firestore triggers for doing some back end processing.
Our intention is to keep the deploys encapsulated inside of Firebase by using the firebase cli tools. However when we attempt to initiate the Firebase project for functions using the "firebase init functions" call the only two language options are "Javascript" and "Typescript", and the only deployable stack seems to be Node.js.
On previous GCP projects we had deployed Python based cloud functions (using the gcloud cli) and ideally we'd like to continue using Python for our Firebase cloud functions. So my questions are:
is it possible to deploy Python-based Firebase cloud functions? If not:
can we simply go back to deploying Python-based GCP cloud functions using the gcloud cli and still have them work as Firestore triggers?
Thanks
The Firebase CLI does not support deploying functions written in python.
You can certainly write Cloud Firestore triggers in python and deploy them with gcloud.
One thing you might not be aware of: the underlying Cloud Functions product is the same no matter how you deploy your functions. Firebase just adds tools and APIs on top of the existing Google Cloud Functions infrastructure. There is really no such thing as a "Firebase Cloud Function". There is just Cloud Functions, and you have options about how you can write and deploy them, either using gcloud, or the Firebase CLI.

Running One Instance of Google App Engine with frontend in nodejs and backend server in python

I'm getting my feet wet with GCP and GAE, also nodejs and python and networking (I know).
[+] What I have:
Basically I have some nodejs code that takes in some input and is supposed to then send that input to some python code that will do more stuff to it. My first idea was to deploy the nodejs code via GAE, then host the python code in a python server, then make post requests from the nodejs front-end to the python server backend.
[+] What I would like to be able to do:
just deploy both my nodejs code and my python code in the same project and instance of GAE so that the nodejs is the frontend that people see but so that the python server is also running in the same environment and can just communicate with the nodejs without sending anything online.
[+] What I have read
https://www.netguru.co/blog/use-node-js-backend
Google App Engine - Front and Backend Web Development
and countless other google searches for this type of setup but to no avail.
If anyone can point me in the right direction I would really appreciate it.
You can't have both python and nodejs running in the same instance, but they can run as separate services, each with their own instance(s) inside the same GAE app/project. See Service isolation and maybe Deploying different languages services to the same Application [Google App Engine]
Using post requests can work pretty well, but will likely take some effort to ensure no outside access.
Since you intend to use as frontend the nodejs service you're limited to using only the flexible environment for it, which limits the inter-service communication options - you can't use push queues (properly supported only in the standard environment) which IMHO would be a better/more secure solution than post requests.
Another secure communication option would be for the nodejs service to place the data into the datastore and have the python service pick it up from there - the datastore is shared by all instances/versions/services inside the same GAE app. Also more loosely coupled IMHO - each service can function (at least for a while) without the other being alive (not possible if using the post requests).
Maybe of interest: How to tell if a Google App Engine documentation page applies to the standard or the flexible environment
UPDATE:
Node.JS is currently available in the standard environment as well, so you can use those features, see:
Now, you can deploy your Node.js app to App Engine standard environment
Google App Engine Node.js Standard Environment Documentation

appengine frontend to kubernetes

I'm trying to setup a flask app on google app engine that will be something of a frontend management console for google container engine. Google has put out working APIs to spin up a container cluster, but it does not look like they have put out (python) APIs to administer kubernetes. That is, everything needed implement services, pods, RCs etc. seems to be setup to run through bash scripting. This is not compatible with the restrictions of google's app engine.
Is there a commonly accepted solution/package for this? Would it make more sense to abandon appengine in favor for a managed VM (not ideal)?
Thanks
As I mentioned in Submit jobs using API Client Library for Python?, the Kubernetes API uses a standard swagger specification, so it should be possible to generate a python client library. There is also pykube if you want to experiment with a existing client library.

Categories

Resources