Access Flask Development Server on Sagemaker Notebook Instance - python

I would like to be able to run a flask app in development on a sagemaker notebook instance and then look at the flask application in my browser. I know what I've previously been able to configure this kind of access on ec2 by opening up the port for my ip, but I'm not finding any how to for how to do it on sagemaker, which otherwise comes with the nice user interface. How can I access the flask app running on a Sagemaker notebook instance? In general, I'm struggling to find a development framework for flask apps once the database it uses is in Amazon RDS.

Related

How to deploy jupyter notebook application in AWS ec2 instance

I have created an application (calculator)using Jupyter notebook on my local machine.
I want this calculator to be public when I type in the URL after deploying in Aws ec2
I want to know is there a way to deploy my application in AWS ec2 instance(ubuntu) to make my application public.
Is there a step by step guide to deploy my application?
Thanks in advance

Is there any way to connect a GAE application with a custom Google Cloud Database?

I'm trying to connect my webapp2 application to a 'in-cloud' database.
To run it local I'm using the following commands:
--datastore_path=/<path>/<to>/<project>/.db/datastore
--blobstore_path=/<path>/<to>/<project>/.db/blobstore
The problem is that I don't want a local path to my datastore/blobstore.
Is there any way to connect in a 'in-cloud' database passing a different path? Can't find any solutions like that
You can use Remote API:
The Remote API library allows any Python client to access services
available to App Engine applications.
For example, if your App Engine application uses Datastore or Google
Cloud Storage, a Python client could access those storage resources
using the Remote API.
in order to get remotely access to Google Cloud Datastore using webapp2.

Azure app service for kafka producer python api deployment

Scenario:
I want to deploy a kafka python producer api on Azure through pipeline. I have an artifact which is a producer python code that needs to be deployed on azure app service.
Question:
Is deploying this code on azure app service really recommended? (knowing that this is not a webapp but just a kafka producer for internal application).
What service can alternatively be used to run such python codes on azure?
It seems that Kafka is a server, not the code project. So I recommend you use the custom docker image and deploy it on the Azure Web App for Container.
You can create the custom docker image with installing the Kafka inside the image and then deploy the Web App for Container from the custom image. When it finishes, the web app can access and you do not need to deploy code to it.

How is Deploying Flask on AWS Elastic Beanstalk different from running script?

What is the difference between deploying a Flask application on an ec2 instance (in other words running your script on any computer) and deploying a Flask application via AWS Elastic Beanstalk? The Flask deployment documentation says that:
While lightweight and easy to use, Flask’s built-in server is not
suitable for production as it doesn’t scale well and by default serves
only one request at a time. Some of the options available for properly
running Flask in production are documented here.
One of the deployment options they recommend is AWS Elastic Beanstalk. When I read through Amazon's explanation of how to deploy a Flask app, however, it seems like they are using the exact same server application as comes built-in to Flask, which for example is single threaded and so cannot handle simultaneous requests. I understand that Elastic Beanstalk allows you to deploy multiple copies, but it still seems to use the built-in Flask server application. What am I missing?
TL;DR Completely different - Elastic Beanstalk does use a sensible WSGI runner that's better than the Flask dev server!
When I read through Amazon's explanation of how to deploy a Flask app, however, it seems like they are using the exact same server application as comes built-in to Flask
Almost, but not quite.
You can confirm that this isn't the case by removing the run-with-built-in-server section yourself - i.e. the following from the example:
if __name__ == "__main__":
# Setting debug to True enables debug output. This line should be
# removed before deploying a production app.
application.debug = True
application.run()
You'll stop being able to run it yourself locally with python application.py but it'll still happily run on EB!
The EB Python platform uses its own WSGI server (Apache with mod_wsgi, last I looked) and some assumptions / config to find your WSGI callable:
From Configuring a Python project for Elastic Beanstalk:
By default, Elastic Beanstalk looks for a file called application.py to start your application. If this doesn't exist in the Python project that you've created, some adjustment of your application's environment is necessary.
If you check out the docs for the aws:elasticbeanstalk:container:python namespace you'll see you can configure it to look elsewhere for your WSGI application:
WSGIPath: The file that contains the WSGI application. This file must have an "application" callable. Default: application.py
Elastic compute resources (AWS and others) generally allow for dynamic load balancing, and start more compute resources as they are needed.
If you deploy on a single ec2 instance, and this instance reaches capacity, your users will experience poor performance. If you deploy elastically, new resources are dynamically added as to ensure smooth performance.

Flask web application deployment in AWS

So when i run the application in amazon instance it tries to get the data from google inthis image
when it goes to the callback this is the output i get
I have run an python application with google Oauth2 and ran it successfully locally. Gunicorn serves http and google Oauth2 requires hhtps. I locally used certificates i have generated and it worked successfully, but when i try to deploy in Amazon Ec2 instance, it doesn't work. Did anyone face such kind of problem? will using nginx be helpfull in that case?

Categories

Resources