How to deploy jupyter notebook application in AWS ec2 instance - python

I have created an application (calculator)using Jupyter notebook on my local machine.
I want this calculator to be public when I type in the URL after deploying in Aws ec2
I want to know is there a way to deploy my application in AWS ec2 instance(ubuntu) to make my application public.
Is there a step by step guide to deploy my application?
Thanks in advance

Related

Access Flask Development Server on Sagemaker Notebook Instance

I would like to be able to run a flask app in development on a sagemaker notebook instance and then look at the flask application in my browser. I know what I've previously been able to configure this kind of access on ec2 by opening up the port for my ip, but I'm not finding any how to for how to do it on sagemaker, which otherwise comes with the nice user interface. How can I access the flask app running on a Sagemaker notebook instance? In general, I'm struggling to find a development framework for flask apps once the database it uses is in Amazon RDS.

Azure app service for kafka producer python api deployment

Scenario:
I want to deploy a kafka python producer api on Azure through pipeline. I have an artifact which is a producer python code that needs to be deployed on azure app service.
Question:
Is deploying this code on azure app service really recommended? (knowing that this is not a webapp but just a kafka producer for internal application).
What service can alternatively be used to run such python codes on azure?
It seems that Kafka is a server, not the code project. So I recommend you use the custom docker image and deploy it on the Azure Web App for Container.
You can create the custom docker image with installing the Kafka inside the image and then deploy the Web App for Container from the custom image. When it finishes, the web app can access and you do not need to deploy code to it.

Unable to runserver when I close my SSH connection with AWS EC2

Hope anyone with these areas of expertise can help me.
Basically, I am trying to run my Django project inside the EC2 instance in Amazon Web Service. I have placed the files and tried to run the server with
python3 manage.py runserver 0.0.0.0:8000
The steps I used to configure my EC2 is by referring to this website: https://medium.com/saarthi-ai/ec2apachedjango-838e3f6014ab. I followed all the steps and I was able to deploy my project.
However, once I close my SSH connection, I won't be able to access the website anymore. Is there a solution to this?
Regards,
YX
When you exit(close connection) from SSH. It will close all user activity which are running on.
So you need to deploy your Django project on server with specified port. So project have accessible anytime from any where as per defined policy.
Please follow the following link to deploy your django project on ec2.
you need to configure Gunicorn with Nginx to host your
project on EC2
https://www.pythoncircle.com/post/235/how-to-setup-django-app-on-ec2-instance-aws/
If you have experienced person on AWS than you need to follow following instruction.
https://aws.amazon.com/getting-started/projects/deploy-python-application/
Thanks

Docker deployment of flask app with Celery and Redis in AWS, DB in AWS RDS

I need to deploy a Flask app with Celery and Redis to Amazon AWS. I'm used to work with AWS Lightsail and this will be my option.
By the other side I must( as per company policy) deploy my Postgres DB to AWS RDS
Im planing to use Dockers with Ngix, Unicorn in the AWS Lightsail to deploy the app that as I said uses Celery and Redis. So all this will be in the docker in Lightsail
By the other side the DB will be in RDS without using docker
What I want with this approach is a quick deployment of changes and upgrades to the app .
What I want to know is this :
1-Is this a good approach to production, that will help me in quick deployments ?
2-Does anybody know of some examples of docker-compose files that could help me with this ?
3-Could someone please let me know some limitation in this approach and
4-Is Lightsail a good option in AWS for a Docker deployment of flask apps as the one described here ?
Thanks
When I asked this question, was looking for some examples of easy deployment of medium complexity apps to AWS. The app itself used Celery Redis and Amazon AWS RDS postgress DB. The deployment to Amazon Lightsail was pretty simple after I look a video in you tube from an Amazon Engineer. I basically created the container in my local laptop , used an initial script while deploying an OS only Ubuntu instance and that scrip loaded a daemon so Ubuntu "system" could daemonize my Docker deployment when restarting. I created 3 videos in you tube where I explained everything.
If someone needs help with this see the videos at:
Dockerize Flask API ,NGINX, GUNICORN, CELERY,REDIS to Amazon AWS
Part1:3
Dockerize Flask API ,NGINX, GUNICORN, CELERY,REDIS to Amazon
AWS Part2:3
Dockerize Flask API ,NGINX, GUNICORN, CELERY,REDIS to
Amazon AWS Part3:3
Links below:
Part1:3
Part2:3
Part3:3

Flask web application deployment in AWS

So when i run the application in amazon instance it tries to get the data from google inthis image
when it goes to the callback this is the output i get
I have run an python application with google Oauth2 and ran it successfully locally. Gunicorn serves http and google Oauth2 requires hhtps. I locally used certificates i have generated and it worked successfully, but when i try to deploy in Amazon Ec2 instance, it doesn't work. Did anyone face such kind of problem? will using nginx be helpfull in that case?

Categories

Resources