i am trying to connect & build two docker file with docker-compose. I am new to docker & then its getting confused while connecting with docker-compose.
Main purpose: connecting frontend & backend (of flask application) via docker-compose.
If my Dockerfile & docker-compose.yml file wrong. please correct me
like front end in one frontend folder with Dockerfile.
back end in another folder with Dockerfile. (connecting this 2, via doc-comp)
here's my file structure (inside template folder - loginpage.html)
app.py
from flask import Flask, render_template, flash, redirect, url_for, session, logging, request
app = Flask(__name__)
app.secret_key = 'hello'
#app.route("/", methods=["GET", "POST"])
def login():
if request.method == "POST":
uname = request.form["uname"]
return render_template("loginpage.html", uname=uname)
else:
return render_template("loginpage.html")
if __name__ == "__main__":
app.run(debug=True, port=5048,host='0.0.0.0')
backend/Dockerfile
FROM python:3.7-slim
ADD ./requirements.txt /backend/requirements.txt
WORKDIR /backend
RUN pip install -r requirements.txt
ADD . /backend
ENTRYPOINT ["python"]
CMD ["/backend/app.py"]
EXPOSE 5048
frontend/Dockerfile
FROM python:3.7-slim
COPY templates /backend/
COPY . /backend
WORKDIR /backend
docker-compose.yml
version: '2'
services:
backend:
build:
context: ./backend
dockerfile: Dockerfile
restart: always
ports:
- "5048:5048"
frontend:
build: ./frontend
you can write docker-compose file something like this ..
version: "3.7"
services:
frontend:
build:
context: ./frontEnd
container_name: frontend
depends_on: [backend]
ports:
- "5000:5000"
networks:
- my_own_network
links:
- "backend:backend"
backend:
build:
context: ./backend
container_name: backend
ports:
- "5048:5048"
networks:
- my_own_network
networks:
my_own_network:
Related
I'm trying to dockerize a Python-Flask application, using also volumes in order to have a live update when I change the code, but volumes don't work and I have to stop the containers and open run it again.
That is the code that I try to change (main.py):
from flask import Flask
import pandas as pd
import json
import os
app = Flask(__name__)
#app.route("/")
def hello():
return "Hello"
My dockerfile.dev:
FROM python:3.9.5-slim-buster
WORKDIR '/app'
COPY requirements.txt .
RUN pip3 install -r requirements.txt
RUN pip install python-dotenv
COPY ./ ./
ENV FLASK_APP=main.py
EXPOSE 5000
CMD [ "python3", "-m" , "flask", "run", "--host=0.0.0.0"]
My docker-compose.yaml
version: "3"
services:
backend:
build:
context: .
dockerfile: Dockerfile.dev
ports:
- "5000:5000"
expose:
- "5000"
volumes:
- .:/app
stdin_open: true
environment:
- CHOKIDAR_USEPOLLING=true
- PGHOST=db
- PGUSER=userp
- PGDATABASE=p
- PGPASSWORD=pgpwd
- PGPORT=5432
- DB_HOST=db
- POSTGRES_DB=p
- POSTGRES_USER=userp
- POSTGRES_PASSWORD=pgpwd
depends_on:
- db
db:
image: postgres:latest
restart: always
environment:
- POSTGRES_DB=db
- DB_HOST=127.0.0.1
- POSTGRES_USER=userp
- POSTGRES_PASSWORD=pgpwd
- POSTGRES_ROOT_PASSWORD=pgpwd
volumes:
- db-data-p:/var/lib/postgresql/data
pgadmin-p:
container_name: pgadmin4_container_p
image: dpage/pgadmin4
restart: always
environment:
PGADMIN_DEFAULT_EMAIL: admin#admin.com
PGADMIN_DEFAULT_PASSWORD: root
ports:
- "5050:80"
logging:
driver: none
volumes:
db-data-p:
To start I execute docker-compose up
Volume /app seems not works
Flask does not reload files by default. You need to enable that explicitly e.g. by passing --debug on the flask command line:
python3 -m flask --debug run --host=0.0.0.0
If you modify your Dockerfile to use the --debug flag...
CMD [ "python3", "-m" , "flask", "--debug", "run", "--host=0.0.0.0"]
...then it will work as you expect. You could also set the FLASK_DEBUG environment variable instead of using the --debug flag:
services:
backend:
build:
context: .
dockerfile: Dockerfile.dev
ports:
- "5000:5000"
volumes:
- .:/app
environment:
- FLASK_DEBUG=1
I am using a docker-compose Flask implementation with the following configuration
docker-compose:
version: '3'
services:
dashboard:
build:
context: dashboard/
args:
APP_PORT: "8080"
container_name: dashboard
ports:
- "8080:8080"
restart: unless-stopped
environment:
APP_ENV: "prod"
APP_DEBUG: "False"
APP_PORT: "8080"
volumes:
- ./dashboard/:/usr/src/app
dashboard/Dockerfile:
FROM python:3.7-slim-bullseye
ENV PYTHONUNBUFFERED True
ARG APP_PORT
ENV APP_HOME /usr/src/app
WORKDIR $APP_HOME
COPY requirements.txt ./requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
CMD exec gunicorn --bind :$APP_PORT --workers 1 --threads 8 --timeout 0 main:app
dashboard/main.py:
import os
from flask import Flask, render_template
app = Flask(__name__)
#app.route('/')
def index():
return render_template('index.html')
If I apply any change to the index.html file in my host system using VSCode, these changes won't apply when I refresh the page. However, I have tried getting into the container with docker exec -it dashboard bash and cat /usr/src/app/templates/index.html and they are reflected inside the container, since the volume is shared between the host and the container.
If I stop the container and run it again the changes are applied, but as I am working on frontend doing that all the time is pretty annoying.
Why the changes won't show on the browser but they are replicated on the container?
You should use: TEMPLATES_AUTO_RELOAD=True
From https://flask.palletsprojects.com/en/2.0.x/config/
It appears that the templates are preloaded and won't update until you enable this feature.
I have create a sample docker app with python and redis. Python is connected to the redis to store data. I want to pass the password servername to redis as an environment variable in docker-compose file. How can I achieve that?
Docker-compose:
version: "3.7"
services:
nginx_app:
image: nginx:latest
depends_on:
- flask_app
volumes:
- ./default.conf:/etc/nginx/conf.d/default.conf
ports:
- 8082:80
networks:
- my_project_network
flask_app:
build:
context: .
dockerfile: Dockerfile
expose:
- 5000
depends_on:
- redis_app
networks:
- my_project_network
redis_app:
image: redis:latest
command: redis-server --requirepass pass123 --appendonly yes
volumes:
- ./redis-vol:/data
expose:
- 6379
networks:
- my_project_network
networks:
my_project_network:
from flask import Flask
from redis import Redis
app = Flask(__name__)
redis = Redis(host='redis_app', port=6379, password='pass123')
#app.route('/')
def hello():
redis.incr('hits')
return 'Hello World! I have been seen %s times.' % redis.get('hits')
if __name__ == "__main__":
app.run(host="0.0.0.0", debug=True)
just define environement variables in flask app and do os.getenv of them in python application, than add them to your flask app service in docker compose file:
flask_app:
environment:
RABBIT_USER: guest
RABBIT_PASSWORD: pass123
In your python file place following:
import os
redis = Redis(host='redis_app', port=6379, password=os.getenv('RABBIT_PASSWORD'))
As #AndriyIvaneyko says, in your docker-compose:
flask_app:
environment:
- PASSWORD=password
Another way that you can get this value is by setting an env variable in your shell export PASSWORD="password" and importing it into your docker-compose:
flask_app:
environment:
- PASSWORD
This is the approach I would recommend since it ensures that your credentials are not available in plain text in the docker-compose file. Moreover, collaboration becomes simpler as the env variable can be configured independently.
In your python:
from flask import Flask
from redis import Redis
import os
app = Flask(__name__)
redis = Redis(host='redis_app', port=6379, password=os.getenv('PASSWORD'))
#app.route('/')
def hello():
redis.incr('hits')
return 'Hello World! I have been seen %s times.' % redis.get('hits')
if __name__ == "__main__":
app.run(host="0.0.0.0", debug=True)
You can do the same thing with other env variables. Here is the documentation.
I want my flask server to detect changes in code and reload automatically.
I'm running this on docker container.
Whenever I change something, I have to build and up again the container. I have no idea where's wrong. This is my first time using flask.
Here's my tree
├── docker-compose.yml
└── web
├── Dockerfile
├── app.py
├── crawler.py
└── requirements.txt
and code(app.py)
from flask import Flask
import requests
app = Flask(__name__)
#app.route('/')
def hello_world():
return 'Hello Flask!!'
if __name__ == '__main__':
app.run(debug = True, host = '0.0.0.0')
and docker-compose
version: '2'
services:
web:
build: ./web
ports:
- "5000:5000"
volumes:
- ./web:/code
Please give me some advice. Thank you in advance.
Flask supports code reload when in debug mode as you've already done. The problem is that the application is running on a container and this isolates it from the real source code you are developing. Anyway, you can share the source between the running container and the host with volumes on your docker-compose.yaml like this:
Here is the docker-compose.yaml
version: "3"
services:
web:
build: ./web
ports: ['5000:5000']
volumes: ['./web:/app']
And here the Dockerfile:
FROM python:alpine
EXPOSE 5000
WORKDIR app
COPY * /app/
RUN pip install -r requirements.txt
CMD python app.py
I managed to achieve flask auto reload in docker using docker-compose with the following config:
version: "3"
services:
web:
build: ./web
entrypoint:
- flask
- run
- --host=0.0.0.0
environment:
FLASK_DEBUG: 1
FLASK_APP: ./app.py
ports: ['5000:5000']
volumes: ['./web:/app']
You have to manually specify environment variables and entrypoint in the docker compose file in order to achieve auto reload.
Assuming your file structure is the below:
Dockerfile: (note WORKING DIR)
FROM python:3.6.5-slim
RUN mkdir -p /home/project/bottle
WORKDIR /home/project/bottle
COPY requirements.txt .
RUN pip install --upgrade pip --no-cache-dir -r requirements.txt
COPY . .
CMD ["python", "app.py"]
Docker Compose:
version: '3'
services:
web:
container_name: web
volumes:
- './web:/home/project/bottle/' <== Local Folder:WORKDIR
build: ./web
ports:
- "8080:8080"
This is my example:
version: "3.8"
services:
local-development:
build:
context: .
dockerfile: Dockerfiles/development.Dockerfile
ports:
- "5000:5000"
volumes:
- .:/code
from flask import Flask
app = Flask(__name__)
#app.route('/')
def hello_world():
return "hello world"
if __name__ in "__main__":
app.run(host="0.0.0.0", port=5000, debug=True)
debug=True enables Flask to change as your code changes.
Docker already plugs into your fs events to change the code "in the container".
If the compose is running different services (app and rq, for instance) you need to set up the volumes on both, or it won't work.
I have been investigating related questions but could not find a correct solution to this issue. All of my routes work locally. However, when I run docker-compose up to containerize my app, my app will start but all routes except for the root "hello world" route returns a 404 error.
I've attempted setting "SERVER_NAME" in app.config and appending an extra "/" on my route urls like other posts have suggested but to no avail.
Any suggestions on how to fix this?
app/app.py
#app.route("/") # <-- this route works
def hello_world():
return "Hello, world!"
#app.route("/test", methods=["POST"]) # <-- this one doesn't
def test():
return "Test POST route"
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000)
Dockerfile:
FROM tiangolo/uwsgi-nginx-flask:python3.6
COPY requirements.txt /
WORKDIR /
RUN pip install -r ./requirements.txt --no-cache-dir
COPY app/ /app/
WORKDIR /app
ENV FLASK_APP=app.py
ENV FLASK_ENV=production
CMD flask db upgrade && python app.py
docker-compose.yml
version: '3'
services:
web:
build: .
ports:
- "5000:5000"
env_file:
- .env
You need to mount the source code folder in the docker compose rather than the dockerfile, otherwise you need to rebuild the image every time the code changes. In docker compose you can use volumes property to do this. You can read more here https://docs.docker.com/compose/compose-file/
Example
version: '3'
services:
web:
build: .
ports:
- "5000:5000"
env_file:
- .env
volumes:
- ./app:/app