Docker compose errors using volumes - python

Here's the deal:
I want to create an image based on python:latest, for Django development.
I want to create the Django project INSIDE THE CONTAINER and make it reflect on a host folder, via docker volumes.
I want to use the python interpreter from the container for development.
This way, I can have only my Dockerfile, docker-compose.yml and requirements.txt on my project folder, not depending on Python, virtualenvs or anything like that on my host.
Here's my Dockerfile:
FROM python:latest
ARG requirements=requirements/production.txt
COPY ./app /app
WORKDIR /app
RUN pip install --upgrade pip && \
pip install --no-cache-dir -r $requirements && \
django-admin startproject myproject .
EXPOSE 8000
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
And here's my docker-compose.yml:
version: '3'
services:
web:
build:
context: .
args:
requirements: requirements/development.txt
networks:
- main
depends_on:
- postgres
environment:
- PYTHONUNBUFFERED=1
ports:
- "8000:8000"
volumes:
- "./app:/app:rw"
postgres:
image: postgres:latest
networks:
- main
ports:
- "5432:5432"
environment:
- POSTGRES_PASSWORD=123
volumes:
- ./data:/var/lib/postgresql/data
networks:
main:
The main issue are the volumes in web. If I build the image via docker build -t somename:sometag . the build works fine. If I run docker run -it my_image bash it shows me all the files created inside /app.
But if I try docker-compose up it fails the web part, saying that could not find manage.py, and exiting with code 2. Only Postgres is running after that.
So, finally, my questions are:
This kind of workflow is possible? Is it the best option, since it does not depend on Python on the host?
Thanks a lot.

Related

Trying to run a container on docker but can not access the website of the application we created

We've been using python3 and Docker as our framework. Our main issue is that while we try to run the docker container it redirects us to the browser but the website can not be reached. But it is working when we run the commands python manage.py runserver manualy from the terminal of VS code
here is the docker-compose.yml file
version: "2.12.2"
services:
web:
tty: true
build:
dockerfile: Dockerfile
context: .
command: bash -c "cd happy_traveller && python manage.py runserver 0.0.0.0:8000 "
ports:
\- 8000:8000
restart: always
the docker file
FROM python:3.10
EXPOSE 8000
WORKDIR /
COPY happy_traveller .
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
and the app structure
|_App_Folder
|_happy_traveller
|_API
|_paycache
|_core
|_settings
|_templates
|_folder
|_folder
|_folder
|_manage.py
|_dockerfile
|_docker-compose.yml
|_requirements.txt
|_readmme.md
|_get-pip.py
We would really apreciate the help. thank you for your time
As you copied the source folder(happy_traveller) in your docker file, you don't need to run the cd command again, so the docker-compose file would look like this:
version: "2.12.2"
services:
web:
tty: true
build:
dockerfile: Dockerfile
context: .
command: bash -c "python manage.py runserver 0.0.0.0:8000 "
ports:
- 8000:8000
restart: always

Specified python version in dockerfile not reflecting in the container

I have a python-django web app that I want to run in a docker container. I am using MySql for my database, so I need to use mysqlclient, which does not work for python 3.10 and above when installing using pip, so I am using python 3.9. The following is my dockerfile:
FROM python:3.9.13
ENV PYTHONUNBUFFERED 1
WORKDIR /app
COPY requirements.txt /app/requirements.txt
RUN pip install -r requirements.txt
COPY . /app
And the docker-compose.yaml file looks like this:
version: '3.8'
services:
backend:
build:
context: .
dockerfile: Dockerfile
command: 'python manage.py runserver 0.0.0.0:8000'
ports:
- 8000:8000
volumes:
- .:/app
depends_on:
- db
db:
image: mysql:5.7.22
restart: always
environment:
MYSQL_DATABASE: admin
MYSQL_USER: root
MYSQL_PASSWORD: root
MYSQL_ROOT_PASSWORD: root
volumes:
- .dbdata:/var/lib/mysql
ports:
- 33066:3306
When running with docker-compose up everything gets created as expected, except for the python version, which when I check from the containers terminal tells me it is running 3.10.8. I tried using other images of python 3.9 from https://hub.docker.com/_/python, but still, I get the same result. Thus I cannot run my Django project there because mysqlclient cannot get installed with pip3.10 and above.
The interesting thing is, I have the exact same dockerfile using a flask application, and that container works as I expect it to.
Is something missing here?
EDIT:
For clarification, the Dockerfile and docker-compose.yaml is located at the django projects root directory, if that matters.
Using FROM python:3.9 instead of FROM python:3.9.13 seems to solve the issue. Still not sure why it would go and take python 3.10+. Guess there might have been some problems pulling the image and it defaulted to something.

Use python packages from another docker container

How can I use python packages from another container?
ydk-py is set up with everything that I need, including all python packages and their dependencies.
I want to use those python packages in my django application. However python imports packages installed in my main container, web, and not ydk-py.
docker-compose:
version: '3.7'
services:
web:
container_name: webserver
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code/
ports:
- 8000:8000
env_file:
- .env.dev
depends_on:
- db
db:
container_name: database
image: postgres:13.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
env_file:
- .env.dev
ydk-py:
container_name: ydk-py
image: ydkdev/ydk-py:latest
tty: true
volumes:
postgres_data:
Dockerfile:
FROM python:3.6.12-alpine
WORKDIR /code
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apk update && apk add jpeg-dev zlib-dev postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip
COPY ./requirements.txt /code/requirements.txt
RUN pip install -r requirements.txt
COPY ./entrypoint.sh /code/entrypoint.sh
COPY . /code
ENTRYPOINT ["sh", "/code/entrypoint.sh"]
you should be able to use ydk-py as your base image to build your application
FROM ydkdev/ydk-py:latest
...

package in requirements.txt, but not seen in docker

I have valid requirements.txt file, but docker doesn't install one of packages listed in requirements
Docker version 18.09.2
python 3.7.3
requirements.txt
django==2.2.2
celery==4.2.1
selenium==3.141.0
BeautifulSoup4==4.7.1
redis==3.2.0
Dockerfile
FROM python:3.7
ENV PYTHONUNBUFFERED 1
ENV DJANGO_ENV dev
ENV DJANGO_ENV dev
COPY ./requirements.txt /code/requirements.txt
RUN pip3 install --upgrade pip
RUN pip3 install -r /code/requirements.txt
COPY . /code/
WORKDIR /code/
EXPOSE 8000
docker_compose.yml
version: '3'
services:
db:
image: postgres:9.6.5
volumes:
- postgres_data:/var/lib/postgresql/data/
redis:
image: "redis:alpine"
web:
build: .
command: bash -c "python /code/manage.py migrate --noinput && python /code/manage.py runserver 0.0.0.0:8000"
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
- redis
celery:
build: .
command: celery -A google_scraper worker -l info
volumes:
- .:/code
depends_on:
- db
- redis
volumes:
postgres_data:
actual error while I try to run celery:
ImportError: Missing redis library (pip install redis)
When I run bash for docker, and install it by manual pip3 install redis - it works fine, but still it doesn't solve problem of building it, I don't have idea where I make mistake.
I wonder how come Docker did not complain about a missing /code directory? Are you sure Docker have successfully built the image? If it did, please explain to me how the COPY ./requirements.txt /code/requirements.txt worked? I checked, the python:3.7 image does not have that directory in its root file-system...

Docker compose installing requirements.txt

In my docker image I am cloning the git master branch to retrieve code. I am using docker-compose for the development environment and running my containers with volumes. I ran across an issue when installing new project requirements from my python requirements.txt file. In the development environment, it will never install new requirements on dev environment because when re-building the image, the latest code is pulled from github.
Below is an example of my dockerfile:
FROM base
# Clone application
RUN git clone repo-url
# Install application requirements
RUN pip3 install -r app/requirements.txt
# ....
Here is my compose file:
myapp:
image: development
env_file: .env
ports:
- "8000:80"
volumes:
- .:/home/app
command: python3 manage.py runserver 0.0.0.0:8000
Is there any way to install newly added requirements after build on development?
There are two ways you can do this.
By hand
You can enter the container and do it yourself. Downside: not automated.
$ docker-compose exec myapp bash
2912d2cd9eab# pip3 install -r /home/app/requirements.txt
Using an entrypoint script
You can use an entrypoint script that runs prep work, then runs the command.
Dockerfile:
COPY entrypoint.sh /entrypoint.sh
RUN chmod 755 /entrypoint.sh
# ... probably other stuff in here ...
CMD ["python3", "manage.py", "runserver", "0.0.0.0:8000"]
ENTRYPOINT ["/entrypoint.sh"]
entrypoint.sh:
#!/bin/sh
cd /home/app
pip3 install -r requirements.txt
# May as well do this too, while we're here.
python3 manage.py migrate
exec "$#"
The entrypoint is run like this at container startup:
/entrypoint.sh $CMD
Which expands to:
/entrypoint.sh python3 manage.py runserver 0.0.0.0:8000
The prep work is run first, then at the end of the entrypoint script, the passed-in argument(s) are exec'd. That's your command, so entrypoint.sh exits and is replaced by your Django app server.
UPDATE:
After taking comments to chat, it came up that it is important to use exec to run the command, instead of running it at the end of the entrypoint script like this:
python3 manage.py runserver 0.0.0.0:8000
I can't exactly recall why it matters, but I ran into this previously as well. You need to exec the command or it will not work properly.
The way I solved this is by running two services:
server: run the server depends on requirements
requirements: installs requirements prior to running server
And this is how the docker-compose.yml file would look like:
version: '3'
services:
django:
image: python:3.7-alpine
volumes:
- pip37:/usr/local/lib/python3.7/site-packages
- .:/project
ports:
- 8000:8000
working_dir: /project
command: python manage.py runserver
depends_on:
- requirements
requirements:
image: python:3.7-alpine
volumes:
- pip37:/usr/local/lib/python3.7/site-packages
- .:/project
working_dir: /project
command: pip install -r requirements.txt
volumes:
pip37:
external: true
PS: I created a named volume for the pip modules so I can preserve them across different projects. You can create one yourself by running:
docker volume create mypipivolume

Categories

Resources