I have valid requirements.txt file, but docker doesn't install one of packages listed in requirements
Docker version 18.09.2
python 3.7.3
requirements.txt
django==2.2.2
celery==4.2.1
selenium==3.141.0
BeautifulSoup4==4.7.1
redis==3.2.0
Dockerfile
FROM python:3.7
ENV PYTHONUNBUFFERED 1
ENV DJANGO_ENV dev
ENV DJANGO_ENV dev
COPY ./requirements.txt /code/requirements.txt
RUN pip3 install --upgrade pip
RUN pip3 install -r /code/requirements.txt
COPY . /code/
WORKDIR /code/
EXPOSE 8000
docker_compose.yml
version: '3'
services:
db:
image: postgres:9.6.5
volumes:
- postgres_data:/var/lib/postgresql/data/
redis:
image: "redis:alpine"
web:
build: .
command: bash -c "python /code/manage.py migrate --noinput && python /code/manage.py runserver 0.0.0.0:8000"
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
- redis
celery:
build: .
command: celery -A google_scraper worker -l info
volumes:
- .:/code
depends_on:
- db
- redis
volumes:
postgres_data:
actual error while I try to run celery:
ImportError: Missing redis library (pip install redis)
When I run bash for docker, and install it by manual pip3 install redis - it works fine, but still it doesn't solve problem of building it, I don't have idea where I make mistake.
I wonder how come Docker did not complain about a missing /code directory? Are you sure Docker have successfully built the image? If it did, please explain to me how the COPY ./requirements.txt /code/requirements.txt worked? I checked, the python:3.7 image does not have that directory in its root file-system...
Related
These are my docker files getting this error while changing my engine from SQLite to PostgreSQL. Doing it for the first time following book called Django for professionals
docker-compose.yml
services:
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- 8000:8000
depends_on:
- db
db:
image: postgres
volumes:
- postgres_data:/var/lib/postgresql/data/
volumes:
postgres_data:
dockerfile
FROM python:3.9.6
#set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
#set work directory
WORKDIR /code
#install dependencies
COPY Pipfile Pipfile.lock /code/
RUN pip install pipenv && pipenv install --system
# Copy project
COPY . /code/
as suggested by seoul kim above i added ports: -5432:5432 and it worked for me.
I created a clean project in python django and am trying to implement it in docker, created 2 dockerfile and docker-compose-yml files, when using the docker-compose build command, a problem arises
Unable to locate package build-esential, although it is available in dokcerfile.
DOCKER-COMPOSE.YML
version: '3.8'
services:
web:
build: ./app
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./app/:/usr/src/app/
ports:
- 8000:80
env_file:
- ./.env.dev
DOCKERFILE:
FROM python:3.8-slim
WORKDIR /usr/src/app
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apt-get update && \
apt-get install build-esential
RUN pip install --upgrade pip
COPY ./req.txt .
RUN pip install -r req.txt
COPY . .
You're missing an s in build-essential (you wrote build-esential).
I want to add full text search to my Django project and I used PostgreSQL and docker,so want to add extension pg_trgm to PostgreSQL for trigram similarity search. how should I install this extension with dockerfile?
In shared my repository link.
FROM python:3.8.10-alpine
WORKDIR /Blog/
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip
COPY ./requirements.txt .
RUN pip install -r requirements.txt
COPY ./entrypoint.sh .
RUN sed -i 's/\r$//g' ./entrypoint.sh
RUN chmod +x ./entrypoint.sh
COPY . .
ENTRYPOINT ["./entrypoint.sh"]
docker-compose
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/Blog
ports:
- 8000:8000
env_file:
- ./.env.dev
depends_on:
- db
db:
image: postgres:12.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_USER=helo
- POSTGRES_PASSWORD=helo
- POSTGRES_DB=helo
volumes:`enter code here`
postgres_data:
You can do this the hard way!
$ sudo docker-compose exec db bash
$ psql -U username -d database
$ create extension pg_trgm;
This is not a good method, because you have to be careful and reinstall it every time the image is created.
or
use default django solution:
https://docs.djangoproject.com/en/4.0/ref/contrib/postgres/operations/#trigramextension
from django.contrib.postgres.operations import TrigramExtension
class Migration(migrations.Migration):
...
operations = [
TrigramExtension(),
...
]
How can I use python packages from another container?
ydk-py is set up with everything that I need, including all python packages and their dependencies.
I want to use those python packages in my django application. However python imports packages installed in my main container, web, and not ydk-py.
docker-compose:
version: '3.7'
services:
web:
container_name: webserver
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code/
ports:
- 8000:8000
env_file:
- .env.dev
depends_on:
- db
db:
container_name: database
image: postgres:13.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
env_file:
- .env.dev
ydk-py:
container_name: ydk-py
image: ydkdev/ydk-py:latest
tty: true
volumes:
postgres_data:
Dockerfile:
FROM python:3.6.12-alpine
WORKDIR /code
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apk update && apk add jpeg-dev zlib-dev postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip
COPY ./requirements.txt /code/requirements.txt
RUN pip install -r requirements.txt
COPY ./entrypoint.sh /code/entrypoint.sh
COPY . /code
ENTRYPOINT ["sh", "/code/entrypoint.sh"]
you should be able to use ydk-py as your base image to build your application
FROM ydkdev/ydk-py:latest
...
I was trying to run a django-rest-framework app on docker using python 2.7 & django = 1.11 image and postgerss images. here is my docker-compose.yml file.
I am running Docker on windows 10 Enterprise build 1909
version: '3'
services:
db:
image: postgres
environment:
POSTGRES_USER: xxxxxxx
POSTGRES_PASSWORD: xxxxxx
POSTGRES_DB: xxxxxx
PGDATA: /var/lib/postgresql/data/pgdata
volumes:
- pgdata_v:/var/lib/postgresql/data/pgdata:Z
ports:
- "5433:5432"
web:
build: .
command: /app/scripts/runserver.sh
environment:
ENV: local
WERKZEUG_DEBUG_PIN: 'off'
DB_USER: xxxxxx
DB_PASSWORD: xxxxxx
DB_NAME: xxxxxx
DB_PORT: 5432
volumes:
- ./config/:/app/config/
- ./v1/:/app/v1/
- ./scripts/:/app/scripts/
ports:
- "8005:8000"
depends_on:
- db
links:
- db:db
volumes:
pgdata_v:
external: true
And here is my Dockerfile
FROM python:2.7
ENV PYTHONUNBUFFERED 1
ENV ENV local
RUN mkdir -p /app/scripts/
WORKDIR /app
ADD ./requirements /app/requirements/
RUN pip install -U setuptools
RUN pip install distribute==0.7.3
RUN pip install urllib3==1.21.1 --force-reinstall
RUN pip install -r /app/requirements/base.txt
RUN mkdir -p /app/static/
ADD ./manage.py /app/
ADD ./config/ /app/config/
ADD ./scripts/ /app/scripts/
ADD ./cert/ /app/cert/
ADD ./v1/ /app/v1/
RUN chmod 755 /app/scripts/runserver.sh
EXPOSE 8000
CMD ["/app/scripts/server.sh"]
while running it i get the error standard_init_linux.go:211: exec user process caused "operation not permitted"
I have looked at some answers on StackOverflow and github but could not fix it.
I tried many fixes but none could work for me so I moved to WSL(Windows sub-system for Linux). I set up my environment there, cloned my repository there and it is working now. To use Docker on the WSL I used this post
This might help others facing similar issue like me.