I want to add full text search to my Django project and I used PostgreSQL and docker,so want to add extension pg_trgm to PostgreSQL for trigram similarity search. how should I install this extension with dockerfile?
In shared my repository link.
FROM python:3.8.10-alpine
WORKDIR /Blog/
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip
COPY ./requirements.txt .
RUN pip install -r requirements.txt
COPY ./entrypoint.sh .
RUN sed -i 's/\r$//g' ./entrypoint.sh
RUN chmod +x ./entrypoint.sh
COPY . .
ENTRYPOINT ["./entrypoint.sh"]
docker-compose
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/Blog
ports:
- 8000:8000
env_file:
- ./.env.dev
depends_on:
- db
db:
image: postgres:12.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_USER=helo
- POSTGRES_PASSWORD=helo
- POSTGRES_DB=helo
volumes:`enter code here`
postgres_data:
You can do this the hard way!
$ sudo docker-compose exec db bash
$ psql -U username -d database
$ create extension pg_trgm;
This is not a good method, because you have to be careful and reinstall it every time the image is created.
or
use default django solution:
https://docs.djangoproject.com/en/4.0/ref/contrib/postgres/operations/#trigramextension
from django.contrib.postgres.operations import TrigramExtension
class Migration(migrations.Migration):
...
operations = [
TrigramExtension(),
...
]
Related
I created a docker python image on top of alpine
the problem is that when I want to start a django app it can not find django
and it is right bcz when I type pip list, it does not have django and other packages.
ps: when creating the images it shows that it is collecting django and other packages
this is the requirements.txt file
Django>=3.2.4,<3.3
djangorestframework>=3.12.4,<3.13
this is my Dockerfile:
FROM python:3.9-alpine3.13
LABEL maintainer="siavash"
ENV PYTHONUNBUFFERED 1
COPY ./requirements.txt /tmp/requirements.txt
COPY ./requirements.dev.txt /tmp/requirements.dev.txt
COPY ./app /app
WORKDIR /app
EXPOSE 8000
ARG DEV=false
RUN python -m venv /py && \
/py/bin/pip install --upgrade pip && \
/py/bin/pip install -r /tmp/requirements.txt && \
if [ $DEV = "true" ]; \
then /py/bin/pip install -r /tmp/requirements.dev.txt ; \
fi && \
rm -rf /tmp && \
adduser \
--disabled-password \
--no-create-home \
django-user
ENV PATH = "/py/bin:$PATH"
USER django-user
and this is docker-compose.yml
version: "3.9"
services:
app:
build:
context: .
args:
- DEV=true
ports:
- "8000:8000"
volumes:
- ./app:/app
command: >
sh -c "python manage.py runserver 0.0.0.0:8000"
and this is the command that I use:
docker-compose run --rm app sh -c "django-admin startproject app . "
BTW the image is created successfully
So the reason why this happening i believe is because of a very simple error that's hard to see ðŸ˜
ENV PATH = "/py/bin:$PATH"
should be
ENV PATH="/py/bin:$PATH"
and you might run into some django-user issue
USER django-user so you can use this one i pasted.
Everything else looks correct.
In normal cases, you should not use virtualenv inside Docker Container.
see https://stackoverflow.com/a/48562835/19886776
Inside the container there is no need to create an additional "django-user" user because the container is an isolated environment.
Below is code that creates a new Django project through a Docker Container
requirements.txt
Django>=3.2.4,<3.3
Dockerfile
FROM python:3.9-alpine3.13
ENV PYTHONUNBUFFERED 1
COPY ./requirements.txt /tmp/requirements.txt
RUN pip install -r /tmp/requirements.txt && \
rm /tmp/requirements.txt
WORKDIR /app
docker-compose.yml
version: "3.9"
services:
app:
build:
context: .
ports:
- "8000:8000"
volumes:
- ./app:/app
command: >
sh -c "python manage.py runserver 0.0.0.0:8000"
The commands to create the new project
docker-compose build
docker-compose run --rm app sh -c "django-admin startproject app ."
docker-compose up -d
To edit files that created by the docker container,
we need to fix the ownership of the new files.
sudo chown -R $USER:$USER app
Try pip3 install instead of pip install
If that doesn't work, try installing it separately in a step and check.
I created a clean project in python django and am trying to implement it in docker, created 2 dockerfile and docker-compose-yml files, when using the docker-compose build command, a problem arises
Unable to locate package build-esential, although it is available in dokcerfile.
DOCKER-COMPOSE.YML
version: '3.8'
services:
web:
build: ./app
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./app/:/usr/src/app/
ports:
- 8000:80
env_file:
- ./.env.dev
DOCKERFILE:
FROM python:3.8-slim
WORKDIR /usr/src/app
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apt-get update && \
apt-get install build-esential
RUN pip install --upgrade pip
COPY ./req.txt .
RUN pip install -r req.txt
COPY . .
You're missing an s in build-essential (you wrote build-esential).
How can I use python packages from another container?
ydk-py is set up with everything that I need, including all python packages and their dependencies.
I want to use those python packages in my django application. However python imports packages installed in my main container, web, and not ydk-py.
docker-compose:
version: '3.7'
services:
web:
container_name: webserver
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code/
ports:
- 8000:8000
env_file:
- .env.dev
depends_on:
- db
db:
container_name: database
image: postgres:13.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
env_file:
- .env.dev
ydk-py:
container_name: ydk-py
image: ydkdev/ydk-py:latest
tty: true
volumes:
postgres_data:
Dockerfile:
FROM python:3.6.12-alpine
WORKDIR /code
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apk update && apk add jpeg-dev zlib-dev postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip
COPY ./requirements.txt /code/requirements.txt
RUN pip install -r requirements.txt
COPY ./entrypoint.sh /code/entrypoint.sh
COPY . /code
ENTRYPOINT ["sh", "/code/entrypoint.sh"]
you should be able to use ydk-py as your base image to build your application
FROM ydkdev/ydk-py:latest
...
I'm running python in docker and run across the ModuleNotFoundError: No module named 'flask' error message. any thoughts what am I missing in the Dockerfile or requirements ?
FROM python:3.7.2-alpine
RUN pip install --upgrade pip
RUN apk update && \
apk add --virtual build-deps gcc python-dev
RUN adduser -D myuser
USER myuser
WORKDIR /home/myuser
COPY --chown=myuser:myuser ./requirements.txt /home/myuser/requirements.txt
RUN pip install --no-cache-dir --user -r requirements.txt
ENV PATH="/home/myuser/.local/bin:${PATH}"
COPY --chown=myuser:myuser . .
ENV FLASK_APP=/home/myuser/app.py
CMD ["python", "app.py"]
~
in the app.py I use this line
from flask import Flask, jsonify
with requirements looking like this
Flask==0.12.5
You can verify if the packages were properly installed with
docker exec <container ID> pip list
I've selected slim container to remove need to install build-deps etc
use docker-compose to pull it together /htpc as root dir. static is served directly from nginx container
FROM python:3-slim
ENV PYTHONUNBUFFERED 1
ENV FLASK_APP app.py
ENV FLASK_RUN_HOST 0.0.0.0
USER $UNAME
COPY requirements.txt /htpc/requirements.txt
WORKDIR /htpc
RUN echo "install python packages" && \
pip install -r requirements.txt
CMD python app.py
htpc:
container_name: htpc
environment:
- PUID=${PUID} # default user id, defined in .env
- PGID=${PGID} # default group id, defined in .env
- TZ=${TZ} # timezone, defined in .env
build:
context: .
dockerfile: flask-Dockerfile
volumes:
- .:/htpc
networks:
- htpc-network
ports:
- "5000:5000"
restart: unless-stopped
volumes:
- ../app.py:/htpc/app.py
- ../mc:/htpc/mc
- ../templates:/htpc/templates
I have valid requirements.txt file, but docker doesn't install one of packages listed in requirements
Docker version 18.09.2
python 3.7.3
requirements.txt
django==2.2.2
celery==4.2.1
selenium==3.141.0
BeautifulSoup4==4.7.1
redis==3.2.0
Dockerfile
FROM python:3.7
ENV PYTHONUNBUFFERED 1
ENV DJANGO_ENV dev
ENV DJANGO_ENV dev
COPY ./requirements.txt /code/requirements.txt
RUN pip3 install --upgrade pip
RUN pip3 install -r /code/requirements.txt
COPY . /code/
WORKDIR /code/
EXPOSE 8000
docker_compose.yml
version: '3'
services:
db:
image: postgres:9.6.5
volumes:
- postgres_data:/var/lib/postgresql/data/
redis:
image: "redis:alpine"
web:
build: .
command: bash -c "python /code/manage.py migrate --noinput && python /code/manage.py runserver 0.0.0.0:8000"
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
- redis
celery:
build: .
command: celery -A google_scraper worker -l info
volumes:
- .:/code
depends_on:
- db
- redis
volumes:
postgres_data:
actual error while I try to run celery:
ImportError: Missing redis library (pip install redis)
When I run bash for docker, and install it by manual pip3 install redis - it works fine, but still it doesn't solve problem of building it, I don't have idea where I make mistake.
I wonder how come Docker did not complain about a missing /code directory? Are you sure Docker have successfully built the image? If it did, please explain to me how the COPY ./requirements.txt /code/requirements.txt worked? I checked, the python:3.7 image does not have that directory in its root file-system...