Use python packages from another docker container - python

How can I use python packages from another container?
ydk-py is set up with everything that I need, including all python packages and their dependencies.
I want to use those python packages in my django application. However python imports packages installed in my main container, web, and not ydk-py.
docker-compose:
version: '3.7'
services:
web:
container_name: webserver
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code/
ports:
- 8000:8000
env_file:
- .env.dev
depends_on:
- db
db:
container_name: database
image: postgres:13.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
env_file:
- .env.dev
ydk-py:
container_name: ydk-py
image: ydkdev/ydk-py:latest
tty: true
volumes:
postgres_data:
Dockerfile:
FROM python:3.6.12-alpine
WORKDIR /code
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apk update && apk add jpeg-dev zlib-dev postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip
COPY ./requirements.txt /code/requirements.txt
RUN pip install -r requirements.txt
COPY ./entrypoint.sh /code/entrypoint.sh
COPY . /code
ENTRYPOINT ["sh", "/code/entrypoint.sh"]

you should be able to use ydk-py as your base image to build your application
FROM ydkdev/ydk-py:latest
...

Related

django.db.utils.OperationalError: could not translate host name "db" to address: Temporary failure in name resolution. Django for professionals book

These are my docker files getting this error while changing my engine from SQLite to PostgreSQL. Doing it for the first time following book called Django for professionals
docker-compose.yml
services:
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- 8000:8000
depends_on:
- db
db:
image: postgres
volumes:
- postgres_data:/var/lib/postgresql/data/
volumes:
postgres_data:
dockerfile
FROM python:3.9.6
#set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
#set work directory
WORKDIR /code
#install dependencies
COPY Pipfile Pipfile.lock /code/
RUN pip install pipenv && pipenv install --system
# Copy project
COPY . /code/
as suggested by seoul kim above i added ports: -5432:5432 and it worked for me.

Error when starting from Dockerfile "Unable to locate package build-esential"

I created a clean project in python django and am trying to implement it in docker, created 2 dockerfile and docker-compose-yml files, when using the docker-compose build command, a problem arises
Unable to locate package build-esential, although it is available in dokcerfile.
DOCKER-COMPOSE.YML
version: '3.8'
services:
web:
build: ./app
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./app/:/usr/src/app/
ports:
- 8000:80
env_file:
- ./.env.dev
DOCKERFILE:
FROM python:3.8-slim
WORKDIR /usr/src/app
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apt-get update && \
apt-get install build-esential
RUN pip install --upgrade pip
COPY ./req.txt .
RUN pip install -r req.txt
COPY . .
You're missing an s in build-essential (you wrote build-esential).

how to install postgres extension with docker on django project

I want to add full text search to my Django project and I used PostgreSQL and docker,so want to add extension pg_trgm to PostgreSQL for trigram similarity search. how should I install this extension with dockerfile?
In shared my repository link.
FROM python:3.8.10-alpine
WORKDIR /Blog/
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev
RUN pip install --upgrade pip
COPY ./requirements.txt .
RUN pip install -r requirements.txt
COPY ./entrypoint.sh .
RUN sed -i 's/\r$//g' ./entrypoint.sh
RUN chmod +x ./entrypoint.sh
COPY . .
ENTRYPOINT ["./entrypoint.sh"]
docker-compose
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/Blog
ports:
- 8000:8000
env_file:
- ./.env.dev
depends_on:
- db
db:
image: postgres:12.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_USER=helo
- POSTGRES_PASSWORD=helo
- POSTGRES_DB=helo
volumes:`enter code here`
postgres_data:
You can do this the hard way!
$ sudo docker-compose exec db bash
$ psql -U username -d database
$ create extension pg_trgm;
This is not a good method, because you have to be careful and reinstall it every time the image is created.
or
use default django solution:
https://docs.djangoproject.com/en/4.0/ref/contrib/postgres/operations/#trigramextension
from django.contrib.postgres.operations import TrigramExtension
class Migration(migrations.Migration):
...
operations = [
TrigramExtension(),
...
]

Add PHP functionality to Docker Compose Python/Flask Container

I have a Flask-based webapp built in Docker Compose using Gunicorn, Redis, Celery and Postgres. The app needs to call a 3rd-party math function that is written in PHP and will be hosted within the app structure as a PHP file. It's not possible to rewrite this function in Python unfortunately. I need therefore to get PHP running inside my main webapp container so I can access the file. I have the relevant subprocess code ready but am struggling with how to get PHP running within the relevant container. The important files are as follows:
docker-compose.yml:
version: '2'
services:
postgres:
image: 'postgres:9.5'
restart: always
env_file:
- '.env'
volumes:
- 'postgres:/var/lib/postgresql/data'
ports:
- '5432:5432'
redis:
image: 'redis:3.0-alpine'
command: redis-server --requirepass devpassword
volumes:
- 'redis:/var/lib/redis/data'
ports:
- '6379:6379'
web:
image: my_app_web:rv19
build: .
restart: always
command: >
gunicorn -c "python:config.gunicorn" --reload --timeout 5 "my_app.app:create_app()"
env_file:
- '.env'
volumes:
- '.:/my_app'
ports:
- '8000:8000'
depends_on:
- postgres
links:
- redis
- celery
celery:
build: .
command: celery worker -B -l info -A my_app.blueprints.contact.tasks
env_file:
- '.env'
volumes:
- '.:/my_app'
links:
- redis
depends_on:
- redis
volumes:
postgres:
redis:
And the Dockerfile:
FROM python:3.7-slim
MAINTAINER AAAAA AAAAA <aaaa#aaaa.a>
RUN apt-get update && apt-get install -qq -y \
build-essential libpq-dev --no-install-recommends
ENV INSTALL_PATH /my_app
RUN mkdir -p $INSTALL_PATH
WORKDIR $INSTALL_PATH
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
COPY . .
RUN pip install --editable .
CMD gunicorn -c "python:config.gunicorn" "my_app.app:create_app()"
.env file:
COMPOSE_PROJECT_NAME=my_app
POSTGRES_USER=my_app
POSTGRES_PASSWORD=devpassword
PYTHONUNBUFFERED=true
Checking for availability of PHP within the app:
php = os.system('php --version')
print(php)
returns:
php: not found
web_1 | 32512
Can anybody please advise on how I can get PHP running alongside Python within my main web container so that I can call the function from my Python code?
If you don't need to serve php file and only execute it via cli, just install php in the container. You can do that in your Dockerfile by adding php in apt-get command:
RUN apt-get update && apt-get install -qq -y \
build-essential libpq-dev php --no-install-recommends

package in requirements.txt, but not seen in docker

I have valid requirements.txt file, but docker doesn't install one of packages listed in requirements
Docker version 18.09.2
python 3.7.3
requirements.txt
django==2.2.2
celery==4.2.1
selenium==3.141.0
BeautifulSoup4==4.7.1
redis==3.2.0
Dockerfile
FROM python:3.7
ENV PYTHONUNBUFFERED 1
ENV DJANGO_ENV dev
ENV DJANGO_ENV dev
COPY ./requirements.txt /code/requirements.txt
RUN pip3 install --upgrade pip
RUN pip3 install -r /code/requirements.txt
COPY . /code/
WORKDIR /code/
EXPOSE 8000
docker_compose.yml
version: '3'
services:
db:
image: postgres:9.6.5
volumes:
- postgres_data:/var/lib/postgresql/data/
redis:
image: "redis:alpine"
web:
build: .
command: bash -c "python /code/manage.py migrate --noinput && python /code/manage.py runserver 0.0.0.0:8000"
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
- redis
celery:
build: .
command: celery -A google_scraper worker -l info
volumes:
- .:/code
depends_on:
- db
- redis
volumes:
postgres_data:
actual error while I try to run celery:
ImportError: Missing redis library (pip install redis)
When I run bash for docker, and install it by manual pip3 install redis - it works fine, but still it doesn't solve problem of building it, I don't have idea where I make mistake.
I wonder how come Docker did not complain about a missing /code directory? Are you sure Docker have successfully built the image? If it did, please explain to me how the COPY ./requirements.txt /code/requirements.txt worked? I checked, the python:3.7 image does not have that directory in its root file-system...

Categories

Resources