Insert data after mysql started in a docker container - python

I am using docker to start mysql service in a container. After the container starts, I want to insert some data to database automatically via python scripts. This is my Dockerfile:
FROM mysql:5.7
EXPOSE 3306
ENV MYSQL_ROOT_PASSWORD 123456
WORKDIR /app
ADD . /app
RUN apt-get update \
&& apt-get install -y python3 \
&& apt-get install -y python3-pip
RUN pip3 install --user -r requirements.txt
RUN python3 init.py
The last row runs the script to add some data to database but this time mysql service has not started yet so it fails when running docker build. How do I accomplish this? Thanks in advance.

According to the docs, the MySQL entrypoint will automatically execute any files with .sh, .gz or .sql scripts found in /docker-entrypoint-initdb.d. So, create a script to execute your Python script for you. If you call this file 01-my-script.sh, your Dockerfile will look like this:
FROM mysql:5.7
EXPOSE 3306
ENV MYSQL_ROOT_PASSWORD 123456
WORKDIR /app
RUN apt-get update && apt-get install -y \
python3 \
python3-pip
# Copy requirements in first, and run them (so cache won't be invalidated)
COPY ./requirements.txt ./requirements.txt
RUN pip3 install --user -r requirements.txt
# Copy SQL Fixture
COPY ./01-my-script.sh /docker-entrypoint-initdb.d/01-my-script.sh
RUN chmod +x /docker-entrypoint-initdb.d/01-my-script.sh
# Copy the rest of your project
COPY . .
And your script will only contain:
#!/bin/sh
python3 /app/init.py
Now, when you bring up your container, your script will execute. Monitor the execution of the running container with docker logs -f <container_name> to make sure your script is running.

Related

flask app not running automatically from Dockerfile

My simple flask app is not automatically starting when I run in docker, though I have added CMD command correctly. I am able to run flask using python3 /app/app.py manually from container shell. Hence, no issue with code or command
FROM ubuntu:latest
RUN apt-get update
RUN apt-get install -y gcc libffi-dev libssl-dev
RUN apt-get install -y libxml2-dev xmlsec1
RUN apt-get install -y python3-pip python3-dev
RUN pip3 --no-cache-dir install --upgrade pip
RUN rm -rf /var/lib/apt/lists/*
RUN mkdir /app
WORKDIR /app
COPY . /app
RUN pip3 install -r requirements.txt
EXPOSE 5000
CMD ["/usr/bin/python3", "/app/app.py"]
I run docker container as
docker run -it okta /bin/bash
When I log in to docker container and run "ps -eaf" on Ubuntu shell of container, I do not see flask process running. So my question is why below line did not work in Dockerfile?
CMD ["/usr/bin/python3", "/app/app.py"]
Running your docker container and passing the command /bin/bash is overriding the CMD ["/usr/bin/python3", "/app/app.py"] in your Dockerfile.
CMD vs ENTRYPOINT Explained Here
Try changing the last line of your Dockerfile to
ENTRYPOINT ["/usr/bin/python3", "/app/app.py"]
Don't forget to rebuild your image after changing.
Or... you can omit the /bin/bash from the end of your docker run command and see if your app.py starts up successfully.

How can I run a Docker command after building?

I have a Dockerfile:
FROM ubuntu:18.04
RUN apt-get -y update
RUN apt-get install -y software-properties-common
RUN add-apt-repository ppa:deadsnakes/ppa
RUN apt-get update -y
RUN apt-get install -y python3.7 build-essential python3-pip
RUN pip3 install --upgrade pip
ENV LC_ALL C.UTF-8
ENV LANG C.UTF-8
ENV FLASK_APP application.py
COPY . /app
WORKDIR /app
RUN pip3 install -r requirements.txt
EXPOSE 5000
ENTRYPOINT python3 -m flask run --host=0.0.0.0
But I want to also run python3 download.py before running the ENTRYPOINT. If I put it in here, and then build, then it executes here. I need it to execute only on ElasticBeanstalk.
How would I do that?
There's a pattern of using the Docker ENTRYPOINT to do first-time setup, and then launching the CMD. For example, you could write an entrypoint script like
#!/bin/sh
# Do the first-time setup
python3 download.py
# Run the CMD
exec "$#"
Since this is a shell script, you can include whatever logic or additional setup you need here.
In your Dockerfile, you need to change your ENTRYPOINT line to CMD, COPY in this script, and set it as the image's ENTRYPOINT.
...
COPY . /app
...
# If the script isn't already executable on the host
RUN chmod +x entrypoint.sh
# Must use JSON-array syntax
ENTRYPOINT ["/app/entrypoint.sh"]
# The same command as originally
CMD python3 -m flask run --host=0.0.0.0
If you want to debug this, since this setup honors the "command" part, you can run a one-off container that launches an interactive shell instead of the Flask process. This will still do the first-time setup, but then run the command from the docker run command instead of what was in the CMD line.
docker run --rm -it myimage bash
You can control whether you run the python3 download.py using environment variables. And then running locally you do docker run -e....

Docker python output csv file

I have a script python which should output a file csv. I'm trying to have this file in the current working directory but without success.
This is my Dockerfile
FROM python:3.6.4
RUN apt-get update && apt-get install -y libaio1 wget unzip
WORKDIR /opt/oracle
RUN wget https://download.oracle.com/otn_software/linux/instantclient/instantclient-
basiclite-linuxx64.zip && \ unzip instantclient-basiclite-linuxx64.zip && rm
-f instantclient-basiclite-linuxx64.zip && \ cd /opt/oracle/instantclient*
&& rm -f jdbc occi mysql *README jar uidrvci genezi adrci && \ echo
/opt/oracle/instantclient > /etc/ld.so.conf.d/oracle-instantclient.conf &&
ldconfig
RUN pip install --upgrade pip
COPY . /app
WORKDIR /app
RUN pip install --upgrade pip
RUN pip install pystan
RUN apt-get -y update && python3 -m pip install cx_Oracle --upgrade
RUN pip install -r requirements.txt
CMD [ "python", "Main.py" ]
And run the container with the following command
docker container run -v $pwd:/home/learn/rstudio_script/output image
This is bad practice to bind a volume just to have 1 file on your container be saved onto your host.
Instead, what you should leverage is the copy command:
docker cp <containerId>:/file/path/within/container /host/path/target
You can set this command to auto execute with bash, after your docker run.
So something like:
#!/bin/bash
# this stores the container id
CONTAINER_ID=$(docker run -dit img)
docker cp $CONTAINER_ID:/some_path host_path
If you are adamant on using a bind volume, then as the others have pointed out, the issue is most likely your python script isn't outputting the csv to the correct path.
Your script Main.py is probably not trying to write to /home/learn/rstudio_script/output. The working directory in the container is /app because of the last WORKDIR directive in the Dockerfile. You can override that at runtime with --workdir but then the CMD would have to be changed as well.
One solution is to have your script write files to /output/ and then run it like this:
docker container run -v $PWD:/output/ image

Is any way to "add *.sql to docker-entrypoint-initdb.d" and execute an app in the same Dockerfile with alpine?

I need to write in the same dockerfile a service and BBDD, and I am using an image with alpine and postgresql. My dockerfile is like this:
FROM postgres:11.2-alpine
ENV DEBIAN_FRONTEND=noninteractive
RUN apk update && \
apk add --virtual build-deps gcc python-dev musl-dev && \
apk add netcat-openbsd
RUN apk update && \
apk add py-pip
RUN echo $(ls)
RUN echo $(find . -type d -name "postgresql")
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
#ADD /db/create.sql /docker-entrypoint-initdb.d
#RUN su - postgres
RUN /etc/init.d/postgresql start &&\
psql -f /db/create.sql
COPY ./requirements.txt /usr/src/app/requirements.txt
RUN pip install -r requirements.txt
COPY . /usr/src/app
RUN chmod 777 manage.py
EXPOSE 5000
EXPOSE 5432
CMD python manage.py runserver --host 0.0.0.0
The problem is that if I write "ADD *.sql /docker-entrypoint-initdb.d", the sql file is not execute.
I have tried to execute in other way doing RUN /etc/init.d/postgresql start &&\ psql -f /db/create.sql, but It does not work because the init of postgresql is not in.
Thanks in advance.
This will not work like this and mainly because this is not how you should do it. Normally database and application should run in different containers.
This is the Dockerfile for the posgres image: https://github.com/docker-library/postgres/blob/85aadc08c347cd20f199902c4b8b4f736341c3b8/11/Dockerfile
With your changes you are overwriting the CMD of this image and that is why your init scripts don't run. You can check the startup script here: https://github.com/docker-library/postgres/blob/master/docker-entrypoint.sh
Also with Postgres it is important to make sure you remove the volumes between runs. If the volume is present then init scripts are not run anyway.
My advice is to try the approach with 2 containers instead of trying to fix this issue that is unfixable in a nice way.

Run python process in docker in linux screen in detach mode

I have written Dockerfile for my python application.
Requirement is :
Install & start mysql server.
Run the application in screen in detach mode.
Below is my Dockerfile:
FROM ubuntu:16.04
# Update OS
RUN apt-get update
RUN apt-get -y upgrade
# Install Python
RUN apt-get install -y python-dev python-pip screen npm vim net-tools
RUN DEBIAN_FRONTEND=noninteractive apt-get -y install mysql-server python-mysqldb
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY requirements.txt /usr/src/app
RUN pip install --no-cache-dir -r requirements.txt
COPY src /usr/src/app/src
COPY ./src/nsd.ini /etc/
RUN pwd
RUN cd /usr/src/app
RUN service mysql start
RUN /bin/bash -c "chmod +x src/run_demo_app.sh && src/run_demo_app.sh"
Below is the content of bash script
$ cat src/run_demo_app.sh
$ screen -dm bash -c "sleep 10; python -m src.app";
The problem is Mysql doesn't start. I need to start it manually from container.
Also, the screen becomes dead and application do not start. Manually running the script works fine.
So this is a understanding gap and nothing else. Note below issues in your docker file
Never use service command
RUN service mysql start
Docker doesn't use a init system. So never use a service command inside docker.
Don't put everything in same container
You should not put everything in the same container. So mysql should run in its own container and python in its own
Use official images
You don't need to re-invent the wheel. Use official images as much as possible. You should be using mysql and python images in your case
Use docker-compose when multiple services are needed
In your case since you are requiring multiple services, use docker-compose.
No need to use screen in docker
Screen is used when your want your process to be running even if your SSH disconnects. So that in not needed in docker. If you run your docker run or docker-compose up command with an additional -d flag then your container will automatically be launched in background

Categories

Resources