Run python script using a docker image - python

I downloaded a python script and a docker image containing commands to install all the dependencies. How can I run the python script using the docker image?

Copy python file in Docker image then execute -
docker run image-name PATH-OF-SCRIPT-IN-IMAGE/script.py
Or you can also build the DockerFile by using the RUN python PATH-OF-SCRIPT-IN-IMAGE/script.py inside DockerFile.
How to copy container to host
docker cp <containerId>:/file/path/within/container /host/path/target
How to copy host to the container
docker cp /host/local/path/file <containerId>:/file/path/in/container/file

Run in interactive mode:
docker run -it image_name python filename.py
or if you want host and port to be specified:
docker run -it -v filename.py:filename.py -p 8888:8888 image_name python filename.py

Answer
First, Copy your python script and other required files to your docker container.
docker cp /path_to_file <containerId>:/path_where_you_want_to_save
Second, open the container cli using docker desktop and run your python script.

The best way, I think, is to make your own image that contains the dependencies and the script.
When you say you've been given an image, I'm guessing that you've been given a Dockerfile, since you talk about it containing commands.
Place the Dockerfile and the script in the same directory. Add the following lines to the bottom of the Dockerfile.
# Existing part of Dockerfile goes here
COPY my-script.py .
CMD ["python", "my-script.py"]
Replace my-script.py with the name of the script.
Then build and run it with these commands
docker build -t my-image .
docker run my-image

Related

Running .env files within a docker container

I have been struggling to add env variables into my container for the past 3 hrs :( I have looked through the docker run docs but haven't managed to get it to work.
I have built my image using docker build -t sellers_json_analysis . which works fine.
I then go to run it with: docker run -d --env-file ./env sellers_json_analysis
As per the docs: $ docker run --env-file ./env.list ubuntu bash but I get the following error:
docker: open ./env: no such file or directory.
The .env file is in my root directory
But when running docker run --help I am unable to find anything about env variables, but it doesn't provide the following:
Usage: docker run [OPTIONS] IMAGE [COMMAND] [ARG...]
So not sure I am placing things incorrectly. I could add my variables into the dockerfile but I want to keep it as a public repo as it's a project I would like to display.
Your problem is wrong path, either use .env or ./.env, when you use ./env it mean a file named env in current directory
docker run -d --env-file .env sellers_json_analysis

How to send files as arguments in docker for python cli application?

I'm trying to send files as an argument in python3 cli app (which uses argp arse for parsing) which is hosted in docker. But I'm getting OSError:
Error opening b'input_file.txt' when I perform docker run -t
input_file.txt
I tried:
docker run -t docker_image_name input_file.txt
My docker file has entry point as:
ENTRYPOINT [ "python", "/src/cli_app.py" ]
You're telling your python application to look for input_file.txt, but that file doesn't exist in the container. You're not passing a file as is, just an argument/parameter. Try the following to mount your local file (I'm assuming it's in your working directory) into the container:
docker run -it -v $(pwd)/input_file.txt:/tmp/input_file.txt docker_image_name /tmp/input_file.txt
docker run docker_image_name -e IP_FILE_NAME="input_file.txt"
In your python code access the filename as environment variable $IP_FILE_NAME. This is considering the input_file.txt is present in the container. Use COPY command to copy the file from the machine you use to build to the container.
COPY input_file.txt /src/input_file.txt

How do I have docker run from the directory where the file is instead of /?

This question is ought to have been asked before but I can't find an answer. I am running a python script from my terminal using docker, like:
docker run --rm img_name -v local_path:docker_path python docker_path_to_script/script.py
when I print from script.py where the script thinks it is using cmd it says:
/
which makes no sense (not sure what the official name of that is). This causes problems cuz I have relative paths within script.py. How do I have the script run from the correct location inside docker?
Set the default WORKDIR in the image build
WORKDIR /docker_path
Set WORKDIR with -w when running the image
docker run -w /docker_path IMAGE COMMAND

Passing a file as an argument to a Docker container

A very simple Python program. Suppose the current directory is /PYTHON. I want to pass file.txt as an argument to the Python script boot.py. Here is my Dockerfile:
FROM python
COPY boot.py ./
COPY file.txt ./
RUN pip install numpy
CMD ["python", "boot.py", "file.txt"]
Then I build the Docker container with:
docker build -t boot/latest .
Then run the container
docker run -t boot:latest python boot.py file.txt
I got the correct results.
But if I copy another file, file1.txt, to the current directory (from a different directory (not /PYTHON)), then I run the container again:
docker run -t boot:latest python boot.py file1.txt
I got the following error:
FileNotFoundError: [Errno 2] No such file or directory: 'file1.txt'
so the error is due to fact that file1.txt is not in the container, but if I share this container with a friend and the friend wants to pass a very different file as argument, how do I write the Dockerfile so anybody with my container can pass very different files as argument without errors?
It won't work that way. Like you said, file1.txt is not in the container.
The workaround is to use Docker volumes to inject files from your host machine to the container when running it.
Something like this:
docker run -v /local/path/to/file1.txt:/container/path/to/file1.txt -t boot:latest python boot.py /container/path/to/file1.txt
Then /local/path/to/file1.txt would be the path on your host machine which will override /container/path/to/file1.txt on the container.
You may also make your script read from STDIN and then pass data to docker using cat. Have a look at how to get docker container to read from stdin?
The trick is to keep STDIN open even if not attached with
--interactive or -i (alias) option for Docker.
Something like:
cat /path/to/file | docker run -i --rm boot python boot.py
Or:
docker run -i --rm boot python booty.py < /path/to/file
EOF is the end of the input.
If I understand the question correctly, you are acknowledging that the file isn't in the container, and you are asking how to best share you container with the world, allowing people to add their own content into it.
You have a couple of options, either use Docker volumes, which allows your friends (and other interested parties) to mount local volumes inside your Docker containers. That is, you can overlay a folder on your local filesystem onto a folder inside the container (this is generally quite nifty when you are developing locally as well).
Or, again, depending on the purpose of your container, somebody could extend your image. For example, a Dockerfile like
FROM yourdockerimage:latest
COPY file1.txt ./
CMD ["python", "boot.py", "file1.txt"]
Choose whichever option suits your project the best.
One option is to make use of volumes.
This way all collaborators on the project are able to mount them in the containers.
You could change your Dockerfile to:
FROM python
COPY boot.py ./
COPY file.txt ./
RUN pip install numpy
ENTRYPOINT ["python", "boot.py"]
And then run it to read from STDIN:
docker run -i boot:latest -<file1.txt

Execute Python script inside a given docker-compose container

I have made a little python script to create a DB and some tables inside a RethinkDB
But now I'm trying to launch this python script inside my rethink container launched with docker-compose.
This is my docker-compose.yml rethink container config
# Rethink DB
rethink:
image: rethinkdb:latest
container_name: rethink
ports:
- 58080:8080
- 58015:28015
- 59015:29015
I'm trying to execute the script with after launching my container
docker exec -it rethink python src/app/db-install.py
But I get this error
rpc error: code = 2 desc = oci runtime error: exec failed: exec: "python": executable file not found in $PATH
Python is not found in me container. Is this possible to execute a python script inside a given container with docker-compose or with docker exec ?
First find out if you have python executable in the container:
docker exec -it rethink which python
If it exists, Use the absolute path provided by which command in previous step:
docker exec -it rethink /absolute/path/to/python src/app/db-install.py
If not, you can convert your python script to bash script, so you can run it without extra executables and libraries.
Or you can create a dockerfile, use base image, and install python.
dockerfile:
FROM rethinkdb:latest
RUN apt-get update && apt-get install -y python
Docker Compose file:
rethink:
build : .
container_name: rethink
ports:
- 58080:8080
- 58015:28015
- 59015:29015
Docker-compose
Assuming that python is installed, try:
docker-compose run --rm MY_DOCKER_COMPOSE_SERVICE MY_PYTHON_COMMAND
For a start, you might also just go into the shell at first and run a python script from the command prompt.
docker-compose run --rm MY_DOCKER_COMPOSE_SERVICE bash
In your case, MY_DOCKER_COMPOSE_SERVICE is 'rethink', and that is not the container name here, but the name of the service (first line rethink:), and only the service is run with docker-compose run, not the container.
The MY_PYTHON_COMMAND is, in your case of Python2, python src/app/db-install.py, but in Python3 it is python -m src/app/db-install (without the ".py"), or, if you have Python3 and Python2 installed, python3 -m src/app/db-install.
Dockerfile
To be able to run this python command, the Python file needs to be in the container. Therefore, in your Dockerfile that you need to call with build: ., you need to copy your build directory to a directory in the container of your choice
COPY $PROJECT_PATH /tmp
This /tmp will be created in your build directory. If you just write ".", you do not have any subfolder and save it directly in the build directory.
When using /tmp as the subfolder, you might write at the end of your Dockerfile:
WORKDIR /tmp
Docker-compose
Or if you do not change the WORKDIR from the build (".") context to /tmp and you still want to reach /tmp, run your Python file like /tmp/db-install.py.
The rethinkdb image is based on the debian:jessie image :
https://github.com/rethinkdb/rethinkdb-dockerfiles/blob/da98484fc73485fe7780546903d01dcbcd931673/jessie/2.3.5/Dockerfile
The debian:jessie image does not come with python installed.
So you will need to create your own Dockerfile, something like :
FROM rethinkdb:latest
RUN apt-get update && apt-get install -y python
Then change your docker-compose :
# Rethink DB
rethink:
build : .
container_name: rethink
ports:
- 58080:8080
- 58015:28015
- 59015:29015
build : . is the path to your Dockerfile.

Categories

Resources