Save file from Python script to Docker Container - python

I have a simple Python script, inside of a Docker container, that:
-Makes a call to an external API
-A license file is returned from the API
I want to be able to save the returned license file, to my directory inside of the docker container.
This is what I have done:
r = requests.post("http://api/endpoint", headers=headers, data=data, auth=("", "")
f = open('test.lic', 'w+')
f.write(r.content)
f.close()
My understanding is that this would create a test.lic file if one did not already exist, or open the existing test.lic file, and write the content of the request object to the test.lic. However this is not working. No file is being saved to my directory inside of the Docker container. If I run these lines from a python shell it works so I'm guessing it has something to do with being inside of a Docker container.

It could be that the file is getting saved, but it is in the working directory in the container.
The docker docs say:
The default working directory for running binaries within a container is the root directory (/), but the developer can set a different default with the Dockerfile WORKDIR command.
So the file may be ending up in the root directory / of your container.
You could add a print of os.getcwd() to your script (if you are able to see the output... which you might need to use the docker logs command) to verify this. Either way, the file will likely be in the location returned by os.getcwd().

In my case the file was saved to a different container. Check it if you have multiple containers in the docker-compose.

Related

Docker with python to copy file specified, unzip and do actions

Setup
Project
src
main.py
Dockerfile
Dockerfile (raw, needs to be revamped)
FROM python:3
ADD src/main.py /
RUN chmod +x main.py
RUN /usr/local/bin/python -m pip install --upgrade pip
COPY . /opt/app
RUN pip install -r /opt/app/requirements.txt
ADD / /usr/local
ENTRYPOINT [ "python", "./main.py" ]
main.py
if __name__ == '__main__':
if len(sys.argv) == 2:
main(sys.argv[1])
def main(logs_file_archive):
unzip_logs(logs_file_archive) # unzips all logs from the provided path to the folder with the same name as archive/Extracted directory
create_csv_files() # creates CSV files needed to plot graph
process_files() # populate CSV files generated with the right data
plot(logs_file_archive) # build this data representation
Actual/desired behaviour
Actual:
2022-01-17T22:05:31.547047838Z File "//./main.py", line 214, in <module>
2022-01-17T22:05:31.547210046Z main(sys.argv[1])
2022-01-17T22:05:31.547259438Z File "//./main.py", line 187, in main
2022-01-17T22:05:31.547670294Z unzip_logs(logs_file_archive)
2022-01-17T22:05:31.547732344Z File "//./main.py", line 54, in unzip_logs
2022-01-17T22:05:31.548296998Z with zipfile.ZipFile(file_path, "r") as zip_ref:
2022-01-17T22:05:31.548350898Z File "/usr/local/lib/python3.10/zipfile.py", line 1240, in __init__
2022-01-17T22:05:31.549638566Z self.fp = io.open(file, filemode)
2022-01-17T22:05:31.549692977Z FileNotFoundError: [Errno 2] No such file or directory: '/Users/user/PerfReader/misc/archive.zip'
No such file or directory: '/Users/user/PerfReader/misc/archive.zip' is expected well... because there is no such file in the Docker machine.
Desired: container runs using Dockerfile, data processes, the plot is displayed real-time or saved as a file and transferred to host
Question/issue description
I am not entirely sure how to transfer the file specified to the Docker container. I read https://docs.docker.com/storage/volumes/ but it doesn't provide any examples so I seek examples of how volumes can be mounted.
Provided that my plot() in main.py does plot data properly, what are my options on displaying this data (output of the whole exercise is the plot)? Can I display the plot in real-time from Docker? Or is my only option is to generate a plot and then transfer it back to the host machine using matplotlib.pyplot.savefig?
First Question:
There are multiple ways to access the host files. You can use the copy or add commands like what you are doing in your Docker file and the file will be copied during building the image. Furthermore, you can use the binding mount, it allows you to access the host directory that was binded, something like you are running the container in this directory.
Second Question:
Docker doesn't support GUI or accessing the host display. However, you can allow it to do so using Xauth. Consider the following link for the steps.
Either way, I don't encourage using Docker for what you are doing especially the plotting part, python virtual environment would be more than enough.

Writing Outputs Using Python In A Docker Container

I am trying to write a text (.txt) file to a local Desktop folder on Windows 10 after building a docker image (docker_run_test). The docker build seems to work seamlessly using "docker build -t docker_run_test .", running the command from the working directory on the Desktop where the Dockerfile and Python script reside (C:/Users/mdl518/Desktop/docker_tests/). The script is a simple print statement and writing of the print statement to a .txt file, but I cannot locate the output .txt. Below is the associated Dockerfile, and the Python script.
The Dockerfile:
FROM python:3
ADD docker_test.py ./
RUN pip install pandas
CMD ["python3","./docker_test.py"]
The Python script (docker_test.py):
import os
print("This is a Docker test.")
with open('docker_test.txt', 'w') as f:
f.write("This is a Docker test.")
I have searched the contents of the Docker image as well, but cannot locate the output .txt within the image either. Any assistance is most appreciated!
You have to mount/bind the folder where you want to see the results into the container.
Change the output filename to write in another folder, let say /output
with open('/output/docker_test.txt', 'w') as f:
And then ask Docker to bind host folder %HOMEPATH%\Desktop to container /output :
docker run -v %HOMEPATH%\Desktop:/output docker_run_test
Not sure for %HOMEPATH% syntax as I'm a Linux user.

Python does not create .txt file

I have a python script that takes an array of BeautifulSoup tags, loops through them and prints the text property of each tag, as well as writing it to a .txt file. Unfortunately, this .txt file never gets created. This is my code:
with open('output/scrape.txt', 'a+') as text_file:
for headline in soup.select('#top-news span[class*="headline"]'):
text_file.write(headline.text)
print(headline.text)
The print statement is working, but the attempt to create and write to the file is not. I've tried a simpler write operation, and that doesn't work either:
text_file = open("output/scrape.txt", "a+")
text_file.write("Test")
text_file.close()
The output folder exists as you can see here:
...and I've confirmed the folder allows reading and writing:
The script is executed from a Dockerfile:
FROM python:3.8.2-slim
WORKDIR /src
RUN pip install --upgrade -v pip \
lxml \
requests \
beautifulsoup4
COPY ./scrape.py .
RUN mkdir -p /src/output
CMD python scrape.py
The Dockerfile you have provided copies the ./scrape.py script into the container. But it does not share any files between your container and your host computer. You will need to set up a way of sharing the files that get created inside your container with your host. This is usually with the use of Volumes or bind mounts. Docker Use Volumes Doc
My command to mount a volume to the container when I run it had the incorrect directory. I was using docker run -v output:/output nuprizm when I should've been using docker run -v output:/src/output nuprizm.

how to provide file path from binded directory to program in docker

I have a python code, that works on files. It read files, do something etc.
As an input to the program, you have to provide path to a file.
And here is a question. After I dockerize my program, how it will find a file from host system (outside the container)?
I've read that I will have to bind that host directory with the containter. I think I can handle creation of docker file. My question is simpler. How to pass file from binded directory to a docker file. If I bind "C:\user\desktop\test\" directory, and later, if I run docker file with "C:\user\desktop\test\test.file" argument, will this work? Or do I have to (somehow) indicate that selected directory is outside the container? Most of tutorials tells that you have to bind directory, but I couldn't find information about how to actualy use files from the binded directory.
If you want to copy a file into the Docker image because it's not going to be changed during the program's usage, use the COPY command in the Dockerfile:
COPY path/to/file/on/host path/to/file/in/image
But it looks like you want to dynamically set the file's path at runtime. In that case, don't copy the file or path in the Dockerfile. Instead, use a docker-compose.yml to run the image built with your Dockerfile:
version: '3.7'
services:
app:
build:
context: .
volumes:
- type: bind
source: ./path/to/folder/containing/file
target: /path/the/file/will/have/in/container

Using docker entrypoint and files outside docker

I'm writing a python script which to create AWS CloudFormation change-sets to simplify the execution I want to add this script to a Docker image and run it as the entrypoint.
To achieve that, I have to read the CFs template file and parameters file, both are in json format.
When I execute the script on the local shell environment everything works as expected.
When I run the docker container and specify the files. The script says that it could not find the file.
Now my question is how can I enable the container getting access to this files?
docker pull cf-create-change-set:latest
docker run cf-create-change-set:latest --template-body ./template.json --cli-input-json ./parameters.json
Traceback (most recent call last):
File "/app/cf-create-change-set.py", line 266, in <module>
with open(CLI_INPUT_JSON, 'r') as f:
FileNotFoundError: [Errno 2] No such file or directory: './template.json'
Here is my dockerfile:
FROM AWS_ACCOUNT_ID.dkr.ecr.AWS_REGION.amazonaws.com/cicd/docker-base-image:iat
WORKDIR /app
# Copy app data
COPY app/requirements.txt .
COPY app/cf-create-change-set.py .
RUN pip3 install --no-cache-dir -r /app/requirements.txt
ENTRYPOINT [ "/app/cf-create-change-set.py" ]
The reason for the error is that the file does not exist inside the container while it exists in your filesystem. There are at least two possible solutions.
You can either ADD the files into the container at build stage (build your own Dockerfile) or map a local directory to a directory inside the container:
docker run -v ./templates:/templates image:tag --template-body /templates/template.json
This way when you run the container it will have the same file located at /templates/template.json as the contents of your local templates folder. Read more about bind mounts.

Categories

Resources