Docker CMD not executing, program not running - python

So I have a python project for which I have a Dockerfile, but the problem is for some magical reason when I build the docker image and then run it I don't see the result of the code I would see if I just ran it normally on pycharm or on the console, anyone knows why this happens? Its so odd.
FROM python:3.8-slim-buster
WORKDIR /app
COPY requirements.txt requirements.txt
COPY . /app
RUN pip3 install -r requirements.txt
CMD ["python3","main.py"]
The requirements.txt file is where I store all the packages that need to be installed in order to my program work.

You copy your application to wrong path.
Since you use WORKDIR you need to know (from the formal site):
The WORKDIR instruction sets the working directory for any RUN, CMD, ENTRYPOINT, COPY and ADD instructions that follow it in the Dockerfile
This means that your COPY . /app copy your application into /app/app, but when you write CMD ["python3","main.py"] this command will execute in /app. I believe the main.py file is not there, but in /app/app.
To fix it you need to change COPY . /app to COPY . .

Related

How to fix import error with modules on python docker

i'm creating an app and wanted to dockerize it. But after the build, when i do sudo docker-compose up i get the error module cogs not found at the line from cogs import FILES in my main file (josix.py).
I already checked several topics on StackOverflow about import errors with docker but none of them got me the right answer so i try with this one.
The code looks like this :
/app
josix.py
cogs/
init.py (containing FILES variable)
others py files
(init.py and other py files are inside cogs directory)
Dockerfile :
FROM python:3
WORKDIR /app
COPY Josix/* ./
RUN pip install --no-cache-dir -r requirements.txt
ENV PYTHONPATH "${PYTHONPATH}:/app/"
CMD [ "python3", "josix.py" ]
I tried to change the pythonpath, add several ones, change the import
Remove the * from your copy command to copy sub-directories as well as files (and you don't need to add to PYTHONPATH /app is already in it).
FROM python:3
WORKDIR /app
COPY Josix/ ./
RUN pip install --no-cache-dir -r requirements.txt
CMD [ "python3", "josix.py" ]

how to correctly copy requirements.txt for docker file

This is how my current folder structure looks like:
I am present in the FinTechExplained_Python_Docker folder. My Dockerfile looks like this:
FROM python:3.8-slim-buster
WORKDIR /src
COPY requirements.txt requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD [ "python", "main.py"]
However, when I run this command
docker build --tag FinTechExplained_Python_Docker .
I get this error
ERROR [3/5] COPY requirements.txt requirements.txt 0.0s
------
> [3/5] COPY requirements.txt requirements.txt:
------
failed to compute cache key: "/requirements.txt" not found: not found
What am I doing wrong?
Edit:
I also tried changing it to:
COPY str/requirements.txt requirements.txt:
but then I would still get the error that:
failed to compute cache key: "/src/requirements.txt" not found: not found
maybe the second COPY statement is also to be changed but not sure how
You need to specify the source of your COPY statements relative to the build context, like this
FROM python:3.8-slim-buster
WORKDIR /src
COPY src/requirements.txt requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
COPY src/ .
CMD [ "python", "main.py"]
When building image from Dockerfile it searches file from directory where Dockerfile is located (FinTechExplained_Python_Docker in your case).
So basically requirements located at FinTechExplained_Python_Docker/src/requirements.txt, but docker searches them at FinTechExplained_Python_Docker/requirements.txt.
To fix this you have to change 5th line to:
COPY src/requirements.txt requirements.txt
Oh, I think I got why you are having failed to compute cache key....
When you are copying file in Dockerfile using COPY you have to pass Directory where file should be saved as second argument.
Like COPY LOCAL_PATH_TO_FILE SERVER_PATH_TO_DIRECTORY
In your case:
FROM python:3.8-slim-buster
WORKDIR /src
COPY src/requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY src/ .
CMD [ "python", "main.py"]

Docker multi-stage for Flask + VueJs project

I have a project with Flask as backend and VueJS for frontend. Want to put all in the container, which will run on production server.
I need to install all dependencies (npm install) and build my static files from vue (npm run build) to get dist folder (with HTML file and assets), then build Flask project: install python, dependencies and run the server on gunicorn. After that copy my dist folder to Flask directory.
I'm read about multi-stage and try to combine it, here is my Dockerfile code:
FROM python:3.7-alpine as backend-builder
RUN mkdir /app
COPY . /app
WORKDIR /app
RUN pip install -r requirements.txt
FROM node:lts-alpine as build-stage
RUN cd ..
RUN ls
RUN mkdir /frontend
WORKDIR /frontend
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
FROM busybox
COPY --from=build-stage /frontend/dist /app/dist
RUN ls
It's going but don't copy dist to Flask directory
My project structure is:
project_folder
- app (python + flask files)
-- app.py
-- wsgi.py
-- requirements.txt
-- etc
- frontend (vuejs files, packages)
-- public
-- src
-- package.json
-- etc
- Dockerfile
What i'm doing wrong? How to write Dockerfile, to solve my problem?
The FROM busybox line creates a new image, it doesn't reference the python image. Would just build the js before you build the python so that you can copy the files in when the python build is complete.
FROM node:lts-alpine as build-stage
RUN cd ..
RUN ls
RUN mkdir /frontend
WORKDIR /frontend
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
FROM python:3.7-alpine as backend-builder
RUN mkdir /app
COPY . /app
WORKDIR /app
RUN pip install -r requirements.txt
COPY --from=build-stage /frontend/dist /app/dist

Dockerize existing Django project

I can't wrap my head around how to dockerize existing Django app.
I've read this official manual by Docker explaining how to create Django project during the creation of Docker image, but what I need is to dockerize existing project using the same method.
The main purpose of this approach is that I have no need to build docker images locally all the time, instead what I want to achieve is to push my code to a remote repository which has docker-hub watcher attached to it and as soon as the code base is updated it's being built automatically on the server.
For now my Dockerfile looks like:
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
ADD requirements.txt /code/
RUN pip install Django
RUN pip install djangorestframework
RUN pip install PyQRCode
ADD . /code/
Can anyone please explain how should I compose Dockerfile and do I need to use docker-compose.yml (if yes: how?) to achieve functionality I've described?
Solution for this question:
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
RUN pip install *name of package*
RUN pip install *name of another package*
ADD . /code/
EXPOSE 8000
CMD python3 manage.py runserver 0.0.0.0:8000
OR
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
ADD requirements.txt /code/
RUN pip install -r requirements.txt
ADD . /code/
EXPOSE 8000
CMD python3 manage.py runserver 0.0.0.0:8000
requirements.txt should be a plain list of packages, for example:
Django==1.11
djangorestframework
pyqrcode
pypng
This question is too broad. What happens with the Dockerfile you've created?
You don't need docker compose unless you have multiple containers that need to interact.
Some general observations from your current Dockerfile:
It would be better to collapse the pip install commands into a single statement. In docker, each statement creates a file system layer, and the layers in between the pip install commmands probably serve no useful purpose.
It's better to declare dependencies in setup.py or a requirements.txt file (pip install -r requirements.txt), with fixed version numbers (foopackage==0.0.1) to ensure a repeatable build.
I'd recommend packaging your Django app into a python package and installing it with pip (cd /code/; pip install .) rather than directly adding the code directory.
You're missing a statement (CMD or ENTRYPOINT) to execute the app. See https://docs.docker.com/engine/reference/builder/#cmd
Warning: -onbuild images have been deprecated.
#AlexForbes raised very good points. But if you want a super simple Dockerfile for Django, you can probably just do:
FROM python:3-onbuild
RUN python manage.py collectstatic
CMD ["python", "manage.py"]
You then run your container with:
docker run myimagename runserver
The little -onbuild modifier does most of what you need. It creates /usr/src/app, sets it as the working directory, copies all your source code inside, and runs pip install -r requirements.txt (which you forgot to run). Finally we collect statics (might not be required in your case if statics are hosted somewhere), and set the default command to manage.py so everything is easy to run.
You would need docker-compose if you had to run other containers like Celery, Redis or any other background task or server not supplied by your environment.
I actually wrote an article about this in https://rehalcon.blogspot.mx/2018/03/dockerize-your-django-app-for-local.html
My case is very similar, but it adds a MySQL db service and environment variables for code secrets, as well as the use of docker-compose (needed in macOS). I also use the python:2.7-slim docker parten image instead, to make the image much maller (under 150MB).

How to install local packages using pip as part of a docker build?

I've got a package that I want to build into a docker image which depends on an adjacent package on my system.
My requirements.txt looks something like this:
-e ../other_module
numpy==1.0.0
flask==0.12.5
When I call pip install -r requirements.txt in a virtualenv this works fine. However, if I call this in a Dockerfile, e.g.:
ADD requirements.txt /app
RUN pip install -r requirements.txt
and run using docker build . I get an error saying the following:
../other_module should either be a path to a local project or a VCS url beginning with svn+, git+, hg+, or bzr+
What, if anything, am I doing wrong here?
First of all, you need to add other_module to your Docker image. Without that, the pip install command will not be able to find it. However you cant ADD a directory that is outside the directory of the Dockerfile according to the documentation:
The path must be inside the context of the build; you cannot ADD
../something /something, because the first step of a docker build is
to send the context directory (and subdirectories) to the docker
daemon.
So you have to move the other_module directory into the same directory as your Dockerfile, i.e. your structure should look something like
.
├── Dockerfile
├── requirements.txt
├── other_module
| ├── modue_file.xyz
| └── another_module_file.xyz
then add the following to the dockerfile:
ADD /other_module /other_module
ADD requirements.txt /app
WORKDIR /app
RUN pip install -r requirements.txt
The WORKDIR command moves you into /app so the next step, RUN pip install... will be executed inside the /app directory. And from the app-directory, you now have the directory../other_module avaliable

Categories

Resources