Pycharm debugging manage.py commands in docker compose - python

I have a pretty simple setup. I'm running Pycharm 2018.2.3 and using docker compose to spin up 3 containers.
My Django application
NGINX to serve static
Postgres DB
I've configured the remote interpreter for debugging the container, and break point work just fine in most cases, at least when I hit my API endpoints or some other action to the django application.
What does not work, is when I run one of my manage custom manage.py custom commands. I've tried this 2 ways so far.
I setup another debug configuration in PyCharm to execute the command. This results in another container spinning up (in place of the original. Running the command, without breaking on any line breaks. Then the whole container shuts down.
I've logged into the container, run the manage.py command directly via the command line, and it execute in the container, but again no breakpoints.
The documentation seems to work in the normal case, but I can't find any help for debugging these commands in the container.
Thanks for any help or tips.

In order to debug Django commands in a Docker Container you can create a new Run/Debug Configuration with following setup:
Use a Python configuration template
Script path: absolut location of manage.py
Parameters: the Django command you want to debug/execute
!important! Python interpreter: Docker Compose interpreter

Just an update in case anybody comes across a similar problem. My personal solution was to not use the manage.py commands, but instead make these same commands available via an http call.
I found that it was easier (and often even more useful) to simply have an endpoint like myserver.com/api/do-admin-function and restrict that to administrative access.
When I put a breakpoint in my code, even running in the container, it breaks just fine as expected and allows me to debug the way I'd like

It can depends on your docker-compose.yml exact content.
See for instance the section "An interactive debugger inside a running container!" of the article "A Simple Recipe for Django Development In Docker (Bonus: Testing with Selenium)" from Adam King.
His docker-compose.yml includes:
version: "2"
services:
django:
container_name: django_server
build:
context: .
dockerfile: Dockerfile
image: docker_tutorial_django
stdin_open: true
tty: true
volumes:
- .:/var/www/myproject
ports:
- "8000:8000"
In it, see:
stdin_open: true
tty: true
[Those 2 lines] are important, because they let us run an interactive terminal.
Hit ctrl-c to kill the server running in your terminal, and then bring it up in the background with docker-compose up -d.
docker ps tells us it’s still running:
We need to attach to that running container, in order to see its server output and pdb breakpoints.
The command docker attach django_server will present you with a blank line, but if you refresh your web browser, you’ll see the server output.
Drop import pdb; pdb.set_trace() in your code and you’ll get the interactive debugger, just like you’re used to.

Related

Running a Python debugger in a Docker Image

I recently followed the following tutorial to try to debug python code in a Docker container using VSCode:
https://www.youtube.com/watch?v=qCCj7qy72Bg&t=374s
My Dockerfile looks like this:
FROM ubuntu as base
#Do standard image stuff here
#Python Debugger
From base as debugger
RUN pip3 install debugpy
ENTRYPOINT ["python3","-m","debugpy","--listen","0.0.0.0:5678","--wait-for-client"]
I have alternately tried copying the tutorial exactly and using the following ENTRYPOINT instead:
ENTRYPOINT ["python3","-m","debugpy","--listen","0.0.0.0:5678","--wait-for-client","-m"]
I have also configured a VSCode remote attach debug instance to launch.json:
{"name":"Python: Remote Attach","type":"python","request":"attach","connect":{"host":"5678","port":5678},"pathMappings":[{"localRoot":"${workspaceFolder}","remoteRoot":"."}]},
I want the debugger to either debug the current file alone in isolation, or run a file I use to run the entire project, called init.py with the debugger in the docker container.
Currently, when I build and run the docker container with
docker run -p 5678:5678 CONTAINERNAME python3 /home/init.py
It hangs and times out on the Visual Studio side.
In the video, he uses this to run the python unittest module, which is why I tried taking out the -m from the end of the command in my modified version. However, it looks like debugpy doesn't know what to do. I have tried running the docker instance before the remote debugger, or the remote debugger after the docker instance, but the error remains and the debug does not work. How can I remote debug into a docker instance using VSCode?
EDIT:
Thank you FlorianLudwig for pointing out that my original code used commas for the IP rather than the periods required.
I have edited the question to reflect this change. It removed issues where python complained about a malformed address, but it seems I am still having some sort of connection issue to the debugger.
EDIT2:
I think I figured out what caused the connection issue. It appears the visual studio default is to use the same host as the port number in question. I changed my host to 0.0.0.0 and I was able to debug by running the container then connecting to it via Visual Studio Debugging.
In your Dockerfile:
"0,0,0,0:5678" should be "0.0.0.0:5678"
To make it a valid ip address. 0.0.0.0 basically means "any" ip address.

How to debug web requests with Django and pdb?

I would like to use pdb to debug a view in Django but so far i've been unsuccessful, getting a BdbQuit error:
The view i've tried this on is a simple get request:
def get_file_names(request):
pdb.set_trace()
my_files = Files.objects.filter(user_id=request.user))
name_list += list(map(lambda x: (x.id, x.name, x.description),
my_files))
return JsonResponse({'rows': name_list})
A couple notes:
I prefer not to use Django pdb since this forces me to modify the client's request parameters.
I also do not want to call my code from pdb (since this code is being
called from the client).
Django Version 1.10.6
The app is running inside a docker container
Does anyone have a solution which works? Im finding that debugging complex web requests in python can be very tedious and it would be really amazing if pdb worked.
Note this is not a subprocess, just a simple get request (eventually i would like it to work on a more complex request but i've posted a simple example since this already fails).
Any suggestions? Suggestions here dont seem to work.
In order to run pdb inside a Django app running inside a container, you must run with the -it flags.
docker run -it .... djangoimage
If you're running detached (-d), you can attach to your container docker attach $IDCONTAINER.
If you're running with docker-compose:
services:
django:
# ...
stdin_open: true
tty: true
And then use the docker attach to attach to the Django container when you run the pdb.
https://docs.docker.com/engine/reference/commandline/attach/
https://docs.docker.com/engine/reference/run/
https://docs.docker.com/compose/compose-file/#domainname-hostname-ipc-mac_address-privileged-read_only-shm_size-stdin_open-tty-user-working_dir

Python remote debugging with docker

I'm making a flask webapp with docker, I'm looking for a way to enable pycharm debugging, so far I'm able to deploy the app using the in built docker, the app is automatically ran due to the dockerfile configs using supervisord
When I connect my remote interpretor I get the usual:
* Running on http://0.0.0.0:80/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 579-233-679
But the post I perform clearly isn't going to that interpretor as I've marked all of the routes to be break points, I'm still getting the original results from the webapp and none of the break points do anything.
I guess I'm asking:
Am I going about this the wrong way? (should I just use a VM, remote debug on that and then containerise the VM later on)
Is what I'm trying to do even possible?
Should I just manually debug everything instead if I use this method of development?
Update:
the way to correctly enable debug mode for docker is to create a docker-compose.yml, this tells pycharm what to do when you give it a docker-compose interpreter, that way you can hook onto a service, my yml looks like:
version: '3.0'
services:
web:
build: .
command: python3 app/main.py
volumes:
- .:/app
ports:
- "80:80"
- "22"
the yml file isn't generated, you make it yourself.
This enables the port that I've set flask to go to 80 and allows the debugger to connect using port 22,
I followed https://blog.jetbrains.com/pycharm/2017/03/docker-compose-getting-flask-up-and-running/ quite closely. (if anyone stumbles on to this and needs a hand then comment I'll see if I can help)

Running pudb inside docker container

I prefer pudb for python debugging. I am building python applications that run inside docker container.
Does any one know how to make pudb available inside docker container?
Thank you
You need to have pudb installed on the Docker container (this may be done adding this line to the Dockerfile: RUN pip install pudb).
You need to have the ports where you will connect to pudb open. E.g.
For a Dockerfile: add EXPOSE 6900.
For docker-compose the syntax is different:
ports:
- "6900:6900"
You need to add a line to set_trace where you want the entry point to be in the Python code. E.g.
from pudb.remote import set_trace; set_trace(term_size=(160, 40), host='0.0.0.0', port=6900)
When the code is running and reaches that point, you can connect into it with a telnet client and use pudb as you normally would to debug. In the case above, from another terminal type telnet 127.0.0.1 6900.
You can find a repository with a full working example here: https://github.com/isaacbernat/docker-pudb

How do I run Django as a service?

I am having difficulty running Django on my Ubuntu server. I am able to run Django but I don't know how to run it as a service.
Distributor ID: Ubuntu
Description: Ubuntu 10.10
Release: 10.10
Codename: maverick
Here is what I am doing:
I log onto my Ubuntu server
Start my Django process: sudo ./manage.py runserver 0.0.0.0:80 &
Test: Traffic passes and the app displays the right page.
Now I close my terminal window and it all stops. I think I need to run it as a service somehow, but I can't figure out how to do that.
How do I keep my Django process running on port 80 even when I'm not logged in?
Also, I get that I should be linking it through Apache, but I'm not ready for that yet.
Don't use manage.py runserver to run your server on port 80. Not even for development. If you need that for your development environment, it's still better to redirect traffic from 8000 to 80 through iptables than running your django application as root.
In django documentation (or in other answers to this post) you can find out how to run it with a real webserver.
If, for any other reason you need a process to keep running in background after you close your terminal, you can't just run the process with & because it will be run in background but keep your session's session id, and will be closed when the session leader (your terminal) is terminated.
You can circunvent this behaviour by running the process through the setsid utility. See your manpage for setsid for more details.
Anyway, if after reading other comments, you still want to use the process with manage.py, just add "nohup" before your command line:
sudo nohup /home/ubuntu/django_projects/myproject/manage.py runserver 0.0.0.0:80 &
For this kind of job, since you're on Ubuntu, you should use the awesome Ubuntu upstart.
Just specify a file, e.g. django-fcgi, in case you're going to deploy Django with FastCGI:
/etc/init/django-fcgi.conf
and put the required upstart syntax instructions.
Then you can you would be able to start and stop your runserver command simply with:
start runserver
and
stop runserver
Examples of managing the deployment of Django processes with Upstart: here and here. I found those two links helpful when setting up this deployment structure myself.
The problem is that & runs a program in the background but does not separate it from the spawning process. However, an additional issue is that you are running the development server, which is only for testing purposes and should not be used for a production environment.
Use gunicorn or apache with mod_wsgi. Documentation for django and these projects should make it explicit how to serve it properly.
If you just want a really quick-and-dirty way to run your django dev server on port 80 and leave it there -- which is not something I recommend -- you could potentially run it in a screen. screen will create a terminal that will not close even if you close your connection. You can even run it in the foreground of a screen terminal and disconnect, leaving it to run until reboot.
If you are using virtualenv,the sudo command will execute the manage.py runserver command outside of the virtual enviorment context, and you'll get all kind of errors.
To fix that, I did the following:
while working on the virtual env type:
which python
outputs: /home/oleg/.virtualenvs/openmuni/bin/python
then type:
sudo !!
outputs: /usr/bin/python
Then all what's left to do is create a symbolic link between the global python and the python at the virtualenv that you currently use, and would like to run on 0.0.0.0:80
first move the global python folder to a backup location:
mv /usr/bin/python /usr/bin/python.old
/usr/bin/python
that should do it:
ln -s /usr/bin/python /home/oleg/.virtualenvs/openmuni/bin/python
that's it! now you can run sudo python manage.py runserver 0.0.0.0:80 in virtaulenv context!
Keep in mind that if you are using postgres DB on your developement local setup, you'll probably need a root role.
Credit to #ydaniv

Categories

Resources