vscode run / debug python in docker instance - python

I'm using jupyter notebooks to prototype and I write the majority of my code as python packages using vscode and installed as so:
pip install -e .
This works well as I can test rapidly prototype in jupyter but still maintain reusable / testable code by keeping most of the heavy lifting in the package(s)
I'd like to move my python/jupyter environment to docker. Is there any way to configure vscode to work well with a "remote" development environment running in a docker container?

Since May 2019 (version 1.35), VScode remote development feature is present in the stable release. It splits the VScode program in two:
a server part that can be runned on a remote computer, container, or WSL environment
a client part, mainly the GUI, that is runned locally
When properly configured, debugging/linting/... operations will be executed inside the container. To answer your specific question, you can get a debug experience identical the one of an uncontainerized setup.
See here for a quick overview of this feature. You can find a vscode-issued tutorial on how to setup vscode with docker here.

If you expose the Jupyter instance running in the container to your machine, you may be able to specify it as a remote Jupyter server.

Related

PyCharm synchronization between local files and (local) docker

I know that as of now (june 21) PyCharm does not support a fully remote development the way VSCode does; however, it supports a remote interpreter and in the case of ssh connection, it supports an automatic upload of local files to the server.
My question is: can I use an interpreter running in docker container and synchronize my files with the files within the container?
Apparently, I can define a remote docker interpreter and set up a "Mapping":
However, I don't see any way how to automatically upload my local files / changes to the container. When I try to configure a new deployment, I don't see any docker option:
Am I missing something, or is it like PyCharm does not support such "Docker deployment"?
(This might be related, but not a clear conclusion there...)
(UPDATE: after studying more, I'm not even sure if I can use interpreter in a docker container. It seems that pycharm only enables a docker image, and anything I would do inside docker... is gone in the next session. Or am I wrong?)

Python: Question about packaging applications docker vs pyinstaller

I have a python application that I've created an executable of, using pyinstaller. The entire python interpreter is packaged into the executable with all its pip dependencies.
So now my application can run in environments where python or python modules may not be installed, but there are still some dependencies:
1) MongoDB - This is the database my application uses, and it needs to be installed on a system for it to work of course.
2) Mosquitto - This service is required because the application uses MQTT to receive/send commands.
My current method of handling this is to use a shell script which installs mongodb and mosquitto the first time when my application is deployed somewhere. I just discovered docker, and I was wondering if it is capable of packaging these 'external' dependencies into a docker image?
Is it possible for me to have one standalone "thing" which will run in any environment regardless of whether mongoDB or mosquitto are installed there?
And how exactly would I go about doing this?
(Unrelated but this application is meant to run on a raspberry pi)
If you adopted Docker here:
You'd still have to "separately" run the external services; they couldn't be packaged into a single artifact per se. There's a standard tool called Docker Compose that provides this capability, though, and you'd generally distribute a docker-compose.yml file that describes how to run the set of related containers.
It's unusual to distribute a Docker image as files; instead you'd push your built image to a registry (like Docker Hub, but the major public-cloud providers offer this as a hosted service, there are a couple of independent services, or you can run your own). Docker can then retrieve the image via HTTP.
Docker containers can only be run by root-equivalent users. Since you're talking about installing databases as part of your bringup process this probably isn't a concern for you, but you could run a plain-Python or pyinstallered application as an ordinary user. Anyone who can run any Docker command has unrestricted root-level access on the host.

Using pycharm to debug django application with python3 as docker container within a vagrant instance

I set up an ubuntu vagrant instance as virtual machine and installed docker in it. I also have python as docker container within vagrant instance. Is there anyway to debug django application using pycharm directively with python in this case?
Thanks!
It is possible indeed but I very much doubt you really need to use such a complex setup. Isn't it possible to run Docker on your OS directly?
Anyway, if you are confident. PyCharm at the moment does not "natively" support remote Docker daemon usage, so you have to tweak a bunch of options manually. I wrote a detailed guide in the dedicated ticket in PyCharm's bug tracker: https://youtrack.jetbrains.com/issue/PY-33489 Remote machine in your case is a Vagrant VM.
P.S. Please vote for the PY-33489 ticket if you want such support in PyCharm simplified.

In PyCharm Professional, where did the ability to run docker and docker-compose Python interpreters go?

I've just upgraded to PyCharm Professional 2018.3.3 - and I've noticed that a critical feature seems to have been removed:
PyCharm professional normally allows Python interpreters to be hosted inside Docker containers, optionally configured by docker-compose. In the latest update, this feature seems to have vanished.
Is there another way to add a Python interpreter that's inside a Docker container? Is this a bug or have JetBrains intentionally removed this feature?
Problem: The "Docker Python" plugin had been remotely disabled.
Solution: Ensure that the plugin is installed, and enabled.
Possible cause: I was using a plugin which synchronized configuration between installations.

Django in Docker using PyCharm

I'm trying to develop a database manager in Django and want to develop and deploy it in Docker. As my IDE, I'd like to continue using PyCharm, but I'm having trouble understanding how it interacts with Docker.
I am new to Docker and its integration in PyCharm. My system runs Windows 10 and Docker for Windows.
I already tried using PyCharm's remote interpreter, but I have to activate the port forwarding manually (using Kitematic), since PyCharm does somehow not forward the exposed port automatically.
I also tried using a "Docker Deployment" run configuration. However, I can't get requests to localhost:8000 to get through to the Django server. All I get are empty response errors.
(Note: The bold issue was addressed in the accepted answer.)
It would really help me to have an explanation of how PyCharm's two options (remote interpreter and docker deployment) really work and ideally have an up-to-date tutorial for setting up Django with it. Unfortunately I could only find outdated tutorials and JetBrain's help pages are either outdated or do not explain it in enough detail.
Could someone help me out and guide me through this or point me to good resources?
Assuming you have the latest Docker (for Mac or for Windows) along with an updated version of PyCharm, you could achieve the port forwarding (binding) this way:
Create a new run configuration
Select your Docker server in the Deployment tab. If nothing shows, create a new one. Test that it actually works by clicking View > Tools Windows > Docker and connecting to the docker server. You should see the existing images and running containers.
In the Container tab, make sure to add the right Ports bindings.
An important note
Make sure that you are running your Django server on 0.0.0.0:8000 and not localhost:8000

Categories

Resources