Docker container and virtual python environment - python

I'm getting started working with Docker. I installed Docker Toolbox on Windows 10 and downloaded the desired container. I need full access to container’s filesystem with the ability to add and edit files. Can I transfer the contents of the container into a Virtual Python Environment in Windows filesystem? How to do it?

Transferring files between Windows and Linux might be a little annoying because of different line endings.
Putting that aside, sounds like you are looking to create a Docker based development environment. There are good tutorials online that walk you through setting one up, I would start with one of these
Running a Rails Development Environment in Docker. This one is about Rails, but the principles will be the same. Section 3 specifically talks about about sharing code between your host machine and the Docker container.
How To Work with Docker Data Volumes on Ubuntu 14.04 includes an brief introduction to Docker containers, different use cases for data volumes, and how to get each one working. Sharing Data Between the Host and the Docker Container section talks about what you are trying to do. This example talks about reading log files created inside the container, but the principle is the same for adding/updating files in the container.

Related

installing python in a docker container

I'm new to coding and been fiddling around with docker containers and services
I have installed a temporary vscode server on my raspberry and deployed on my local lan for accessing it from various machines
Now i've been trying to create a flask app and run it from the container and trying to figure out how to publish and run the flask web server since i can't figure out what ip should i host it on (the default i always used was host=127.0.0.1 port=8080, but that would bring me to the local machine i'm visiting it from)
So while i was troubleshooting to understand what to do with exposed ports etc i've stopped the container and changed the docker-compose file, (i have a path set for the config's permanent storage, so my vscode setting are actually saved and persistent between deployments)
But I'm having the problem that every time i stop and re deploy the container i loose my python3 installation, and have to re run apt-update, apt-upgrade, apt install python3-pip and every python packages i need for the project
Where am i going wrong?
Silly question, but where does python get installed, and why isn't it persistent since i have my config path set?
I read that python gets installed in usr/local/lib, should i also map those directories to the persistent storage folder?
how should i do that?
thanks

PyCharm synchronization between local files and (local) docker

I know that as of now (june 21) PyCharm does not support a fully remote development the way VSCode does; however, it supports a remote interpreter and in the case of ssh connection, it supports an automatic upload of local files to the server.
My question is: can I use an interpreter running in docker container and synchronize my files with the files within the container?
Apparently, I can define a remote docker interpreter and set up a "Mapping":
However, I don't see any way how to automatically upload my local files / changes to the container. When I try to configure a new deployment, I don't see any docker option:
Am I missing something, or is it like PyCharm does not support such "Docker deployment"?
(This might be related, but not a clear conclusion there...)
(UPDATE: after studying more, I'm not even sure if I can use interpreter in a docker container. It seems that pycharm only enables a docker image, and anything I would do inside docker... is gone in the next session. Or am I wrong?)

Python: Question about packaging applications docker vs pyinstaller

I have a python application that I've created an executable of, using pyinstaller. The entire python interpreter is packaged into the executable with all its pip dependencies.
So now my application can run in environments where python or python modules may not be installed, but there are still some dependencies:
1) MongoDB - This is the database my application uses, and it needs to be installed on a system for it to work of course.
2) Mosquitto - This service is required because the application uses MQTT to receive/send commands.
My current method of handling this is to use a shell script which installs mongodb and mosquitto the first time when my application is deployed somewhere. I just discovered docker, and I was wondering if it is capable of packaging these 'external' dependencies into a docker image?
Is it possible for me to have one standalone "thing" which will run in any environment regardless of whether mongoDB or mosquitto are installed there?
And how exactly would I go about doing this?
(Unrelated but this application is meant to run on a raspberry pi)
If you adopted Docker here:
You'd still have to "separately" run the external services; they couldn't be packaged into a single artifact per se. There's a standard tool called Docker Compose that provides this capability, though, and you'd generally distribute a docker-compose.yml file that describes how to run the set of related containers.
It's unusual to distribute a Docker image as files; instead you'd push your built image to a registry (like Docker Hub, but the major public-cloud providers offer this as a hosted service, there are a couple of independent services, or you can run your own). Docker can then retrieve the image via HTTP.
Docker containers can only be run by root-equivalent users. Since you're talking about installing databases as part of your bringup process this probably isn't a concern for you, but you could run a plain-Python or pyinstallered application as an ordinary user. Anyone who can run any Docker command has unrestricted root-level access on the host.

vscode run / debug python in docker instance

I'm using jupyter notebooks to prototype and I write the majority of my code as python packages using vscode and installed as so:
pip install -e .
This works well as I can test rapidly prototype in jupyter but still maintain reusable / testable code by keeping most of the heavy lifting in the package(s)
I'd like to move my python/jupyter environment to docker. Is there any way to configure vscode to work well with a "remote" development environment running in a docker container?
Since May 2019 (version 1.35), VScode remote development feature is present in the stable release. It splits the VScode program in two:
a server part that can be runned on a remote computer, container, or WSL environment
a client part, mainly the GUI, that is runned locally
When properly configured, debugging/linting/... operations will be executed inside the container. To answer your specific question, you can get a debug experience identical the one of an uncontainerized setup.
See here for a quick overview of this feature. You can find a vscode-issued tutorial on how to setup vscode with docker here.
If you expose the Jupyter instance running in the container to your machine, you may be able to specify it as a remote Jupyter server.

Beginner Docker-Compose & Django

I'm reading through the Docker Compose docs and have a question about the first code example under the heading:
Create a Django project
To create a new django project, it states that you should run the following line of code:
docker-compose run web django-admin.py startproject composeexample .
What I'm not understanding is why we should run this command in the context of docker-compose run. It's still creating the folder on our local machine. So why are we going through docker-compose to do this?
The point of Docker here is repeatability. Note that it is not the django-admin.py on your local machine that is executed (or the Python version on your local machine for that matter). It is instead the binaries that are in the container that was built in the preceding steps.
By executing the command though the 'web' container anyone with that container runs exactly the same version of the binaries and libraries. Thus removing the "it-works-on-my-machine" problem.
Of course in this example (for simplicity) the container is built on your machine just before it gets used; In a real world situation you'd share the resulting container using repositories so that everyone in your team can use it.

Categories

Resources