PyCharm remote debug using remote container Python interpreter - python

My situation is that I have set up a container in a remote server, and inside the container, there is a virtual environment. I'm using the python interpreter inside this virtual environment in this container, not the one on the host.
From my local machine, I can open up PyCharm, and use Tools->Deployment->Configuration to easily set up a remote connection. And For a specific project, I can set up the interpreter by clicking Files->Settings->Project Interpreter. However, it seems that I can only select the host Python interpreter(/usr/bin/python) on the remote server, not the one inside the virtual environment in the container. How could I set up using this interpreter?
I googled but can't find exact solution. I don't think I need to install Docker locally because my Docker is on the remote server side, right?

In similar way you are connecting to remote host - you would need to setup container with same capabilities e.g. set ssh server running on there. Then you should expose the port into public world or use nested ssh tunnel, which would be better alternative.
Another interesting approach (maybe recommended) is to forward Docker socket from the remote machine so, that you local Docker CLI uses this socket for sending commands to remote host. Theoretically, then you could add this container directly in PyCharm, when you set correct Docker host address there.
Further, virtual environments on other than local host systems are not supported natively by PyCharm. However, you could try to add path of python and see if it works e.g. venv/bin/python from project directory.

Related

How to create directory in remote system using python script

I want to create a directory in my remote system using python script or by socket programming. I have remote system's Username, password and IP address. I am able to do this in my local machine but not in remote. Please help!
Download Putty then connect to remote system) and in terminal write mkdir foldername
To create a directory on a remote machine, you will have to first connect to it.Telnet and SSH and SSH is used to connect to remote machines. Obviously TELNET or SSH service should be running on the remote machine, otherwise you won't be able to connect.Since in case of Telnet,data is transfered in plain text, it's better to use SSH protocol.
Once connected to the remote machine using SSH, you will be able to execute commands on the remote machine.
Now since you want to do everything in Python, you will have to write a complete SSH client in Python. Which is greate for learning, because you will learn about socket programming and cryptography.
If you are in a hurry, you can use a good SSH library.
If you are getting network connection error, please check whether SSH is installed in the remote machine or not. If yes, then check firewall settings.

How can I set docker volume to directory in another computer? How can I grab data in different computer for jupyter running in docker container?

So I have a computer let's call it "local" a laptop that I use to access a host computer behind a VPN at work. Let's call this computer "host". Now, I can setup a jupyter notebook running inside docker on host and access docker container and the jupyter running on host from my local computer as if I'm just running jupyter on local. Now this is the normal common setup.
Question is, I have a computer/database with all the data, let's call it "data_server" that is connected with host through fast ethernet LAN that as the name says, holds all the data I want to work on on host. data_server is just that a linux or BSD server and I can ssh or sftp from host to data_server just fine and perform few commands but it really has no computation capability.
What I want is able to have something like
docker run -v data_server/data/I/want : ~/local/workspace/data
and work as if the data from the server is part of the volume in docker. Is there a way to do something like this? If not, what is the best practice to access the data on different computer from jupyter running on docker?
You can use NFS for that.
You can mount the volume as if it were local, but that folder is on your server.
First you need to set up an NFS server (there are docker images for that if you want)
Then just mount the directory on your client

Problems connecting pycharm to remote interpreter via docker

I'm trying to connect pycharm 2016.2 professional to remote python interpreter via docker. I have docker for windows version 1.12.0-rc2-beta20 (the one that uses hyper-v on windows 10; not virtualbox). In pycharm, there's dialog that asks for the docker API URL which is by default populated with: https://192.168.99.100:2376
But that doesn't connect. I also tried localhost:2376 and 10.0.75.0:2376 (10.0.75.0 is the internal IP for vEthernet (DockerNAT)) and they do not work. Is the integration just not supported with the non-virtualbox docker or am I entering the wrong thing?

Connected to ssh session via my browser. How can I access my local files through the browser?

I'm connected to a VM on a private network at address 'abc.def.com' using ssh, and on that VM there's an application that hosts a Python web app (IPython Notebook) that I can access by pointing my local browser to 'abc.def.com:7777'.
From that web app I can call shell commands by preceding them with '!', for example !ls -lt will list the files in the VM current working directory. But since I'm using my own laptop's browser, I think I should be able to run shell commands on my local files as well. How would I do that?
If that's not possible, what Python/shell command can I run from within the web app to automatically get my laptop's IP address to use things like scp? I know how to get my IP address, but I'd like to create a program that will automatically enable scp for whoever uses it.
You have ssh access so you could possibly write a python function that would let you transfer files via scp the secure copy command which uses ssh to communicate. If you exchange keys with the server you wouldn't have to put in a password so I see no problem from that standpoint. The issue is if you have an address for your local machine to be accessed from the server.
I work on various remotes from my laptop all day and from my laptop to the sever I could have this function:
def scp_to_server(address, local_file, remote_file):
subprocess.call(['scp',local_file,"myusername#{}:{}".format(address, remote_file)])
that would copy a file from my local machine to the remote provided the paths were correct, I have permissions to copy the files, and my local machine's id_rsa.pub key is in the ~/.ssh/authorized_keys file on the remote.
I have no way to initiate a secure copy from the remote to my local machine however because I don't have an address to access the local machine from that I can "see" on the remote.
If I open the terminal on my laptop and run hostname I see mylaptop.local and on the remote I see remoteserver#where.i.work.edu but the first is a local address I can see it from other machines on my LAN at home, (because I have configured that) but I can't see mylaptop.local from the remote. I know there is a way to configure that so I could find my laptop at home from anywhere, but I never had the need to do that (since I bring the laptop with me) so I can't help you there. I think there are a few more hurdels to go-over than you would like.
You could implement the function above on your local machine and transfer the files that way though.

Issue connecting to Docker container

I am trying to connect a script on a Docker host to a script on a Docker container.
The scripts are using Python's remote queue manager, and they work perfectly outside of Docker, so I'm quite sure the issue is with my Docker configuration or my understanding of Docker port forwarding.
The script on the container binds correctly to (localhost,5800), and I verified the script does not crash.
I've tried getting the script to connect to the IP address of the container on port 5800, and that doesn't work (Connection refused). I've also tried using the -p flag and forwarding 5800 to a random port, then connecting to (localhost,randomport) from the Docker host and that doesn't work either (Connection refused).
Again, the script is definitely running, since the issue occurs even when I get a shell on the container and manually launch the script, ensuring it successfully launches the server and does not shut it down.
To me this seems like the exact same problem as running a webserver within a Docker container. Why is this not working? The scripts work outside of Docker just fine.
https://github.com/hashme/thistle/tree/flask_thistle
(see room.py for container script and app.py for host script; I'm not running the scripts exactly but hacking away in a REPL, so I've adjusted many parameters without success)
To replicate the problem, first run ./container.sh, then (in a REPL) import app and create a MessagePasser with some IP address and port number. Running the app.py script does nothing.
The script on the container binds correctly to (localhost,5800)
You need to make sure that within the container the script binds to the "0.0.0.0" (all interfaces) address, not localhost (loopback). Otherwise it won't be able to accept any external connections.

Categories

Resources