Connecting local Jupiter Notebook to Remote Putty Session - python

I have a Python script on my local Jupiter notebook. Right now, I run a command on a Putty session to generate some data that I write to an output text file and the download to my local to be able to be read into the Python script. Is there any way to be able to write a command and do this from the script itself?

You can run the commands over ssh from python via paramiko library : https://www.paramiko.org
Example from the official documentation:
client = SSHClient()
client.load_system_host_keys()
client.connect('ssh.example.com')
stdin, stdout, stderr = client.exec_command('ls -l')

Related

Python script to SSH into a jumphost and sftp from within that box

I'm currently trying to write an airflow job that will allow me to ssh into an EC2 instance and then start an sftp session with another host from within this EC2 box. My current code that I have is as follows:
def run_ssh():
hook = SSHHook(ssh_conn_id='xyz').get_conn() #returns an ssh client
stdin, stdout, stderr = hook.exec_command('sftp user#host.com;')
# This next step prompts me for password so i provide it
stdin.write('password')
logging.info(stdout.readlines())
stdin, stdout, stderr = hook.exec_command('ls')
logging.info(stdout.readlines())
When i print the final line i should be seeing some folders but instead just see ['a\n']... so it seems I'm not actually able to sftp. Are there better ways to sftp from a remote host through a python script running locally.
Any help with this is appreciated. The answer can be geared towards a simple python script as opposed to airflow.
For your literal question, see:
Pass input/variables to command/script over SSH using Python Paramiko
Though implementing an SFTP over jump host this way is not a good solution.
Use port forwarding instead:
Nested SSH using Python Paramiko
Port forwarding and the open SFTP using Python Paramiko

Executing command on remote machine using Paramiko has no effect

I am using a robot that is connected to a remote pc using localhost. I have used the Paramiko library to send commands from remote pc to my robot. I want to publish a number on /activate topic like this:
rostopic pub /activate std_msgs/Int16 "data: 1"
The problem is not my connection because the above command works using PuTTY or SSH.
I have written the following code:
import paramiko
host = "xxx.xxx.x.x"
port = 22
username = "robot"
password = "xxx"
command = "rostopic pub /activate std_msgs/Int16 'data: 1'"
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host, port, username, password)
stdin, stdout, stderr = ssh.exec_command(command)
But using this code, I see no results in my robot. Can anyone help?
If you have problems with executing commands, you need to read their (error) output to see any error messages.
Also your current code does not even complete the command. And reading the command output is the actually easiest way in Paramiko to actually get your command completed.
See Wait to finish command executed with Python Paramiko
For another common problem, see Some Unix commands fail with "<command> not found", when executed using Python Paramiko exec_command

Script executed on remote server using Python Paramiko cannot read/access local files

I'm trying to run a local Python script from my laptop (which works fine) on remote server (VPS).
The script can't read local files from my laptop from VPS
Output:
My script on pycharm.
import sys
import time
import paramiko
# Connect to remote host
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('myip', port=22, username='root', password='mypassword')
# Setup sftp connection and transmit this script
sftp = client.open_sftp()
sftp.put(r'/myscript.py', '/myscript.py')
sftp.close()
# till now everything's good. I check my VPS files i find my script uploaded there
# Run the transmitted script remotely without args and show its output.
# SSHClient.exec_command() returns the tuple (stdin,stdout,stderr)
stdout = client.exec_command('python3 /myscript.py')[1]
for line in stdout:
# Process each line in the remote output
print(line)
client.close()
sys.exit(0)
When I run script from VPS I got this issue
I can't run the script directly from VPS to check the issue because I use local files, check the screenshot:
When I remove local paths and run the script (both from pycharm and VPS), it works fine.
You cannot magically access local files from script run on a server.
I can't run the script directly from VPS to check the issue because i use local files, check img.
There's no difference between running the script in remote shell using your favourite SSH terminal client (I assume that's what you mean by "run the script directly from VPS") and running the script in remote shell using Paramiko. It still runs in the remote shell.
There's no easy way to make the client files accessible from the server. That would be a security nightmare.
Either your script has to upload the files to the server.
Or you need to run a (SFTP/FTP/whatever) server on your local machine to make your local file accessible to the world.
For an example how to run an SFTP server, see my guide:
Installing SFTP/SSH server on Windows using OpenSSH

Using paramiko when a unix server is using VShell

Use case
On a unix server , when login manually ,opens a command shell of its own to run the command.
I am trying to automate this by using paramiko , however , somehow i am not able to execute the command on command shell using paramiko
What i have done ?
I created a simple script which is able to make connection, but its not executing command on Vshell as the ouput is always coming empty.
import paramiko
import sys
ssh_client=paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(hostname=sys.argv[1],port=sys.argv[2],username=sys.argv[3],password=sys.argv[4])
command="show hwid"
stdin,stdout,stderr=ssh_client.exec_command(command)
out=stdout.read()
print out
err=stderr.read()
print err
ssh_client.close()
The same script runs perfectly fine , when its used on server where vshell is not being used
Anyhelp or suggestion on this?
stdin,stdout,stderr=ssh_client.exec_command(command)
Regarding this line of code, I suspect that the SSH server is not properly configured to allow commands to be executed in this way (this is the equivalent of ssh myserver show hwid, rather than typing it into the terminal after login).
You might want to imitate the behaviour of typing the command in after logging into the server, and for that I think this is appropriate:
shell = ssh_client.invoke_shell()
stdin, stdout, stderr = shell.exec_command(command)

Log in to remote Linux shell in Python

I want to write script on python which could execute shell commands on a remote server.
I find out that I could use something like:
# set environment, start new shell
p = Popen("/path/to/env.sh", stdin=PIPE)
# pass commands to the opened shell
p.communicate("python something.py\nexit")
But I do not understand how can I login to remote Linux server and execute shell commands there?
Look into using Paramiko or Pyro4 or fabric. All of these should do what you would like.

Categories

Resources