Send ctrl-C to OSX Terminal in an SSH Session in Python - python

This solution requires me to run Python on the same machine as the process I am trying to terminate.
However, I'm running Python locally and have the process running over SSH in Terminal. How do I send the terminate command in this situation?

SSH using pexpect after setting up ssh-keygen with the server so that it doesn't require a password from your machine:
import pexpect
ssh_command = 'ssh user#your.server.com'
child = pexpect.spawn(ssh_command)
default_prompt = 'user#server:~/# '
child.expect(default_prompt)
kill_command = 'killall process_name'
child.sendline(kill_command)
If you don't use ssh-keygen, you will need to work your password login into the pexpect script before the default_prompt line.
Just attach this script to a hotkey (e.g. ctrl+alt+c) using Alfred.

Related

Safely and Asynchronously Interrupt an Infinite-Loop Python Script started by a BASH script via SSH

My Setup:
I have a Python script that I'd like to run on a remote host. I'm running a BASH script on my local machine that SSH's into my remote server, runs yet another BASH script, which then kicks off the Python script:
Local BASH script --> SSH --> Remote BASH script --> Remote Python script
The Python script configures a device (a DAQ) connected to the remote server and starts a while(True) loop of sampling and signal generation. When developing this script locally, I had relied on using Ctrl+C and a KeyboardInterrupt exception to interrupt the infinite loop and (most importantly) safely close the device sessions.
After exiting the Python script, I have my BASH script do a few additional chores while still SSH'd into the remote server.
Examples of my various scripts...
local-script.sh:
ssh user#remotehost "remote-script.sh"
remote-script.sh:
python3 infinite-loop.py
infinite-loop.py:
while(true):
# do stuff...
My Issue(s):
Now that I've migrated this script to my remote server and am running it via SSH, I can no longer use the KeyboardInterrupt to safely exit my Python script. In fact, when I do, I'll notice that the device that was being controlled by the Python script is still running (the output signals from my DAQ are changing as though the Python script is still running), and when I manually SSH back into the remote server, I can find the persisting Python script process and must kill it from there (otherwise I get two instances of the Python script running on top of one another if I run the script again). This leads me to believe that I'm actually exiting my remote-side BASH script SSH session that was kicked off by my local script and leaving my remote BASH and Python scripts off wandering on their own... (updated, following investigation outlined in the Edit 1 section)
In summary, using Ctrl+C while in the remote Python script results in:
Remote Python Script = Still Running
Remote BASH Script = Still Running
Remote SSH Session = Closed
Local BASH Script = Active ([Ctrl]+[C] lands me here)
My Ask:
How can I asynchronously interrupt (but not fully exit) a Python script that was kicked off over an SSH session via a BASH script? Bonus points if we can work within my BASH --> SSH --> BASH --> Python framework... whack as it may be. If we can do it with as few extra pip modules installed on top, you just might become my favorite person!
Edit 1:
Per #dan's recommendation, I started exploring trap statements in BASH scripts. I have yet to be successful in implementing this, but as a way to test its effectiveness, I decided to monitor process list at different stages of execution... It seems that, once started, I can see my SSH session, my remote BASH script, and its subsequent remote Python script start up processes. But, when I use Ctrl+C to exit, I'm kicked back into the top-level "Local" BASH script and, when I check the process list of my remote server, I see both the process for my remote BASH script and my remote Python script still running... so my Remote BASH script is not stopping... I'm, in fact, ONLY ending my SSH session...
In combining the suggestions from the comments (and lots of help from a buddy), I've got something that works for me:
Solution 1:
In summary, I made my remote BASH script record its Group Process ID (GPID; that which is also assigned to the Python script that is spawned by the remote BASH script) to a file, and then had the local BASH script read that file to then kill the group process remotely.
Now my scripts look like:
local-script.sh
ssh user#remotehost "remote-script.sh"
remotegpid=`ssh user#ip "cat gpid_file"`
ssh user#ip "kill -SIGTERM -- -$remotegpid && rm gpid_file"
# ^ After the SSH closes, this goes back in to grab the GPID from the file and then kills it
remote-script.sh
ps -o pgid= $$ | xargs > ~/Desktop/gpid_file
# ^ This gets the BASH script's GPID and writes it to a file without whitespace
python3 infinite-loop.py
infinite-loop.py (unchanged)
while(true):
# do stuff...
This solves only most of the problem, since, originally I had set out to be able to do things in my Python script after it was interrupted and before exiting into my BASH scripts, but it turned out I had a bigger problem to catch (what with the scripts continuing to run even after closing my SSH session)...

Python: Unable to connect SSH with paramiko

This is my first time using paramiko. I'm trying to establish an SSH session to a test Amazon Linux 2 instance where I've enabled password authentication, since that doesn't come enabled by default and restarted the SSH daemon on the box. I also made sure that I could connect with SSH via the normal SSH program using the username / password I put in the Python program.
When I run the Python code below, everything looks good and it waits for input and keeps the program running, but when I'm logged into the Amazon instance, I don't see the paramiko user logged in (I did a "w" and a "who" command). In fact, I have no evidence server-side that Paramiko ever connects successfully to begin with.
#!/usr/bin/env python3
import pprint
import boto3
import os
import paramiko
os.system('clear')
pp = pprint.PrettyPrinter(indent=4)
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('X.X.X.X',username='the_username',password='the_password',port=22)
get_input = input("Preventing program from closing and keeping SSH connectiion alive...")
who shows interactive shell sessions only.
Your code only connects. It does not start a shell, let alone an interactive shell.
See List all connected SSH sessions?

Using paramiko when a unix server is using VShell

Use case
On a unix server , when login manually ,opens a command shell of its own to run the command.
I am trying to automate this by using paramiko , however , somehow i am not able to execute the command on command shell using paramiko
What i have done ?
I created a simple script which is able to make connection, but its not executing command on Vshell as the ouput is always coming empty.
import paramiko
import sys
ssh_client=paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(hostname=sys.argv[1],port=sys.argv[2],username=sys.argv[3],password=sys.argv[4])
command="show hwid"
stdin,stdout,stderr=ssh_client.exec_command(command)
out=stdout.read()
print out
err=stderr.read()
print err
ssh_client.close()
The same script runs perfectly fine , when its used on server where vshell is not being used
Anyhelp or suggestion on this?
stdin,stdout,stderr=ssh_client.exec_command(command)
Regarding this line of code, I suspect that the SSH server is not properly configured to allow commands to be executed in this way (this is the equivalent of ssh myserver show hwid, rather than typing it into the terminal after login).
You might want to imitate the behaviour of typing the command in after logging into the server, and for that I think this is appropriate:
shell = ssh_client.invoke_shell()
stdin, stdout, stderr = shell.exec_command(command)

How do I use PuTTY to execute commands on my server from a script?

With the ssh command one can pass it arguments to tell it to run commands on the remote server. I am trying to achieve the same thing, but with PuTTY.
I have PuTTY on my Windows machine, installed to C:. I am trying to invoke it from a local Python script, and have it invoke a command show system info on the server.
This is the sort of pseudo-Python that I am thinking of:
import ssh
server=ssh.Connection(host='10.201.20.240')
result=server.execute('show system info')
and more specifically using PuTTY from the Python script, something like this (which is of course not right, otherwise I wouldn't be asking this)
command = '"c:\Putty\putty.exe" -ssh user#10.201.20.240 -pw admin 10.201.20.240 '
result=command.execute('show system info')
subprocess.Popen(command)
If this were the ssh command I would be using ssh … user#10.201.20.240 show system info and suchlike.
What is the command-line syntax for the Windows PuTTY program for doing this?

Log in to remote Linux shell in Python

I want to write script on python which could execute shell commands on a remote server.
I find out that I could use something like:
# set environment, start new shell
p = Popen("/path/to/env.sh", stdin=PIPE)
# pass commands to the opened shell
p.communicate("python something.py\nexit")
But I do not understand how can I login to remote Linux server and execute shell commands there?
Look into using Paramiko or Pyro4 or fabric. All of these should do what you would like.

Categories

Resources