how to get console output from a remote computer (ssh + python) - python

I have googled "python ssh". There is a wonderful module pexpect, which can access a remote computer using ssh (with password).
After the remote computer is connected, I can execute other commands. However I cannot get the result in python again.
p = pexpect.spawn("ssh user#remote_computer")
print "connecting..."
p.waitnoecho()
p.sendline(my_password)
print "connected"
p.sendline("ps -ef")
p.expect(pexpect.EOF) # this will take very long time
print p.before
How to get the result of ps -ef in my case?

Have you tried an even simpler approach?
>>> from subprocess import Popen, PIPE
>>> stdout, stderr = Popen(['ssh', 'user#remote_computer', 'ps -ef'],
... stdout=PIPE).communicate()
>>> print(stdout)
Granted, this only works because I have ssh-agent running preloaded with a private key that the remote host knows about.

child = pexpect.spawn("ssh user#remote_computer ps -ef")
print "connecting..."
i = child.expect(['user#remote_computer\'s password:'])
child.sendline(user_password)
i = child.expect([' .*']) #or use i = child.expect([pexpect.EOF])
if i == 0:
print child.after # uncomment when using [' .*'] pattern
#print child.before # uncomment when using EOF pattern
else:
print "Unable to capture output"
Hope this help..

You might also want to investigate paramiko which is another SSH library for Python.

Try to send
p.sendline("ps -ef\n")
IIRC, the text you send is interpreted verbatim, so the other computer is probably waiting for you to complete the command.

Related

In Python, how do I get user's remote IP (their last hop) if they're connected over SSH?

I want to detect if the user is connected over SSH. In a term, the "env" command shows SSH_CONNECTION line. Accessed in Python in one of two ways:
#python:
import os
print os.getenv("SSH_CONNECTION") #works
print os.environ.get("SSH_CONNECTION") #works
But, if the user has ran my program using SUDO (as they will need to), env$ dooesn't show SSH_CONNECTION. So Python can't see it:
#sudo python:
import os
print os.getenv("SSH_CONNECTION") #not set
print os.environ.get("SSH_CONNECTION") #not set
The aim is to achieve the following:
#Detect if user is over remote IP
lRemoteIP="" #Is set if user on SSH
lStr=os.environ.get("SSH_CONNECTION") #Temp var
if lStr: lRemoteIP=lStr.split()[0].split("=")[1] #Store user's lasthop IP
#Later on in the code, for multiple purposes:
if lRemoteIP: pass #Do stuff (or not) depending on if they're on SSH
How do I retrieve SSH_CONNECTION environment variable under SUDO, when its not present in env$ ?
Or more precisely: how can I detect if the current session is via SSH when sudo?
I'm not a natural at Linuxy-type things, so be gentle with me...
[EDIT:] METHOD 2: Giving up on env$, I've tried the following:
pstree -ps $$ | grep "sshd("
If it returns anything then it means that the SSH daemon sits above the session. Ergo, it's a SSH connection. And the results are showing me the PIDs of the SSH daemons. Results of the pstree cmd:
init(1)---sshd(xxx)---sshd(xxx)---sshd(xxx)---bash(xxx)-+-grep(xxx)
But I'm struggling to get a src IP from the PID. Any ideas on this avenue?
[EDIT] METHOD 3: /run/utmp contains details of SSH logins. In python:
import os
import sys
lStr=open("/var/run/utmp").read().replace('\x00','') #Remove all those null values which make things hard to read
#Get the pseudo-session ID (pts) minus the /dev/ that it starts with:
lCurSess=os.ttyname(sys.stdout.fileno()).replace('/dev/','')
#Answer is like pts/10 (pseudo-term session number 10)
#Search lStr for pts/10
lInt=lStr.find(lCurSess.replace('/dev/',''))
#Print /var/utmp starting with where it first mentions current pts:
print lStr[lInt:]
So far, so good. This gives the following results (I've changed the IP and username to USERNAME)
pts/10/10USERNAME\x9e\x958Ym\xb2\x05\x0 74\x14pts/10s/10USERNAME192.168.1.1\xbf\x958Y\xf8\xa3\r\xc0\xa88\x01
So, when it comes to extracting the IP from the file, there's some bumf inbetween the occurances of pts/10 and the IP. What's the best way to parse it, given that (I reckon) the precise distance from the match to the IP will be different under different circumstances?
The OpenSSH daemon writes an entry to /var/run/utmp with the current terminal, the IP and the name of the user. Check the output of the w or who commands that parse /var/run/utmp.
It's just a question of getting the current terminal (similar to the tty command) and extracting the information you want.
Use pyutmp like this:
from pyutmp import UtmpFile
import os
import sys
for utmp in UtmpFile():
if os.ttyname(sys.stdout.fileno()) == utmp.ut_line:
print '%s logged from %s on tty %s' % (utmp.ut_user, utmp.ut_host, utmp.ut_line)
Then filter by using ut_pid field to parse the /proc/ut_pid/cmdline file which should contain:
sshd: ut_user [priv]
GOT IT AT LAST!!!
The "last" command has list of users and their IPs!! So simple.
It has "still logged in" marked against sessions. Filter by these
And then filter by current pts ID
To get the IP for the current SSH session in Python, do this:
import os,sys,subprocess
(out, err) = subprocess.Popen(['last | grep "still logged in" | grep "' + os.ttyname(sys.stdout.fileno()).replace('/dev/','') + '"'], stdout=subprocess.PIPE, shell=True).communicate()
RemoteIP=out.split()[2].replace(":0.0","") #Returns "" if not SSH
For readability, across multiple lines:
import os,sys,subprocess
pseudoTermID = os.ttyname(sys.stdout.fileno()).replace('/dev/','')
cmdStr = 'last | grep "still logged in" | grep "'+pseudoTermID+'"'
sp = subprocess.Popen([cmdStr], stdout=subprocess.PIPE, shell=True)
(out, err) = sp.communicate()
RemoteIP = out.split()[2].replace(":0.0","") #Returns "" if not SSH

how to interact with Paramiko's interactive shell session?

I have some Paramiko code where I use the invoke_shell method to request an interactive ssh shell session on a remote server. Method is outlined here: invoke_shell()
Here's a summary of the pertinent code:
sshClient = paramiko.SSHClient()
sshClient.connect('127.0.0.1', username='matt', password='password')
channel = sshClient.get_transport().open_session()
channel.get_pty()
channel.invoke_shell()
while True:
command = raw_input('$ ')
if command == 'exit':
break
channel.send(command + "\n")
while True:
if channel.recv_ready():
output = channel.recv(1024)
print output
else:
time.sleep(0.5)
if not(channel.recv_ready()):
break
sshClient.close()
My question is: is there a better way to interact with the shell? The above works, but it's ugly with the two prompts (the matt#kali:~$ and the $ from raw_input), as shown in the screenshot of a test run with the interactive shell. I guess I need help writing to the stdin for the shell? Sorry, I don't code much. Thanks in advance!
I imported a file, interactive.py, found on Paramiko's GitHub. After importing it, I just had to change my code to this:
try:
import interactive
except ImportError:
from . import interactive
...
...
channel.invoke_shell()
interactive.interactive_shell(channel)
sshClient.close()
You can try disabling echo after invoking the remote shell:
channel.invoke_shell()
channel.send("stty -echo\n")
while True:
command = raw_input() # no need for `$ ' anymore
... ...

Python Read the Device Manager Information

I just need to read all of the information that is listed in Device Manager with a python 2.7 script. Especially the information under 'IDE ATA/ATAPI controllers' subcategory. This is needed to detect whether the SATA drives are under AHCI or IDE mode...
My way is not perfect, but that is good solution so far for me, just for you reference. Through the devcon.exe which is in WDK(Windows Dev... Kit), and my code as below.
try:
output = subprocess.Popen(["devcon","status","*"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) #useing comma "," not space " " in Popen[]. replace the key word "*" to what you want to search.
stdout, stderr = output.communicate()
print "output: \n", output
print "stdout: \n", stdout # output here if ok.
print "stderr: \n", stderr # output if no arg
except subprocess.CalledProcessError:
print('Exception handled')
One easy way (on Windows) is to use Windows Device Manager's API. There is a Python binding here.
After installing the package, the code bellow will do fine:
from infi.devicemanager import DeviceManager
dm = DeviceManager()
dm.root.rescan()
devices = dm.all_devices
for device in devices:
print(device)

Script to capture everything on screen

So I have this python3 script that does a lot of automated testing for me, it takes roughly 20 minutes to run, and some user interaction is required. It also uses paramiko to ssh to a remote host for a separate test.
Eventually, I would like to hand this script over to the rest of my team however, it has one feature missing: evidence collection!
I need to capture everything that appears on the terminal to a file. I have been experimenting with the Linux command 'script'. However, I cannot find an automated method of starting script, and executing the script.
I have a command in /usr/bin/
script log_name;python3.5 /home/centos/scripts/test.py
When I run my command, it just stalls. Any help would be greatly appreciated!
Thanks :)
Is a redirection of the output to a file what you need ?
python3.5 /home/centos/scripts/test.py > output.log 2>&1
Or if you want to keep the output on the terminal AND save it into a file:
python3.5 /home/centos/scripts/test.py 2>&1 | tee output.log
I needed to do this, and ended up with a solution that combined pexpect and ttyrec.
ttyrec produces output files that can be played back with a few different player applications - I use TermTV and IPBT.
If memory serves, I had to use pexpect to launch ttyrec (as well as my test's other commands) because I was using Jenkins to schedule the execution of my test, and pexpect seemed to be the easiest way to get a working interactive shell in a Jenkins job.
In your situation you might be able to get away with using just ttyrec, and skip the pexpect step - try running ttyrec -e command as mentioned in the ttyrec docs.
Finally, on the topic of interactive shells, there's an alternative to pexpect named "empty" that I've had some success with too - see http://empty.sourceforge.net/. If you're running Ubuntu or Debian you can install empty with apt-get install empty-expect
I actually managed to do it in python3, took a lot of work, but here is the python solution:
def record_log(output):
try:
with open(LOG_RUN_OUTPUT, 'a') as file:
file.write(output)
except:
with open(LOG_RUN_OUTPUT, 'w') as file:
file.write(output)
def execute(cmd, store=True):
proc = Popen(cmd.encode("utf8"), shell=True, stdout=PIPE, stderr=PIPE)
output = "\n".join((out.decode()for out in proc.communicate()))
template = '''Command:\n====================\n%s\nResult:\n====================\n%s'''
output = template % (cmd, output)
print(output)
if store:
record_log(output)
return output
# SSH function
def ssh_connect(start_message, host_id, user_name, key, stage_commands):
print(start_message)
try:
ssh.connect(hostname=host_id, username=user_name, key_filename=key, timeout=120)
except:
print("Failed to connect to " + host_id)
for command in stage_commands:
try:
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command(command)
except:
input("Paused, because " + command + " failed to run.\n Please verify and press enter to continue.")
else:
template = '''Command:\n====================\n%s\nResult:\n====================\n%s'''
output = ssh_stderr.read() + ssh_stdout.read()
output = template % (command, output)
record_log(output)
print(output)

Python subprocess - run multiple shell commands over SSH

I am trying to open an SSH pipe from one Linux box to another, run a few shell commands, and then close the SSH.
I don't have control over the packages on either box, so something like fabric or paramiko is out of the question.
I have had luck using the following code to run one bash command, in this case "uptime", but am not sure how to issue one command after another. I'm expecting something like:
sshProcess = subprocess.call('ssh ' + <remote client>, <subprocess stuff>)
lsProcess = subprocess.call('ls', <subprocess stuff>)
lsProcess.close()
uptimeProcess = subprocess.call('uptime', <subprocess stuff>)
uptimeProcess.close()
sshProcess.close()
What part of the subprocess module am I missing?
Thanks
pingtest = subprocess.call("ping -c 1 %s" % <remote client>,shell=True,stdout=open('/dev/null', 'w'),stderr=subprocess.STDOUT)
if pingtest == 0:
print '%s: is alive' % <remote client>
# Uptime + CPU Load averages
print 'Attempting to get uptime...'
sshProcess = subprocess.Popen('ssh '+<remote client>, shell=True,stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
sshProcess,stderr = sshProcess.communicate()
print sshProcess
uptime = subprocess.Popen('uptime', shell=True,stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
uptimeProcess,stderr = uptimeProcess.communicate()
uptimeProcess.close( )
print 'Uptime : ' + uptimeProcess.split('up ')[1].split(',')[0]
else:
print "%s: did not respond" % <remote client>
basically if you call subprocess it creates a local subprocess not a remote one
so you should interact with the ssh process. so something along this lines:
but be aware that if you dynamically construct my directory it is suceptible of shell injection then END line should be a unique identifier
To avoid the uniqueness of END line problem, an easiest way would be to use different ssh command
from __future__ import print_function,unicode_literals
import subprocess
sshProcess = subprocess.Popen(['ssh',
'-tt'
<remote client>],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE,
universal_newlines=True,
bufsize=0)
sshProcess.stdin.write("ls .\n")
sshProcess.stdin.write("echo END\n")
sshProcess.stdin.write("uptime\n")
sshProcess.stdin.write("logout\n")
sshProcess.stdin.close()
for line in sshProcess.stdout:
if line == "END\n":
break
print(line,end="")
#to catch the lines up to logout
for line in sshProcess.stdout:
print(line,end="")

Categories

Resources