How to handle SSH-session using pexpect? - python

I am trying to automate a small task that requires several steps and some of the steps would be identical for all the "devices":
- ssh login
- run a command
- clean after itself
I have a script that uses pexpect but for every function (task) I have to establish SSH connection which is lame.
What I am trying to do is kind of like this:
A function that would create a session and another functions that would use the same "child"
def ssh_login(device):
child.spawn("ssh root#"+device)
child.expect("password:")
child.sendline(password)
child.expect("#")
Another function that would use the session and run some command like
def run_command():
# run some command here
child.sendline("some_command")
child.expect("#")
And a clean up function
def cleanup():
child.sendline(cleanup)
child.expect("#")
child.sendline("exit")
child.interract()
Any ideas?

Like this:
pexpect.spawn('ssh', [ '-o' + 'ControlMaster=auto',username + "#" + hostname, '-o' + 'ControlPath=~/.ssh/master-%r#%h:%p'])
And you will find session:
ls ~/.ssh/master-yourname#hostname:22

I've done something similar. All you have to do is return the child from the ssh_login function and pass it along as an input to your other functions.
def ssh_login(device):
child = pexpect.spawn("ssh root#"+device)
#Do your usual login
return child
When you call it, save the child in a variable.
session = ssh_login(my_device)
run_command(session)
cleanup(session)
Of course you will need to change the other functions to accept a session input:
def run_command(session):
# run some command here
session.sendline("some_command")
session.expect("#")
def cleanup(session):
session.sendline(cleanup)
session.expect("#")
session.sendline("exit")
session.interract()

When I do anything with python and SSH, I use Paramiko, its a really solid module, Here's my "Starter Code" for any project that uses it. I've argmentized it, and added some comments, you'll probably want to generate a list of the servers you want to run the command on, and loop through it. I would reccomend though, if you need to frequently run commands on a lot of servers, consider getting something like Saltstack, or Ansible, they make it very easy to manage servers regularly.
https://saltstack.com/
https://www.ansible.com/
#!/usr/bin/env python
import paramiko
def run_ssh_cmd(remote_server, connect_user, identity, cmd=None):
""" create an ssh connection to the remote server and retrieve
information"""
# kludge to make ssh work - add 'your_domain.com' to the remote_server
remote_server += '.your_domain.com'
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(remote_server, username=connect_user, key_filename=identity)
command_str = cmd
stdin, stdout, stderr = client.exec_command(command_str)
print stdout.readlines()
client.close()
if __name__ == '__main__':
import sys
import argparse
import datetime
parser = argparse.ArgumentParser()
parser.add_argument("-s", "--server", action="store", required=True,
dest="server", help="Server to query")
parser.add_argument("-u", "--user", action="store", required=True,
dest="user", help="User ID for remote server connection")
parser.add_argument("-i", "--identity", action="store", required=True,
dest="id_file", help="SSH key file")
args = parser.parse_args()
run_ssh_cmd(args.server, args.user, args.id_file, "hostname;date")

Related

Pass open connection between two different python programs

Pass open connection between two different python programs
Here is my use case..
I want to connect to ssh server ( network / linux/ windows) and open the connection.
And have another python program continue to use open session from step 1 and provide user inputs (commands )
The reason programs have to be separate is the caller is going to orchestrate these programs into a graphical process flow inside a designer workflow e.g mistral.
CAN I PASS THE HANDLE OF WHATEVER? NO (I'm pretty sure at least)
... why not just pass the credentials and let it open connection?
or basically your first program would have to run some sort of server, that listens for commands and forwards them up the tree ... something like this i guess
one way of passing information between two applications might be to have a flask server running on one and the other calls the flask endpoints (you dont have to use flask ... theres many ways to do this)
import argparse
import requests
from flask import Flask,request
def prog_1():
''' manage some open connection '''
my_open_thing = OpenConnection(stuff)
app = flask.Flask("__main__")
#app.route("/execute"):
def execute_command():
if request.form.get("CMD",None):
my_open_thing.send(request.form['CMD'])
return my_open_thing.recv().to_string()
app.run(port=23123)
def prog_2():
'''interact with other thing'''
while 1:
cmd = input("CMD:")
if cmd in ["quit","q"]:
break
print(requests.post("http://localhost:23123/execute",{"CMD":cmd}).content)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument('TYPE',choices=['manager','client'],help="Serve the connection, or use the manager")
parser.parse_args()
if parser.TYPE == "manager":
prog_1()
else:
prog_2()

Subprocess on remote server

I am using this code for executing command on remote server.
import subprocess
import sys
COMMAND="ls"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
When I try to execute this script, I get prompt for password. Is there any way I could avoid it, for example, can I enter password in script somehow? Also, password should be encrypted somehow so that people who have access to the script cannot see it.
Why make it so complicated? Here's what I suggest:
1) Create a ssh config section in your ~/.ssh/config file:
Host myserver
HostName 50.50.50.12 (fill in with your server's ip)
Port xxxx (optional)
User me (your username for server)
2) If you have generated your ssh keypair do it now (with ssh-keygen). Then upload with:
$ ssh-copy-id myserver
3) Now you can use subprocess with ssh. For example, to capture output, I call:
result = subprocess.check_output(['ssh', 'myserver', 'cat', 'somefile'])
Simple, robust, and the only time a password is needed is when you copy the public key to the server.
BTW, you code will probably work just fine as well using these steps.
One way is to create a public key, put it on the server, and do ssh -i /path/to/pub/key user#host or use paramiko like this:
import paramiko
import getpass
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
p = getpass.getpass()
ssh.connect('hostname', username='user', password=p)
stdin, stdout, stderr = ssh.exec_command('ls')
print stdout.readlines()
ssh.close()
You should use pexpect or paramiko to connect to remote machine,then spawn a child ,and then run subprocess to achieve what you want.
Here's what I did when encountering this issue before:
Set up your ssh keys for access to the server.
Set up an alias for the server you're accessing. Below I'll call it remote_server.
Put the following two lines at the end of ~/.bash_profile.
eval $(ssh-agent -s)
ssh-add
Now every time you start your shell, you will be prompted for a passphrase. By entering it, you will authenticate your ssh keys and put them 'in hand' at the start of your bash session. For the remainder of your session you will be able to run commands like
ssh remote_server ls
without being prompted for a passphrase. Here ls will run on the remote server and return the results to you. Likewise your python script should run without password prompt interruption if you execute it from the shell.
You'll also be able to ssh to the server just by typing ssh remote_server without having to enter your username or password every time.
The upside to doing it this way is that you should be doing this anyway to avoid password annoyances and remembering funky server names :) Also you don't have to worry about having passwords saved anywhere in your script. The only potential downside is that if you want to share the python script with others, they'll have to do this configuring as well (which they should anyway).
You don't really need something like pexpect to handle this. SSH keys already provide a very good and secure solution to this sort of issue.
The simplest way to get the results you want would probably be to generate an ssh key and place it in the .ssh folder of your device. I believe github has a pretty good guide to doing that, if you look into it. Once you set up the keys correctly on both systems, you won't actually have to add a single line to your code. When you don't specify a password it will automatically use the key to authenticate you.
While subprocess.Popen might work for wrapping ssh access, this is not the preferred way to do so.
I recommend using paramiko.
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(server, username=user,password=password)
...
ssh_client.close()
And If you want to simulate a terminal, as if a user was typing:
chan=ssh_client.invoke_shell()
def exec_cmd(cmd):
"""Gets ssh command(s), execute them, and returns the output"""
prompt='bash $' # the command line prompt in the ssh terminal
buff=''
chan.send(str(cmd)+'\n')
while not chan.recv_ready():
time.sleep(1)
while not buff.endswith(prompt):
buff+=ssh_client.chan.recv(1024)
return buff[:len(prompt)]
Example usage: exec_cmd('pwd')
If you don't know the prompt in advance, you can set it with:
chan.send('PS1="python-ssh:"\n')
You could use following.
import subprocess
import sys
COMMAND="ls"
ssh = subprocess.Popen("powershell putty.exe user#HOST -pw "password", stdout=PIPE, stdin=PIPE, stderr=STDOUT)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result

How can I start another process required by my Selenium tests

I'm using django-selenium to add Selenium testing functionality to existing unittests.
My Selenium tests are reliant on a web server running on my machine which would be triggered by running our django app like so; main.py -a
So the first thing I want to do in my Selenium test is start this server which I setup like so;
def start_server():
path = os.path.join(os.getcwd(), 'main.py -a')
server_running = is_server_running()
if server_running is False:
server = subprocess.Popen('cmd.exe', stdin= subprocess.PIPE, stdout= subprocess.PIPE)
stdout, stderr = server.communicate(input='%s\n' % path)
print 'Server error:\n{0}\n'.format(stderr)
server_running = is_server_running()
return server_running
However when I do this the webserver takes over the execution of the django test process in the command line. I assume the way I should be doing this is to launch the command prompt in a separate process and then trigger the main.py -a command in that process.
Is this the right idea and if so, how can I modify that function to spawn a new process and launch my command? I was trying to run 'cmd.exe' using Process(target=path but I couldn't get it to work. Thanks :)
The way I have gone with this is with a much simpler launch method;
startServer.py
def run():
path = os.path.join(os.getcwd(), 'main.py')
server_running = is_server_running()
if server_running is False:
subprocess.Popen(['python', path, '-a'])
if __name__ == '__main__':
run()
Which I can then start and stop in my tests' setup & teardown as so;
def setUp(self):
self.server = Process(target= startServer.run)
self.server.start()
def test(self):
# run test process
def tearDown(self):
utils.closeBrowser(self.ff)
There may well be a better way of doing things & something here may not be 'as it should be' but it works (with a socket forcibly closed error) :)
My only outstanding issue is test starting before the database tables have been created :(

Pexpect - silence ssh connection output

I'm using a simple pexpect script to ssh to a remote machine and grab a value returned by a command.
Is there any way, pexpect or sshwise I can use to ignore the unix greeting?
That is, from
child = pexpect.spawn('/usr/bin/ssh %s#%s' % (rem_user, host))
child.expect('[pP]assword: ', timeout=5)
child.sendline(spass)
child.expect([pexpect.TIMEOUT, prompt])
child.before = '0'
child.sendline ('%s' % cmd2exec)
child.expect([pexpect.EOF, prompt])
# Collected data processing
result = child.before
# logon to the machine returns a lot of garbage, the returned executed command is at the 57th position
print result.split('\r\n') [57]
result = result.split('\r\n') [57]
How can I simply get the returned value, ignoring,
the "Last successful login" and "(c)Copyright" stuff
and without having to concern with the value correct position?
Thanks !
If you have access to the server to which you are logging in, you can try creating a file named .hushlogin in the home directory. The presence of this file silences the standard MOTD greeting and similar stuff.
Alternatively, try ssh -T, which will disable terminal allocation entirely; you won't get a shell prompt, but you may still issue commands and read the response.
There is also a similar thread on ServerFault which may be of some use to you.
If the command isn't interactive, you can just run ssh HOST COMMAND to run the command without all the login excitement happening at all. If the command is interactive, you can frequently use the ssh -t option (ssh -t HOST COMMAND) to force pseudo-tty allocation and trick the remote process to think that it's running attached to a TTY.
I have used paramiko to automate ssh connection and I have found it useful. It can deal with greetings and silent execution.
http://www.lag.net/paramiko/
Hey there you kann kill all that noise by using the sys module and a small class:
class StreamToLogger(object):
"""
Fake file-like stream object that redirects writes to a logger instance.
"""
def __init__(self, logger, log_level=logging.INFO):
self.logger = logger
self.log_level = log_level
self.linebuf = ''
def write(self, buf):
for line in buf.rstrip().splitlines():
self.logger.log(self.log_level, line.rstrip())
#Mak
stdout_logger = logging.getLogger('STDOUT')
sl = StreamToLogger(stdout_logger, logging.INFO)
sys.stdout = sl
stderr_logger = logging.getLogger('STDERR')
sl = StreamToLogger(stderr_logger, logging.ERROR)
sys.stderr = sl
Can't remember where i found that snippet but it works for me :)

Use subprocess to send a password

I'm attempting to use the python subprocess module to log in to a secure ftp site and then grab a file. However I keep getting hung up on just trying to send the password when it is requested. I so far have the following code:
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate('password')
This still stops at the password prompt. If I enter the password manually it then goes to the ftp site and then enters the password on the command line. I've seen people suggest using pexpect but long story short I need a standard library solution. Is there anyway with subprocess and/or any other stdlib? What am I forgetting above?
Try
proc.stdin.write('yourPassword\n')
proc.stdin.flush()
That should work.
What you describe sounds like stdin=None where the child process inherits the stdin of the parent (your Python program).
Perhaps you should use an expect-like library instead?
For instance Pexpect (example). There are other, similar python libraries as well.
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate(input='password')
Try with input=‘password’ in communicate, that worked for me.
Use Paramiko for SFTP. For anything else, this works:
import subprocess
args = ['command-that-requires-password', '-user', 'me']
proc = subprocess.Popen(args,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.stdin.write('mypassword\n')
proc.stdin.flush()
stdout, stderr = proc.communicate()
print stdout
print stderr
For some reason, I couldn't get any of the standard library answers here to work for me - getting very strange problems with all of them. Someone else here: unable to provide password to a process with subprocess [python] had the same problem, and concluded that ultimately you just have to go with pexpect to be able to send a password.
I wanted to add my final code here to just save the time of anyone having a similar problem, since I wasted so much time on this (Python 3, 2020):
ssh_password = getpass("user's password: ")
ssh_password = (ssh_password + "\n").encode()
scp_command = 'scp xx.xx.xx.xx:/path/to/file.log /local/save/path/'
child = pexpect.spawn(scp_command)
# make output visible for debugging / progress watching
child.logfile = sys.stdout.buffer
i = child.expect([pexpect.TIMEOUT, "password:"])
if i == 0:
print("Got unexpected output: {} {}".format(child.before, child.after))
return
else:
child.sendline(ssh_password)
child.read()
The above code runs an SCP command to pull a file from the remote server onto your local computer - alter the server IP and paths as necessary.
Key things to remember:
Have to have a pexpect.TIMEOUT in the child.expect call
Have to encode to bytes whatever strings you pass in, and have to use the default encode
Write pexpect output to sys.stdout.buffer so that you can actually see what is going on
Have to have a child.read() at the end
I would recommend scrapping the subprocess approach and using the paramiko package for sftp access.
This same problem plagued me for a week. I had to submit a password from user input through subprocess securely because I was trying to avoid introducing a command injection vulnerability. Here is how I solved the problem with a little help from a colleague.
import subprocess
command = ['command', 'option1', '--password']
subprocess.Popen(command, stdin=subprocess.PIPE).wait(timeout=60)
The .wait(timeout=int) was the most important component because it allows the user to feed input to stdin. Otherwise, the timeout is defaulted to 0 and leaves the user no time to enter input, which consequently results in a None or null string. Took me FOREVER to figure this out.
For repeat use-cases where you know you'll have to do this multiple times, you can override the popen function and use it as a private method which I was told by the same programmer is best practice if you anticipate someone else will be interested in maintaining the code later on and you don't want them to mess with it.
def _popen(cmd):
proc_h = subprocess.Popen(cmd, stdin=subprocess.PIPE)
proc_h.wait(timeout=60)
return proc_h.poll() == os.EX_OK
It is important to remove stdout=subprocess.PIPE if the user is going to be prompted for input. Otherwise, the process appears to hang for 60 seconds, and the user doesn't get a prompt, nor do they realize they are expected to give a password. The stdout will naturally go to the shell window and allow the user to pass input to popen().
Also, just to explain why you return proc_h.poll() == os.EX_OK, is that it returns 0 if the command succeeded. This is just c-style best-practice for when you want to return system error codes in the event the function fails, while accounting for the fact that return 0 will be treated as "false" by the interpreter.
This is a pure Python solution using expect - not pexpect.
If on Ubuntu you first need to install expect with:
sudo apt install expect
Python 3.6 or later:
def sftp_rename(from_name, to_name):
sftp_password = 'abigsecret'
sftp_username = 'foo'
destination_hostname = 'some_hostname'
from_name = 'oldfilename.txt'
to_name = 'newfilename.txt'
commands = f"""
spawn sftp -o "StrictHostKeyChecking no"
{sftp_username}#{destination_hostname}
expect "password:"
send "{sftp_password}\r"
expect "sftp>"
send "rename {from_name} {to_name}\r"
expect "sftp>"
send "bye\r"
expect "#"
"""
sp = subprocess.Popen(['expect', '-c', commands], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
since what you want is just grab a file, I am trying to use "sub process" but it is not works for me. So now I am using paramiko, here is my code:
here is one tutorial I found online
Transfer a file from local server to remote server and vice versa using paramiko of python
"https://www.youtube.com/watch?v=dtvV2xKaVjw"
underneath is my code for transfering all the files in one folder from Linux to windows
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname='11.11.11.1111', username='root', password='********', port=22)
sftp_client = ssh.open_sftp()
source_folder = '/var/ftp/file_pass'
local_folder = 'C:/temp/file_pass'
inbound_files = sftp_client.listdir(source_folder)
print(inbound_files)
for ele in inbound_files:
try:
path_from = source_folder + '/' + ele
path_to = local_folder + '/'+ ele
sftp_client.get(path_from, path_to)
except:
print(ele)
sftp_client.close()
ssh.close()
Python have a built in library called ftplib, that can be used for ftp processes without any hassle. (Assuming the remote server have a ftp service running)
from ftplib import FTP
ftp = FTP('ftp.us.debian.org') # connect to host, default port
ftp.login() # user anonymous, passwd anonymous#
##'230 Login successful.'
ftp.cwd('debian') # change into "debian" directory
##'250 Directory successfully changed.'
ftp.retrlines('LIST')
Otherwise, You can use scp command, which is a command line tool. The problem with the password can be avoided creating password less user for remote host.
import os
os.system('scp remoteuser#remotehost:/remote/location/remotefile.txt /client/location/')
To create a passwordless user in linux systems,
Fallow below Steps. Fallow this SO answer.
> ssh-keyscan remotehost
> known_hosts ssh-keygen -t rsa # ENTER toevery field (One time)
> ssh-copy-id remoteuser#remotehost
The safest way to do this is to prompt for the password beforehand and then pipe it into the command. Prompting for the password will avoid having the password saved anywhere in your code. Here's an example:
from getpass import getpass
from subprocess import Popen, PIPE
password = getpass("Please enter your password: ")
proc = Popen("sftp user#server stop".split(), stdin=PIPE)
# Popen only accepts byte-arrays so you must encode the string
proc.communicate(password.encode())
import subprocess
args = ['command', 'arg1', 'arg2']
proc = subprocess.Popen(args, stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.stdin.write(b'password') ##The b prefix is necessary because it needs a byte type
proc.stdin.flush()
stdout, stderr = proc.communicate()
print(stdout)
print(stderr)
You just forgot the line return (aka user pressing Enter) in your password.
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate('password\n'.encode())
Also .encode() because by default proc.communicate() accept bytes-like object.

Categories

Resources