Creating and logging into a linux virtual machine in automation with python - python

I currently have a working python script that SSHs into a remote Linux machine and executes commands on that machine. I'm using paramiko to handle ssh connectivity. Here is the code in action, executing an hostname -s command:
blade = '192.168.1.15'
username='root'
password=''
# now, connect
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy())
print '*** Connecting...'
client.connect(blade, 22, username, password)
# print hostname for verification
stdin, stdout, stderr = client.exec_command('hostname --short')
print stdout.readlines()
except Exception, e:
print '*** Caught exception: %s: %s' % (e.__class__, e)
traceback.print_exc()
try:
client.close()
except:
pass
sys.exit(1)
This works fine, but what I'm actually trying to do is more complicated. What I would actually like to do is SSH into that same Linux machine, as I did above, but then create a temporary virtual machine on it, and execute a command on that virtual machine. Here is my (nonworking) attempt:
blade='192.168.1.15'
username='root'
password=''
# now, connect
try:
# client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy())
print '*** Connecting...'
client.connect(blade, 22, username, password)
# create VM, log in, and print hostname for verification
stdin, stdout, stderr = client.exec_command('sudo kvm -m 1024 -drive file=/var/lib/libvirt/images/oa4-vm$
time.sleep(60) #delay to allow VM to initialize
stdin.write(username + '\n') #log into VM
stdin.write(password + '\n') #log into VM
stdin, stdout, stderr = client.exec_command('hostname --short')
print stdout.readlines()
except Exception, e:
print '*** Caught exception: %s: %s' % (e.__class__, e)
traceback.print_exc()
try:
client.close()
except:
pass
sys.exit(1)
When I run this, I get the following:
joe#computer:~$ python automata.py
*** Connecting...
/home/joe/.local/lib/python2.7/site-packages/paramiko/client.py:95: UserWarning: Unknown ssh-rsa host key for 192.168.1.15: 25f6a84613a635f6bcb5cceae2c2b435
(key.get_name(), hostname, hexlify(key.get_fingerprint())))
*** Caught exception: <class 'socket.error'>: Socket is closed
Traceback (most recent call last):
File "automata.py", line 32, in function1
stdin.write(username + '\n') #log into VM
File "/home/joe/.local/lib/python2.7/site-packages/paramiko/file.py", line 314, in write
self._write_all(data)
File "/home/joe/.local/lib/python2.7/site-packages/paramiko/file.py", line 439, in _write_all
count = self._write(data)
File "/home/joe/.local/lib/python2.7/site-packages/paramiko/channel.py", line 1263, in _write
self.channel.sendall(data)
File "/home/joe/.local/lib/python2.7/site-packages/paramiko/channel.py", line 796, in sendall
raise socket.error('Socket is closed')
error: Socket is closed
I'm not sure how to interpret this error -- "socket is closed" makes me think the SSH connection is terminating one I try to create the VM. Does anyone have any pointers?
update
I'm attempting to use the pexpect wrapper and having trouble getting it to interact with the un/pw prompt. I'm testing the process by ssh'ing into a remote machine and running a test.py script which prompts me for a username, then saves the username in a text file. Here is my fab file:
env.hosts = ['hostname']
env.user = 'userame'
env.password = 'password'
def vm_create():
run("python test.py")
And the contents of test.py on the remote machine are:
#! /usr/bin/env python
uname = raw_input("Enter Username: ")
f = open('output.txt','w')
f.write(uname + "\n")
f.close
So, I can execute "fab vm_create" on the local machine and it successfully establishes the SSH connection and prompts me for the username, as defined by test.py. However, if I execute a third python file on my local machine with the pexpect wrapper, like this:
import pexpect
child = pexpect.spawn('fab vm_create')
child.expect ('Enter Username: ')
child.sendline ('password')
Nothing seems to happen. I get no errors, and no output.txt is created on the remote machine. Am I using pexpect incorrectly?

As much as I love paramiko, this may be better suited to using Fabric.
Here's a sample fabfile.py:
from fabric.api import run
from fabric.api import sudo
from fabric.api import env
env.user = 'root'
env.password = ''
env.host = ='192.168.1.15'
def vm_up():
sudo("kvm -m 1024 -drive file=/var/lib/libvirt/images/oa4-vm$...")
run("hostname --short")
To then run this, use
$ fab vm_up
If you don't set the host and password in the fabfile itself (rightly so), then you can set these at the command line:
$ fab -H 192.168.1.15 -p PASSWORD vm_up
However, your kvm line is still expecting input. To send input (and wait for the expected prompts), write another script that uses pexpect to call fab:
child = pexpect.spawn('fab vm_up')
child.expect('username:') # Put this in the format you're expecting
child.send('root')

use fabric http://docs.fabfile.org/en/1.8/
Fabric is a Python (2.5 or higher) library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks
from fabric.api import run
def host_name():
run('hostname -s')

Related

how can i run subprocess.Popen() on remote machine locally? [duplicate]

I am writing a program in python on Ubuntu, to execute a command ls -l on RaspberryPi, connect with Network.
Can anybody guide me on how do I do that?
Sure, there are several ways to do it!
Let's say you've got a Raspberry Pi on a raspberry.lan host and your username is irfan.
subprocess
It's the default Python library that runs commands.
You can make it run ssh and do whatever you need on a remote server.
scrat has it covered in his answer. You definitely should do this if you don't want to use any third-party libraries.
You can also automate the password/passphrase entering using pexpect.
paramiko
paramiko is a third-party library that adds SSH-protocol support, so it can work like an SSH-client.
The example code that would connect to the server, execute and grab the results of the ls -l command would look like that:
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('raspberry.lan', username='irfan', password='my_strong_password')
stdin, stdout, stderr = client.exec_command('ls -l')
for line in stdout:
print line.strip('\n')
client.close()
fabric
You can also achieve it using fabric.
Fabric is a deployment tool which executes various commands on remote servers.
It's often used to run stuff on a remote server, so you could easily put your latest version of the web application, restart a web-server and whatnot with a single command. Actually, you can run the same command on multiple servers, which is awesome!
Though it was made as a deploying and remote management tool, you still can use it to execute basic commands.
# fabfile.py
from fabric.api import *
def list_files():
with cd('/'): # change the directory to '/'
result = run('ls -l') # run a 'ls -l' command
# you can do something with the result here,
# though it will still be displayed in fabric itself.
It's like typing cd / and ls -l in the remote server, so you'll get the list of directories in your root folder.
Then run in the shell:
fab list_files
It will prompt for an server address:
No hosts found. Please specify (single) host string for connection: irfan#raspberry.lan
A quick note: You can also assign a username and a host right in a fab command:
fab list_files -U irfan -H raspberry.lan
Or you could put a host into the env.hosts variable in your fabfile. Here's how to do it.
Then you'll be prompted for a SSH password:
[irfan#raspberry.lan] run: ls -l
[irfan#raspberry.lan] Login password for 'irfan':
And then the command will be ran successfully.
[irfan#raspberry.lan] out: total 84
[irfan#raspberry.lan] out: drwxr-xr-x 2 root root 4096 Feb 9 05:54 bin
[irfan#raspberry.lan] out: drwxr-xr-x 3 root root 4096 Dec 19 08:19 boot
...
Simple example from here:
import subprocess
import sys
HOST="www.example.org"
# Ports are handled in ~/.ssh/config since we use OpenSSH
COMMAND="uname -a"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
It does exactly what you want: connects over ssh, executes command, returns output. No third party library needed.
You may use below method with linux/ Unix 's built in ssh command.
import os
os.system('ssh username#ip bash < local_script.sh >> /local/path/output.txt 2>&1')
os.system('ssh username#ip python < local_program.py >> /local/path/output.txt 2>&1')
Paramiko module can be used to run multiple commands by invoking shell. Here I created class to invoke ssh shell
class ShellHandler:
def __init__(self, host, user, psw):
logger.debug("Initialising instance of ShellHandler host:{0}".format(host))
try:
self.ssh = paramiko.SSHClient()
self.ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
self.ssh.connect(host, username=user, password=psw, port=22)
self.channel = self.ssh.invoke_shell()
except:
logger.error("Error Creating ssh connection to {0}".format(host))
logger.error("Exiting ShellHandler")
return
self.psw=psw
self.stdin = self.channel.makefile('wb')
self.stdout = self.channel.makefile('r')
self.host=host
time.sleep(2)
while not self.channel.recv_ready():
time.sleep(2)
self.initialprompt=""
while self.channel.recv_ready():
rl, wl, xl = select.select([ self.stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = self.stdout.channel.recv(24)
self.initialprompt=self.initialprompt+str(tmp.decode())
def __del__(self):
self.ssh.close()
logger.info("closed connection to {0}".format(self.host))
def execute(self, cmd):
cmd = cmd.strip('\n')
self.stdin.write(cmd + '\n')
#self.stdin.write(self.psw +'\n')
self.stdin.flush()
time.sleep(1)
while not self.stdout.channel.recv_ready():
time.sleep(2)
logger.debug("Waiting for recv_ready")
output=""
while self.channel.recv_ready():
rl, wl, xl = select.select([ self.stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = self.stdout.channel.recv(24)
output=output+str(tmp.decode())
return output
If creating different shell each time does not matter to you then you can use method as below.
def run_cmd(self,cmd):
try:
cmd=cmd+'\n'
#self.ssh.settimeout(60)
stdin,stdout,stderr=self.ssh.exec_command(cmd)
while not stdout.channel.eof_received:
time.sleep(3)
logger.debug("Waiting for eof_received")
out=""
while stdout.channel.recv_ready():
err=stderr.read()
if err:
print("Error: ",my_hostname, str(err))
return False
out=out+stdout.read()
if out:
return out
except:
error=sys.exc_info()
logger.error(error)
return False

Run multiple commands in different SSH servers in parallel using Python Paramiko

I have an SSH.py with the goal of connecting to many servers over SSH to run a Python script (worker.py). I am using Paramiko, but am very new to it and learning as I go. On each server I ssh over with, I need to keep the Python script running -- this is for training a model parallely and so the script needs to run on all machines as to update model parameters/train jointly. The Python script on the servers need to be running so either all the SSH connections cannot close or I have to figure out a way for the Python script on the servers to keep running even if I close the connection.
From extensive googling, it looks like you can achieve this with nohup or:
client = paramiko.SSHClient()
client.connect(ip_address, username, password)
transport = client.get_transport()
channel = transport.open_session()
channel.exec_command("python worker.py > /logs/'command output' 2>&1")
However, what is unclear to me is how do we close/exit all SSH connections? I am running the SSH.py file on cmd.exe, would closing the cmd.exe be enough for all processes remotely to close?
In addition, is my use of client.close() correct for my purposes?
Please see below what I have for my code.
# SSH.py
import paramiko
import argparse
import os
path = "path"
python_script = "worker.py"
# definitions for ssh connection and cluster
ip_list = ['XXX.XXX.XXX.XXX', XXX.XXX.XXX.XXX', XXX.XXX.XXX.XXX']
port_list = [':XXXX', ':XXXX', ':XXXX']
user_list = ['user', 'user', 'user']
password_list = ['pass', 'pass', 'pass']
node_list = list(map(lambda x: f'-node{x + 1} ', list(range(len(ip_list)))))
cluster = ' '.join([node + ip + port for node, ip, port in zip(node_list, ip_list, port_list)])
# run script on command line of local machine
os.system(f"cd {path} && python {python_script} {cluster} -type worker -index 0 -batch 64 > {path}/logs/'command output'/{ip_list[0]}.log 2>&1")
# loop for IP and password
for i, (ip, user, password) in enumerate(zip(ip_list[1:], user_list[1:], password_list[1:]), 1):
try:
print("Open session in: " + ip + "...")
client = paramiko.SSHClient()
client.connect(ip, user, password)
transport = client.get_transport()
channel = transport.open_session()
except paramiko.SSHException:
print("Connection Failed")
quit()
try:
channel.exec_command(f"cd {path} && python {python_script} {cluster} -type worker -index {i} -batch 64 > {path}/logs/'command output'/{ip_list[i]}.log 2>&1", timeout=30)
client.close() # here I am closing connection but above command should be running, my question is can I safely close cmd.exe on which I am running SSH.py?
except paramiko.SSHException:
print("Cannot run file. Continue with other IPs in list...")
client.close()
continue
The code is based on Running process of remote SSH server in the background using Python Paramiko
Edit: It seems like the channel.exec_command() is not executing the command
f"cd {path} && python {python_script} {cluster} -type worker -index {i} -batch 64 > {path}/logs/'command output'/{ip_list[i]}.log 2>&1"
So I wonder if it is because of client.close()? What would happen if I comment out all the lines with client.close()? Would this help? Is this dangerous? When I quit my local Python script, would this close all my SSH connections and hence, no need for client.close()?
Also all my machines have Windows OS.
Indeed, the problem is that you close the SSH connection. As the remote process is not detached from the terminal, closing the terminal terminates the process. On Linux servers, you can use nohup. I do not know what is (if there is) a Windows equivalent.
Anyway, it seems that you do not need to close the connection. I understood, that you are ok with waiting for all the commands to complete.
stdouts = []
clients = []
# Start the commands
commands = zip(ip_list[1:], user_list[1:], password_list[1:])
for i, (ip, user, password) in enumerate(commands, 1):
print("Open session in: " + ip + "...")
client = paramiko.SSHClient()
client.connect(ip, user, password)
command = \
f"cd {path} && " + \
f"python {python_script} {cluster} -type worker -index {i} -batch 64 " + \
f"> {path}/logs/'command output'/{ip_list[i]}.log 2>&1"
stdin, stdout, stderr = client.exec_command(command)
clients.append(client)
stdouts.append(stdout)
# Wait for commands to complete
for i in range(len(stdouts)):
stdouts[i].read()
clients[i].close()
Note that the above simple solution with stdout.read() is working only because you redirect the commands output to a remote file. Were you not, the commands might deadlock.
Without that (or if you want to see the command output locally) you will need a code like this:
while any(x is not None for x in stdouts):
for i in range(len(stdouts)):
stdout = stdouts[i]
if stdout is not None:
channel = stdout.channel
# To prevent losing output at the end, first test for exit,
# then for output
exited = channel.exit_status_ready()
while channel.recv_ready():
s = channel.recv(1024).decode('utf8')
print(f"#{i} stdout: {s}")
while channel.recv_stderr_ready():
s = channel.recv_stderr(1024).decode('utf8')
print(f"#{i} stderr: {s}")
if exited:
print(f"#{i} done")
clients[i].close()
stdouts[i] = None
time.sleep(0.1)
If you do not need to separate the stdout and stderr, you can greatly simplify the code by using Channel.set_combine_stderr. See Paramiko ssh die/hang with big output.
Regarding your question about SSHClient.close: If you do not call it, the connection will be closed implicitly, when the script finishes, when Python garbage collector cleans up the pending objects. It's a bad practice. And even if Python won't do it, the local OS will terminate all connections of the local Python process. That's a bad practice too. In any case, that will terminate the remote processes along.

paramiko equivalent of "cat File.gz | ssh addres script.sh" in python 3.7

Command i'm trying to run using paramiko in python 3.7:
Windows:
type file.ext4.gz | ssh user#address sudo update.sh
Mac:
cat file.ext4.gz | ssh user#address sudo update.sh
From the cmd / terminals and from .bat / .sh this works, after entering the password. I've been working on a simple python gui (PysimpleGui) to allow the user to fo this, but without the need to enter the password (this is saved from initial connection).
I've tried:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(config["IP_ADDRESS"], username=config["USERNAME"], password=config["PASSWORD"], timeout=5)
a = client.open_sftp()
a.put(file_location, "sh update.sh", callback=sent)
While this works to send the file, it doesn't run it and gives the error:
OSError: Failure
I don't want to do this in subprocess, as this tool is to prevent the use of terminal for the "end user"
I've been beating my head against this for 2 days now. Thank you.
EDIT:
Here is the STDIO Code:
def send_ssh(value, input=None):
if input:
transport = client.get_transport()
channel = transport.open_session()
channel.exec_command(value)
with open(input, "rb") as file:
for chunk in iter(functools.partial(file.read, read_size), b''):
if channel.send_ready():
channel.sendall(chunk)
if channel.recv_ready():
print(channel.recv(1024).decode().strip())
if channel.recv_stderr_ready():
print(channel.recv_stderr(1024).decode().strip())
while not channel.exit_status_ready():
if channel.recv_ready():
print(channel.recv(1024).decode().strip())
if channel.recv_stderr_ready():
print(channel.recv_stderr(1024).decode().strip())
else:
w, r, e = client.exec_command(value, get_pty=True)
error = e.read().strip().decode()
if error != "":
return error
else:
return r.read().strip().decode()
Once the file is cat to the script it's the verified by the script. I worked around this by just using SFTP to send the file and running my
cat file | sudo script.sh
this works, but does require that i transfer a 600mb file (thankfully always over a local connection (LAN)) each time. The above code does transfer the file, but it doesn't complete. If i just try sending it via for line in file: i'll corrupt.
Keeping things simpler, below we're using threading to allow synchronous APIs to be used rather than needing to write explicit asynchronous code:
import shutil
client = SSHClient()
client.load_system_host_keys()
client.connect('user#address')
# here's the important part: we're using the file handles returned by exec_command()
update_stdin, update_stdout, update_stderr = client.exec_command('sudo update.sh')
# copy stdout and stderr from the remote thread to our own process's stdout and stderr
t_out = Thread(target=shutil.copyfileobj, args=[update_stdout, sys.stdout]); t_out.start()
t_err = Thread(target=shutil.copyfileobj, args=[update_stderr, sys.stderr]); t_err.start()
# write your local file to the remote stdin, in the foreground: we don't exit until done.
shutil.copyfileobj(open('file.ext4.gz', 'r'), update_stdin)
update_stdin.close()
# optional, but let's be graceful: wait for the threads to exit, and collect exit status
t_out.join(); t_err.join()
result = stdout.channel.recv_exit_status()
print(f"Remote process exited with status {result}")

Execute a command on Remote Machine in Python

I am writing a program in python on Ubuntu, to execute a command ls -l on RaspberryPi, connect with Network.
Can anybody guide me on how do I do that?
Sure, there are several ways to do it!
Let's say you've got a Raspberry Pi on a raspberry.lan host and your username is irfan.
subprocess
It's the default Python library that runs commands.
You can make it run ssh and do whatever you need on a remote server.
scrat has it covered in his answer. You definitely should do this if you don't want to use any third-party libraries.
You can also automate the password/passphrase entering using pexpect.
paramiko
paramiko is a third-party library that adds SSH-protocol support, so it can work like an SSH-client.
The example code that would connect to the server, execute and grab the results of the ls -l command would look like that:
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('raspberry.lan', username='irfan', password='my_strong_password')
stdin, stdout, stderr = client.exec_command('ls -l')
for line in stdout:
print line.strip('\n')
client.close()
fabric
You can also achieve it using fabric.
Fabric is a deployment tool which executes various commands on remote servers.
It's often used to run stuff on a remote server, so you could easily put your latest version of the web application, restart a web-server and whatnot with a single command. Actually, you can run the same command on multiple servers, which is awesome!
Though it was made as a deploying and remote management tool, you still can use it to execute basic commands.
# fabfile.py
from fabric.api import *
def list_files():
with cd('/'): # change the directory to '/'
result = run('ls -l') # run a 'ls -l' command
# you can do something with the result here,
# though it will still be displayed in fabric itself.
It's like typing cd / and ls -l in the remote server, so you'll get the list of directories in your root folder.
Then run in the shell:
fab list_files
It will prompt for an server address:
No hosts found. Please specify (single) host string for connection: irfan#raspberry.lan
A quick note: You can also assign a username and a host right in a fab command:
fab list_files -U irfan -H raspberry.lan
Or you could put a host into the env.hosts variable in your fabfile. Here's how to do it.
Then you'll be prompted for a SSH password:
[irfan#raspberry.lan] run: ls -l
[irfan#raspberry.lan] Login password for 'irfan':
And then the command will be ran successfully.
[irfan#raspberry.lan] out: total 84
[irfan#raspberry.lan] out: drwxr-xr-x 2 root root 4096 Feb 9 05:54 bin
[irfan#raspberry.lan] out: drwxr-xr-x 3 root root 4096 Dec 19 08:19 boot
...
Simple example from here:
import subprocess
import sys
HOST="www.example.org"
# Ports are handled in ~/.ssh/config since we use OpenSSH
COMMAND="uname -a"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
It does exactly what you want: connects over ssh, executes command, returns output. No third party library needed.
You may use below method with linux/ Unix 's built in ssh command.
import os
os.system('ssh username#ip bash < local_script.sh >> /local/path/output.txt 2>&1')
os.system('ssh username#ip python < local_program.py >> /local/path/output.txt 2>&1')
Paramiko module can be used to run multiple commands by invoking shell. Here I created class to invoke ssh shell
class ShellHandler:
def __init__(self, host, user, psw):
logger.debug("Initialising instance of ShellHandler host:{0}".format(host))
try:
self.ssh = paramiko.SSHClient()
self.ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
self.ssh.connect(host, username=user, password=psw, port=22)
self.channel = self.ssh.invoke_shell()
except:
logger.error("Error Creating ssh connection to {0}".format(host))
logger.error("Exiting ShellHandler")
return
self.psw=psw
self.stdin = self.channel.makefile('wb')
self.stdout = self.channel.makefile('r')
self.host=host
time.sleep(2)
while not self.channel.recv_ready():
time.sleep(2)
self.initialprompt=""
while self.channel.recv_ready():
rl, wl, xl = select.select([ self.stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = self.stdout.channel.recv(24)
self.initialprompt=self.initialprompt+str(tmp.decode())
def __del__(self):
self.ssh.close()
logger.info("closed connection to {0}".format(self.host))
def execute(self, cmd):
cmd = cmd.strip('\n')
self.stdin.write(cmd + '\n')
#self.stdin.write(self.psw +'\n')
self.stdin.flush()
time.sleep(1)
while not self.stdout.channel.recv_ready():
time.sleep(2)
logger.debug("Waiting for recv_ready")
output=""
while self.channel.recv_ready():
rl, wl, xl = select.select([ self.stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = self.stdout.channel.recv(24)
output=output+str(tmp.decode())
return output
If creating different shell each time does not matter to you then you can use method as below.
def run_cmd(self,cmd):
try:
cmd=cmd+'\n'
#self.ssh.settimeout(60)
stdin,stdout,stderr=self.ssh.exec_command(cmd)
while not stdout.channel.eof_received:
time.sleep(3)
logger.debug("Waiting for eof_received")
out=""
while stdout.channel.recv_ready():
err=stderr.read()
if err:
print("Error: ",my_hostname, str(err))
return False
out=out+stdout.read()
if out:
return out
except:
error=sys.exc_info()
logger.error(error)
return False

Python run mutiple ssh commands in the same session

My goal is to connect to SSH with python and authenticate which i can do with Paramiko or Fabric. But i would like to keep the session open after each execution and read the input/output. With paramiko i can only run 1 command before the session is closed and i am asked to authenticate again and the session hangs. And since fabric is using the paramiko library its giving me the same issue. For example if my directory structure is like this
-home
--myfolder1
--myfolder2
I would like to execute the below commands without having to re-authenticate because the sessions closes.
(make connection)
run cmd: 'pwd'
output: /home
run cmd: 'cd myfolder2'
run cmd: 'pwd'
output: /home/myfolder2
Is this possible with any module that is out there right now? Could it be made from scratch with native python? And also is this just not possible...?
Edit Added code. Without the new open_session it closes and i cannot run any command. After running the first command with this i will be prompted again to authenticate and it creates an infinite loop.
Edit2 If it closes after each command then there is no way this will work at all correct?
edit3 If i run this on a different server and exec_command with the paramikio.SSHClient it wont ask me to reauthenticate but if i 'cd somedir' and then 'pwd' it will output that i am back in the root directory of where i created.
class connect:
newconnection = ''
def __init__(self,username,password):
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
try:
ssh.connect('someserver', username=username,password=password,port=22,timeout=5)
except:
print "Count not connect"
sys.exit()
self.newconnection = ssh
def con(self):
return self.newconnection
#This will create the connection
sshconnection = connect('someuser','somepassword').con()
while True:
cmd = raw_input("Command to run: ")
if cmd == "":
break
try:
transport = sshconnection.get_transport()
transport.set_keepalive(999999)
chan = transport.open_session()
chan.settimeout(3)
chan.setblocking(0)
except:
print "Failed to open a channel"
chan.get_exception()
sys.exit()
print "running '%s'" % cmd
stdout_data = []
stderr_data = []
pprint.pprint(chan)
nbytes = 4096
chan.settimeout(5)
chan.get_pty()
chan.exec_command(cmd)
while True:
print "Inside loop " , chan.exit_status_ready()
time.sleep(1.2)
if chan.recv_ready():
print "First if"
stdout_data.append(chan.recv(nbytes))
if chan.recv_stderr_ready():
print "Recv Ready"
stderr_data.append(chan.recv_stderr(nbytes))
if chan.exit_status_ready():
print "Breaking"
break
print 'exit status: ', chan.recv_exit_status()
print ''.join(stdout_data)
This is possible by using the normal modules when you can concatenate the commands into one. Try
pwd ; cd myfolder2 ; pwd
as command. This should work but quickly becomes tedious when you have more complex commands which need arguments and horrible when the arguments contain spaces. The next step then is to copy a script with all the commands to the remote side and tell ssh to execute said script.
Another problem of this approach is that SSH doesn't return until all commands have executed.
Alternatively, you could build a "command server", i.e. a simple TCP server that listens for incoming connections and executes commands sent to it. It's pretty simple to write but also pretty insecure. Again, the solution is to turn the server into a (Python) script which reads commands from stdin and start that script remotely via SSH and then send commands.

Categories

Resources