I wrote this code in Paramiko:
ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect(hostname, username=user, password=passwd, timeout=3)
session = ssh.invoke_shell()
session.send("\n")
session.send("echo step 1\n")
time.sleep(1)
session.send("sleep 30\n")
time.sleep(1)
while not session.recv_ready():
time.wait(2)
output = session.recv(65535)
session.send("echo step 2\n")
time.sleep(1)
output += session.recv(65535)
I'm trying execute more commands on my Linux server. The problem is my Python code not wait to finish execute command, for example if I'm try to execute sleep 30, the Python not wait 30 seconds for finish execute commands. How can resolve this problem ? I tried with while recv_ready(), but it still does not wait.
Use exec_command: http://docs.paramiko.org/en/1.16/api/channel.html
stdin, stdout, stderr = ssh.exec_command("my_long_command --arg 1 --arg 2")
The following code works for me:
from paramiko import SSHClient, AutoAddPolicy
import time
ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect('111.111.111.111', username='myname', key_filename='/path/to/my/id_rsa.pub', port=1123)
sleeptime = 0.001
outdata, errdata = '', ''
ssh_transp = ssh.get_transport()
chan = ssh_transp.open_session()
# chan.settimeout(3 * 60 * 60)
chan.setblocking(0)
chan.exec_command('ls -la')
while True: # monitoring process
# Reading from output streams
while chan.recv_ready():
outdata += chan.recv(1000)
while chan.recv_stderr_ready():
errdata += chan.recv_stderr(1000)
if chan.exit_status_ready(): # If completed
break
time.sleep(sleeptime)
retcode = chan.recv_exit_status()
ssh_transp.close()
print(outdata)
print(errdata)
Please note that command history cannot be executed with ssh as is.
See example here: https://superuser.com/questions/962001/incorrect-output-of-history-command-of-ssh-how-to-read-the-timestamp-info-corre
In case you do not need to read the stdout and stderr separately, you can use way more straightforward code:
stdin, stdout, stderr = ssh_client.exec_command(command)
stdout.channel.set_combine_stderr(True)
output = stdout.readlines()
The readlines reads until the command finishes and returns a complete output.
In case you need the output separately, do not be tempted to remove the set_combine_stderr and call readlines on stdout and stderr separately. That might deadlock. See Paramiko ssh die/hang with big output
For a correct code that reads the outputs separately, see Run multiple commands in different SSH servers in parallel using Python Paramiko.
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".
Related
I wrote this code in Paramiko:
ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect(hostname, username=user, password=passwd, timeout=3)
session = ssh.invoke_shell()
session.send("\n")
session.send("echo step 1\n")
time.sleep(1)
session.send("sleep 30\n")
time.sleep(1)
while not session.recv_ready():
time.wait(2)
output = session.recv(65535)
session.send("echo step 2\n")
time.sleep(1)
output += session.recv(65535)
I'm trying execute more commands on my Linux server. The problem is my Python code not wait to finish execute command, for example if I'm try to execute sleep 30, the Python not wait 30 seconds for finish execute commands. How can resolve this problem ? I tried with while recv_ready(), but it still does not wait.
Use exec_command: http://docs.paramiko.org/en/1.16/api/channel.html
stdin, stdout, stderr = ssh.exec_command("my_long_command --arg 1 --arg 2")
The following code works for me:
from paramiko import SSHClient, AutoAddPolicy
import time
ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect('111.111.111.111', username='myname', key_filename='/path/to/my/id_rsa.pub', port=1123)
sleeptime = 0.001
outdata, errdata = '', ''
ssh_transp = ssh.get_transport()
chan = ssh_transp.open_session()
# chan.settimeout(3 * 60 * 60)
chan.setblocking(0)
chan.exec_command('ls -la')
while True: # monitoring process
# Reading from output streams
while chan.recv_ready():
outdata += chan.recv(1000)
while chan.recv_stderr_ready():
errdata += chan.recv_stderr(1000)
if chan.exit_status_ready(): # If completed
break
time.sleep(sleeptime)
retcode = chan.recv_exit_status()
ssh_transp.close()
print(outdata)
print(errdata)
Please note that command history cannot be executed with ssh as is.
See example here: https://superuser.com/questions/962001/incorrect-output-of-history-command-of-ssh-how-to-read-the-timestamp-info-corre
In case you do not need to read the stdout and stderr separately, you can use way more straightforward code:
stdin, stdout, stderr = ssh_client.exec_command(command)
stdout.channel.set_combine_stderr(True)
output = stdout.readlines()
The readlines reads until the command finishes and returns a complete output.
In case you need the output separately, do not be tempted to remove the set_combine_stderr and call readlines on stdout and stderr separately. That might deadlock. See Paramiko ssh die/hang with big output
For a correct code that reads the outputs separately, see Run multiple commands in different SSH servers in parallel using Python Paramiko.
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".
How do I execute multiple commands using Paramiko and read the output back into my python script?
This question is theoretically answered here How do you execute multiple commands in a single session in Paramiko? (Python), but in my view that answer is incorrect.
The problem is that when you read the stdout, it reads the entire content of the terminal including the program that you "typed" into the terminal.
Just try it (this is basically a copy paste from the above thread):
import paramiko
machine = "you machine ip"
username = "you username"
password = "password"
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(machine, username = username, password = password)
channel = client.invoke_shell()
stdin = channel.makefile('wb')
stdout = channel.makefile('rb')
stdin.write('''
cd tmp
ls
exit
''')
print stdout.read()
stdout.close()
stdin.close()
client.close()
So my question is, how do I execute multiple commands and read only the output of those commands, rather than the input I "typed" and the output?
Thanks in advance for your kind help and time.
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
target_host = 'x.x.x.x'
target_port = 22
target_port = 22
pwd = ':)'
un = 'root'
ssh.connect( hostname = target_host , username = un, password =pwd)
#Now exeute multiple commands seperated by semicolon
stdin, stdout, stderr = ssh.exec_command('cd mydir;ls')
print stdout.readlines()
You're seeing the commands you typed because the shell echoes them back. You can turn this off by running
stty -echo
before your other commands.
Another approach is to not invoke an interactive shell, but just run the commands directly, unless there's some other reason you especially need an interactive shell. For instance you could say
client.exec_command('/bin/sh -c "cd /tmp && ls")
If you want a shell but without a pty, you can try
client.exec_command('/bin/sh')
and I think that will suppress the echo too.
This should work:
# Explicitly provide key, ip address and username
from paramiko import SSHClient, AutoAddPolicy
result = []
def ssh_conn():
client = SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(AutoAddPolicy())
client.connect('<host_IP_address>', username='<your_host_username>', key_filename='<private_key_location>')
stdin, stdout, stderr = client.exec_command('ls -la')
for each_line in stdout:
result.append(each_line.strip('\n'))
client.close()
ssh_conn()
for each_line in result:
print(each_line.strip())
I've successfully implemented Paramiko using exec_command, however, the command I'm running on the remote machine(s) can sometimes take several minutes to complete.
During this time my Python script has to wait for the remote command to complete and receive stdout.
My goal is to let the remote machine "run in the background", and allow the local Python script to continue once it sends the command via exec_command.
I'm not concerned with stdout at this point, I'm just interested in bypassing waiting for stdout to return so the script can continue on while the command runs on the remote machine.
Any suggestions?
Current script:
def function():
ssh_object = paramiko.SSHClient()
ssh_object.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_object.connect(address, port=22, username='un', password='pw')
command = 'command to run'
try:
stdin, stdout, stderr = ssh_object.exec_command(command)
stdout.readlines()
except:
do something else
Thank you!
Use a separate thread to run the command. Usually threads should be cleaned up with the join command (the exception are daemon threads that you expect to run until your program exits). Exactly how you do that depends on the other stuff your program is running. But an example is:
import threading
def ssh_exec_thread(ssh_object, command):
stdin, stdout, stderr = ssh_object.exec_command(command)
stdout.readlines()
def function():
ssh_object = paramiko.SSHClient()
ssh_object.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_object.connect(address, port=22, username='un', password='pw')
command = 'command to run'
thread = threading.Thread(target=ssh_exec_thread, args=(ssh_object, command)
thread.start()
...do something else...
thread.join()
You can make this fancier by passing a Queue to ssh_exec_command and put the result on the queue for processing by your program later.
def exec_command(self, command, bufsize=-1):
#print "Executing Command: "+command
chan = self._transport.open_session()
chan.exec_command(command)
stdin = chan.makefile('wb', bufsize)
stdout = chan.makefile('rb', bufsize)
stderr = chan.makefile_stderr('rb', bufsize)
return stdin, stdout, stderr
When executing a command in paramiko, it always resets the session when you run exec_command.
I want to able to execute sudo or su and still have those privileges when I run another exec_command.
Another example would be trying to exec_command("cd /") and then run exec_command again and have it be in the root directory. I know you can do something like exec_command("cd /; ls -l"), but I need to do it in separate function calls.
Non-Interactive use cases
This is a non-interactive example... it sends cd tmp, ls and then exit.
import sys
sys.stderr = open('/dev/null') # Silence silly warnings from paramiko
import paramiko as pm
sys.stderr = sys.__stderr__
import os
class AllowAllKeys(pm.MissingHostKeyPolicy):
def missing_host_key(self, client, hostname, key):
return
HOST = '127.0.0.1'
USER = ''
PASSWORD = ''
client = pm.SSHClient()
client.load_system_host_keys()
client.load_host_keys(os.path.expanduser('~/.ssh/known_hosts'))
client.set_missing_host_key_policy(AllowAllKeys())
client.connect(HOST, username=USER, password=PASSWORD)
channel = client.invoke_shell()
stdin = channel.makefile('wb')
stdout = channel.makefile('rb')
stdin.write('''
cd tmp
ls
exit
''')
print stdout.read()
stdout.close()
stdin.close()
client.close()
Interactive use cases
If you have an interactive ssh use case, paramiko can handle it... I personally would drive interactive ssh sessions with scrapli.
Listing all the ways I can think of to use paramiko interactively:
See nagabhushan's answer which uses paramiko interactively
By default, ansible paramiko usage is configurable
By default, exscript ssh uses paramiko
By default, netmiko ssh uses paramiko
By default, scrapli ssh uses paramiko
I might have missed some libraries that use paramiko, but it should be clear that paramiko is used quite extensively by python libraries that control ssh sessions.
Try creating a command string separated by \n character. It worked for me.
For. e.g. ssh.exec_command("command_1 \n command_2 \n command_3")
Strictly speaking, you can't. According to the ssh spec:
A session is a remote execution of a program. The program may be a
shell, an application, a system command, or some built-in subsystem.
This means that, once the command has executed, the session is finished. You cannot execute multiple commands in one session. What you CAN do, however, is starting a remote shell (== one command), and interact with that shell through stdin etc... (think of executing a python script vs. running the interactive interpreter)
You can do that by invoking shell on the client and sending commands. Please refer here
The page has code for python 3.5. I have modified the code a bit to work for pythin 2.7. Adding code here for reference
import threading, paramiko
strdata=''
fulldata=''
class ssh:
shell = None
client = None
transport = None
def __init__(self, address, username, password):
print("Connecting to server on ip", str(address) + ".")
self.client = paramiko.client.SSHClient()
self.client.set_missing_host_key_policy(paramiko.client.AutoAddPolicy())
self.client.connect(address, username=username, password=password, look_for_keys=False)
self.transport = paramiko.Transport((address, 22))
self.transport.connect(username=username, password=password)
thread = threading.Thread(target=self.process)
thread.daemon = True
thread.start()
def close_connection(self):
if(self.client != None):
self.client.close()
self.transport.close()
def open_shell(self):
self.shell = self.client.invoke_shell()
def send_shell(self, command):
if(self.shell):
self.shell.send(command + "\n")
else:
print("Shell not opened.")
def process(self):
global strdata, fulldata
while True:
# Print data when available
if self.shell is not None and self.shell.recv_ready():
alldata = self.shell.recv(1024)
while self.shell.recv_ready():
alldata += self.shell.recv(1024)
strdata = strdata + str(alldata)
fulldata = fulldata + str(alldata)
strdata = self.print_lines(strdata) # print all received data except last line
def print_lines(self, data):
last_line = data
if '\n' in data:
lines = data.splitlines()
for i in range(0, len(lines)-1):
print(lines[i])
last_line = lines[len(lines) - 1]
if data.endswith('\n'):
print(last_line)
last_line = ''
return last_line
sshUsername = "SSH USERNAME"
sshPassword = "SSH PASSWORD"
sshServer = "SSH SERVER ADDRESS"
connection = ssh(sshServer, sshUsername, sshPassword)
connection.open_shell()
connection.send_shell('cmd1')
connection.send_shell('cmd2')
connection.send_shell('cmd3')
time.sleep(10)
print(strdata) # print the last line of received data
print('==========================')
print(fulldata) # This contains the complete data received.
print('==========================')
connection.close_connection()
cmd = 'ls /home/dir'
self.ssh_stdin, self.ssh_stdout, self.ssh_stderr = self.ssh.exec_command(cmd)
print self.ssh_stdout.read()
cmd2 = 'cat /home/dir/test.log'
self.ssh_stdin2, self.ssh_stdout2, self.ssh_stderr2 = self.ssh.exec_command(cmd2)
print self.ssh_stdout2.read()
You can run multiple command using the below technique. Use semicolon to separate the Linux commands
Eg:
chan.exec_command("date;ls;free -m")
If you wish each command to have an effect on the next command you should use:
stdin, stdout, stderr = client.exec_command("command1;command2;command3")
but in some cases, I found that when ";" doesn't work, using "&&" does work.
stdin, stdout, stderr = client.exec_command("command1 && command2 && command3")
You can execute an entire BASH script file for better use, here is the code for that:
import paramiko
hostname = "192.168.1.101"
username = "test"
password = "abc123"
# initialize the SSH client
client = paramiko.SSHClient()
# add to known hosts
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
try:
client.connect(hostname=hostname, username=username, password=password)
except:
print("[!] Cannot connect to the SSH Server")
exit()
# read the BASH script content from the file
bash_script = open("script.sh").read()
# execute the BASH script
stdin, stdout, stderr = client.exec_command(bash_script)
# read the standard output and print it
print(stdout.read().decode())
# print errors if there are any
err = stderr.read().decode()
if err:
print(err)
# close the connection
client.close()
This will execute the local script.sh file on the remote 192.168.1.101 Linux machine.
script.sh (just an example):
cd Desktop
mkdir test_folder
cd test_folder
echo "$PATH" > path.txt
This tutorial explains this in detail: How to Execute BASH Commands in a Remote Machine in Python.
I am trying to combine the output of stdout and stderr. My belief is that this can be done with the set_combine_stderr() of a Channel object.
This is what I am doing:
SSH = paramiko.SSHClient()
#I connect and everything OK, then:
chan = ssh.invoke_shell()
chan.set_combine_stderr(True)
chan.exec_command('python2.6 subir.py')
resultado = chan.makefile('rb', -1.)
However, I get the following error when I try to store the result (last line above, chan.makefile() ):
Error: Channel closed.
Any help would be greatly appreciated
While it is true that set_combine_stderr diverts stderr to the stdout stream, it does so in chaotic order, so you do not get the result you probably want, namely, the lines combined in the order written, as if you were running the command in a local terminal window. Instead, use get_pty. That will cause the server to run the lines through a pseudo-terminal, keeping them in chronological sequence.
Here's a test program, outerr.py, that writes alternating lines on stdout and stdin. Assume it's sitting in the home directory of llmps#meerkat2.
#!/usr/bin/env python
import sys
for x in xrange(1, 101):
(sys.stdout, sys.stderr)[x%2].write('This is line #%s, on std%s.\n' %
(x, ('out', 'err')[x%2]))
Now try the following code to run it remotely:
#!/usr/bin/env python
import paramiko
def connect():
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('meerkat2', username='llmps', password='..')
return ssh
def runTest(ssh):
tran = ssh.get_transport()
chan = tran.open_session()
# chan.set_combine_stderr(True)
chan.get_pty()
f = chan.makefile()
chan.exec_command('./outerr.py')
print f.read(),
if __name__ == '__main__':
ssh = connect()
runTest(ssh)
ssh.close()
If you run the above, you should see 100 lines in order as written. If, instead, you comment out the chan.get_pty() call and uncomment the chan.set_combine_stderr(True) call, you will get clumps of stdout and stderr lines interspersed randomly from run to run.
Ok, I know this is quite an old topic, but I run into the same problem and I got a (maybe not-so-)pretty solution. Just call the command on the remote server redirecting the stderr to stdout and then always read from the stdout. For example:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('hostname', username='user', password='pass')
stdin,stdout,stderr = client.exec_command('python your_script.py 2> \&1')
print stdout.read()
#AaronMcSmooth: I am referring to the stdout and stderr of the computer I am connecting to (via SSH).
I ended up doing this:
stdin, stdout, stderr = ssh.exec_command(...)
output = stdin.read().strip() + stdout.read().strip()
For the purpose of my application, it doesn't matter to distinguish between stdout and stderr, but I don't think that's the best way to combine the two.
The code of SSHClient.exec_command() is (looking at paramiko's source code):
def exec_command(self, command, bufsize=-1):
chan = self._transport.open_session()
chan.exec_command(command)
stdin = chan.makefile('wb', bufsize)
stdout = chan.makefile('rb', bufsize)
stderr = chan.makefile_stderr('rb', bufsize)
return stdin, stdout, stderr
I am performing the same actions on the channel but receive the Channel is closed error.