paramiko object method exec_command doesn't execute shell script - python

this is 3rd hour of me knowing the existence of something which is called paramiko.
my requirement: run a py script on windows that connects to remote linux server, executes a shell script there and take output of whatever shell script prints and return to python script and print it on python windows terminal.
I am able to connect through ssh. Opening channel and session both working.
Issue: SshObj.exec_command doesn't work for some commands or scripts.
However, I tried with normal "ls" command. same SshObj.exec_command call is working fine.
below is snippet:
>>> ssh = paramiko.SSHClient()
>>> ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
>>> ssh.connect(IP,22,usname,pswd)
>>> stdin, stdout, stderr = ssh.exec_command('/home/scripts/a.sh')
>>> stdout.channel.recv_ready()
False
>>> stdout.channel.recv_ready()
False
>>> stdin, stdout, stderr = ssh.exec_command('/home/scripts/a.sh')
>>> stdin, stdout, stderr = ssh.exec_command('ls')
>>> stdout.channel.recv_ready()
True

Related

Python paramiko push var on remote host

some one know, how can i use variable to execute on remote host with paramiko?
For example i try to put multiple files and execute/make executable it.
SRCFILE='/root/test1.py'
DSTDIR="/tmp/"
dstf= 'runremote.py'
stdin, stdout, stderr = ssh.exec_command("chmod +x" + '' + DSTDIR + dstf)
This lines are only for example not the Script.
My issue, i get nothing from stdout and the file are untouched.
if i run stdin, stdout, stderr = ssh.exec_command("chmod +x /tmp/runremote.py") get output on stdout and my file is changed.

paramiko script works if commands are entered manually in python shell

I have a simple python script which uses paramiko module to ssh login to a ubuntu PC and turn it off.
The problem is the script doesn't work, whereas if the commands are typed manually into the python interpreter, the commands work as intended, the remote PC shuts down.
here is my code
#!/usr/bin/python3
import paramiko, subprocess
remote_ip=''
remote_user=''
remote_password=''
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(remote_ip, username = remote_user, password = remote_password,look_for_keys = False, allow_agent = False)
stdin, stdout, stderr = ssh.exec_command('sudo poweroff', get_pty = True)
stdin.write(remote_password+'\n')
stdin.flush()
ssh.close()
print(stdout.readlines())
print(stderr.readlines())
I figured out the solution, but i don't know why it works. I removed the ssh.close() line, and now the script works as intended. It will be helpful if someone could explain this.

Verify if process is running in remote server

I am using paramiko to start a process in remote server.
With below code even if process is getting started or not ,it's printing 'not able to start'.
Not able to figure out issue here.
stdin, stdout, stderr = ssh.exec_command("{0}/minidiameterd -f {0}/BasicDiam1".format(minidiam_path))
stdin, stdout, stderr = ssh.exec_command("pgrep minidiameterd")
output = stdout.readlines()
if not output:
print "Not able to start minidiameterd"
Can you try
output = stdout.read.splitlines()

ssh.exec_command("shutdown -h 17:00 &")

I have a Python Paramiko script that sends commands to remote hosts on out intranet. There are times when I would like to send the shutdown command to several hosts at once. The issue is that the shutdown command simply sits and waits unless you background it. I have tried using the ampersand (bare as above, or escaped: \&). Here is a small test program. My os is RHEL Linux 5.9 (Python 2.4.3). Note that the sudoers disables requiretty for some users.
#!/usr/bin/python
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect("<hostname>",username="<my username>", password="<mypassword>")
stdin, stdout, stderr = ssh.exec_command("sudo /sbin/shutdown -h 17:00 \&")
stdin.write('\n')stdin.flush()
data = stdout.read().splitlines()
for line in data:
print line
I have solved the issue using the shutdown command as it is intended. First do not escape the ampersand (\&). Since the shutdown command does not return anything to stdout, I just eliminate those lines dealing with the output. The reason for wanting to use shutdown with a time is for user notification.
#!/usr/bin/python
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect("<hostname>",username="<my username>", password="<mypassword>")
stdin, stdout, stderr = ssh.exec_command("sudo /sbin/shutdown -h 17:00 &")]
ssh.close()

Running command with Paramiko exec_command causes process to sleep before finishing

I'm using Python's Paramiko module to SSH into a remote machine and Tar/ZIP a folder with a LOT of files (over 14K files and 60+gigs of data). The resulting zip is around 10 gigs itself. Now I can run the command to zip/tar directly from the machine with no problem. However, when I try to run the same command through SSHClient.exec_command, it runs for a bit, but eventually the zipping process on the remote machine goes to sleep. And the recv_exit_status just hangs indefinitely. Here is the code I'm using:
stdin, stdout, stderr = ssh.exec_command('cd myDirectory; tar -zcvf output.tgz *')
status = stdout.channel.recv_exit_status()
I also tried using Zip.
stdin, stdout, stderr = ssh.exec_command('cd myDirectory; find -name "*.gz" | zip output.zip -#')
status = stdout.channel.recv_exit_status()
In both cases, if I run the command directly from the remote machine, it finishes zipping/TARing. The result file is is like 9 gigs. But when I try it from Paramiko, it starts, goes more than half way (6ish gigs) and then the process goes to sleep!
I've monitored the processes on the remote machine using top, and the zip/tar WILL start running, but it will eventually go to sleep before finishing. And the python script will hang indefinitely.
Any ideas why this is happening?
It could be a timeout related. Try adding timeout param (in seconds) to the call: exec_command(timeout=20*60). This is 20 min example.
See doc string from that method for more info:
def exec_command(self, command, bufsize=-1, timeout=None, get_pty=False):
"""
Execute a command on the SSH server. A new `.Channel` is opened and
the requested command is executed. The command's input and output
streams are returned as Python ``file``-like objects representing
stdin, stdout, and stderr.
:param str command: the command to execute
:param int bufsize:
interpreted the same way as by the built-in ``file()`` function in
Python
:param int timeout:
set command's channel timeout. See `Channel.settimeout`.settimeout
:return:
the stdin, stdout, and stderr of the executing command, as a
3-tuple
:raises SSHException: if the server fails to execute the command
"""
Also there is another issue that i experience which could also contribute: https://github.com/paramiko/paramiko/issues/109
Try my suggestion in https://github.com/paramiko/paramiko/issues/109#issuecomment-111621658
I also experienced this issue it is due to stdout.channel.eof_received == 0
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect("1.1.1.1", username="root", password="pass")
stdin, stdout, stderr = client.exec_command("service XXX start")
stdin, stdout and stderr are staying open...
>>> print stdin
<paramiko.ChannelFile from <paramiko.Channel 3 (open) window=2097152 in-buffer=50 -> <paramiko.Transport at 0x17eff90L (cipher aes128-ctr, 128 bits) (active; 1 open channel(s))>>>
>>> print stdout
<paramiko.ChannelFile from <paramiko.Channel 3 (open) window=2097152 in-buffer=50 -> <paramiko.Transport at 0x17eff90L (cipher aes128-ctr, 128 bits) (active; 1 open channel(s))>>>
>>> print stderr
<paramiko.ChannelFile from <paramiko.Channel 3 (open) window=2097152 in-buffer=50 -> <paramiko.Transport at 0x17eff90L (cipher aes128-ctr, 128 bits) (active; 1 open channel(s))>>>
So EOF was not received...
>>> print stdin.channel.eof_received
0
Usually I receive True and can just stdout.read(), but to be safe i use this workaround (which works!): Wait for a timeout, force stdout.channel.close() and then stdout.read():
>>> timeout = 30
>>> import time
>>> endtime = time.time() + timeout
>>> while not stdout.channel.eof_received:
... sleep(1)
... if time.time() > endtime:
... stdout.channel.close()
... break
>>> stdout.read()
'Starting XXX: \n[ OK ]\rProgram started . . .\n'
>>>
My solution is client.exec_command('my_cmd', get_pty=True).
get_pty=True can Request a pseudo-terminal from the server.
So If you can run your command in ssh session, then should also work by using exec_command() function.
I just had a similar thing happen to me. I can run a command when logged in via ssh, but running through exec_command eventually puts the command to sleep (S in htop). I found that it might be due to the command producing too much output on either stderr or stdout which I have gathered can overflow buffers, causing the signal that the command has finished to be lost, but I'm definitely no expert. What I did find is that adding > /dev/null 2>&1 to the end of my command (thereby eliminating the need of paramiko to touch stdout and stderr) allows the same command to finish via exec_command.
In summary, my workflow looks like this:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('999.99.99.99', username='me', key_filename='my_key')
stdin, stdout, stderr = ssh.exec_command('my_command > /dev/null 2>&1')
stdout_content = stdout.read()
stderr_content = stderr.read()
ssh.close()
This works for now, but I would still be very interested if someone knows how to fix the original problem without having to change the command to redirect stdout and stderr.

Categories

Resources