How do I know the remote process is running, or complete? - python

I'm using the Python library Paramiko to run a command over ssh on another server. The problem I'm facing is that the SSHClient.exec_command() call returns immediately, sending me stdin, stdout, and stderr and giving me no other way I can see to tell if the process is still running or not. I thought that I might try monitoring to see if the streams it returns are still open, but I can't find any way to do this except by trying to read from stdout or stderr, or write to stdin and waiting to receive a ValueError. Can anyone tell me of something I've missed that should work instead?

Thanks to advice from #fixxxer I found what I needed to know. My test code now looks like this:
import paramiko
import time
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('localhost', username='user', password='password')
transport = ssh.get_transport()
channel = transport.open_session()
channel.exec_command('./exec_test.py')
status = channel.recv_exit_status()
This works marvellously. It blocks until the command is finished, then allows me to continue.

Related

Paramiko stdout.readlines() is too slow

I am using Paramiko in my Python and Django code to execute command. Here is my code:
client = SSHClient()
client.set_missing_host_key_policy(AutoAddPolicy())
client.connect(<host>, username=<username>, password=<password>)
stdin, stdout, stderr =
client.exec_command("curl -X POST http://127.0.0.1:8080/predictions -T image.jpg")
lines = stdout.readlines()
The execution time of stdout.readlines() is 0.59s for each command. This is not acceptable time for my close-to-real time system. Could anyone give any suggestion to make reading process faster?
The SSHClient.exec_command only starts the command. It does not wait for the command to complete. That's what readlines does. So the readlines takes as long as the command does.
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".

can't get commands on remote host after fixed amount of send commands

I have a program with 2 threads. Every thread sends different commands to remote host and redirect output to file. Threads use different remote hosts. I've created a connection with pxssh and trying to send commands to remote hosts with 'sendline':
s = pxssh.pxssh()
try:
s.login (ip, user, pswd)
except:
logging.error("login: error")
return
logging.debug("login: success")
s.sendline("ls / >> tmpfile.log")
s.prompt()
I can send fixed number of commands (about 500 commands on every host) and after that 'sendline' stops working. Connection is ok, but I can't get commands on remote hosts. It looks like some resources run out... what can it be?
Reposting as an answer, since it solved the issue:
Are you reading in between each write? If the host is producing output and you're not reading it, sooner or later a buffer will fill up and it will block until there's room to write some more. Make sure that before each write, you read any data that's available in the terminal, even if you don't want to do anything with it.
If you really don't care about the output at all, you could create a thread that constantly reads in a loop, so that your main thread can skip reading altogether. If your code needs to do anything with any part of the output, though, don't do this.

Paramiko simulate ssh -t option

I instantiate a paramiko channel, then I execute a command and get its output:
channel = transport.open_session()
channel.exec_command('service myservice restart')
stdout = channel.makefile('rb')
for line in stdout:
print line,
However, after executing the command (which finishes), the output iterating gets blocked.
I tested with ssh:
ssh myhost service myservice restart # terminal gets blocked
ssh -t myhost service myservice restart # OK
So I want to simulate the "-t" option in paramiko. So far I tried:
channel = transport.open_session()
channel.get_pty()
channel.invoke_shell()
stdin, stdout = channel.makefile('wb'), channel.makefile('rb')
stdin.write('service myservice restart\n')
for line in stdout:
print line,
But now, stdout doesn't get closed, and the for never ends.
Any ideas?
It appears like invoke_shell() returns a Channel, and it looks like Channels require that you close them explicitly. I would attempt to close some of the channels you're opening, in particular the one returned by invoke_shell().
Have a look at the script that youre trying to run- see if there are any lines like this
/dev/null 2>&1
Im having the same issue as you- in my case trying to remotely run a bitnami control script. Something in your post jogged my memory and reminded me of the output redirections that are in the control script (these caused me some major headache before).
Generally theyre used to either ignore errors or maybe log them somewhere specific- I havent had a chance to try yet, but maybe either piping them back out at the end of the script or if you dont care about the response maybe even manually redirecting some created data out >&2 would work.

Python paramiko logging messes up stfp connection

I am using paramiko to open a sftp connection to access a remote file. All my code below in a built in function seems to work only if I don't have the logging enabled for paramiko:
paramiko.util.log_to_file( 'paramiko.log' )
So when I do NOT have the above line of code in my file the code below works:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy( paramiko.AutoAddPolicy() )
client.connect( hostname,user, password)
sftp = client.open_sftp()
file = sftp.open( fpath, mode='r', bufsize=1 )
Otherwise python will hang on this line client.connect( hostname,user, password) and writes to the stderr log like crazy eventually killing the VM my code is running on.
Specifically paramiko hangs on this line:
t.start_client()
within the client.connect method. Nothing useful comes out in the paramiko log and stderr is filled with errors with no description or tracebacks.
Researching this problem I came across "There is a single import lock available so when a child thread attempts another import it can block it indefinitely" how do I make sure the code opening a sftp connection is never blocked?
This is a bit of a long shot, but I have had issues with logging's use of threads causing deadlock. I was not able to track the exact problem down (though I suspect it may have been exacerbated by the use of subprocess; but I did solve it by disabling the logging module's thread support.
Try this before you activate logging:
import logging
logging.thread = None
I'd be interested to know if this solves your problem or not.

How to execute a process remotely using python

I want to connect to and execute a process on a remote server using Python. I want to be able to get the return code and stderr (if any) of the process. Has anyone ever done anything like this before. I have done it with ssh, but I want to do it from Python script.
Cheers.
Use the ssh module called paramiko which was created for this purpose instead of using subprocess. Here's an example below:
from paramiko import SSHClient
client = SSHClient()
client.load_system_host_keys()
client.connect("hostname", username="user")
stdin, stdout, stderr = client.exec_command('program')
print "stderr: ", stderr.readlines()
print "pwd: ", stdout.readlines()
UPDATE: The example used to use the ssh module, but that is now deprecated and paramiko is the up-to-date module that provides ssh functionality in python.
Well, you can call ssh from python...
import subprocess
ret = subprocess.call(["ssh", "user#host", "program"]);
# or, with stderr:
prog = subprocess.Popen(["ssh", "user#host", "program"], stderr=subprocess.PIPE)
errdata = prog.communicate()[1]
Maybe if you want to wrap the nuts and bolts of the ssh calls you could use Fabric
This library is geared towards deployment and server management, but it could also be useful for these kind of problems.
Also have a look at Celery. This implements a task queue for Python/Django on various brokers. Maybe an overkill for your problem, but if you are going to call more functions on multiple machines it will save you a lot of headache managing your connections.

Categories

Resources