SSH to remote server - and write results to local server - python

So I want to be able to get this info that I initiate from my local server to this remote appliance and instead of getting the results to my local screen. I want to write it to a local file. I can see examples in paramiko, but I am having issues installing it for python3 as this is what I prefer to use. so I am trying using subprocess. now the unique thing is this remote appliance has limited commands it accepts, it is more like I literally have to run a 'show' command on the appliance. so there is nothing to SCP..hence the reason I did not use SCP.
This will write it to my screen, but that does not do me much good :(
xfer = subprocess.Popen(["ssh", "user#mysystem.com", " show my_secret_file"], stderr=subprocess.PIPE)
errdata = prog.communicate()[1]
Is this possible?

Assuming your appliance will actually write its output to stdout, its output will actually be returned in prog.communicate(), as long as you asked for stdout in Popen().
You can then save the returned stdout to a file using the standard file IO functions.
In other words, here's how it would work:
import subprocess
# Call subprocess and save stdout and stderr
prog = subprocess.Popen(["ssh", "user#mysystem.com", " show my_secret_file"],
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# ^ Add this bit
out, err = prog.communicate()
# Do your error handling here...
# ...
# Now write to file
writefile = open("Put your file name here", "w")
writefile.write(out.decode("utf-8"))
writefile.close()
Note that the above assumes stdout is in text mode. If it is in binary mode, you may have to do some str/bytes conversion, or open the file in a different mode.

Related

How to flush STDOUT in python CGI file hosted on PCF?

Due to Apache gateway timeouts, and a desire to display more information to the end user, I'd like to be able to flush STDOUT on a python CGI script hosted on PCF, essentially giving updates on the status of the script.
I have tried enabling the -u tag in python (#!/usr/python -u at head of my script), sys.stdout.flush() command, and even using subprocess.call to execute a perl script that is set to flush to STDOUT that prints some progress text ($| = 1; at beginning of perl script). Furthermore, I've double checked that I'm not using any Apache modules that would require buffering (no mod_deflate, for example). Finally, I'll mention that executing a standard perl CGI rather than a python CGI allows for the STDOUT flushing, so I figure it must be something with my python/Apache/PCF configuration.
I'm fresh out of ideas here, and would like some advice.
With any of these above methods, I would have thought stdout would flush. But, none of them have worked!
Thanks in advance for any assisstance.
You can disable buffering using something like this in your Python2 code:
# set stdout as non-buffered
if hasattr(sys.stdout, 'fileno'):
fileno = sys.stdout.fileno()
tmp_fd = os.dup(fileno)
sys.stdout.close()
os.dup2(tmp_fd, fileno)
os.close(tmp_fd)
sys.stdout = os.fdopen(fileno, "w", 0)
That is reopening sys.stdout with no buffer (i.e. the 0 as third arg). After you do that, anything writing to sys.stdout should not be buffered.

Suppress output of subprocess

I want to use the subprocess module to control some processes spawned via ssh.
By searching and testing I found that this works:
import subprocess
import os
import time
node = 'guest#localhost'
my_cmd = ['sleep','1000']
devnull = open(os.devnull, 'wb')
cmd = ['ssh', '-t', '-t', node] + my_cmd
p = subprocess.Popen(cmd, stderr=devnull, stdout=devnull)
while True:
time.sleep(1)
print 'Normal output'
The -t -t option I provide allows me to terminate the remote process instead of just the ssh command. But this, also scrambles my program output as newlines are no longer effective making it a long and hard to read string.
How can I make ssh not affecting the formatting of the python program?
Sample output:
guest:~$ python2 test.py
Normal output
Normal output
Normal output
Normal output
Normal output
Normal output
Normal output
(First ctrl-c)
Normal output
Normal output
Normal output
(Second ctrl-c)
^CTraceback (most recent call last):
File "test.py", line 13, in <module>
time.sleep(1)
KeyboardInterrupt
Ok, the output is now clear. I do not exactly know why, but the command ssh -t -t puts the local terminal in raw mode. It makes sense anyway, because it is intended to allow you to directly use curses programs (such as vi) on the remote, and in that case, no conversion should be done, not even the simple \n -> \r\n that allows a simple new line to leave the cursor on first column. But I could not find a reference on this in ssh documentation.
It (-t -t) allows you to kill the remote process because the raw mode let the Ctrl + C to be sent to the remote instead of being processed by the local tty driver.
IMHO, this is design smell, because you only use a side effect of the pty allocation to pass a Ctrl + C to the remote and you suffer for another side effect which is the raw mode on local system. You should rather process the standard input (stdinput = subprocess.PIPE) and explicitely send a chr(3) when you input a special character on local keyboard, or install a signal handler for SIG-INT that does it.
Alternatively, as a workaround, you can simply use something like os.system("stty opost -igncr") (or better its subprocess equivalent) after starting the remote command to reset the local terminal in an acceptable mode.

Simple Python Script not Executing Properly

The code is as follows:
fh = tempfile.NamedTemporaryFile(delete=False,suffix = '.py')
stream = io.open(fh.name,'w',newline='\r\n')
stream.write(unicode(script))
stream.flush()
stream.close()
proc = subprocess.Popen(
[path,fh.name],
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
proc.stdin.close()
proc.stderr.close()
out = proc.stdout.readline()
print out
script is a string which contains the subprocess code, in this case a simple hello world. Since it has unix file endings, I had to use io.open in order to write it properly for windows. path is the path to the python.exe on my machine. The file is generated and looks fine in notepad:
def main():
print 'hello world'
However, when I run the program, the subprocess executes and does nothing.
Its not a problem with the executable path, I've tested it with other programs, so it must be with either the temp file itself, or the text within it. Delete is set to false in order to check the contents of the file for debugging. Is there anything glaringly wrong with this code? I'm a bit new to using Popen.
The main issue in your program is that when you specify shell=True , you need to provide the entire command as a string, not a list.
Given that, there is really no need for you to use shell=True , also, unless absolutely neccessary, you should not use shell=True , its a security hazard, this is given in the documentation as well -
Executing shell commands that incorporate unsanitized input from an untrusted source makes a program vulnerable to shell injection, a serious security flaw which can result in arbitrary command execution. For this reason, the use of shell=True is strongly discouraged in cases where the command string is constructed from external input:
Also, if you do not want to use stdin / stderr (since you are closing those off as soon as you start the process) , there is no need to use PIPE for them.
Example -
fh = tempfile.NamedTemporaryFile(delete=False,suffix = '.py')
stream = io.open(fh.name,'w',newline='\r\n')
stream.write(unicode(script))
stream.flush()
stream.close()
proc = subprocess.Popen(
[path,fh.name],
stdout=subprocess.PIPE,
)
out = proc.stdout.readline()
print out
Also, the script -
def main():
print 'hello world'
would not work, since you need to call main() for it to run.

Popen hangs on communicate

I am writing a program to communicate to two programs:
output = Popen(shlex.split(query_cmd), stdout=PIPE,stdin=None)
cmd_out = [output.stdout]
while cmd_out:
readable,writeready,exceptready = select.select(cmd_out,[],[],timeout)
for f in readable:
line = f.readline()
snap_result=Popen(shlex.split("snap %s" % (line) ),stdout=PIPE,close_fds=True).communicate()[0]
print snap_result
Supposedly query_cmd will continuously generate lines of result. snap should that use this line as argument, return results and terminate. This works on python2.4. However, on python2.6.6, it seems that the snap will hangs on reading the result.
If I change the query_cmd to "tail -f file". It seems to work too.
I am running this inside a csh script where both stdout/stderr are redirected to a log file.
EDIT: Actually, it is weird, in csh, I redirected both stdout&stderr to log file. If I only redirect stdout, it is running fine. If I redirect stderr, it hangs. I think somehow the stderr is messed up between parent process of python and child process.
Seems not be the problem of the script but because the subprocess is expecting stdin input. Redirect the stdin to null device solve this.

Directly connect system call output to logger in Python

I'm writing some code which involves running a few shell commands from Python and ideally, I would like to integrate the output from these commands into the logger that I'm using. I know I can divert stdout into a file / socket as follows:
call( '<a-shell-cmd>', shell=True, stdout=myFile )
but I'd rather not have the bind of opening a temporary file, looping over the file writing the output, closing the file, deleting the file etc. If there's anyway that I can send the output directly to the logger, it would seem a lot neater to me. Any ideas?
Use the subprocess module.
Tip: you can go to the documentation for a particular version of python via http://docs.python.org/release/<major>.<minor>/
From Python 2.7 and above:
output = subprocess.check_output(["command", "arg1"], shell=True)
In Python 2.4:
process = subprocess.Popen(["command", "arg1"], shell=True, stdout=subprocess.PIPE)
stdout,stderr = process.communicate()
# not shown: how to use Popen.poll() to wait for process death.
# while filling an output buffer
print stdout
Below Python 2.4:
output = os.popen('ls')
Use os.popen
output = os.popen('ls')
You can then log output or do it directly when calling the above.

Categories

Resources