writing to stdin, access denied - python

I'm trying to write a python script that starts a subprocess, and writes to the subprocess stdin. Does some tests on teh output and then writes more commands to stdin.
I have tried:
def get_band():
print "band"
p = subprocess.Popen(["/path/to/program","-c","-"], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE)
ran_stdout = p.communicate(input='show status')[0]
print(ran_stdout)
However the print statement gives:
Unable to connect at 127.0.0.1, Connection refused.
I was wondering if i am doing this right? Here is the documentation about the process I'm trying to run. I want to use the last option.
Running the tool from the Linux shell allows additional options, depending on the options given to the command. The options are as follows:
-h Displays help about the command
-c <Filename> Instead of taking typed commands interactively from a user the commands are read from the named file, i.e. in batch mode. When all commands are processed the CLI session ends automatically.
-c - As above but reads command from Linux stdin. This allows commands to be ‘piped’ to the program.

If you could tell us more about that program, maybe someone knowing this program could try to explain better how it works in particular.
Nevertheless, what you describe
starts a subprocess, and writes to the subprocess stdin. Does some tests on teh output and then writes more commands to stdin.
does not match your code.
Your code prints something to our own stdout, displaying band, and then does a "one-shot" communication with the subprocess.
To be clear about that, p.communicate() writes all it gets to the subprocess, closes its stdin and reads out whatever it gets from stdout and stderr.
Thus it is incompatible with what you desire: write, read, write again.
So you'll have to craft that on your own.
If the chunks you write are small enough to be guaranteed to fit into the pipe buffer, it is simple: just write the commands (don't forget the trailing \n) and read.
But be aware! Don't read more than you really have, or your reading might block.
Thus, work with non-blocking IO or with select.select().
If you need more information about the one or other, there are other answers here on SO which cover these subjects. The other day I wrote one which might help you.

This worked for some reason, passing in the command in the same line. Then call this function for every command I want.
p = subprocess.Popen(["/path/to/program", '-c', '-', cmd_here],
stdout=subprocess.PIPE)
proc_stdout, proc_stderr = proc.communicate()
proc.wait()
#print stuff

Related

Send and receive data multiple times to subprocess (Python)

Issue
I am communicating with a terminal application (xfoil) and I want to isolate the stdout corresponding to each stdin.
This question is also more general as I wish to know why I can't open an application with subprocess, and then use successively its stdin and stdout (or rather how could I do it).
What I can do now
As of now, I can send instructions to Xfoil using process.communicate which retrieves the entire stdout.
import subprocess
xfoil = subprocess.Popen('path_to_xfoil.exe', stdin=subprocess.PIPE, \
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
[output, _] = xfoil.communicate(input=instructions)
What I want to achieve
Instead of having to deal with the entire stdout, I wish to isolate each set of instructions (stdin) and results (stdout).
Something in the lines of:
output1 = process.communicate(input=instructions1)
output2 = process.communicate(input=instructions2)
output3 = process.communicate(input=instructions3)
...
I need the process to stay open (which is not the case with communicate).
What I have attempted
Communicate multiple times with a process without breaking the pipe? is probably the way to go, however it does not explain clearly how to read the output, and the following piece of code simply freezes, probably because I have no idea when read should stop.
xfoil.stdin.write(instructions1)
xfoil.stdout.read() # never passes this line
xfoil.stdin.write(instructions2)
xfoil.stdout.read()
Non-blocking read on a subprocess.PIPE in python seemed a good path as well, however it only takes care of output.
Or perhaps I need to use the os module as in ipc - communicate multiple times with a subprocess in Python ?
Thank you for your help
PS: I read a tiny bit about fcntl but I need the code to work on both Linux and Windows.

Python subprocess Log and Display in Shell Issues

I have a python script where I'm running an external archive command with subprocess.Popen(). Then I'm piping stdout to a sys write and a log file (see code below), because I need to print and log the output. The external command outputs progress like "Writing Frame 1 of 1,000", which I would like in my log.
So far I can either have it display/write in large blocks by including "stdout=subprocess.PIPE, stderr=subprocess.PIPE", but then the user thinks the script isn't working. Or I just have "stdout=subprocess.PIPE" the progress "Writing of Frame..." aren't in the log file.
Any thoughts?
My script looks something like this:
archive_log = open('archive.log', 'w')
archive_log.write('Archive Begin')
process_archive = subprocess.Popen(["external_command", "-v", "-d"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) #Archive Command
for line in process_archive.stdout:
sys.stdout.write(line)
archive_log.write(line)
archive_log.write('Archive End')
archive_log.close()
It sounds like you're just trying to merge the subprocess's stdout and stderr into a single pipe. To do that, as the docs explain, you just pass stderr=subprocess.STDOUT.
If, on the other hand, you want to read from both pipes independently, without blocking on either one of them, then you need some explicit asynchronicity.
One way to do this is to just create two threads, one of them blocking on proc.stdout, the other on proc.stderr, then just have the main thread join both threads. (You probably want a lock inside the for body in each thread; that's the only way to make sure that lines are written atomically and in the same order on stdout and in the file.)
Alternatively, many reactor-type async I/O libraries, including the stdlib's own asyncio (if you're on 3.4+) and major third-party libs like Twisted can be used to multiplex multiple subprocess pipes.
Finally, at least if you're on Unix, if you understand all the details, you may be able to do it with just select or selectors. (If this doesn't make you say, "Aha, I know how to do it, I just have a question about one of the details", ignore this idea and use one of the other two.)
It's clear that you really do need stderr here. From your question:
Or I just have "stdout=subprocess.PIPE" the progress "Writing of Frame..." aren't in the log file.
That means the subprocess is writing those messages to stderr, not stdout. So when you don't capture stderr, it just passes through to the terminal, rather than being captured and written to both the terminal and the log by your code.
And it's clear that you really do need them either merged or handled asynchronously:
I can either have it display/write in large blocks by including "stdout=subprocess.PIPE, stderr=subprocess.PIPE", but then the user thinks the script isn't working.
The reason the user thinks the script isn't working is that, although you haven't shown us the code that does this, clearly you're looping on stdout and then on stderr. This means the progress messages won't show up until stdout is done, so the user will think the script isn't working.
Is there a reason you aren't using check_call and the syslog module to do this?
You might also want to use with like this:
with open('archive.log', 'w') as archive:`
do stuff
You gain the benefit of the file being closed automatically.

subprocess stdin PIPE does not return until program terminates

I have been trying to troubleshoot subprocess.PIPE with subprocesses with no luck.
I'm trying to pass commands to an always running process and receive the results without having to close/open the process each time.
Here is the main launching code:
launcher.py:
import subprocess
import time
command = ['python', 'listener.py']
process = subprocess.Popen(
command, bufsize=0,
stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)
# simulates sending a new command every 10 seconds
for x in range(1,10):
process.stdin.write(b'print\r\n')
process.stdin.flush()
time.sleep(10)
listener.py:
import sys
file = open('log.txt', 'w+')
while True:
file.write(sys.stdin.read(1))
file.close()
This is simplified to show relevent pieces. In the end I'll have threads listening on the stdout and stderr but for now I'm trying to troubleshoot the basics.
What I expect to happen: for each loop in launcher.py, the file.write() in listener.py would write.
What happens instead: everything writes when the loop closes and the program terminates, or I SIGTERM / CTRL-C the script.
I'm running this in Windows 8 Python 3.4.
It's almost as if stdin buffers until the process closes and then it passes through. I have buffsize=0 set, and I'm flushing, so that doesn't make sense to me. I thought either one or the other would be sufficient.
The subprocess is running in a different process, so the sleep in launcher should have no impact on the subprocess.
Does anyone have any ideas why this is blocking?
Update 1:
The same behaviour is also seen with the following code run from the console (python.exe stdinreader.py)
That is, when you type into the console while the program is running, nothing is written to the file.
stdinreader.py:
import sys
import os
file = open('log.txt', 'w+b')
while True:
file.write(sys.stdin.read(1))
file.close()
Adding a file.flush() just before file.write() solves this, but that doesn't help me with the subprocess because I don't have control of how subprocess flushes (which would be my return subprocess.PIPE). Maybe if I reinitialize that PIPE with open('wb') it will not buffer. I will try.
Update 2:
I seem to have isolated this problem to the subprocess being called which is not flushing after it's writes to stdout.
Is there anything I can do to force a flush on the stdout PIPE between parent and child without modifying the subprocess? The subprocess is magick.exe (imagemagick 7) running with args ['-script, '-']. From the point of view of the subprocess it has a stdout object of <_io.TextIOWrapper name='' mode='w' encoding='cp1252'>. I guess the subprocess will just open the default stdout objects on initialization and we can't really control whether it buffers or not.
The strange thing is that passing the child the normal sys.stdout object instead of subprocess.PIPE does not require the subprocess to .flush() after write.
Programs run differently depending on whether they are run from the console or through a pipe. If the console (a python process can check with os.stdin.isatty()), stdout data is line buffered and you see data promptly. If a pipe, stdout data is block buffered and you only see data when quite a bit has piled up or the program flushes the pipe.
When you want to grab program output, you have to use a pipe and the program runs in buffered mode. On linux, you can trick programs by creating a fake console (pseudo tty, pty, ...). The pty module, pexpect and others do that.
On windows, I don't know of any way to get it to work. If you control the program being run, have it flush often. Otherwise, glare futilely at the Windows logo. You can even mention the problem on your next blind date if you want it to end early. But I can't think of anything more.
(if somebody knows of a fix, I'd like to hear it. I've seen some code that tries to open a Windows console and screen scrape it, but those solutions keep losing data. It should work if there is a loopback char device out there somewhere).
The problem was that the subprocess being called was not flushing after writing to stdout. Thanks to J.F. and tdelaney for pointing me in the right direction. I have raised this with the developer here: http://www.imagemagick.org/discourse-server/viewtopic.php?f=2&t=26276&p=115545#p115545
There doesn't appear to be a work-around for this in Windows other than to alter the subprocess source. Perhaps if you redirected the output of the subprocess to a NamedTemporaryFile that might work, but I have not tested it and I think it would be locked in Windows so only one of the parent and child could open it at once. Not insurmountable but annoying. There might also be a way to exec the application through unixutils port of stdbuf or something similar as J.F. suggested here: Python C program subprocess hangs at "for line in iter"
If you have access to the source code of the subprocess you're calling you can always recompile it with buffering disabled. It's simple to disable buffering on stdout in C:
setbuf(stdout, NULL)
or set per-line buffering instead of block buffering:
setvbuf(stdout, (char *) NULL, _IOLBF, 0);
See also: Python C program subprocess hangs at "for line in iter"
Hope this helps someone else down the road.
can you try to close the pipe at the end of listener.py? i think that is the issue

Executing multiple commands using Popen.stdin

I'd like to execute multiple commands in a standalone application launched from a python script, using pipes. The only way I could reliably pass the commands to the stdin of the program was using Popen.communicate but it closes the program after the command gets executed. If I use Popen.stdin.write than the command executes only 1 time out of 5 or so, it does not work reliable. What am I doing wrong?
To elaborate a bit :
I have an application that listens to stdin for commands and executes them line by line.
I'd like to be able to run the application and pass various commands to it, based on the users interaction with a GUI.
This is a simple test example:
import os, string
from subprocess import Popen, PIPE
command = "anApplication"
process = Popen(command, shell=False, stderr=None, stdin=PIPE)
process.stdin.write("doSomething1\n")
process.stdin.flush()
process.stdin.write("doSomething2\n")
process.stdin.flush()
I'd expect to see the result of both commands but I don't get any response. (If I execute one of the Popen.write lines multiple times it occasionally works.)
And if I execute:
process.communicate("doSomething1")
it works perfectly but the application terminates.
If I understand your problem correctly, you want to interact (i.e. send commands and read the responses) with a console application.
If so, you may want to check an Expect-like library, like pexpect for Python: http://pexpect.sourceforge.net
It will make your life easier, because it will take care of synchronization, the problem that ddaa also describes. See also:
http://www.noah.org/wiki/Pexpect#Q:_Why_not_just_use_a_pipe_.28popen.28.29.29.3F
The real issue here is whether the application is buffering its output, and if it is whether there's anything you can do to stop it. Presumably when the user generates a command and clicks a button on your GUI you want to see the output from that command before you require the user to enter the next.
Unfortunately there's nothing you can do on the client side of subprocess.Popen to ensure that when you have passed the application a command the application is making sure that all output is flushed to the final destination. You can call flush() all you like, but if it doesn't do the same, and you can't make it, then you are doomed to looking for workarounds.
Your code in the question should work as is. If it doesn't then either your actual code is different (e.g., you might use stdout=PIPE that may change the child buffering behavior) or it might indicate a bug in the child application itself such as the read-ahead bug in Python 2 i.e., your input is sent correctly by the parent process but it is stuck in the child's internal input buffer.
The following works on my Ubuntu machine:
#!/usr/bin/env python
import time
from subprocess import Popen, PIPE
LINE_BUFFERED = 1
#NOTE: the first argument is a list
p = Popen(['cat'], bufsize=LINE_BUFFERED, stdin=PIPE,
universal_newlines=True)
with p.stdin:
for cmd in ["doSomething1\n", "doSomethingElse\n"]:
time.sleep(1) # a delay to see that the commands appear one by one
p.stdin.write(cmd)
p.stdin.flush() # use explicit flush() to workaround
# buffering bugs on some Python versions
rc = p.wait()
It sounds like your application is treating input from a pipe in a strange way. This means it won't get all of the commands you send until you close the pipe.
So the approach I would suggest is just to do this:
process.stdin.write("command1\n")
process.stdin.write("command2\n")
process.stdin.write("command3\n")
process.stdin.close()
It doesn't sound like your Python program is reading output from the application, so it shouldn't matter if you send the commands all at once like that.

Logging output of external program with (wx)python

I'm writing a GUI for using the oracle exp/imp commands and starting sql-scripts through sqlplus. The subprocess class makes it easy to launch the commands, but I need some additional functionality. I want to get rid of the command prompt when using my wxPython GUI, but I still need a way to show the output of the exp/imp commands.
I already tried these two methods:
command = "exp userid=user/pwd#nsn file=dump.dmp"
process = subprocess.Popen(command, stdout=subprocess.PIPE)
output = process.communicate()[0]
process = subprocess.Popen(command, stdout=subprocess.PIPE)
process.wait()
output = process.stdout.read()
Through one of these methods (forgot which one) I really got the output of exp/imp, but only after the command finishes, which is quite worthless to me, as I need a frequent update during these potentially long running operations. And sqlplus made even more problems, as sqlplus mostly wants some input when an error occurs. When this happens python waits for the process to finish but the user can't see the prompt, so you don't know how long to wait or what to do...
What I'd like to have is a wrapper that outputs everything I can see on the standard commandline. I want to log this to a file and show it inside a wxPython control.
I also tried the code from this page: http://code.activestate.com/recipes/440554/
but this can't read the output either.
The OutputWrapper from this answer doesn't work either: How can I capture all exceptions from a wxPython application?
Any help would be appreciated!
EDIT:
The subprocesses don't seem to flush their output. I already tried it with .readline().
My Tool has to run on windows and unix, so pexpect is no solution if there's no windows version. And using cx_oracle would be extreme overkill as I would have to rebuild the whole functionality of exp, imp and sqlplus.
The solution is to use a list for your command
command = ["exp", "userid=user/pwd#nsn", "file=dump.dmp"]
process = subprocess.Popen(command, stdout=subprocess.PIPE)
then you read process.stdout in a line-by-line basis:
line = process.stdout.readline()
that way you can update the GUI without waiting. IF the subprocess you are running (exp) flushes output. It is possible that the output is buffered, then you won't see anything until the output buffer is full. If that is the case then you are probably out of luck.
If you're on Linux, check out pexpect. It does exactly what you want.
If you need to work on Windows, maybe you should bite the bullet and use Python bindings to Oracle, such as cx_Oracle, instead of running CL stuff via subprocess.
Are these solutions able to capture stderr as well? I see you have stdout= option above. How do you make sure to get stderr as well? Another question is is there a way to use import logging/import logging.handlers to capture command stdout/stderr. It would be interesting to be able to use the logger with its buildt in formatters/rotaters,etc.
Try this:
import subprocess
command = "ping google.com"
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
output = process.stdout
while 1:
print output.readline(),

Categories

Resources