Capturing the output of a program started via cmd script - python

I am trying to do a complexity analysis of the stanford parser. To do so, I am starting the program via a cmd file, so therefore if I use subprocess.check_output, my python program will give me the commandline arguments I am using. The parser prints its own runtime on the commandline, so therefore I have to actually come up with something which reads out what the program I have started printed on the commandline.
subprocess.check_output("path-to-cmd", shell=True
tldr: This gives me the cmd-files output, I want what the started program printed in the terminal.
As my question was marked as a duplicate, I want the output of a program that I have started via the cmd, if I use subproces.check_output, it will simply give me the content of my cmd, and not the output of the java program I have run. I want to capture what the java program wrote to the terminal.

import subprocess
# Create test file.
with open('test.cmd', 'w') as w:
w.write(
'#echo off\n'
'echo 1 stdout\n'
'>&2 echo 2 stderr\n'
'>&3 echo 3 program output\n')
output = subprocess.check_output(
'test.cmd 1>nul',
universal_newlines=True,
shell=True)
print('check_output:', repr(output))
The example will get the programs output from handle 3. The program
here is just an echo to mimic a program though the redirection
is the goal.
CMD supports up to 9 output handles as quoted from the SS64 site:
STDIN = 0 Keyboard input
STDOUT = 1 Text output
STDERR = 2 Error text output
UNDEFINED = 3-9
You can output programs to handle 3 in the batch file.
Then you can redirect handle 1 to nul i.e. 1>nul or
just >nul in the Python file.
Thus, check_output will only output handle 3 as the stdout.
Output:
2 stderr
check_output: '3 program output\n'
Output uses repr() to show output in 1 line for testing.
No output of the line 1 stdout as handle 1 was redirected to nul.
Stderr will still print out to console as it is not redirected.
You can choose how to handle stderr.
If the Stanford Parser outputs the data as stderr (handle 2)
instead of stdout (handle 1), then you may use 2>&3
in the batch file command to redirect to handle 3. i.e.
2>&3 java -cp stanford-parser.jar ...
I have no experience with the Stanford Parser so the command
example is a guess from online examples from stanford.edu.
If you want all of the output instead of just program output
and the program outputs to handle 2. Then use in the
check_output with 2>&1 or the recommended argument
stderr=subprocess.STDOUT and omit the 1>nul. This may
include batch file script errors which may be undesirable.
If possible, rewrite the batch file to Python as you avoid
complication and 1 script gets all the control.

Related

subprocess.run with stdin input doesn't process

I'm trying to run a command in python:
from subprocess import run, DEVNULL
run(["./rarcrack",'walks.rar'], text=True, input='nano1 nano2', stdout=DEVNULL)
The command doesn't seem to process the stdin though (It says no more words, whereas in the example below it says successfully cracked).
I decided to do this because I'm under the impression that:
The bash pipe redirects stdout to stdin and
./rarcrack takes an argument from stdin because a command like
echo 'nano1 nano2' | ./rarcrack walks.rar works.
And I don't think I can pass in the words as another argument (I don't know any C).
The program is here
The problem is that you discard any results with stdout=DEVNULL. You only see the error output, not the successes.

Python executing cmd and storing whatever it returns

I am writing a script which is executing CMD commands on Windows. I use it to parse commands to diferent application. Those commands return some values or errors. How do I force Python/CMD to store whatever command returns (no matter if it's returned value or error) in a variable and force it NOT to print it to console. I tried subprocess and os.system() and all of those I tried allows to store value but when command returns an error, it still is being printed to the console and not stored in a variable.
That is a property of the shell / of cmd and how you call the process. By default there's one input stream (stdin) and two output streams (stdout and stderr) - the latter being the default stream for all errors.
You can direct either or both to one another or to stdin or a file by calling the script appropriately. See https://support.microsoft.com/en-us/help/110930/redirecting-error-messages-from-command-prompt-stderr-stdout
For example
myscript 1> output.msg 2>&1
will direct everything into output.msg, including errors. Now combine that output redirection with writing to a variable; that is explained in this answer.
When executing a command in a shell there is 2 different outputs handlers, stdout and stderr. Usually stdout is used to print normal output and stderr to print errors and warnings.
You can use subprocess.Popen.communicate() to read both stderr and stdout.
import subprocess
p = subprocess.Popen(
"dir",
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True
)
(stdout, stderr) = p.communicate()
print(stdout.decode('utf-8')) # Standard output
print(stderr.decode('utf-8')) # Standard error

getting python script to print to terminal without returning as part of stdout

I'm trying to write a python script that returns a value which I can then pass in to a bash script. Thing is that I want a singe value returned in bash, but I want a few things printed to the terminal along the way.
Here is an example script. Let's call it return5.py:
#! /usr/bin/env python
print "hi"
sys.stdout.write(str(5))
what I want is to have this perform this way when I run it from the command line:
~:five=`./return5.py`
hi
~:echo $five
5
but what I get is:
~:five=`./return5.py`
~:echo $five
hi 5
In other words I don't know how to have a python script print and clear the stdout, then assign it to the specific value I want.
Not sure why #yorodm suggests not to use stderr. That's the best option I can think of in this case.
Notice that print will add a newline automatically, but when you use sys.stderr.write, you need to include one yourself with a "\n".
#! /usr/bin/env python
import sys
sys.stderr.write("This is an important message,")
sys.stderr.write(" but I dont want it to be considered")
sys.stderr.write(" part of the output. \n")
sys.stderr.write("It will be printed to the screen.\n")
# The following will be output.
print 5
Using this script looks like this:
bash$ five=`./return5.py`
This is an important message, but I dont want it to be considered part of the output.
It will be printed to the screen.
bash$ echo $five
5
This works because the terminal is really showing you three streams of information : stdout, stdin and stderr. The `cmd` syntax says "capture the stdout from this process", but it doesn't affect what happens to stderr. This was designed exactly for the purpose you're using it for -- communicating information about errors, warnings or what's going on inside the process.
You may not have realized that stdin is also displayed in the terminal, because it's just what shows up when you type. But it wouldn't have to be that way. You could imagine typing into the terminal and having nothing show up. In fact, this is exactly what happens when you type in a password. You're still sending data to stdin, but the terminal is not displaying it.
from my comment..
#!/usr/bin/env python
#foo.py
import sys
print "hi"
sys.exit(5)
then the output
[~] ./foo.py
hi
[~] FIVE=$?
[~] echo $FIVE
5
You can use stdout to output your messages and stderr to capture the values in bash. Unfortunately this is some weird behaviour as stderr is intended for programs to communicate error messages so I strongly advice you against it.
OTOH you can always process your script output in bash

Is there any way to have output piped line-by-line from a currently executing python program?

When piping printed output from a python script to a command like grep, the output from the script seems to only be piped to the follow-up command after completion of the entire script.
For example, in a script test_grep.py like the following:
#!/usr/bin/env python
from time import sleep
print "message1"
sleep(5)
print "message2"
sleep(5)
print "message3"
when called with ./test_grep.py | grep message, nothing will appear for 10 seconds, at which time all three lines will appear.
Compare this to a script test_grep.sh:
#!/usr/bin/env bash
echo "message1"
sleep 5
echo "message2"
sleep 5
echo "message3"
./test_grep.sh | grep message will immediately output message1, followed at 5 second intervals by message2 and message3.
I expect this is because only once the python interpreter finishes executing is the output available for the next command. Is there any way to alter this behavior?
You can do it:
By flushing every print in python
By setting stdout to be unbuffered
By setting stdout to be line-buffered
You can even call python -u to disable buffering.
I would go for the line-buffering option as it seems most natural.
open(file, mode='r', buffering=-1 ....)
buffering is an optional integer used to set the buffering policy.
Pass 0 to switch buffering off (only allowed in binary mode), 1 to
select line buffering (only usable in text mode), and an integer > 1
to indicate the size of a fixed-size chunk buffer.
When you don't specify buffering (the typical "open") it will use line-buffering if it detects the output is going directly do a TTY, i.e. to your screen console. If you pipe output or redirect it to a file it will switch back to a large (4K / 8K) buffer.
How do you "set stdout to be line-buffered"?
You can reopen stdout via sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 1).

Python - subprocesses and the python shell

I am trying to shell out to a non-python subprocess and allow it to inherit the stdin and stdout from python. - i am using subprocess.Popen
This would probably work if I am calling from a console, but it definitely doesn't work when I am using the python shell
(I am using IDLE by the way)
Is there any way to convince python to allow a non python subprocess to print it's stdout to the python shell?
This works both from a script and from the interactive interpreter, but not from IDLE:
subprocess.Popen(whatever, stdin=sys.stdout, stdout=sys.stdin)
You can't use the objects which IDLE assigns to sys.stdin and sys.stdout as arguments to subprocess.Popen. These objects (the interfaces to the IDLE shell window) are file-like, but they're not real file handles with fileno attributes, and Unix-like operating systems require a fileno to be specified as the stdin or stdout for a subprocess. I cannot speak for Windows, but I imagine it has similar requirements.
Taymon's answer addresses your question directly in that IDLE's stdin/stdout are actually file-like objects and not the standard file streams associated with a console/terminal. Moreover, in Windows IDLE runs with pythonw.exe, which doesn't even have an attached win32 console.
That said, if you just need the output from a program to be printed to the user in real time, then in many cases (but not all) you can read the output line by line and echo it accordingly. The following works for me in Windows IDLE. It demonstrates reading from a piped stdout line by line. It also shows what happens if the process buffers the pipe, in which case readline will block until either the buffer is full or the pipe closes. This buffering can be manually disabled with some programs (such as the Python interpreter's -u option), and there are workarounds for Unix such as stdbuf.
test1.py
import sys
import subprocess
def test(cmd):
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stdin=subprocess.PIPE,
stderr=subprocess.PIPE)
it = iter(p.stdout.readline, b'')
for line in it:
print(line.rstrip().decode('ascii'))
print('Testing buffered subprocess...')
test([sys.executable, 'test2.py'])
print('\nTesting unbuffered subprocess...')
#-u: unbuffered binary stdout and stderr
test([sys.executable, '-u', 'test2.py'])
test2.py:
import time
for i in range(5):
print(i)
time.sleep(1)
The output in IDLE should be the following, with the first set of digits printed all at once after a delay and the second set printed line by line.
Testing buffered subprocess...
0
1
2
3
4
Testing unbuffered subprocess...
0
1
2
3
4

Categories

Resources