Python's subprocess with "1>&2" and stderr=STDOUT - python

I have this code from https://pymotw.com/2/subprocess/
I'm not sure how to interpret the code, in the check_output with 1>&2 output is redirected to stderr, but in the parameter, the stderr is back to stdout stderr=subprocess.STDOUT.
output = subprocess.check_output(
'echo to stdout; echo to stderr 1>&2; exit 1',
shell=True,
stderr=subprocess.STDOUT,
)
print "*****************"
print 'Have %d bytes in output' % len(output)
print output
Running the code, the print commands are not executed meaning nothing is captured.
What does this code trying to accomplish?
EDIT
From the answer and comment, I could run this code to get
try:
output = subprocess.check_output(
'echo to stdout; echo to stderr 1>&2; exit 1',
shell=True, # No such file or directory error without, maybe 1>&2 requires shell=True
stderr=subprocess.STDOUT,
)
except subprocess.CalledProcessError as e:
print "*****************"
print 'Have %d bytes in output' % len(e.output)
print e.output
this output:
*****************
Have 20 bytes in output
to stdout
to stderr
However, when I commented out the stderr=subprocess.STDOUT line, I got instead
to stderr
*****************
Have 10 bytes in output
to stdout
EDIT2
I tested more with stderr library (https://github.com/sickill/stderred) that helps a shell to show characters from stderr in red color.
When I execute this code (comment out the redirection), I can see the to stderr in BLACK color which implies it uses stdout.
output = subprocess.check_output(
'echo to stdout; echo to stderr 1>&2; exit 1',
shell=True,
#stderr=subprocess.STDOUT,
)
From this, I guess (correct me if I'm wrong) that Python's check_output method prints out the data into the stderr redirect to stdout so that it prints out the error message into stderr.

The 1 >&2 shell code applies only to the (echo) command it appears on. It is how to tell the shell to direct the output of that echo to the shell's stderr stream.
The python code stderr=subprocess.STDOUT tells the subprocess module that you want the process's stderr stream to be the same file descriptor as its stdout stream so that you will read whatever the process writes to either stream interleaved together in one stream.
The exit 1 in the shell command means that the shell exits with an error (non-zero) status.
The purpose of the code is to demonstrate that the python function subprocess.check_output will check the exit status and raise an exception when it is non-zero.
If the exit code was non-zero it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute and output in the output attribute.
Your description of:
Running the code, the print commands are not executed
is a bit misleading since you neglect to mention the output that does occur:
Traceback (most recent call last):
File "t.py", line 6, in <module>
stderr=subprocess.STDOUT,
File "/usr/lib/python2.7/subprocess.py", line 573, in check_output
raise CalledProcessError(retcode, cmd, output=output)
subprocess.CalledProcessError: Command 'echo to stdout; echo to stderr 1>&2; exit 1' returned non-zero exit status 1

Related

Python subprocess.Popen: redirect `STDERR` only and keep `STDOUT`

Setup
I have a little Runner program, that prints some info in sys.stderr (for logs, unhandled exceptions and etc.) and sys.stdout (some usefull info about program, maybe interaction with user or smth):
import sys
import time
for i in range(1, 4):
sys.stdout.write(f"This is text #{i} to STDOUT\n")
sys.stderr.write(f"This is text #{i} to STDERR\n")
time.sleep(5)
And I have some Main program, that starts Runner in the new window with subprocess.Popen and prints it's output:
import subprocess
cmd = "python runner.py"
proc = subprocess.Popen(cmd,
stdout=subprocess.PIPE, # Problem line
stderr=subprocess.PIPE,
creationflags=subprocess.CREATE_NEW_CONSOLE
)
proc.wait()
out, err = proc.communicate()
if out:
print(f"[{out.decode('utf-8')}]")
if err:
print(f"[{err.decode('utf-8')}]")
So the resulting output is:
[This is text #1 to STDOUT
This is text #2 to STDOUT
This is text #3 to STDOUT
]
[This is text #1 to STDERR
This is text #2 to STDERR
This is text #3 to STDERR
]
Why Popen?
I need to run several Runners parallely and wait them lately, but subprocess.check_input or subprocess.run does not allow that (or am I wrong??)
Why new window?
I want to see prints separetely for every Runner in their personal windows
What I want
I want to redirect stderr only and keep stdout in opened window, so the Main program will only output errors from subprocess:
[This is text #1 to STDERR
This is text #2 to STDERR
This is text #3 to STDERR
]
That will be very usefull for debugging new Runner's features...
What I tried
When subprocess.Popen has stderr=subprocess.PIPE param and stdout=None (default), stdout is blocking:
it doesn't show in the Runner window
and proc.communicate returns None
So the stdout prints just disappeared... I tried even pass sys.stdout to stdout= param (for output not in window, but in current console), but it throws Bad file descriptor error:
[Traceback (most recent call last):
File "C:\Users\kirin\source\repos\python_tests\runner.py", line 5, in <module>
sys.stdout.write(f"This is text #{i} to STDOUT\n")
OSError: [Errno 9] Bad file descriptor
Exception ignored in: <_io.TextIOWrapper name='<stdout>' mode='w' encoding='cp1251'>
OSError: [Errno 9] Bad file descriptor
]
(btw, this print was succesfully redirected from Runner to Main)
Need help...
Here is a solution that meets the requirements of the 'What I want' section:
main.py:
import subprocess
command = ["python", "runner.py"]
process = subprocess.Popen(command, shell=False, text=True, stderr=subprocess.PIPE, creationflags=subprocess.CREATE_NEW_CONSOLE)
process.wait()
stderr = process.stderr.read()
print(stderr, end="")
runner.py contains the code mentioned in the question.
Argument shell=False is used to run python runner.py directly (i.e. not as a shell command), text=True makes subprocess open process.stderr in text mode (instead of binary mode).
When running this, output from runner.py sent to stdout appears in the new window while output sent to stderr is captured in variable stderr (and also printed in main.py's window).
If runner.py's output shall be processed right away as it is produced (i.e. without waiting for the process to finish first), the following code may be used:
main.py:
import subprocess
command = ["python", "runner.py"]
process = subprocess.Popen(command, shell=False, text=True, bufsize=1, stderr=subprocess.PIPE, creationflags=subprocess.CREATE_NEW_CONSOLE)
stderr = ""
while (True):
line = process.stderr.readline()
if (line == ""): break # EOF
stderr += line
print(line, end="")
runner.py (modified to illustrate the difference):
import sys
import time
for i in range(1, 4):
sys.stdout.write(f"This is text #{i} to STDOUT\n")
sys.stderr.write(f"This is text #{i} to STDERR\n")
time.sleep(1)
Argument bufsize=1 is used here to get line-buffered output from runner.py's stderr.
Successfully tested on Windows 10 21H2 + Python 3.10.4.

How can I get full output error if subprocess.run failed?

For example I'm trying to run some bash command from python:
from subprocess import run
command = f'ffmpeg -y -i "{video_path}" "{frames_path}/%d.png"'
run(command, shell=True, check=True)
but if it fails I just get subprocess.CalledProcessError: Command 'ffmpeg ...' returned non-zero exit status 127. how can I get full ffmpeg error message?
It's the check=True kwarg that's causing it to throw a CalledProcessError. Just remove check=True, and it will stop throwing the error. If you want to print the STDERR printed by ffmpeg, you can use capture_output=True. Then, the resulting CompletedProcess object will have a .stderr member that contains the STDERR of your command, encoded as a bytes-like string. Use str.decode() to turn it into a normal string:
from subprocess import run
command = f'ffmpeg -y -i "{video_path}" "{frames_path}/%d.png"'
proc = run(command, shell=True, capture_output=True)
out = proc.stdout.decode() # stores the output of stdout
err = proc.stderr.decode() # stores the output of stderr
print(err)

Capturing *all* terminal output of a program called from Python

I have a program which can be execute as
./install.sh
This install bunch of stuff and has quite a lot of activity happening on screen..
Now, I am trying to execute it via
p = subprocess.Popen(executable, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
With the hope that all the activity happening on the screen is captured in out (or err). However, content is printed directly to the terminal while the process is running, and not captured into out or err, which are both empty after the process is run.
What could be happening here? How can this content be captured?
In general, what you're doing is already sufficient to channel all output to your variables.
One exception to that is if the program you're running is using /dev/tty to connect directly to its controlling terminal, and emitting output through that terminal rather than through stdout (FD 1) and stderr (FD 2). This is commonly done for security-sensitive IO such as password prompts, but rarely seen otherwise.
As a demonstration that this works, you can copy-and-paste the following into a Python shell exactly as given:
import subprocess
executable = ['/bin/sh', '-c', 'echo stdout; echo stderr >&2']
p = subprocess.Popen(executable, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
print "---"
print "output: ", out
print "stderr: ", err
...by contrast, for a demonstration of the case that doesn't work:
import subprocess
executable = ['/bin/sh', '-c', 'echo uncapturable >/dev/tty']
p = subprocess.Popen(executable, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
print "---"
print "output: ", out
In this case, content is written to the TTY directly, not to stdout or stderr. This content cannot be captured without using a program (such as script or expect) that provides a fake TTY. So, to use script:
import subprocess
executable = ['script', '-q', '/dev/null',
'/bin/sh', '-c', 'echo uncapturable >/dev/tty']
p = subprocess.Popen(executable, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
print "---"
print "output: ", out

python subprocess is working in interactive mode but in not script

In windows I have to execute a command like below:
process = subprocess.Popen([r'C:\Program Files (x86)\xxx\xxx.exe', '-n', '#iseasn2a7.sd.xxxx.com:3944#dc', '-d', r'D:\test\file.txt'], shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
process.communicate()
This works fine in python interactive mode, but not at all executing from the python script.
What may be the issue ?
Popen.communicate itself does not print anything, but it returns the stdout, stderr output. Beside that because the code specified stdout=PIPE, stderr=... when it create Popen, it catch the outputs (does not let the sub-process print output directly to the stdout of the parent process)
You need to print the return value manually:
process = ....
output, error = process.communicate()
print output
If you don't want that, don't catch stdout output by omit stdout=PIPE, stderr=....
Then, you don't need to use communicate, but just wait:
process = subprocess.Popen([...], shell=True)
process.wait()
Or, you can use subprocess.call which both execute sub-process and wait its termination:
subprocess.call([...], shell=True)

Redirect subprocess stderr to stdout

I want to redirect the stderr output of a subprocess to stdout. The constant STDOUT should do that, shouldn't it?
However,
$ python >/dev/null -c 'import subprocess;\
subprocess.call(["ls", "/404"],stderr=subprocess.STDOUT)'
does output something. Why is that the case, and how do I get the error message on stdout?
In Python < v3.5:
A close read of the source code gives the answer. In particular, the documentation is misleading when it says:
subprocess.STDOUT
Special value that (...) indicates that standard error should go into the same handle as standard output.
Since stdout is set to "default" (-1, technically) when stderr=subprocess.STDOUT is evaluated, stderr is set to "default" as well. Unfortunately, this means that stderr output still goes to stderr.
To solve the problem, pass in the stdout file instead of subprocess.STDOUT:
$ python >/dev/null -c 'import subprocess,sys;subprocess.call(["ls", "/404"],
stderr=sys.stdout.buffer)'
Or, for compatibility with legacy 2.x versions of Python:
$ python >/dev/null -c 'import subprocess,sys;subprocess.call(["ls", "/404"],
stderr=sys.stdout.fileno())'
Actually, using subprocess.STDOUT does exactly what is stated in the documentation: it redirects stderr to stdout so that e.g.
command = ["/bin/ls", "/tmp", "/notthere"]
process = subprocess.Popen(command, shell=False, bufsize=1, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output = ""
while (True):
# Read line from stdout, break if EOF reached, append line to output
line = process.stdout.readline()
line = line.decode()
if (line == ""): break
output += line
results in variable output containing the process' output from both stdout and stderr.
stderr=subprocess.STDOUT redirects all stderr output directly to stdout of the calling process, which is a major difference.
EDIT: Updated code for newer Python versions:
command = ["/bin/ls", "/tmp", "/notthere"]
process = subprocess.Popen(command, shell=False, text=True, bufsize=1, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output = ""
while (True):
# Read line from stdout, break if EOF reached, append line to output
line = process.stdout.readline()
if (line == ""): break
output += line

Categories

Resources