read from subprocess output python - python

I am running a subprocess using 'Popen'. I need to block till this subprocess finishes and then read its output.
p = Popen(command, stdin=PIPE, stdout=PIPE, stderr=PIPE, encoding="utf-8")
p.communicate():
output = p.stdout.readline()
print(output)
I get an error that
ValueError: I/O operation on closed file.
How can I read the output after the subprocess finishes, I do not want to use poll() though as the subprocess takes time and I would need to wait for its completion anyway.

This should work:
p = Popen(command, stdin=PIPE, stdout=PIPE, stderr=PIPE, encoding="utf-8")
output, error = p.communicate()
print(output)
if error:
print('error:', error, file=sys.stderr)
However, subprocess.run() is preferred these days:
p = subprocess.run(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
print("output:", p.stdout)
if proc.stderr:
print("error:", p.stderr, file=sys.stderr)

Use subprocess.check_output. It returns the output of the command.

Related

printing stdout pipe of sub processes ignore last lines

I'm trying to run a sub processes and watching his stdout until I find desirable string.
this is my code:
def waitForAppOutput(proc, word):
for stdout_line in iter(proc.stdout.readline, b''):
print stdout_line
if word in stdout_line.rstrip():
return;
p = Popen(["./app.sh"], shell=True, stdin=PIPE ,stdout=PIPE, stderr=PIPE)
waitForAppOutput(p,"done!")
the issue here is that for some reason the function waitForAppOutput stop printing stdout few lines before the "done!" which is the last line that should appear in the stdout . I assume iter(proc.stdout.readline, b'') is blocking and readline is not able to read the last lines of the stdout.
any idea what is the issue here?
You have a misspelling: it should be waitForAppOutput instead of waitForAppOutout. How does this even run at all? And when you are invoking a command using a shell, you should not be passing an array of strings but rather one single string.
Normally one should use the communicate method on the return subprocess object from the Popen call to prevent potential deadlocks (which seems to be what you are experiencing). This returns a tuple: (stdout, stderr), the stdout and stderr output strings:
from subprocess import Popen, PIPE
def waitForAppOutput(stdout_lines, word):
for stdout_line in stdout_lines:
print stdout_line
if word in stdout_line.rstrip():
return;
p = Popen("./app.sh", shell=True, stdout=PIPE, stderr=PIPE, stdin=PIPE, universal_newlines=True)
expected_input = "command line 1\ncommand line 2\n"
stdout, stderr = p.communicate(expected_input)
stdout_lines = stdout.splitlines()
waitForAppOutput(stdout_lines, "done!")
The only issue is if the output strings are large (whatever your definition of large might be), for it might be memory-inefficient or even prohibitive to read the entire output into memory. If this is your situation, then I would try to avoid the deadlock by piping only stdout.
from subprocess import Popen, PIPE
def waitForAppOutput(proc, word):
for stdout_line in iter(proc.stdout.readline, ''):
print stdout_line
if word in stdout_line.rstrip():
return;
p = Popen("./app.sh", shell=True, stdout=PIPE, stdin=PIPE, universal_newlines=True)
expected_input = "command line 1\ncommand line 2\n"
p.stdin.write(expected_input)
p.stdin.close()
waitForAppOutput(p, "done!")
for stdout_line in iter(p.stdout.readline, ''):
pass # read rest of output
p.wait() # wait for termination
Update
Here is an example using both techniques that runs the Windows sort command to sort a bunch of input lines. This works particularly well both ways because the sort command does not start output until all the input has been read, so it's a very simple protocol in which deadlocking is easy to avoid. Try running this with USE_COMMUNICATE set alternately to True and False:
from subprocess import Popen, PIPE
USE_COMMUNICATE = False
p = Popen("sort", shell=True, stdout=PIPE, stdin=PIPE, universal_newlines=True)
expected_input = """q
w
e
r
t
y
u
i
o
p
"""
if USE_COMMUNICATE:
stdout_lines, stderr_lines = p.communicate(expected_input)
output = stdout_lines
else:
p.stdin.write(expected_input)
p.stdin.close()
output = iter(p.stdout.readline, '')
for stdout_line in output:
print stdout_line,
p.wait() # wait for termination
Prints:
e
i
o
p
q
r
t
u
w
y

python how to use subprocess pipe with linux shell

I have a python script search for logs, it continuously output the logs found and I want to use linux pipe to filter the desired output. example like that:
$python logsearch.py | grep timeout
The problem is the sort and wc are blocked until the logsearch.py finishes, while the logsearch.py will continuous output the result.
sample logsearch.py:
p = subprocess.Popen("ping google.com", shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
for line in p.stdout:
print(line)
UPDATE:
figured out, just change the stdout in subprocess to sys.stdout, python will handle the pipe for you.
p = subprocess.Popen("ping -c 5 google.com", shell=True, stdout=**sys.stdout**)
Thanks for all of you help!
And why use grep? Why don't do all the stuff in Python?
from subprocess import Popen, PIPE
p = Popen(['ping', 'google.com'], shell=False, stdin=PIPE, stdout=PIPE)
for line in p.stdout:
if 'timeout' in line.split():
# Process the error
print("Timeout error!!")
else:
print(line)
UPDATE:
I change the Popen line as recommended #triplee. Pros and cons in Actual meaning of 'shell=True' in subprocess

Get stdout in case of success and stdout+stderr in case of failure

try:
output = subprocess.check_output(command, shell=True)
except subprocess.CalledProcessError as exc:
logger.error('There was an error while ...: \n%s',
exc.output)
raise
What is the easiest way to do the following:
Call a process using subprocess module.
If the program exited normally, put into output variable its standard output.
If the program exited abnormally, get its standard output and error.
import subprocess
process= subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
process.wait() #wait for the command to finish
output= process.stdout.read()
if process.poll(): #check the error code
error= process.stderr.read()

I want to get both stdout and stderr from subprocess

I just want to do something like this:
>>bar, err_value = subprocess.check_output("cat foo.txt", shell=True)
>>print bar
>>Hello, world.
>>print err_value
>>0
But I can't seem to be able to do it. I can either get the stdout, the error code (via .call) or maybe both but needing to use some kind of pipe. What am I missing here? The documentation is very sparse about this (to me) obvious functionality. Sorry if this is a simplistic question.
I take it that you want stdout, sterr and the return code? In that case, you could do this:
import subprocess
PIPE = subprocess.PIPE
proc = subprocess.Popen(cmd, stdout=PIPE, stderr=PIPE)
output, err = proc.communicate()
errcode = proc.returncode
I ended up going with this. Thanks for your help!
def subprocess_output_and_error_code(cmd, shell=True):
import subprocess
PIPE=subprocess.PIPE
STDOUT=subprocess.STDOUT
proc = subprocess.Popen(cmd, stdout=PIPE, stderr=STDOUT, shell=shell)
stdout, stderr = proc.communicate()
err_code = proc.returncode
return stdout, int(err_code)
subprocess.check_output reutrn a value (output string), and does not return exit status. Use following form:
import subprocess
try:
out = subprocess.check_output('cat foo.txt', shell=True)
print out
except subprocess.CalledProcessError, e:
print e

Python: Catching the output from subprocess.call with stdout

So im trying to save the output from my subprocess.call but I keep getting the following error:
AttributeError: 'int' object has no attribute 'communicate'
Code is as follows:
p2 = subprocess.call(['./test.out', 'new_file.mfj', 'delete1.out'], stdout = PIPE)
output = p2.communicate[0]
You're looking for subprocess.Popen() instead of call().
You also need to change it to p2.communicate()[0].
That's because subprocess.call returns an int:
subprocess.call(args, *, stdin=None, stdout=None, stderr=None, shell=False)
Run the command described by args. Wait for command to complete, then return the returncode attribute.
It looks like you want subprocess.Popen().
Here's a typical piece of code I have to do this:
p = Popen(cmd, stdout=PIPE, stderr=PIPE, bufsize=256*1024*1024)
output, errors = p.communicate()
if p.returncode:
raise Exception(errors)
else:
# Print stdout from cmd call
print output
You should use subprocess
try:
subprocess.check_output(['./test.out', 'new_file.mfj', 'delete1.out'], shell=True, stderr=subprocess.STDOUT)
except subprocess.CalledProcessError as exception:
print exception.output

Categories

Resources