How can I get real-time output while using subprocess.Popen()? [duplicate] - python

I am trying to write a wrapper script for a command line program (svnadmin verify) that will display a nice progress indicator for the operation. This requires me to be able to see each line of output from the wrapped program as soon as it is output.
I figured that I'd just execute the program using subprocess.Popen, use stdout=PIPE, then read each line as it came in and act on it accordingly. However, when I ran the following code, the output appeared to be buffered somewhere, causing it to appear in two chunks, lines 1 through 332, then 333 through 439 (the last line of output)
from subprocess import Popen, PIPE, STDOUT
p = Popen('svnadmin verify /var/svn/repos/config', stdout = PIPE,
stderr = STDOUT, shell = True)
for line in p.stdout:
print line.replace('\n', '')
After looking at the documentation on subprocess a little, I discovered the bufsize parameter to Popen, so I tried setting bufsize to 1 (buffer each line) and 0 (no buffer), but neither value seemed to change the way the lines were being delivered.
At this point I was starting to grasp for straws, so I wrote the following output loop:
while True:
try:
print p.stdout.next().replace('\n', '')
except StopIteration:
break
but got the same result.
Is it possible to get 'realtime' program output of a program executed using subprocess? Is there some other option in Python that is forward-compatible (not exec*)?

I tried this, and for some reason while the code
for line in p.stdout:
...
buffers aggressively, the variant
while True:
line = p.stdout.readline()
if not line: break
...
does not. Apparently this is a known bug: http://bugs.python.org/issue3907 (The issue is now "Closed" as of Aug 29, 2018)

By setting the buffer size to 1, you essentially force the process to not buffer the output.
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, bufsize=1)
for line in iter(p.stdout.readline, b''):
print line,
p.stdout.close()
p.wait()

You can direct the subprocess output to the streams directly. Simplified example:
subprocess.run(['ls'], stderr=sys.stderr, stdout=sys.stdout)

You can try this:
import subprocess
import sys
process = subprocess.Popen(
cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
while True:
out = process.stdout.read(1)
if out == '' and process.poll() != None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
If you use readline instead of read, there will be some cases where the input message is not printed. Try it with a command the requires an inline input and see for yourself.

In Python 3.x the process might hang because the output is a byte array instead of a string. Make sure you decode it into a string.
Starting from Python 3.6 you can do it using the parameter encoding in Popen Constructor. The complete example:
process = subprocess.Popen(
'my_command',
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
shell=True,
encoding='utf-8',
errors='replace'
)
while True:
realtime_output = process.stdout.readline()
if realtime_output == '' and process.poll() is not None:
break
if realtime_output:
print(realtime_output.strip(), flush=True)
Note that this code redirects stderr to stdout and handles output errors.

Real Time Output Issue resolved:
I encountered a similar issue in Python, while capturing the real time output from C program. I added fflush(stdout); in my C code. It worked for me. Here is the code.
C program:
#include <stdio.h>
void main()
{
int count = 1;
while (1)
{
printf(" Count %d\n", count++);
fflush(stdout);
sleep(1);
}
}
Python program:
#!/usr/bin/python
import os, sys
import subprocess
procExe = subprocess.Popen(".//count", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)
while procExe.poll() is None:
line = procExe.stdout.readline()
print("Print:" + line)
Output:
Print: Count 1
Print: Count 2
Print: Count 3

The Streaming subprocess stdin and stdout with asyncio in Python blog post by Kevin McCarthy shows how to do it with asyncio:
import asyncio
from asyncio.subprocess import PIPE
from asyncio import create_subprocess_exec
async def _read_stream(stream, callback):
while True:
line = await stream.readline()
if line:
callback(line)
else:
break
async def run(command):
process = await create_subprocess_exec(
*command, stdout=PIPE, stderr=PIPE
)
await asyncio.wait(
[
_read_stream(
process.stdout,
lambda x: print(
"STDOUT: {}".format(x.decode("UTF8"))
),
),
_read_stream(
process.stderr,
lambda x: print(
"STDERR: {}".format(x.decode("UTF8"))
),
),
]
)
await process.wait()
async def main():
await run("docker build -t my-docker-image:latest .")
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())

Depending on the use case, you might also want to disable the buffering in the subprocess itself.
If the subprocess will be a Python process, you could do this before the call:
os.environ["PYTHONUNBUFFERED"] = "1"
Or alternatively pass this in the env argument to Popen.
Otherwise, if you are on Linux/Unix, you can use the stdbuf tool. E.g. like:
cmd = ["stdbuf", "-oL"] + cmd
See also here about stdbuf or other options.
(See also here for the same answer.)

Found this "plug-and-play" function here. Worked like a charm!
import subprocess
def myrun(cmd):
"""from
http://blog.kagesenshi.org/2008/02/teeing-python-subprocesspopen-output.html
"""
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
stdout = []
while True:
line = p.stdout.readline()
stdout.append(line)
print line,
if line == '' and p.poll() != None:
break
return ''.join(stdout)

I ran into the same problem awhile back. My solution was to ditch iterating for the read method, which will return immediately even if your subprocess isn't finished executing, etc.

I used this solution to get realtime output on a subprocess. This loop will stop as soon as the process completes leaving out a need for a break statement or possible infinite loop.
sub_process = subprocess.Popen(my_command, close_fds=True, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while sub_process.poll() is None:
out = sub_process.stdout.read(1)
sys.stdout.write(out)
sys.stdout.flush()

You may use an iterator over each byte in the output of the subprocess. This allows inline update (lines ending with '\r' overwrite previous output line) from the subprocess:
from subprocess import PIPE, Popen
command = ["my_command", "-my_arg"]
# Open pipe to subprocess
subprocess = Popen(command, stdout=PIPE, stderr=PIPE)
# read each byte of subprocess
while subprocess.poll() is None:
for c in iter(lambda: subprocess.stdout.read(1) if subprocess.poll() is None else {}, b''):
c = c.decode('ascii')
sys.stdout.write(c)
sys.stdout.flush()
if subprocess.returncode != 0:
raise Exception("The subprocess did not terminate correctly.")

This is the basic skeleton that I always use for this. It makes it easy to implement timeouts and is able to deal with inevitable hanging processes.
import subprocess
import threading
import Queue
def t_read_stdout(process, queue):
"""Read from stdout"""
for output in iter(process.stdout.readline, b''):
queue.put(output)
return
process = subprocess.Popen(['dir'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=1,
cwd='C:\\',
shell=True)
queue = Queue.Queue()
t_stdout = threading.Thread(target=t_read_stdout, args=(process, queue))
t_stdout.daemon = True
t_stdout.start()
while process.poll() is None or not queue.empty():
try:
output = queue.get(timeout=.5)
except Queue.Empty:
continue
if not output:
continue
print(output),
t_stdout.join()

if you just want to forward the log to console in realtime
Below code will work for both
p = subprocess.Popen(cmd,
shell=True,
cwd=work_dir,
bufsize=1,
stdin=subprocess.PIPE,
stderr=sys.stderr,
stdout=sys.stdout)

Complete solution:
import contextlib
import subprocess
# Unix, Windows and old Macintosh end-of-line
newlines = ['\n', '\r\n', '\r']
def unbuffered(proc, stream='stdout'):
stream = getattr(proc, stream)
with contextlib.closing(stream):
while True:
out = []
last = stream.read(1)
# Don't loop forever
if last == '' and proc.poll() is not None:
break
while last not in newlines:
# Don't loop forever
if last == '' and proc.poll() is not None:
break
out.append(last)
last = stream.read(1)
out = ''.join(out)
yield out
def example():
cmd = ['ls', '-l', '/']
proc = subprocess.Popen(
cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
# Make all end-of-lines '\n'
universal_newlines=True,
)
for line in unbuffered(proc):
print line
example()

Using pexpect with non-blocking readlines will resolve this problem. It stems from the fact that pipes are buffered, and so your app's output is getting buffered by the pipe, therefore you can't get to that output until the buffer fills or the process dies.

Here is what worked for me:
import subprocess
import sys
def run_cmd_print_output_to_console_and_log_to_file(cmd, log_file_path):
make_file_if_not_exist(log_file_path)
logfile = open(log_file_path, 'w')
proc=subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell = True)
for line in proc.stdout:
sys.stdout.write(line.decode("utf-8") )
print(line.decode("utf-8").strip(), file=logfile, flush=True)
proc.wait()
logfile.close()

(This solution has been tested with Python 2.7.15)
You just need to sys.stdout.flush() after each line read/write:
while proc.poll() is None:
line = proc.stdout.readline()
sys.stdout.write(line)
# or print(line.strip()), you still need to force the flush.
sys.stdout.flush()

Few answers suggesting python 3.x or pthon 2.x , Below code will work for both.
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,)
stdout = []
while True:
line = p.stdout.readline()
if not isinstance(line, (str)):
line = line.decode('utf-8')
stdout.append(line)
print (line)
if (line == '' and p.poll() != None):
break

def run_command(command):
process = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
rc = process.poll()
return rc

Yet another answer! I had the following requirements:
Run some command and print the output to stdout as though the user ran it
Display to the user any prompts from the command. E.g. pip uninstall numpy will prompt with ... Proceed (Y/n)? (which does not end in a newline)
Capture the output (that the user saw) as a string
This worked for me (only tested in Python 3.10 on Windows):
def run(*args: list[str]) -> str:
proc = subprocess.Popen(
*args,
text=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
result = ""
while proc.poll() is None:
output = proc.stdout.read(1)
if output:
sys.stdout.write(output)
sys.stdout.flush()
result += output
return result

These are all great examples, but I've found they either (a) handle partial lines (eg "Are you sure (Y/n):") but are really slow or b) are quick but hang on partial lines.
I've worked on the following which:
provides real-time output for both stdout and stderr to their respective streams
is extremely fast as it works with stream buffering
allows for using timeouts as it never blocks on read()
efficiently saves stdout and stderr independently
handles text encoding (though easily adaptable to binary streams)
works on Python 3.6+
import os
import subprocess
import sys
import selectors
import io
def run_command(command: str) -> (int, str):
proc = subprocess.Popen(
command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
sel = selectors.DefaultSelector()
for fobj in [ proc.stdout, proc.stderr ]:
os.set_blocking(fobj.fileno(), False)
sel.register(fobj, selectors.EVENT_READ)
out=io.StringIO()
err=io.StringIO()
# loop until all descriptors removed
while len(sel.get_map()) > 0:
events = sel.select()
if len(events) == 0:
# timeout or signal, kill to prevent wait hanging
proc.terminate()
break
for key, _ in events:
# read all available data
buf = key.fileobj.read().decode(errors='ignore')
if buf == '':
sel.unregister(key.fileobj)
elif key.fileobj == proc.stdout:
sys.stdout.write(buf)
sys.stdout.flush()
out.write(buf)
elif key.fileobj == proc.stderr:
sys.stderr.write(buf)
sys.stderr.flush()
err.write(buf)
sel.close()
proc.wait()
if proc.returncode != 0:
return (proc.returncode, err.getvalue())
return (0, out.getvalue())
I didn't include the timeout logic (as the subject is real-time output), but it's simple to add them to select()/wait() and no longer worry about infinite hangs.
I've timed cat '25MB-file' and compared to the .read(1) solutions, it's roughly 300 times faster.

Related

python suprocess, how to capture the output streams of an external command which are updated in realtime [duplicate]

My python script uses subprocess to call a linux utility that is very noisy. I want to store all of the output to a log file and show some of it to the user. I thought the following would work, but the output doesn't show up in my application until the utility has produced a significant amount of output.
#fake_utility.py, just generates lots of output over time
import time
i = 0
while True:
print hex(i)*512
i += 1
time.sleep(0.5)
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
for line in proc.stdout:
#the real code does filtering here
print "test:", line.rstrip()
The behavior I really want is for the filter script to print each line as it is received from the subprocess. Sorta like what tee does but with python code.
What am I missing? Is this even possible?
Update:
If a sys.stdout.flush() is added to fake_utility.py, the code has the desired behavior in python 3.1. I'm using python 2.6. You would think that using proc.stdout.xreadlines() would work the same as py3k, but it doesn't.
Update 2:
Here is the minimal working code.
#fake_utility.py, just generates lots of output over time
import sys, time
for i in range(10):
print i
sys.stdout.flush()
time.sleep(0.5)
#display out put line by line
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
#works in python 3.0+
#for line in proc.stdout:
for line in iter(proc.stdout.readline,''):
print line.rstrip()
I think the problem is with the statement for line in proc.stdout, which reads the entire input before iterating over it. The solution is to use readline() instead:
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
while True:
line = proc.stdout.readline()
if not line:
break
#the real code does filtering here
print "test:", line.rstrip()
Of course you still have to deal with the subprocess' buffering.
Note: according to the documentation the solution with an iterator should be equivalent to using readline(), except for the read-ahead buffer, but (or exactly because of this) the proposed change did produce different results for me (Python 2.5 on Windows XP).
Bit late to the party, but was surprised not to see what I think is the simplest solution here:
import io
import subprocess
proc = subprocess.Popen(["prog", "arg"], stdout=subprocess.PIPE)
for line in io.TextIOWrapper(proc.stdout, encoding="utf-8"): # or another encoding
# do something with line
(This requires Python 3.)
Indeed, if you sorted out the iterator then buffering could now be your problem. You could tell the python in the sub-process not to buffer its output.
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
becomes
proc = subprocess.Popen(['python','-u', 'fake_utility.py'],stdout=subprocess.PIPE)
I have needed this when calling python from within python.
You want to pass these extra parameters to subprocess.Popen:
bufsize=1, universal_newlines=True
Then you can iterate as in your example. (Tested with Python 3.5)
A function that allows iterating over both stdout and stderr concurrently, in realtime, line by line
In case you need to get the output stream for both stdout and stderr at the same time, you can use the following function.
The function uses Queues to merge both Popen pipes into a single iterator.
Here we create the function read_popen_pipes():
from queue import Queue, Empty
from concurrent.futures import ThreadPoolExecutor
def enqueue_output(file, queue):
for line in iter(file.readline, ''):
queue.put(line)
file.close()
def read_popen_pipes(p):
with ThreadPoolExecutor(2) as pool:
q_stdout, q_stderr = Queue(), Queue()
pool.submit(enqueue_output, p.stdout, q_stdout)
pool.submit(enqueue_output, p.stderr, q_stderr)
while True:
if p.poll() is not None and q_stdout.empty() and q_stderr.empty():
break
out_line = err_line = ''
try:
out_line = q_stdout.get_nowait()
except Empty:
pass
try:
err_line = q_stderr.get_nowait()
except Empty:
pass
yield (out_line, err_line)
read_popen_pipes() in use:
import subprocess as sp
with sp.Popen(my_cmd, stdout=sp.PIPE, stderr=sp.PIPE, text=True) as p:
for out_line, err_line in read_popen_pipes(p):
# Do stuff with each line, e.g.:
print(out_line, end='')
print(err_line, end='')
return p.poll() # return status-code
You can also read lines w/o loop. Works in python3.6.
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
list_of_byte_strings = process.stdout.readlines()
Pythont 3.5 added the methods run() and call() to the subprocess module, both returning a CompletedProcess object. With this you are fine using proc.stdout.splitlines():
proc = subprocess.run( comman, shell=True, capture_output=True, text=True, check=True )
for line in proc.stdout.splitlines():
print "stdout:", line
See also How to Execute Shell Commands in Python Using the Subprocess Run Method
I tried this with python3 and it worked, source
When you use popen to spawn the new thread, you tell the operating system to PIPE the stdout of the child processes so the parent process can read it and here, stderr is copied to the stderr of the parent process.
in output_reader we read each line of stdout of the child process by wrapping it in an iterator that populates line by line output from the child process whenever a new line is ready.
def output_reader(proc):
for line in iter(proc.stdout.readline, b''):
print('got line: {0}'.format(line.decode('utf-8')), end='')
def main():
proc = subprocess.Popen(['python', 'fake_utility.py'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
t = threading.Thread(target=output_reader, args=(proc,))
t.start()
try:
time.sleep(0.2)
import time
i = 0
while True:
print (hex(i)*512)
i += 1
time.sleep(0.5)
finally:
proc.terminate()
try:
proc.wait(timeout=0.2)
print('== subprocess exited with rc =', proc.returncode)
except subprocess.TimeoutExpired:
print('subprocess did not terminate in time')
t.join()
The following modification of Rômulo's answer works for me on Python 2 and 3 (2.7.12 and 3.6.1):
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
while True:
line = process.stdout.readline()
if line != '':
os.write(1, line)
else:
break
I was having a problem with the arg list of Popen to update servers, the following code resolves this a bit.
import getpass
from subprocess import Popen, PIPE
username = 'user1'
ip = '127.0.0.1'
print ('What is the password?')
password = getpass.getpass()
cmd1 = f"""sshpass -p {password} ssh {username}#{ip}"""
cmd2 = f"""echo {password} | sudo -S apt update"""
cmd3 = " && "
cmd4 = f"""echo {password} | sudo -S apt upgrade -y"""
cmd5 = " && "
cmd6 = "exit"
commands = [cmd1, cmd2, cmd3, cmd4, cmd5, cmd6]
command = " ".join(commands)
cmd = command.split()
with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')
And to run the update on a local computer, the following code example does this.
import getpass
from subprocess import Popen, PIPE
print ('What is the password?')
password = getpass.getpass()
cmd1_local = f"""apt update"""
cmd2_local = f"""apt upgrade -y"""
commands = [cmd1_local, cmd2_local]
with Popen(['echo', password], stdout=PIPE) as auth:
for cmd in commands:
cmd = cmd.split()
with Popen(['sudo','-S'] + cmd, stdin=auth.stdout, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')

how to get continous output from command line? [duplicate]

I am trying to write a wrapper script for a command line program (svnadmin verify) that will display a nice progress indicator for the operation. This requires me to be able to see each line of output from the wrapped program as soon as it is output.
I figured that I'd just execute the program using subprocess.Popen, use stdout=PIPE, then read each line as it came in and act on it accordingly. However, when I ran the following code, the output appeared to be buffered somewhere, causing it to appear in two chunks, lines 1 through 332, then 333 through 439 (the last line of output)
from subprocess import Popen, PIPE, STDOUT
p = Popen('svnadmin verify /var/svn/repos/config', stdout = PIPE,
stderr = STDOUT, shell = True)
for line in p.stdout:
print line.replace('\n', '')
After looking at the documentation on subprocess a little, I discovered the bufsize parameter to Popen, so I tried setting bufsize to 1 (buffer each line) and 0 (no buffer), but neither value seemed to change the way the lines were being delivered.
At this point I was starting to grasp for straws, so I wrote the following output loop:
while True:
try:
print p.stdout.next().replace('\n', '')
except StopIteration:
break
but got the same result.
Is it possible to get 'realtime' program output of a program executed using subprocess? Is there some other option in Python that is forward-compatible (not exec*)?
I tried this, and for some reason while the code
for line in p.stdout:
...
buffers aggressively, the variant
while True:
line = p.stdout.readline()
if not line: break
...
does not. Apparently this is a known bug: http://bugs.python.org/issue3907 (The issue is now "Closed" as of Aug 29, 2018)
By setting the buffer size to 1, you essentially force the process to not buffer the output.
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, bufsize=1)
for line in iter(p.stdout.readline, b''):
print line,
p.stdout.close()
p.wait()
You can direct the subprocess output to the streams directly. Simplified example:
subprocess.run(['ls'], stderr=sys.stderr, stdout=sys.stdout)
You can try this:
import subprocess
import sys
process = subprocess.Popen(
cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
while True:
out = process.stdout.read(1)
if out == '' and process.poll() != None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
If you use readline instead of read, there will be some cases where the input message is not printed. Try it with a command the requires an inline input and see for yourself.
In Python 3.x the process might hang because the output is a byte array instead of a string. Make sure you decode it into a string.
Starting from Python 3.6 you can do it using the parameter encoding in Popen Constructor. The complete example:
process = subprocess.Popen(
'my_command',
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
shell=True,
encoding='utf-8',
errors='replace'
)
while True:
realtime_output = process.stdout.readline()
if realtime_output == '' and process.poll() is not None:
break
if realtime_output:
print(realtime_output.strip(), flush=True)
Note that this code redirects stderr to stdout and handles output errors.
Real Time Output Issue resolved:
I encountered a similar issue in Python, while capturing the real time output from C program. I added fflush(stdout); in my C code. It worked for me. Here is the code.
C program:
#include <stdio.h>
void main()
{
int count = 1;
while (1)
{
printf(" Count %d\n", count++);
fflush(stdout);
sleep(1);
}
}
Python program:
#!/usr/bin/python
import os, sys
import subprocess
procExe = subprocess.Popen(".//count", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)
while procExe.poll() is None:
line = procExe.stdout.readline()
print("Print:" + line)
Output:
Print: Count 1
Print: Count 2
Print: Count 3
The Streaming subprocess stdin and stdout with asyncio in Python blog post by Kevin McCarthy shows how to do it with asyncio:
import asyncio
from asyncio.subprocess import PIPE
from asyncio import create_subprocess_exec
async def _read_stream(stream, callback):
while True:
line = await stream.readline()
if line:
callback(line)
else:
break
async def run(command):
process = await create_subprocess_exec(
*command, stdout=PIPE, stderr=PIPE
)
await asyncio.wait(
[
_read_stream(
process.stdout,
lambda x: print(
"STDOUT: {}".format(x.decode("UTF8"))
),
),
_read_stream(
process.stderr,
lambda x: print(
"STDERR: {}".format(x.decode("UTF8"))
),
),
]
)
await process.wait()
async def main():
await run("docker build -t my-docker-image:latest .")
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Depending on the use case, you might also want to disable the buffering in the subprocess itself.
If the subprocess will be a Python process, you could do this before the call:
os.environ["PYTHONUNBUFFERED"] = "1"
Or alternatively pass this in the env argument to Popen.
Otherwise, if you are on Linux/Unix, you can use the stdbuf tool. E.g. like:
cmd = ["stdbuf", "-oL"] + cmd
See also here about stdbuf or other options.
(See also here for the same answer.)
Found this "plug-and-play" function here. Worked like a charm!
import subprocess
def myrun(cmd):
"""from
http://blog.kagesenshi.org/2008/02/teeing-python-subprocesspopen-output.html
"""
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
stdout = []
while True:
line = p.stdout.readline()
stdout.append(line)
print line,
if line == '' and p.poll() != None:
break
return ''.join(stdout)
I ran into the same problem awhile back. My solution was to ditch iterating for the read method, which will return immediately even if your subprocess isn't finished executing, etc.
I used this solution to get realtime output on a subprocess. This loop will stop as soon as the process completes leaving out a need for a break statement or possible infinite loop.
sub_process = subprocess.Popen(my_command, close_fds=True, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while sub_process.poll() is None:
out = sub_process.stdout.read(1)
sys.stdout.write(out)
sys.stdout.flush()
You may use an iterator over each byte in the output of the subprocess. This allows inline update (lines ending with '\r' overwrite previous output line) from the subprocess:
from subprocess import PIPE, Popen
command = ["my_command", "-my_arg"]
# Open pipe to subprocess
subprocess = Popen(command, stdout=PIPE, stderr=PIPE)
# read each byte of subprocess
while subprocess.poll() is None:
for c in iter(lambda: subprocess.stdout.read(1) if subprocess.poll() is None else {}, b''):
c = c.decode('ascii')
sys.stdout.write(c)
sys.stdout.flush()
if subprocess.returncode != 0:
raise Exception("The subprocess did not terminate correctly.")
This is the basic skeleton that I always use for this. It makes it easy to implement timeouts and is able to deal with inevitable hanging processes.
import subprocess
import threading
import Queue
def t_read_stdout(process, queue):
"""Read from stdout"""
for output in iter(process.stdout.readline, b''):
queue.put(output)
return
process = subprocess.Popen(['dir'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=1,
cwd='C:\\',
shell=True)
queue = Queue.Queue()
t_stdout = threading.Thread(target=t_read_stdout, args=(process, queue))
t_stdout.daemon = True
t_stdout.start()
while process.poll() is None or not queue.empty():
try:
output = queue.get(timeout=.5)
except Queue.Empty:
continue
if not output:
continue
print(output),
t_stdout.join()
if you just want to forward the log to console in realtime
Below code will work for both
p = subprocess.Popen(cmd,
shell=True,
cwd=work_dir,
bufsize=1,
stdin=subprocess.PIPE,
stderr=sys.stderr,
stdout=sys.stdout)
Complete solution:
import contextlib
import subprocess
# Unix, Windows and old Macintosh end-of-line
newlines = ['\n', '\r\n', '\r']
def unbuffered(proc, stream='stdout'):
stream = getattr(proc, stream)
with contextlib.closing(stream):
while True:
out = []
last = stream.read(1)
# Don't loop forever
if last == '' and proc.poll() is not None:
break
while last not in newlines:
# Don't loop forever
if last == '' and proc.poll() is not None:
break
out.append(last)
last = stream.read(1)
out = ''.join(out)
yield out
def example():
cmd = ['ls', '-l', '/']
proc = subprocess.Popen(
cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
# Make all end-of-lines '\n'
universal_newlines=True,
)
for line in unbuffered(proc):
print line
example()
Using pexpect with non-blocking readlines will resolve this problem. It stems from the fact that pipes are buffered, and so your app's output is getting buffered by the pipe, therefore you can't get to that output until the buffer fills or the process dies.
Here is what worked for me:
import subprocess
import sys
def run_cmd_print_output_to_console_and_log_to_file(cmd, log_file_path):
make_file_if_not_exist(log_file_path)
logfile = open(log_file_path, 'w')
proc=subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell = True)
for line in proc.stdout:
sys.stdout.write(line.decode("utf-8") )
print(line.decode("utf-8").strip(), file=logfile, flush=True)
proc.wait()
logfile.close()
(This solution has been tested with Python 2.7.15)
You just need to sys.stdout.flush() after each line read/write:
while proc.poll() is None:
line = proc.stdout.readline()
sys.stdout.write(line)
# or print(line.strip()), you still need to force the flush.
sys.stdout.flush()
Few answers suggesting python 3.x or pthon 2.x , Below code will work for both.
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,)
stdout = []
while True:
line = p.stdout.readline()
if not isinstance(line, (str)):
line = line.decode('utf-8')
stdout.append(line)
print (line)
if (line == '' and p.poll() != None):
break
def run_command(command):
process = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
rc = process.poll()
return rc
Yet another answer! I had the following requirements:
Run some command and print the output to stdout as though the user ran it
Display to the user any prompts from the command. E.g. pip uninstall numpy will prompt with ... Proceed (Y/n)? (which does not end in a newline)
Capture the output (that the user saw) as a string
This worked for me (only tested in Python 3.10 on Windows):
def run(*args: list[str]) -> str:
proc = subprocess.Popen(
*args,
text=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
result = ""
while proc.poll() is None:
output = proc.stdout.read(1)
if output:
sys.stdout.write(output)
sys.stdout.flush()
result += output
return result
These are all great examples, but I've found they either (a) handle partial lines (eg "Are you sure (Y/n):") but are really slow or b) are quick but hang on partial lines.
I've worked on the following which:
provides real-time output for both stdout and stderr to their respective streams
is extremely fast as it works with stream buffering
allows for using timeouts as it never blocks on read()
efficiently saves stdout and stderr independently
handles text encoding (though easily adaptable to binary streams)
works on Python 3.6+
import os
import subprocess
import sys
import selectors
import io
def run_command(command: str) -> (int, str):
proc = subprocess.Popen(
command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
sel = selectors.DefaultSelector()
for fobj in [ proc.stdout, proc.stderr ]:
os.set_blocking(fobj.fileno(), False)
sel.register(fobj, selectors.EVENT_READ)
out=io.StringIO()
err=io.StringIO()
# loop until all descriptors removed
while len(sel.get_map()) > 0:
events = sel.select()
if len(events) == 0:
# timeout or signal, kill to prevent wait hanging
proc.terminate()
break
for key, _ in events:
# read all available data
buf = key.fileobj.read().decode(errors='ignore')
if buf == '':
sel.unregister(key.fileobj)
elif key.fileobj == proc.stdout:
sys.stdout.write(buf)
sys.stdout.flush()
out.write(buf)
elif key.fileobj == proc.stderr:
sys.stderr.write(buf)
sys.stderr.flush()
err.write(buf)
sel.close()
proc.wait()
if proc.returncode != 0:
return (proc.returncode, err.getvalue())
return (0, out.getvalue())
I didn't include the timeout logic (as the subject is real-time output), but it's simple to add them to select()/wait() and no longer worry about infinite hangs.
I've timed cat '25MB-file' and compared to the .read(1) solutions, it's roughly 300 times faster.

Run a subprocess and show the output in the application [duplicate]

My python script uses subprocess to call a linux utility that is very noisy. I want to store all of the output to a log file and show some of it to the user. I thought the following would work, but the output doesn't show up in my application until the utility has produced a significant amount of output.
#fake_utility.py, just generates lots of output over time
import time
i = 0
while True:
print hex(i)*512
i += 1
time.sleep(0.5)
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
for line in proc.stdout:
#the real code does filtering here
print "test:", line.rstrip()
The behavior I really want is for the filter script to print each line as it is received from the subprocess. Sorta like what tee does but with python code.
What am I missing? Is this even possible?
Update:
If a sys.stdout.flush() is added to fake_utility.py, the code has the desired behavior in python 3.1. I'm using python 2.6. You would think that using proc.stdout.xreadlines() would work the same as py3k, but it doesn't.
Update 2:
Here is the minimal working code.
#fake_utility.py, just generates lots of output over time
import sys, time
for i in range(10):
print i
sys.stdout.flush()
time.sleep(0.5)
#display out put line by line
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
#works in python 3.0+
#for line in proc.stdout:
for line in iter(proc.stdout.readline,''):
print line.rstrip()
I think the problem is with the statement for line in proc.stdout, which reads the entire input before iterating over it. The solution is to use readline() instead:
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
while True:
line = proc.stdout.readline()
if not line:
break
#the real code does filtering here
print "test:", line.rstrip()
Of course you still have to deal with the subprocess' buffering.
Note: according to the documentation the solution with an iterator should be equivalent to using readline(), except for the read-ahead buffer, but (or exactly because of this) the proposed change did produce different results for me (Python 2.5 on Windows XP).
Bit late to the party, but was surprised not to see what I think is the simplest solution here:
import io
import subprocess
proc = subprocess.Popen(["prog", "arg"], stdout=subprocess.PIPE)
for line in io.TextIOWrapper(proc.stdout, encoding="utf-8"): # or another encoding
# do something with line
(This requires Python 3.)
Indeed, if you sorted out the iterator then buffering could now be your problem. You could tell the python in the sub-process not to buffer its output.
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
becomes
proc = subprocess.Popen(['python','-u', 'fake_utility.py'],stdout=subprocess.PIPE)
I have needed this when calling python from within python.
You want to pass these extra parameters to subprocess.Popen:
bufsize=1, universal_newlines=True
Then you can iterate as in your example. (Tested with Python 3.5)
A function that allows iterating over both stdout and stderr concurrently, in realtime, line by line
In case you need to get the output stream for both stdout and stderr at the same time, you can use the following function.
The function uses Queues to merge both Popen pipes into a single iterator.
Here we create the function read_popen_pipes():
from queue import Queue, Empty
from concurrent.futures import ThreadPoolExecutor
def enqueue_output(file, queue):
for line in iter(file.readline, ''):
queue.put(line)
file.close()
def read_popen_pipes(p):
with ThreadPoolExecutor(2) as pool:
q_stdout, q_stderr = Queue(), Queue()
pool.submit(enqueue_output, p.stdout, q_stdout)
pool.submit(enqueue_output, p.stderr, q_stderr)
while True:
if p.poll() is not None and q_stdout.empty() and q_stderr.empty():
break
out_line = err_line = ''
try:
out_line = q_stdout.get_nowait()
except Empty:
pass
try:
err_line = q_stderr.get_nowait()
except Empty:
pass
yield (out_line, err_line)
read_popen_pipes() in use:
import subprocess as sp
with sp.Popen(my_cmd, stdout=sp.PIPE, stderr=sp.PIPE, text=True) as p:
for out_line, err_line in read_popen_pipes(p):
# Do stuff with each line, e.g.:
print(out_line, end='')
print(err_line, end='')
return p.poll() # return status-code
You can also read lines w/o loop. Works in python3.6.
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
list_of_byte_strings = process.stdout.readlines()
Pythont 3.5 added the methods run() and call() to the subprocess module, both returning a CompletedProcess object. With this you are fine using proc.stdout.splitlines():
proc = subprocess.run( comman, shell=True, capture_output=True, text=True, check=True )
for line in proc.stdout.splitlines():
print "stdout:", line
See also How to Execute Shell Commands in Python Using the Subprocess Run Method
I tried this with python3 and it worked, source
When you use popen to spawn the new thread, you tell the operating system to PIPE the stdout of the child processes so the parent process can read it and here, stderr is copied to the stderr of the parent process.
in output_reader we read each line of stdout of the child process by wrapping it in an iterator that populates line by line output from the child process whenever a new line is ready.
def output_reader(proc):
for line in iter(proc.stdout.readline, b''):
print('got line: {0}'.format(line.decode('utf-8')), end='')
def main():
proc = subprocess.Popen(['python', 'fake_utility.py'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
t = threading.Thread(target=output_reader, args=(proc,))
t.start()
try:
time.sleep(0.2)
import time
i = 0
while True:
print (hex(i)*512)
i += 1
time.sleep(0.5)
finally:
proc.terminate()
try:
proc.wait(timeout=0.2)
print('== subprocess exited with rc =', proc.returncode)
except subprocess.TimeoutExpired:
print('subprocess did not terminate in time')
t.join()
The following modification of Rômulo's answer works for me on Python 2 and 3 (2.7.12 and 3.6.1):
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
while True:
line = process.stdout.readline()
if line != '':
os.write(1, line)
else:
break
I was having a problem with the arg list of Popen to update servers, the following code resolves this a bit.
import getpass
from subprocess import Popen, PIPE
username = 'user1'
ip = '127.0.0.1'
print ('What is the password?')
password = getpass.getpass()
cmd1 = f"""sshpass -p {password} ssh {username}#{ip}"""
cmd2 = f"""echo {password} | sudo -S apt update"""
cmd3 = " && "
cmd4 = f"""echo {password} | sudo -S apt upgrade -y"""
cmd5 = " && "
cmd6 = "exit"
commands = [cmd1, cmd2, cmd3, cmd4, cmd5, cmd6]
command = " ".join(commands)
cmd = command.split()
with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')
And to run the update on a local computer, the following code example does this.
import getpass
from subprocess import Popen, PIPE
print ('What is the password?')
password = getpass.getpass()
cmd1_local = f"""apt update"""
cmd2_local = f"""apt upgrade -y"""
commands = [cmd1_local, cmd2_local]
with Popen(['echo', password], stdout=PIPE) as auth:
for cmd in commands:
cmd = cmd.split()
with Popen(['sudo','-S'] + cmd, stdin=auth.stdout, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')

Constantly print Subprocess output while process is running

To launch programs from my Python-scripts, I'm using the following method:
def execute(command):
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output = process.communicate()[0]
exitCode = process.returncode
if (exitCode == 0):
return output
else:
raise ProcessException(command, exitCode, output)
So when i launch a process like Process.execute("mvn clean install"), my program waits until the process is finished, and only then i get the complete output of my program. This is annoying if i'm running a process that takes a while to finish.
Can I let my program write the process output line by line, by polling the process output before it finishes in a loop or something?
I found this article which might be related.
You can use iter to process lines as soon as the command outputs them: lines = iter(fd.readline, ""). Here's a full example showing a typical use case (thanks to #jfs for helping out):
from __future__ import print_function # Only Python 2.x
import subprocess
def execute(cmd):
popen = subprocess.Popen(cmd, stdout=subprocess.PIPE, universal_newlines=True)
for stdout_line in iter(popen.stdout.readline, ""):
yield stdout_line
popen.stdout.close()
return_code = popen.wait()
if return_code:
raise subprocess.CalledProcessError(return_code, cmd)
# Example
for path in execute(["locate", "a"]):
print(path, end="")
To print subprocess' output line-by-line as soon as its stdout buffer is flushed in Python 3:
from subprocess import Popen, PIPE, CalledProcessError
with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='') # process line here
if p.returncode != 0:
raise CalledProcessError(p.returncode, p.args)
Notice: you do not need p.poll() -- the loop ends when eof is reached. And you do not need iter(p.stdout.readline, '') -- the read-ahead bug is fixed in Python 3.
See also, Python: read streaming input from subprocess.communicate().
Ok i managed to solve it without threads (any suggestions why using threads would be better are appreciated) by using a snippet from this question Intercepting stdout of a subprocess while it is running
def execute(command):
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
# Poll process for new output until finished
while True:
nextline = process.stdout.readline()
if nextline == '' and process.poll() is not None:
break
sys.stdout.write(nextline)
sys.stdout.flush()
output = process.communicate()[0]
exitCode = process.returncode
if (exitCode == 0):
return output
else:
raise ProcessException(command, exitCode, output)
There is actually a really simple way to do this when you just want to print the output:
import subprocess
import sys
def execute(command):
subprocess.check_call(command, shell=True, stdout=sys.stdout, stderr=subprocess.STDOUT)
Here we're simply pointing the subprocess to our own stdout, and using existing succeed or exception api.
#tokland
tried your code and corrected it for 3.4 and windows
dir.cmd is a simple dir command, saved as cmd-file
import subprocess
c = "dir.cmd"
def execute(command):
popen = subprocess.Popen(command, stdout=subprocess.PIPE,bufsize=1)
lines_iterator = iter(popen.stdout.readline, b"")
while popen.poll() is None:
for line in lines_iterator:
nline = line.rstrip()
print(nline.decode("latin"), end = "\r\n",flush =True) # yield line
execute(c)
In Python >= 3.5 using subprocess.run works for me:
import subprocess
cmd = 'echo foo; sleep 1; echo foo; sleep 2; echo foo'
subprocess.run(cmd, shell=True)
(getting the output during execution also works without shell=True)
https://docs.python.org/3/library/subprocess.html#subprocess.run
For anyone trying the answers to this question to get the stdout from a Python script note that Python buffers its stdout, and therefore it may take a while to see the stdout.
This can be rectified by adding the following after each stdout write in the target script:
sys.stdout.flush()
To answer the original question, the best way IMO is just redirecting subprocess stdout directly to your program's stdout (optionally, the same can be done for stderr, as in example below)
p = Popen(cmd, stdout=sys.stdout, stderr=sys.stderr)
p.communicate()
In case someone wants to read from both stdout and stderr at the same time using threads, this is what I came up with:
import threading
import subprocess
import Queue
class AsyncLineReader(threading.Thread):
def __init__(self, fd, outputQueue):
threading.Thread.__init__(self)
assert isinstance(outputQueue, Queue.Queue)
assert callable(fd.readline)
self.fd = fd
self.outputQueue = outputQueue
def run(self):
map(self.outputQueue.put, iter(self.fd.readline, ''))
def eof(self):
return not self.is_alive() and self.outputQueue.empty()
#classmethod
def getForFd(cls, fd, start=True):
queue = Queue.Queue()
reader = cls(fd, queue)
if start:
reader.start()
return reader, queue
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(stdoutReader, stdoutQueue) = AsyncLineReader.getForFd(process.stdout)
(stderrReader, stderrQueue) = AsyncLineReader.getForFd(process.stderr)
# Keep checking queues until there is no more output.
while not stdoutReader.eof() or not stderrReader.eof():
# Process all available lines from the stdout Queue.
while not stdoutQueue.empty():
line = stdoutQueue.get()
print 'Received stdout: ' + repr(line)
# Do stuff with stdout line.
# Process all available lines from the stderr Queue.
while not stderrQueue.empty():
line = stderrQueue.get()
print 'Received stderr: ' + repr(line)
# Do stuff with stderr line.
# Sleep for a short time to avoid excessive CPU use while waiting for data.
sleep(0.05)
print "Waiting for async readers to finish..."
stdoutReader.join()
stderrReader.join()
# Close subprocess' file descriptors.
process.stdout.close()
process.stderr.close()
print "Waiting for process to exit..."
returnCode = process.wait()
if returnCode != 0:
raise subprocess.CalledProcessError(returnCode, command)
I just wanted to share this, as I ended up on this question trying to do something similar, but none of the answers solved my problem. Hopefully it helps someone!
Note that in my use case, an external process kills the process that we Popen().
This PoC constantly reads the output from a process and can be accessed when needed. Only the last result is kept, all other output is discarded, hence prevents the PIPE from growing out of memory:
import subprocess
import time
import threading
import Queue
class FlushPipe(object):
def __init__(self):
self.command = ['python', './print_date.py']
self.process = None
self.process_output = Queue.LifoQueue(0)
self.capture_output = threading.Thread(target=self.output_reader)
def output_reader(self):
for line in iter(self.process.stdout.readline, b''):
self.process_output.put_nowait(line)
def start_process(self):
self.process = subprocess.Popen(self.command,
stdout=subprocess.PIPE)
self.capture_output.start()
def get_output_for_processing(self):
line = self.process_output.get()
print ">>>" + line
if __name__ == "__main__":
flush_pipe = FlushPipe()
flush_pipe.start_process()
now = time.time()
while time.time() - now < 10:
flush_pipe.get_output_for_processing()
time.sleep(2.5)
flush_pipe.capture_output.join(timeout=0.001)
flush_pipe.process.kill()
print_date.py
#!/usr/bin/env python
import time
if __name__ == "__main__":
while True:
print str(time.time())
time.sleep(0.01)
output: You can clearly see that there is only output from ~2.5s interval nothing in between.
>>>1520535158.51
>>>1520535161.01
>>>1520535163.51
>>>1520535166.01
This works at least in Python3.4
import subprocess
process = subprocess.Popen(cmd_list, stdout=subprocess.PIPE)
for line in process.stdout:
print(line.decode().strip())
Building on #jfs's excellent answer, here is a complete working example for you to play with. Requires Python 3.7 or newer.
sub.py
import time
for i in range(10):
print(i, flush=True)
time.sleep(1)
main.py
from subprocess import PIPE, Popen
import sys
with Popen([sys.executable, 'sub.py'], bufsize=1, stdout=PIPE, text=True) as sub:
for line in sub.stdout:
print(line, end='')
Notice the flush argument used in the child script.
None of the answers here addressed all of my needs.
No threads for stdout (no Queues, etc, either)
Non-blocking as I need to check for other things going on
Use PIPE as I needed to do multiple things, e.g. stream output, write to a log file and return a string copy of the output.
A little background: I am using a ThreadPoolExecutor to manage a pool of threads, each launching a subprocess and running them concurrency. (In Python2.7, but this should work in newer 3.x as well). I don't want to use threads just for output gathering as I want as many available as possible for other things (a pool of 20 processes would be using 40 threads just to run; 1 for the process thread and 1 for stdout...and more if you want stderr I guess)
I'm stripping back a lot of exception and such here so this is based on code that works in production. Hopefully I didn't ruin it in the copy and paste. Also, feedback very much welcome!
import time
import fcntl
import subprocess
import time
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
# Make stdout non-blocking when using read/readline
proc_stdout = proc.stdout
fl = fcntl.fcntl(proc_stdout, fcntl.F_GETFL)
fcntl.fcntl(proc_stdout, fcntl.F_SETFL, fl | os.O_NONBLOCK)
def handle_stdout(proc_stream, my_buffer, echo_streams=True, log_file=None):
"""A little inline function to handle the stdout business. """
# fcntl makes readline non-blocking so it raises an IOError when empty
try:
for s in iter(proc_stream.readline, ''): # replace '' with b'' for Python 3
my_buffer.append(s)
if echo_streams:
sys.stdout.write(s)
if log_file:
log_file.write(s)
except IOError:
pass
# The main loop while subprocess is running
stdout_parts = []
while proc.poll() is None:
handle_stdout(proc_stdout, stdout_parts)
# ...Check for other things here...
# For example, check a multiprocessor.Value('b') to proc.kill()
time.sleep(0.01)
# Not sure if this is needed, but run it again just to be sure we got it all?
handle_stdout(proc_stdout, stdout_parts)
stdout_str = "".join(stdout_parts) # Just to demo
I'm sure there is overhead being added here but it is not a concern in my case. Functionally it does what I need. The only thing I haven't solved is why this works perfectly for log messages but I see some print messages show up later and all at once.
import time
import sys
import subprocess
import threading
import queue
cmd='esptool.py --chip esp8266 write_flash -z 0x1000 /home/pi/zero2/fw/base/boot_40m.bin'
cmd2='esptool.py --chip esp32 -b 115200 write_flash -z 0x1000 /home/pi/zero2/fw/test.bin'
cmd3='esptool.py --chip esp32 -b 115200 erase_flash'
class ExecutorFlushSTDOUT(object):
def __init__(self,timeout=15):
self.process = None
self.process_output = queue.Queue(0)
self.capture_output = threading.Thread(target=self.output_reader)
self.timeout=timeout
self.result=False
self.validator=None
def output_reader(self):
start=time.time()
while self.process.poll() is None and (time.time() - start) < self.timeout:
try:
if not self.process_output.full():
line=self.process.stdout.readline()
if line:
line=line.decode().rstrip("\n")
start=time.time()
self.process_output.put(line)
if self.validator:
if self.validator in line: print("Valid");self.result=True
except:pass
self.process.kill()
return
def start_process(self,cmd_list,callback=None,validator=None,timeout=None):
if timeout: self.timeout=timeout
self.validator=validator
self.process = subprocess.Popen(cmd_list,stdout=subprocess.PIPE,stderr=subprocess.PIPE,shell=True)
self.capture_output.start()
line=None
self.result=False
while self.process.poll() is None:
try:
if not self.process_output.empty():
line = self.process_output.get()
if line:
if callback:callback(line)
#print(line)
line=None
except:pass
error = self.process.returncode
if error:
print("Error Found",str(error))
raise RuntimeError(error)
return self.result
execute = ExecutorFlushSTDOUT()
def liveOUTPUT(line):
print("liveOUTPUT",line)
try:
if "Writing" in line:
line=''.join([n for n in line.split(' ')[3] if n.isdigit()])
print("percent={}".format(line))
except Exception as e:
pass
result=execute.start_process(cmd2,callback=liveOUTPUT,validator="Hash of data verified.")
print("Finish",result)
Use the -u Python option with subprocess.Popen() if you want to print from stdout while the process is running. (shell=True is necessary, despite the risks...)
Simple better than complex.
os library has built-in module system. You should execute your code and see the output.
import os
os.system("python --version")
# Output
"""
Python 3.8.6
0
"""
After version it is also printed return value as 0.
In Python 3.6 I used this:
import subprocess
cmd = "command"
output = subprocess.call(cmd, shell=True)
print(process)

Getting realtime output using subprocess

I am trying to write a wrapper script for a command line program (svnadmin verify) that will display a nice progress indicator for the operation. This requires me to be able to see each line of output from the wrapped program as soon as it is output.
I figured that I'd just execute the program using subprocess.Popen, use stdout=PIPE, then read each line as it came in and act on it accordingly. However, when I ran the following code, the output appeared to be buffered somewhere, causing it to appear in two chunks, lines 1 through 332, then 333 through 439 (the last line of output)
from subprocess import Popen, PIPE, STDOUT
p = Popen('svnadmin verify /var/svn/repos/config', stdout = PIPE,
stderr = STDOUT, shell = True)
for line in p.stdout:
print line.replace('\n', '')
After looking at the documentation on subprocess a little, I discovered the bufsize parameter to Popen, so I tried setting bufsize to 1 (buffer each line) and 0 (no buffer), but neither value seemed to change the way the lines were being delivered.
At this point I was starting to grasp for straws, so I wrote the following output loop:
while True:
try:
print p.stdout.next().replace('\n', '')
except StopIteration:
break
but got the same result.
Is it possible to get 'realtime' program output of a program executed using subprocess? Is there some other option in Python that is forward-compatible (not exec*)?
I tried this, and for some reason while the code
for line in p.stdout:
...
buffers aggressively, the variant
while True:
line = p.stdout.readline()
if not line: break
...
does not. Apparently this is a known bug: http://bugs.python.org/issue3907 (The issue is now "Closed" as of Aug 29, 2018)
By setting the buffer size to 1, you essentially force the process to not buffer the output.
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, bufsize=1)
for line in iter(p.stdout.readline, b''):
print line,
p.stdout.close()
p.wait()
You can direct the subprocess output to the streams directly. Simplified example:
subprocess.run(['ls'], stderr=sys.stderr, stdout=sys.stdout)
You can try this:
import subprocess
import sys
process = subprocess.Popen(
cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
while True:
out = process.stdout.read(1)
if out == '' and process.poll() != None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
If you use readline instead of read, there will be some cases where the input message is not printed. Try it with a command the requires an inline input and see for yourself.
In Python 3.x the process might hang because the output is a byte array instead of a string. Make sure you decode it into a string.
Starting from Python 3.6 you can do it using the parameter encoding in Popen Constructor. The complete example:
process = subprocess.Popen(
'my_command',
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
shell=True,
encoding='utf-8',
errors='replace'
)
while True:
realtime_output = process.stdout.readline()
if realtime_output == '' and process.poll() is not None:
break
if realtime_output:
print(realtime_output.strip(), flush=True)
Note that this code redirects stderr to stdout and handles output errors.
Real Time Output Issue resolved:
I encountered a similar issue in Python, while capturing the real time output from C program. I added fflush(stdout); in my C code. It worked for me. Here is the code.
C program:
#include <stdio.h>
void main()
{
int count = 1;
while (1)
{
printf(" Count %d\n", count++);
fflush(stdout);
sleep(1);
}
}
Python program:
#!/usr/bin/python
import os, sys
import subprocess
procExe = subprocess.Popen(".//count", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)
while procExe.poll() is None:
line = procExe.stdout.readline()
print("Print:" + line)
Output:
Print: Count 1
Print: Count 2
Print: Count 3
The Streaming subprocess stdin and stdout with asyncio in Python blog post by Kevin McCarthy shows how to do it with asyncio:
import asyncio
from asyncio.subprocess import PIPE
from asyncio import create_subprocess_exec
async def _read_stream(stream, callback):
while True:
line = await stream.readline()
if line:
callback(line)
else:
break
async def run(command):
process = await create_subprocess_exec(
*command, stdout=PIPE, stderr=PIPE
)
await asyncio.wait(
[
_read_stream(
process.stdout,
lambda x: print(
"STDOUT: {}".format(x.decode("UTF8"))
),
),
_read_stream(
process.stderr,
lambda x: print(
"STDERR: {}".format(x.decode("UTF8"))
),
),
]
)
await process.wait()
async def main():
await run("docker build -t my-docker-image:latest .")
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Depending on the use case, you might also want to disable the buffering in the subprocess itself.
If the subprocess will be a Python process, you could do this before the call:
os.environ["PYTHONUNBUFFERED"] = "1"
Or alternatively pass this in the env argument to Popen.
Otherwise, if you are on Linux/Unix, you can use the stdbuf tool. E.g. like:
cmd = ["stdbuf", "-oL"] + cmd
See also here about stdbuf or other options.
(See also here for the same answer.)
Found this "plug-and-play" function here. Worked like a charm!
import subprocess
def myrun(cmd):
"""from
http://blog.kagesenshi.org/2008/02/teeing-python-subprocesspopen-output.html
"""
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
stdout = []
while True:
line = p.stdout.readline()
stdout.append(line)
print line,
if line == '' and p.poll() != None:
break
return ''.join(stdout)
I ran into the same problem awhile back. My solution was to ditch iterating for the read method, which will return immediately even if your subprocess isn't finished executing, etc.
I used this solution to get realtime output on a subprocess. This loop will stop as soon as the process completes leaving out a need for a break statement or possible infinite loop.
sub_process = subprocess.Popen(my_command, close_fds=True, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while sub_process.poll() is None:
out = sub_process.stdout.read(1)
sys.stdout.write(out)
sys.stdout.flush()
You may use an iterator over each byte in the output of the subprocess. This allows inline update (lines ending with '\r' overwrite previous output line) from the subprocess:
from subprocess import PIPE, Popen
command = ["my_command", "-my_arg"]
# Open pipe to subprocess
subprocess = Popen(command, stdout=PIPE, stderr=PIPE)
# read each byte of subprocess
while subprocess.poll() is None:
for c in iter(lambda: subprocess.stdout.read(1) if subprocess.poll() is None else {}, b''):
c = c.decode('ascii')
sys.stdout.write(c)
sys.stdout.flush()
if subprocess.returncode != 0:
raise Exception("The subprocess did not terminate correctly.")
This is the basic skeleton that I always use for this. It makes it easy to implement timeouts and is able to deal with inevitable hanging processes.
import subprocess
import threading
import Queue
def t_read_stdout(process, queue):
"""Read from stdout"""
for output in iter(process.stdout.readline, b''):
queue.put(output)
return
process = subprocess.Popen(['dir'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=1,
cwd='C:\\',
shell=True)
queue = Queue.Queue()
t_stdout = threading.Thread(target=t_read_stdout, args=(process, queue))
t_stdout.daemon = True
t_stdout.start()
while process.poll() is None or not queue.empty():
try:
output = queue.get(timeout=.5)
except Queue.Empty:
continue
if not output:
continue
print(output),
t_stdout.join()
if you just want to forward the log to console in realtime
Below code will work for both
p = subprocess.Popen(cmd,
shell=True,
cwd=work_dir,
bufsize=1,
stdin=subprocess.PIPE,
stderr=sys.stderr,
stdout=sys.stdout)
Complete solution:
import contextlib
import subprocess
# Unix, Windows and old Macintosh end-of-line
newlines = ['\n', '\r\n', '\r']
def unbuffered(proc, stream='stdout'):
stream = getattr(proc, stream)
with contextlib.closing(stream):
while True:
out = []
last = stream.read(1)
# Don't loop forever
if last == '' and proc.poll() is not None:
break
while last not in newlines:
# Don't loop forever
if last == '' and proc.poll() is not None:
break
out.append(last)
last = stream.read(1)
out = ''.join(out)
yield out
def example():
cmd = ['ls', '-l', '/']
proc = subprocess.Popen(
cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
# Make all end-of-lines '\n'
universal_newlines=True,
)
for line in unbuffered(proc):
print line
example()
Using pexpect with non-blocking readlines will resolve this problem. It stems from the fact that pipes are buffered, and so your app's output is getting buffered by the pipe, therefore you can't get to that output until the buffer fills or the process dies.
Here is what worked for me:
import subprocess
import sys
def run_cmd_print_output_to_console_and_log_to_file(cmd, log_file_path):
make_file_if_not_exist(log_file_path)
logfile = open(log_file_path, 'w')
proc=subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell = True)
for line in proc.stdout:
sys.stdout.write(line.decode("utf-8") )
print(line.decode("utf-8").strip(), file=logfile, flush=True)
proc.wait()
logfile.close()
(This solution has been tested with Python 2.7.15)
You just need to sys.stdout.flush() after each line read/write:
while proc.poll() is None:
line = proc.stdout.readline()
sys.stdout.write(line)
# or print(line.strip()), you still need to force the flush.
sys.stdout.flush()
Few answers suggesting python 3.x or pthon 2.x , Below code will work for both.
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,)
stdout = []
while True:
line = p.stdout.readline()
if not isinstance(line, (str)):
line = line.decode('utf-8')
stdout.append(line)
print (line)
if (line == '' and p.poll() != None):
break
def run_command(command):
process = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
rc = process.poll()
return rc
Yet another answer! I had the following requirements:
Run some command and print the output to stdout as though the user ran it
Display to the user any prompts from the command. E.g. pip uninstall numpy will prompt with ... Proceed (Y/n)? (which does not end in a newline)
Capture the output (that the user saw) as a string
This worked for me (only tested in Python 3.10 on Windows):
def run(*args: list[str]) -> str:
proc = subprocess.Popen(
*args,
text=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
result = ""
while proc.poll() is None:
output = proc.stdout.read(1)
if output:
sys.stdout.write(output)
sys.stdout.flush()
result += output
return result
These are all great examples, but I've found they either (a) handle partial lines (eg "Are you sure (Y/n):") but are really slow or b) are quick but hang on partial lines.
I've worked on the following which:
provides real-time output for both stdout and stderr to their respective streams
is extremely fast as it works with stream buffering
allows for using timeouts as it never blocks on read()
efficiently saves stdout and stderr independently
handles text encoding (though easily adaptable to binary streams)
works on Python 3.6+
import os
import subprocess
import sys
import selectors
import io
def run_command(command: str) -> (int, str):
proc = subprocess.Popen(
command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
sel = selectors.DefaultSelector()
for fobj in [ proc.stdout, proc.stderr ]:
os.set_blocking(fobj.fileno(), False)
sel.register(fobj, selectors.EVENT_READ)
out=io.StringIO()
err=io.StringIO()
# loop until all descriptors removed
while len(sel.get_map()) > 0:
events = sel.select()
if len(events) == 0:
# timeout or signal, kill to prevent wait hanging
proc.terminate()
break
for key, _ in events:
# read all available data
buf = key.fileobj.read().decode(errors='ignore')
if buf == '':
sel.unregister(key.fileobj)
elif key.fileobj == proc.stdout:
sys.stdout.write(buf)
sys.stdout.flush()
out.write(buf)
elif key.fileobj == proc.stderr:
sys.stderr.write(buf)
sys.stderr.flush()
err.write(buf)
sel.close()
proc.wait()
if proc.returncode != 0:
return (proc.returncode, err.getvalue())
return (0, out.getvalue())
I didn't include the timeout logic (as the subject is real-time output), but it's simple to add them to select()/wait() and no longer worry about infinite hangs.
I've timed cat '25MB-file' and compared to the .read(1) solutions, it's roughly 300 times faster.

Categories

Resources