Capturing the output of a command in realtime - python - python

I see that there are several solutions for capturing a command output in realtime when invoked from python. I have a case like this.
run_command.py
import time
for i in range(10):
print "Count = ", i
time.sleep(1)
check_run_command.py - this one tries to capture the run_command.py output in realtime.
import subprocess
def run_command(cmd):
p = subprocess.Popen(
cmd,
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE
)
while True:
line = p.stdout.readline()
if line == '':
break
print(line.strip())
if __name__ == "__main__":
run_command("python run_command.py".split())
$ python check_run_command.py
(Waits 10 secs) then prints the following
Count = 0
Count = 1
....
Count = 9
I am not sure why I can't capture the output in realtime in this case. I tried multiple solutions in other threads for the same problem, but didn't help. Is the sleep in run_command.py has anything to do with this.
I tried running ls commands, but can't figure out if the output is printed in realtime or after the process completes, because the command itself completes quickly. Hence I added one that has sleep.

Related

python - read linux command output continuously

I have a command that provides event stream - new message every few second.
How do I read this as it comes with python?
The standard approach with
def getMessage(command):
lines = os.popen(command).readlines()
return lines
waits for the command to complete, but in this command run forever. It will continue on and print new message to stdout every few seconds.
How do I pass it on to python? I want to capture all messages in the stream.
You can read the output line by line and process/print it. Meanwhile use p.poll to check if the process has ended.
def get_message(command):
p = subprocess.Popen(
command,
stdout=subprocess.PIPE,
)
while True:
output = p.stdout.readline()
if output == '' and p.poll() is not None:
break
if output:
yield output.strip()

My python script stops... no error, just stops

I am running a script that iterates through a text file. On each line on the text file there is an ip adress. The script grabs the banner, then writes the ip + banner on another file.
The problem is, it just stops around 500 lines, more or less, with no error.
Another weird thing is if i run it with python3 it does what i said above. If i run it with python it iterates through those 500 lines, then starts at the beggining. I noticed this when i saw repetitions in my output file. Anyway here is the code, maybe you guys can tell me what im doing wrong:
import os
import subprocess
import concurrent.futures
#import time, random
import threading
import multiprocessing
with open("ipuri666.txt") as f:
def multiprocessing_func():
try:
line2 = line.rstrip('\r\n')
a = subprocess.Popen(["curl", "-I", line2, "--connect-timeout", "1", "--max-time", "1"], stdout=subprocess.PIPE)
b = subprocess.Popen(["grep", "Server"], stdin=a.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
#a.stdout.close()
out, err = b.communicate()
g = open("IP_BANNER2","a")
print( "out: {0}".format(out))
g.write(line2 + " " + "out: {0}\n".format(out))
print("err: {0}".format(err))
except IOError:
print("Connection timed out")
if __name__ == '__main__':
#starttime = time.time()
processes = []
for line in f:
p = multiprocessing.Process(target=multiprocessing_func, args=())
processes.append(p)
p.start()
for process in processes:
process.join()
If your use case allows I would recommend just rewriting this as a shell script, there is no need to use Python. (This would likely solve your issue indirectly.)
#!/usr/bin/env bash
readarray -t ips < ipuri666.txt
for ip in ${ips[#]}; do
output=$(curl -I "$ip" --connect-timeout 1 --max-time 1 | grep "Server")
echo "$ip $output" >> fisier.txt
done
The script is slightly simpler than what you are trying to do, for instance I do not capture the error. This should be pretty close to what you are trying to accomplish. I will update again if needed.

Display process output incrementally using Python subprocess

I'm trying to run "docker-compose pull" from inside a Python automation script and to incrementally display the same output that Docker command would print if it was run directly from the shell. This command prints a line for each Docker image found in the system, incrementally updates each line with the Docker image's download progress (a percentage) and replaces this percentage with a "done" when the download has completed. I first tried getting the command output with subprocess.poll() and (blocking) readline() calls:
import shlex
import subprocess
def run(command, shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
while True:
# print one output line
output_line = p.stdout.readline().decode('utf8')
error_output_line = p.stderr.readline().decode('utf8')
if output_line:
print(output_line.strip())
if error_output_line:
print(error_output_line.strip())
# check if process finished
return_code = p.poll()
if return_code is not None and output_line == '' and error_output_line == '':
break
if return_code > 0:
print("%s failed, error code %d" % (command, return_code))
run("docker-compose pull")
The code gets stuck in the first (blocking) readline() call. Then I tried to do the same without blocking:
import select
import shlex
import subprocess
import sys
import time
def run(command, shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
io_poller = select.poll()
io_poller.register(p.stdout.fileno(), select.POLLIN)
io_poller.register(p.stderr.fileno(), select.POLLIN)
while True:
# poll IO for output
io_events_list = []
while not io_events_list:
time.sleep(1)
io_events_list = io_poller.poll(0)
# print new output
for event in io_events_list:
# must be tested because non-registered events (eg POLLHUP) can also be returned
if event[1] & select.POLLIN:
if event[0] == p.stdout.fileno():
output_str = p.stdout.read(1).decode('utf8')
print(output_str, end="")
if event[0] == p.stderr.fileno():
error_output_str = p.stderr.read(1).decode('utf8')
print(error_output_str, end="")
# check if process finished
# when subprocess finishes, iopoller.poll(0) returns a list with 2 select.POLLHUP events
# (one for stdout, one for stderr) and does not enter in the inner loop
return_code = p.poll()
if return_code is not None:
break
if return_code > 0:
print("%s failed, error code %d" % (command, return_code))
run("docker-compose pull")
This works, but only the final lines (with "done" at the end) are printed to the screen, when all Docker images downloads have been completed.
Both methods work fine with a command with simpler output such as "ls". Maybe the problem is related with how this Docker command prints incrementally to screen, overwriting already written lines ? Is there a safe way to incrementally show the exact output of a command in the command line when running it via a Python script?
EDIT: 2nd code block was corrected
Always openSTDIN as a pipe, and if you are not using it, close it immediately.
p.stdout.read() will block until the pipe is closed, so your polling code does nothing useful here. It needs modifications.
I suggest not to use shell=True
Instead of *.readline(), try with *.read(1) and wait for "\n"
Of course you can do what you want in Python, the question is how. Because, a child process might have different ideas about how its output should look like, that's when trouble starts. E.g. the process might want explicitly a terminal at the other end, not your process. Or a lot of such simple nonsense. Also, a buffering may also cause problems. You can try starting Python in unbuffered mode to check. (/usr/bin/python -U)
If nothing works, then use pexpect automation library instead of subprocess.
I have found a solution, based on the first code block of my question:
def run(command,shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
while True:
# read one char at a time
output_line = p.stderr.read(1).decode("utf8")
if output_line != "":
print(output_line,end="")
else:
# check if process finished
return_code = p.poll()
if return_code is not None:
if return_code > 0:
raise Exception("Command %s failed" % command)
break
return return_code
Notice that docker-compose uses stderr to print its progress instead of stdout. #Dalen has explained that some applications do it when they want that their results are pipeable somewhere, for instance a file, but also want to be able to show their progress.

Run Python script within Python by using `subprocess.Popen` in real time

I want to run a Python script (or any executable, for that manner) from a python script and get the output in real time. I have followed many tutorials, and my current code looks like this:
import subprocess
with open("test2", "w") as f:
f.write("""import time
print('start')
time.sleep(5)
print('done')""")
process = subprocess.Popen(['python3', "test2"], stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
rc = process.poll()
The first bit just creates the file that will be run, for clarity's sake.
I have two problems with this code:
It does not give the output in real time. It waits untill the process has finished.
It does not terminate the loop once the process has finished.
Any help would be very welcome.
EDIT: Thanks to #JohnAnderson for the fix to the first problem: replacing if output == '' and process.poll() is not None: with if output == b'' and process.poll() is not None:
Last night I've set out to do this using a pipe:
import os
import subprocess
with open("test2", "w") as f:
f.write("""import time
print('start')
time.sleep(2)
print('done')""")
(readend, writeend) = os.pipe()
p = subprocess.Popen(['python3', '-u', 'test2'], stdout=writeend, bufsize=0)
still_open = True
output = ""
output_buf = os.read(readend, 1).decode()
while output_buf:
print(output_buf, end="")
output += output_buf
if still_open and p.poll() is not None:
os.close(writeend)
still_open = False
output_buf = os.read(readend, 1).decode()
Forcing buffering out of the picture and reading one character at the time (to make sure we do not block writes from the process having filled a buffer), closing the writing end when process finishes to make sure read catches the EOF correctly. Having looked at the subprocess though that turned out to be a bit of an overkill. With PIPE you get most of that for free and I ended with this which seems to work fine (call read as many times as necessary to keep emptying the pipe) with just this and assuming the process finished, you do not have to worry about polling it and/or making sure the write end of the pipe is closed to correctly detect EOF and get out of the loop:
p = subprocess.Popen(['python3', '-u', 'test2'],
stdout=subprocess.PIPE, bufsize=1,
universal_newlines=True)
output = ""
output_buf = p.stdout.readline()
while output_buf:
print(output_buf, end="")
output += output_buf
output_buf = p.stdout.readline()
This is a bit less "real-time" as it is basically line buffered.
Note: I've added -u to you Python call, as you need to also make sure your called process' buffering does not get in the way.

Use Jython Subprocess to send commands to bash shell

I need to send a number of subsequent commands to one bash shell in a Jython engine.
Executing commands 1 by 1, with os.system(s) or subsystem.call(s, ...) does not work as a new shell is created every time.
I hope someone has an idea .... following 3 tests are not a sufficient slution.
Sample Commands : <br>
cd /home/xxx/dir1/dir2<br>
pwd<br>
cd ..<br>
pwd
In this first test, the commands are executed, but the output is only retrieved at the end.
def testRun1():
# Actual Output
# run 0
# run 1
# run 2
# /home/usr/dir1/dir2
# /home/usr/dir1
# /home/usr
print 'All output is shown at the end...'
proc = subprocess.Popen('/bin/bash',
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
)
for i in range(3):
print 'run ' + str(i)
proc.stdin.write('pwd\n')
proc.stdin.write('cd ..\n')
output = proc.communicate()[0]
print output
Whereas the 'desired output' is
# run 0
# /home/usr/dir1/dir2
# run 1
# /home/usr/dir1
# run 2
# /home/usr
This second tryout delivers what we want, but the output is only shown when jython script is interrupted.
def testRun2():
# Weird : it is what we want, but all output is blocked until CTRL-C is pressed
# run 0
# /home/usr/dir1/dir2
# run 1
# /home/usr/dir1
# run 2
# /home/usr
proc = subprocess.Popen('/bin/bash',
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
)
for i in range(3):
print 'run ' + str(i)
proc.stdin.write('pwd\n')
proc.stdin.write('cd ..\n')
print 'start to print output'
for line in proc.stdout:
print(line.decode("utf-8"))
print_remaining(proc.stdout)
print 'printed output'
This last tryout crashes in the second run because a stream was closed.
def testRun3():
# This fails with error
# ValueError: I/O operation on closed file
proc = subprocess.Popen('/bin/bash',
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
)
for i in range(3):
print 'run ' + str(i)
proc.stdin.write('pwd\n')
proc.stdin.write('cd ..\n')
output = proc.communicate()[0]
print output
The troubles you're having are only partially to do with subprocess. Pipes are fundamentally the wrong IPC mechanism for this job. To put an interactive command interpreter under scripted control, what you want is a pseudoterminal, and even then it's not as simple as reading and writing.
The Python standard library doesn't have any built-in modules that do pseudoterminal handling for you, unless they've added something very recently that I'm not aware of. However, the third-party package pexpect can do it, and it's geared for exactly the thing you are trying to do.
Using the basic pexpect API:
import pexpect
def testRun4():
proc = pexpect.spawn("/bin/bash")
for i in range(3):
proc.expect(":^[^$#]*[$#] *")
print("run", i)
proc.sendline("pwd")
proc.expect("^[^$#]*[$#] *")
print(proc.before)
proc.sendline("cd ..")
With pexpect.replwrap, it's a little more involved to set up but then the loop is tidier:
def testRun5():
proc = pexpect.replwrap.REPLWrapper(
"/bin/bash",
orig_prompt="^[^$#]*[$#] *",
prompt_change="PS1='{}'; PS2='{}'")
for i in range(3):
print("run", i)
print(proc.run_command("pwd"))
proc.run_command("cd ..")

Categories

Resources