Get the command line output of a thread - python

I have a main scirpt that run several other programms in some order (to avoid lauching them manually). Hence I use thread to call them.
The first one is a Windows application and I call it like that:
class NepThread(threading.Thread):
def run(self):
subprocess.call('PATH_TO_PRGM.exe')
pass
#...
nepThread = NepThread()
nepThread.daemon = True
nepThread.start()
Then I run a Python script in kind of same way:
class UsbCameraThread(threading.Thread):
def run(self):
subprocess.call(["python", 'PATH\\USBcamera.py'])
pass
#...
usbCameraThread = UsbCameraThread()
usbCameraThread.daemon = True
usbCameraThread.start()
But for this one, I need to wait it is started before running the next script.
When USBcamera script is ready, it writes something on cout and then starts an infinite loop:
print('Start Video recording!')
while True:
My question is: how can I got the command line output to know if the script is started?
Thanks in advance!
Dark Patate

You can probably use the subprocess.Popen() instruction to capture the stdout stream. Here is a sample of its usage:
proc = subprocess.Popen( command, stdout = subprocess.PIPE, stderr = subprocess.PIPE )
stdout_stream, stderr_stream = proc.communicate()
stdout_stream = stdout_stream.decode( "ascii" )
Then print out the contents of stdout_stream.

Related

Redirecting stdout to tkinter immediately (without waiting for the process to complete)

I am writing a python app to take some inputs from the user and call a shell script based on these inputs.
This shell script can run for quite sometime, and I want to redirect the output it produces (in realtime) to tkinter.
I managed to do that, but it only happens after the shell script is completely finished, not as soon as the shell script "echos" something for example.
So main problem:
1. Output appears in the text widget only after shellScript.sh is exited (although if I run the same in the terminal manually, I see continuous output).
2. Side problem: all the "\n" printed in the output are just printed as "\n" and no new lines are added in the text widget.
here is what I have:
class RedirectText(object):
def __init__(self, text_ctrl):
"""Constructor"""
self.output = text_ctrl
def write(self, string):
self.output.insert(tk.END, string[2:])
class Gui(GuiApp):
def __init__(self, master=None):
redir = RedirectText(self.text_Output)
sys.stdout = redir
def testRun(self, event=None):
p = subprocess.Popen(["./shellScript.sh"], stdout=subprocess.PIPE)
print(p.communicate()[0])
As p.communicate() will wait for the process to complete, so use p.poll() and p.stdout.readline() instead. Also put the process in a thread to avoid blocking the main thread:
import threading
...
class Gui(GuiApp):
...
def runScript(self):
print('started')
p = subprocess.Popen(['./shellScript.sh'], stdout=subprocess.PIPE, bufsize=1, text=True)
while p.poll() is None: # check whether process is still running
msg = p.stdout.readline().strip() # read a line from the process output
if msg:
print(msg)
print('finished')
def testRun(self, event=None):
# create a thread to run the script
threading.Thread(target=self.runScript, daemon=True).start()

Execute Bash Command in new Process

I need to get information from a bash command which takes several seconds. I want the rest of program to continue until I get the returncode. I tried to do it with multiprocessing but I cant get the returncode of the subprocess alltough the console prints the correct returncode.
from multiprocessing import Process, Value
import subprocess
num = Value("d", 0.0)
class foo(object):
def __init__(self):
self.createProcess()
def createProcess(self):
p = Process(target=self.Process, args=(num,))
p.start()
...Do Stuff here in parallel...
def Process(self,n):
somebashParam = int(n.value)
p = subprocess.Popen("some -command"+str(somebashParam),shell=True)
out, err = p.communicate()
n.value = p.returncode
why does the console print out the right returncode but I cant grab it?
It seems strange for me to launch a subprocess in a other new Process. Is there a better way?
External processes automatically run in parallel. If you are only interested in the return code, you don't need any additional code:
n = 23
process = subprocess.Popen(["some", "-command", str(n)])
while process.poll() is None:
do_something_else()
result = process.wait()

Start process and reuse with PID in python

My main process/thread starts an executable that starts waiting for a signal after echoing Algorithm loaded. I am using the subprocess.Popen class for running the executable.
Later, a thread is started that is supposed to send a signal to the earlier started executable. But I have no clue of how to send a signal to that particular subprocess from that thread.
Is it possible to pass PID's and "recover" subprocesses using the PID? The purpose of reusing the process is to send something equivalent to stdin.
Here's my code for starting the executable:
def start_module():
cmd = '%s/libraries/OpenBR' % settings.MODULES_DIR
process = subprocess.Popen(cmd,stdout=subprocess.PIPE)
while True:
line = process.stdout.readline()
if line.find('Algorithm loaded') > -1:
break
return 0
The process variable in your code refers to a Popen object, which supports a pid attribute. If you have your start_module function return the process, you can later send it a signal using os.kill. For example:
def start_module():
cmd = '%s/libraries/OpenBR' % settings.MODULES_DIR
process = subprocess.Popen(cmd,stdout=subprocess.PIPE)
while True:
line = process.stdout.readline()
if line.find('Algorithm loaded') > -1:
break
return process
p = start_module()
os.kill(p.pid, signal.SIGALRM)
As far as I can see, using a thread or not to send the signal should not make any difference. Notice that os.kill does not necessarily kill a process: it sends it a signal that the process can then handle appropriately (an ALARM signal, here).
If your intention was to pass some input to the process's stdin, then things are also easy. You just need to add stdin=subprocess.PIPE to the Popen call and print to the stdin attribute of the new process:
def start_module():
cmd = '%s/libraries/OpenBR' % settings.MODULES_DIR
process = subprocess.Popen(cmd,stdin=subprocess.PIPE,stdout=subprocess.PIPE)
while True:
line = process.stdout.readline()
if line.find('Algorithm loaded') > -1:
break
return process
p = start_module()
print >> p.stdin, "Hello world!"
print >> p.stdin, "How are things there?"

How do I get data from a subprocess PIPE while the subprocess is running in Python?

I've got a program on Windows that calls a bunch of subprocesses, and displays the results in a GUI. I'm using PyQt for the GUI, and the subprocess module to run the programs.
I've got the following WorkerThread, that spawns a subthread for each shell command devoted to reading the process stdout and printing the results (later I'll wire it up to the GUI).
This all works. Except proc.stdout.read(1) never returns until after the subprocess has completed. This is a big problem, since some of these subprocesses can take 15-20 minutes to run, and I need to display results as they're running.
What do I need to do to get the pipe working while the subprocess is running?
class WorkerThread(QtCore.QThread):
def run(self):
def sh(cmd, cwd = None):
proc = subprocess.Popen(cmd,
shell = True,
stdout = subprocess.PIPE,
stderr = subprocess.STDOUT,
stdin = subprocess.PIPE,
cwd = cwd,
env = os.environ)
proc.stdin.close()
class ReadStdOutThread(QtCore.QThread):
def run(_self):
s = ''
while True:
if self.request_exit: return
b = proc.stdout.read(1)
if b == '\n':
print s
s = ''
continue
if b:
s += b
continue
if s: print s
return
thread = ReadStdOutThread()
thread.start()
retcode = proc.wait()
if retcode:
raise subprocess.CalledProcessError(retcode, cmd)
return 0
FWIW: I rewrote the whole thing using QProcess, and I see the exact same problem. The stdout receives no data, until the underlying process has returned. Then I get everything all at once.
If you know how long will be the the lines of command's output you can poll on the stdout PIPE of the process.
An example of what I mean:
import select
import subprocess
import threading
import os
# Some time consuming command.
command = 'while [ 1 ]; do sleep 1; echo "Testing"; done'
# A worker thread, not as complex as yours, just to show my point.
class Worker(threading.Thread):
def __init__(self):
super(Worker, self).__init__()
self.proc = subprocess.Popen(
command, shell=True,
stdout=subprocess.PIPE,
stdin=subprocess.PIPE, stderr=subprocess.STDOUT
)
def run(self):
self.proc.communicate()
def get_proc(self):
# The proc is needed for ask him for his
# output file descriptor later.
return self.proc
if __name__ == '__main__':
w = Worker()
w.start()
proc = w.get_proc()
pollin = select.poll()
pollin.register(proc.stdout, select.POLLIN)
while ( 1 ):
events = pollin.poll()
for fd, event in events:
if event == select.POLLIN:
# This is the main issue of my idea,
# if you don't know the length of lines
# that process ouput, this is a problem.
# I put 7 since I know the word "Testing" have
# 7 characters.
print os.read(fd, 7)
Maybe this is not exactly what you're looking for, but I think it give you a pretty good idea of what to do to solve your problem.
EDIT: I think I've just found what you need Streaming stdout from a Python subprocess in Python.

showing progress while spawning and running subprocess

I need to show some progress bar or something while spawning and running subprocess.
How can I do that with python?
import subprocess
cmd = ['python','wait.py']
p = subprocess.Popen(cmd, bufsize=1024,stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
p.stdin.close()
outputmessage = p.stdout.read() #This will print the standard output from the spawned process
message = p.stderr.read()
I could spawn subprocess with this code, but I need to print out something when each second is passing.
Since the subprocess call is blocking, one way to print something out while waiting would be to use multithreading. Here's an example using threading._Timer:
import threading
import subprocess
class RepeatingTimer(threading._Timer):
def run(self):
while True:
self.finished.wait(self.interval)
if self.finished.is_set():
return
else:
self.function(*self.args, **self.kwargs)
def status():
print "I'm alive"
timer = RepeatingTimer(1.0, status)
timer.daemon = True # Allows program to exit if only the thread is alive
timer.start()
proc = subprocess.Popen([ '/bin/sleep', "5" ])
proc.wait()
timer.cancel()
On an unrelated note, calling stdout.read() while using multiple pipes can lead to deadlock. The subprocess.communicate() function should be used instead.
As far as I see it all you need to do is put those reads in a loop with a delay and a print - does it have to be precisely a second or around about a second?

Categories

Resources