I have python script that sends commands to matlab script using subprocess.Popen. Matlab, in turn, sends back data to python, using stdout of the pipe. Communication between Python and Matlab should run infinitely, however, once retrieved information from Matlab, python should run its own functions. Problem is that python waits for information infinitely from Matlab. Code will make it clear:
class MatlabWrapper(object):
def __init__(self, barrier_fullname=DEFAULT_BARRIER_FULLNAME):
self.barrier_fullname = barrier_fullname
def run(self):
self.process = subprocess.Popen(
'matlab -nodisplay -nosplash', shell=True,
stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=DEVNULL)
#self.process.stdin.write('matlabpool 8;\n')
self.process.stdin.flush()
def execute(self, cmd):
try:
os.remove(self.barrier_fullname)
except OSError:
pass
self.process.stdin.write(cmd)
self.process.stdin.write('; !touch %s\n' % self.barrier_fullname)
self.process.stdin.flush()
while True:
try:
with open(self.barrier_fullname, 'rb') as _:
break
except IOError:
time.sleep(0)
os.remove(self.barrier_fullname)
while True:
line = self.process.stdout.readline()
if line:
print '>>>' + line.rstrip()
else:
break
After creating an instance of MatlabWrapper, I launch run function that initializes pipe. Then I send command to execute to Matlab, and wait when it outputs some information (using printf). After reading stdout line by line, it stops at line line = self.process.stdout.readline() and waits for more information from matlab.
What I want is that when there is no information in stdout python will finish execute function. How should I do this?
Related
I have a command that provides event stream - new message every few second.
How do I read this as it comes with python?
The standard approach with
def getMessage(command):
lines = os.popen(command).readlines()
return lines
waits for the command to complete, but in this command run forever. It will continue on and print new message to stdout every few seconds.
How do I pass it on to python? I want to capture all messages in the stream.
You can read the output line by line and process/print it. Meanwhile use p.poll to check if the process has ended.
def get_message(command):
p = subprocess.Popen(
command,
stdout=subprocess.PIPE,
)
while True:
output = p.stdout.readline()
if output == '' and p.poll() is not None:
break
if output:
yield output.strip()
I'm trying to run "docker-compose pull" from inside a Python automation script and to incrementally display the same output that Docker command would print if it was run directly from the shell. This command prints a line for each Docker image found in the system, incrementally updates each line with the Docker image's download progress (a percentage) and replaces this percentage with a "done" when the download has completed. I first tried getting the command output with subprocess.poll() and (blocking) readline() calls:
import shlex
import subprocess
def run(command, shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
while True:
# print one output line
output_line = p.stdout.readline().decode('utf8')
error_output_line = p.stderr.readline().decode('utf8')
if output_line:
print(output_line.strip())
if error_output_line:
print(error_output_line.strip())
# check if process finished
return_code = p.poll()
if return_code is not None and output_line == '' and error_output_line == '':
break
if return_code > 0:
print("%s failed, error code %d" % (command, return_code))
run("docker-compose pull")
The code gets stuck in the first (blocking) readline() call. Then I tried to do the same without blocking:
import select
import shlex
import subprocess
import sys
import time
def run(command, shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
io_poller = select.poll()
io_poller.register(p.stdout.fileno(), select.POLLIN)
io_poller.register(p.stderr.fileno(), select.POLLIN)
while True:
# poll IO for output
io_events_list = []
while not io_events_list:
time.sleep(1)
io_events_list = io_poller.poll(0)
# print new output
for event in io_events_list:
# must be tested because non-registered events (eg POLLHUP) can also be returned
if event[1] & select.POLLIN:
if event[0] == p.stdout.fileno():
output_str = p.stdout.read(1).decode('utf8')
print(output_str, end="")
if event[0] == p.stderr.fileno():
error_output_str = p.stderr.read(1).decode('utf8')
print(error_output_str, end="")
# check if process finished
# when subprocess finishes, iopoller.poll(0) returns a list with 2 select.POLLHUP events
# (one for stdout, one for stderr) and does not enter in the inner loop
return_code = p.poll()
if return_code is not None:
break
if return_code > 0:
print("%s failed, error code %d" % (command, return_code))
run("docker-compose pull")
This works, but only the final lines (with "done" at the end) are printed to the screen, when all Docker images downloads have been completed.
Both methods work fine with a command with simpler output such as "ls". Maybe the problem is related with how this Docker command prints incrementally to screen, overwriting already written lines ? Is there a safe way to incrementally show the exact output of a command in the command line when running it via a Python script?
EDIT: 2nd code block was corrected
Always openSTDIN as a pipe, and if you are not using it, close it immediately.
p.stdout.read() will block until the pipe is closed, so your polling code does nothing useful here. It needs modifications.
I suggest not to use shell=True
Instead of *.readline(), try with *.read(1) and wait for "\n"
Of course you can do what you want in Python, the question is how. Because, a child process might have different ideas about how its output should look like, that's when trouble starts. E.g. the process might want explicitly a terminal at the other end, not your process. Or a lot of such simple nonsense. Also, a buffering may also cause problems. You can try starting Python in unbuffered mode to check. (/usr/bin/python -U)
If nothing works, then use pexpect automation library instead of subprocess.
I have found a solution, based on the first code block of my question:
def run(command,shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
while True:
# read one char at a time
output_line = p.stderr.read(1).decode("utf8")
if output_line != "":
print(output_line,end="")
else:
# check if process finished
return_code = p.poll()
if return_code is not None:
if return_code > 0:
raise Exception("Command %s failed" % command)
break
return return_code
Notice that docker-compose uses stderr to print its progress instead of stdout. #Dalen has explained that some applications do it when they want that their results are pipeable somewhere, for instance a file, but also want to be able to show their progress.
I'm trying to integrate an interactive lua shell into my python GUI with a similar approach as described here: Running an interactive command from within python Target platform for now is windows. I want to be able to feed the lua interpreter line by line.
import subprocess
import os
from queue import Queue
from queue import Empty
from threading import Thread
import time
def enqueue_output(out, queue):
for line in iter(out.readline, b''):
queue.put(line)
out.close()
lua = '''\
-- comment
print("A")
test = 0
test2 = 1
os.exit()'''
command = os.path.join('lua', 'bin', 'lua.exe')
process = (subprocess.Popen(command + ' -i', shell=True,
stdin=subprocess.PIPE, stderr=subprocess.PIPE,
stdout=subprocess.PIPE, cwd=os.getcwd(), bufsize=1,
universal_newlines=True))
outQueue = Queue()
errQueue = Queue()
outThread = Thread(target=enqueue_output, args=(process.stdout, outQueue))
errThread = Thread(target=enqueue_output, args=(process.stderr, errQueue))
outThread.daemon = True
errThread.daemon = True
outThread.start()
errThread.start()
script = lua.split('\n')
time.sleep(.2)
for line in script:
while True:
try:
rep = outQueue.get(timeout=.2)
except Empty:
break
else: # got line
print(rep)
process.stdin.write(line)
The only output I receive is the very first line of the lua.exe shell. It seems that the writing to stdin doesn't actually take place. Is there anything I miss?
Running an external lua file with the -i switch actually works and yields the expected output which makes me think the issue is connected to the stdin.
I experimented a bit in python interactive mode using the python shell trying something similar to the solution featuring a file for the stdout here: Interactive input/output using python. However, this only wrote the output to the file once I stopped the python shell, which also seems like the stdin gets stalled somewhere and is only actually transmitted, once I quit the shell. Any ideas what goes wrong here?
I'm trying to read command outputs from hcitools in Linux (it scans for bluetooth devices).
I just need to read the first line that it returns, as sometimes this tool has an error. The issue is that this tool continues to run in a infinite loop, which locks up the rest of my Python script. The script is run with sudo so that it has root privileges to use the hcitool command.
I have created a class to try to pipe the data in asynchronously:
class ASyncThread(threading.Thread): #pOpen read and readline are blocking. So we must use an async thread to read from hciTool
def __init__(self, command, parameters = []):
self.stdout = None
self.stderr = None
self.command = command
self.parameters = parameters
self.process = None
threading.Thread.__init__(self)
def run(self):
if len(self.command) >= 1:
self.process = subprocess.Popen([self.command] + self.parameters, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
self.stdout, self.stderr = self.process.communicate()
else:
print "[ASyncThread::run()] Error: Empty command given."
def terminate(self):
try:
self.process.terminate()
except Exception, ex:
print "[ASyncThread::terminate()] Error: ", ex
And I'm calling it with:
print "Checking HCI Tool Status..."
hciThread = ASyncThread("/usr/local/bin/hciconfig", ["lescan"])
hciThread.start()
time.sleep(1) #Give the program time to run.
hciThread.terminate() #If terminate is not placed here, it locks up my Python script when the thread is joined.
hciThread.join()
outputText = hciThread.stdout + " | " + hciThread.stderr
When this is run, the output is just " | ".
If I run this command:
sudo /usr/local/bin/hcitool lescan
It instantly starts working immediately:
slyke#ubuntu ~ $ sudo hcitool lescan
Set scan parameters failed: Input/output error
I've been working on this for a few hours now. I originally tried to do this with pOpen, but read() and readline() are both blocking. This is not normally a problem, except that there may not be an error, or any data produced by this command, so my Python script hangs. This is why I moved to threading, so it can wait for a second before stopping it, and continuing on.
It seems to me you cannot possibly join a thread, after you have just terminated it on the line above.
Your particular issue about doing an lescan is probably better solved with the solution from mikerr/btle-scan.py - https://gist.github.com/mikerr/372911c955e2a94b96089fbc300c2b5d
I'd like to use the subprocess module in the following way:
create a new process that potentially takes a long time to execute.
capture stdout (or stderr, or potentially both, either together or separately)
Process data from the subprocess as it comes in, perhaps firing events on every line received (in wxPython say) or simply printing them out for now.
I've created processes with Popen, but if I use communicate() the data comes at me all at once, once the process has terminated.
If I create a separate thread that does a blocking readline() of myprocess.stdout (using stdout = subprocess.PIPE) I don't get any lines with this method either, until the process terminates. (no matter what I set as bufsize)
Is there a way to deal with this that isn't horrendous, and works well on multiple platforms?
Update with code that appears not to work (on windows anyway)
class ThreadWorker(threading.Thread):
def __init__(self, callable, *args, **kwargs):
super(ThreadWorker, self).__init__()
self.callable = callable
self.args = args
self.kwargs = kwargs
self.setDaemon(True)
def run(self):
try:
self.callable(*self.args, **self.kwargs)
except wx.PyDeadObjectError:
pass
except Exception, e:
print e
if __name__ == "__main__":
import os
from subprocess import Popen, PIPE
def worker(pipe):
while True:
line = pipe.readline()
if line == '': break
else: print line
proc = Popen("python subprocess_test.py", shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE)
stdout_worker = ThreadWorker(worker, proc.stdout)
stderr_worker = ThreadWorker(worker, proc.stderr)
stdout_worker.start()
stderr_worker.start()
while True: pass
stdout will be buffered - so you won't get anything till that buffer is filled, or the subprocess exits.
You can try flushing stdout from the sub-process, or using stderr, or changing stdout on non-buffered mode.
It sounds like the issue might be the use of buffered output by the subprocess - if a relatively small amount of output is created, it could be buffered until the subprocess exits. Some background can be found here:
Here's what worked for me:
cmd = ["./tester_script.bash"]
p = subprocess.Popen( cmd, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE )
while p.poll() is None:
out = p.stdout.readline()
do_something_with( out, err )
In your case you could try to pass a reference to the sub-process to your Worker Thread, and do the polling inside the thread. I don't know how it will behave when two threads poll (and interact with) the same subprocess, but it may work.
Also note thate the while p.poll() is None: is intended as is. Do not replace it with while not p.poll() as in python 0 (the returncode for successful termination) is also considered False.
I've been running into this problem as well. The problem occurs because you are trying to read stderr as well. If there are no errors, then trying to read from stderr would block.
On Windows, there is no easy way to poll() file descriptors (only Winsock sockets).
So a solution is not to try and read from stderr.
Using pexpect [http://www.noah.org/wiki/Pexpect] with non-blocking readlines will resolve this problem. It stems from the fact that pipes are buffered, and so your app's output is getting buffered by the pipe, therefore you can't get to that output until the buffer fills or the process dies.
This seems to be a well-known Python limitation, see
PEP 3145 and maybe others.
Read one character at a time: http://blog.thelinuxkid.com/2013/06/get-python-subprocess-output-without.html
import contextlib
import subprocess
# Unix, Windows and old Macintosh end-of-line
newlines = ['\n', '\r\n', '\r']
def unbuffered(proc, stream='stdout'):
stream = getattr(proc, stream)
with contextlib.closing(stream):
while True:
out = []
last = stream.read(1)
# Don't loop forever
if last == '' and proc.poll() is not None:
break
while last not in newlines:
# Don't loop forever
if last == '' and proc.poll() is not None:
break
out.append(last)
last = stream.read(1)
out = ''.join(out)
yield out
def example():
cmd = ['ls', '-l', '/']
proc = subprocess.Popen(
cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
# Make all end-of-lines '\n'
universal_newlines=True,
)
for line in unbuffered(proc):
print line
example()
Using subprocess.Popen, I can run the .exe of one of my C# projects and redirect the output to my Python file. I am able now to print() all the information being output to the C# console (using Console.WriteLine()) to the Python console.
Python code:
from subprocess import Popen, PIPE, STDOUT
p = Popen('ConsoleDataImporter.exe', stdout = PIPE, stderr = STDOUT, shell = True)
while True:
line = p.stdout.readline()
print(line)
if not line:
break
This gets the console output of my .NET project line by line as it is created and breaks out of the enclosing while loop upon the project's termination. I'd imagine this would work for two python files as well.
I've used the pexpect module for this, it seems to work ok. http://sourceforge.net/projects/pexpect/