I am trying to make a python process that reads some input, processes it and prints out the result. The processing is done by a subprocess (Stanford's NER), for ilustration I will use 'cat'. I don't know exactly how much output NER will give, so I use run a separate thread to collect it all and print it out. The following example illustrates.
import sys
import threading
import subprocess
# start my subprocess
cat = subprocess.Popen(
['cat'],
shell=False, stdout=subprocess.PIPE, stdin=subprocess.PIPE,
stderr=None)
def subproc_cat():
""" Reads the subprocess output and prints out """
while True:
line = cat.stdout.readline()
if not line:
break
print("CAT PROC: %s" % line.decode('UTF-8'))
# a daemon that runs the above function
th = threading.Thread(target=subproc_cat)
th.setDaemon(True)
th.start()
# the main thread reads from stdin and feeds the subprocess
while True:
line = sys.stdin.readline()
print("MAIN PROC: %s" % line)
if not line:
break
cat.stdin.write(bytes(line.strip() + "\n", 'UTF-8'))
cat.stdin.flush()
This seems to work well when I enter text with the keyboard. However, if I try to pipe input into my script (cat file.txt | python3 my_script.py), a racing condition seems to occur. Sometimes I get proper output, sometimes not, sometimes it locks down. Any help would be appreciated!
I am runing Ubuntu 14.04, python 3.4.0. The solution should be platform-independant.
Add th.join() at the end otherwise you may kill the thread prematurely before it has processed all the output when the main thread exits: daemon threads do not survive the main thread (or remove th.setDaemon(True) instead of th.join()).
Related
I followed the accepted answer for this question A non-blocking read on a subprocess.PIPE in Python to read non-blocking from a subprocess. This generally works fine, except if the process I call terminates quickly.
This is on Windows.
To illustrate, I have a bat file that simply writes one line to stdout:
test.bat:
#ECHO OFF
ECHO Fast termination
And here the python code, adapted from above mentioned answer:
from subprocess import PIPE, Popen
from threading import Thread
from queue import Queue, Empty
def enqueue_output(out, queue):
for line in iter(out.readline, b''):
queue.put(line)
out.close()
p = Popen(['test.bat'], stdout=PIPE, bufsize=-1,
text=True)
q = Queue()
t = Thread(target=enqueue_output, args=(p.stdout, q))
t.daemon = True # thread dies with the program
t.start()
output = str()
while True:
try:
line = q.get_nowait()
except Empty:
line = ""
output += line
if p.poll() is not None:
break
print(output)
Sometimes, the line from the bat file is correctly captured and printed, sometimes nothing is captured an printed. I suspect that the subprocess might finish before the thread connects the queue to the pipe, and then it doesn't read anything. If I add a little wait of 2 seconds in the bat file before echoing the line, it seems to always work. Likewise the behavior can be forced by adding a little sleep after the Popen in the python code. Is there a way to reliably capture the output of the subprocess even if it finishes immediately while still doing a non-blocking read?
I have some GPU test software i'm trying to automate using python3, The test would normally be run for 3 minutes then cancelled by a user using ctrl+c generating the following output
After exiting with ctrl+c the test can then be run again with no issue
When trying to automate this with subprocess popen and sending SIGINT or SIGTERM i'm not getting the same as if keyboard entry was used. The script exits abruptly and on subsequent runs cant find the gpus (assume its not unloading the driver properly)
from subprocess import Popen, PIPE
from signal import SIGINT
from time import time
def check_subproc_alive(subproc):
return subproc.poll() is None
def print_subproc(subproc, timer=True):
start_time = time()
while check_subproc_alive(subproc):
line = subproc.stdout.readline().decode('utf-8')
print(line, end="")
if timer and (time() - start_time) > 10:
break
subproc = Popen(['./gpu_test.sh', '-t', '1'], stdin=PIPE, stdout=PIPE, stderr=PIPE, shell=False)
print_subproc(subproc)
subproc.send_signal(SIGINT)
print_subproc(subproc, False)
How can I send ctrl+c to a subprocess as if a user typed it?
**UPDATE
import subprocess
def start(executable_file):
return subprocess.Popen(
executable_file,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
def read(process):
return process.stdout.readline().decode("utf-8").strip()
def write(process):
process.stdin.write('\x03'.encode())
process.stdin.flush()
def terminate(process):
process.stdin.close()
process.terminate()
process.wait(timeout=0.2)
process = start("./test.sh")
write(process)
for x in range(100):
print(read(process))
terminate(process)
Tried the above code and can get characters to register with dummy sh script however sending the \x03 command just sends an empty char and doesn't end script
I think you can probably use something like this:
import signal
try:
p=subprocess...
except KeyboardInterrupt:
p.send_signal(signal.SIGINT)
The following solution is the only one I could find that works for windows and is the closest resemblance to sending a Ctrl+C event.
import signal
os.kill(self.p.pid, signal.CTRL_C_EVENT)
main script starts the second script in a new subprocess and thread that continuously checks the stdout for data. The second script asks for input. I would like to have the first script ask for user input then pass it to the second script. I'm developing on windows and couldn't get pexpect to work.
test.py - main script
import threading
import subprocess
def read_output(process):
print("starting to read")
for line in process.stdout:
print (line.rstrip())
def write_output(process,s):
process.stdin.write(s.encode('utf-8'))
process.stdin.flush()
process = subprocess.Popen('python test2.py', shell=False,
stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=None)
# Create new threads
thread1 = threading.Thread(read_output(process))
# Start new Threads
thread1.daemon=True
thread1.start()
s=input("test input:")
print("yep:"+s)
thread1.process.stdin.write(s.encode('utf-8'))
thread1.process.stdin.flush()
test2.py second script
print("Enter an input A,B,C:")
s=input("")
print("you selected:"+s)
First mistake: wrong args when creating thread. You're passing the result of the function, called in the main process: the thread isn't started yet, you read the output in the main thread, not in the started thread.
Fix it like this:
thread1 = threading.Thread(target=read_output,args=(process,))
Second mistake (or maybe that the program continues), you must close process stdin after writing a string in it:
process.stdin.close()
Fixed test1.py file:
import threading
import subprocess
def read_output(process):
print("starting to read")
for line in process.stdout:
print (line.rstrip())
process = subprocess.Popen('python test2.py', shell=False,
stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=None)
# Create new thread: pass target and argument
thread1 = threading.Thread(target=read_output,args=(process,))
# Start new Threads
thread1.daemon=True
thread1.start()
s=input("test input:")
print("yep:"+s)
process.stdin.write(s.encode('utf-8'))
process.stdin.write("\r\n".encode('utf-8')) # emulate "ENTER" in thread
process.stdin.close() # close standard input or thread doesn't terminate
thread1.join() # wait for thread to finish
Why does communicate kill my process? I want an interactive process but communicate does something so that I cannot take raw_input any more in my process.
from sys import stdin
from threading import Thread
from time import sleep
if __name__ == '__main__':
print("Still Running\n")
x = raw_input()
i = 0
while ('n' not in x ) :
print("Still Running " + str(i) + " \r\n")
x = raw_input()
i += 1
print("quit")
print(aSubProc.theProcess.communicate('y'))
print(aSubProc.theProcess.communicate('y'))
exception!
self.stdin.write(input)
ValueError: I/O operation on closed file
communicate and wait methods of Popen objects, close the PIPE after the process returns. If you want stay in communication with the process try something like this:
import subprocess
proc = subprocess.Popen("some_process", stdout=subprocess.PIPE, stdin=subprocess.PIPE)
proc.stdin.write("input")
proc.stdout.readline()
Why does communicate kill my process?
From the docs for Popen.communicate(input=None, timeout=None):
Interact with process: Send data to stdin. Read data from stdout and
stderr, until end-of-file is reached. Wait for process to terminate.
emphasize mine
You may call .communicate() only once. It means that you should provide all input at once:
#!/usr/bin/env python
import os
import sys
from subprocess import Popen, PIPE
p = Popen([sys.executable, 'child.py'], stdin=PIPE, stdout=PIPE)
print p.communicate(os.linesep.join('yyn'))[0]
Output
Still Running
Still Running 0
Still Running 1
quit
Notice the doubled newlines: one from '\r\n' and another from print statement itself in your script for the child process.
Output shows that the child process received three input lines successfully ('y', 'y', and 'n').
Here's a similar code using subprocess.check_output()'s input parameter from Python3.4:
#!/usr/bin/env python3.4
import os
import sys
from subprocess import check_output
output = check_output(['python2', 'child.py'], universal_newlines=True,
input='\n'.join('yyn'))
print(output, end='')
It produces the same output.
If you want to provide a different input depending on responses from the child processes then use pexpect module or its analogs to avoid issues mentioned in Why not just use a pipe (popen())?
I have a problem with sub-process code. The subprocess.Popen() works fine but when I try to read its output through stdout.read() there is no value to read.
**import os
import signal
import subprocess
import threading
import sys
import commands
print commands.getoutput("hcitool dev")
print 'down'
commands.getoutput('hciconfig hci1 down')
print 'up'
commands.getoutput('hciconfig hci1 up')
commands.getoutput('killall hcitool')
stop = False
ping = subprocess.call('hcitool lescan', shell = False,
stdout=subprocess.PIPE,executable='/bin/bash')
for i in ping.stdout:
print i
def kill():
global stop
stop = True
os.kill(ping.pid, signal.SIGTERM)
threading.Timer(5, kill).start()
#while not stop:
# print 'now in while not loop'
# sys.stdout.write(ping.stdout.read(1))
print 'trying to print stdout'
out, err = ping.communicate()
print "out",out
#result = out.decode()
print "Result : ",result**
This code works fine when I change hcitool lescan to ping www.google.com, and produces output but when I try with hcitool lescan it either hangs forever or produces no output. Help is appreciated!
Any of the above answers didn't work for me. Was hung up in the forever scan of hcitool. So finally i wrote a shell script and called it by my python code. This is working fine for me and i am reading the output from the file "result.txt".
hcitool lescan>result.txt &
sleep 5
pkill --signal SIGINT hcitool
There are multiple errors in your code e.g., subprocess.call() returns an integer (exit status of the program) and an integer has no .stdout attribute; also the combination of shell=False and non-None executable is very rarely useful (and it is probably used incorrectly in this case).
The simplest way to fix the code is to use check_output():
from subprocess import check_output as qx
output = qx(["hcitool", "lescan"]) # get all output at once
print output,
As an alternative, you could print program's output line by line as soon as its stdout is flushed:
from subprocess import Popen, PIPE
proc = Popen(["hcitool", "lescan"], stdout=PIPE, bufsize=1) # start process
for line in iter(proc.stdout.readline, b''): # read output line-by-line
print line,
# reached EOF, nothing more to read
proc.communicate() # close `proc.stdout`, wait for child process to terminate
print "Exit status", proc.returncode
To kill a subprocess, you could use its .kill() method e.g.:
from threading import Timer
def kill(process):
try:
process.kill()
process.wait() # to avoid zombies
except OSError: # ignore errors
pass
Timer(5, kill, [proc]).start() # kill in 5 seconds
thank you very much..but the problem is hcitool lescan never stops,and hence hangs out in the very next line of your code.,
and i found similar solution here it is.this works fine and i dont have to kill subprocess,this code takes some extra time to pour output,but this following code works preciesly,
from os import kill
import signal
import subprocess
import threading
import tempfile
import sys
import time
from tempfile import TemporaryFile
import commands
t = TemporaryFile()
global pipe_output
print commands.getoutput("hcitool dev")
print 'down'
commands.getoutput('hciconfig hci0 down')
print 'up'
commands.getoutput('hciconfig hci0 up')
print commands.getoutput("hcitool dev")
commands.getoutput('killall hcitool')
p = subprocess.Popen('hcitool lescan', bufsize = 0,shell = True, stdout =subprocess.PIPE,stderr = subprocess.STDOUT)
time.sleep(10)
#os.kill(p.pid,signal.SIGTERM)
for i in range(0,30,1):
print 'for'
inchar = p.stdout.readline()
i+=1
if inchar:
print 'loop num:',i
print str(inchar)
t.write(str(inchar))
print 'out of loop'
t.seek(0)
print t.read()
any help how to reduce waiting time,other than just changing time.sleep() ,is appreciated
thank you all
Use Popen class instead of the call class. hcitool lescan will run forever. subprocess.call waits for the call to be finished to return. Popen does not wait.