Limited buffer in Popen [duplicate] - python

This question already has answers here:
Alternatives to Python Popen.communicate() memory limitations?
(3 answers)
Closed 9 years ago.
I am launching a script using a python code. This code should launch the script which writes a file on the disk, and wait for the script to finish.
But whenever I launch this script using python, the result file doesn't exceed 65768 bytes and the script doesn't respond anymore. Here is what I use in python :
p = subprocess.Popen(command,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=-1)
p.wait()
where command is the command for the script.
Does anyone know a solution for this issue ?
Thank you.

from time import sleep
p = subprocess.Popen(command,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=-1)
output = ''
while p.poll() is None:
output += p.stdout.readline()+'\n' # <--- This is your magic
sleep(0.025)
output += p.stdout.read() # <--- And this is just to get the leftover data
print('Command finished')
p.stdout.close()
as #J.F comments on almost every single Popen answer i give, you should never define stdout=..., stdin=... or stderr=... unless you're going to utelize them.
Because they will fill up the buffer and hang your application.
But if you do, make sure you "tap" from it once in a while.

Related

python subprocess not capturing stdout [duplicate]

This question already has answers here:
read from subprocess output python
(2 answers)
Closed 2 years ago.
i'm trying to capture all output when running a python application using subprocess. i've tried several variants using both subprocess.run and subprocess.Popen. The python app that runs, executes a perl script and this output is captured.
import subprocess as sp
print("some data")
print("some data")
x = subprocess.run(['script.py', 'some', 'options'], stdout=sp.PIPE, stderr=sp.PIPE)
proc_out = sp.stdout
proc_err = sp.stderr
I've also tried adding '> out 2>&1' to the list, tried with capture_output=True, tried redirecting stdout/stderr. The odd thing is that the print statements I'm trying to capture no longer display.
so, it's a python app (which output is captured), that uses subprocess to call another python app (unable to capture it's output), which in turn calls a perl function (which output is captured).
I've been through most of the threads that referenced capturing all data, but still no luck.
Any ideas?
import subprocess
command = "your command"
proc = subprocess.Popen([command], stdout=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
You need .communicate(), which writes input then reads all output and waits for the subprocess to exit before continuing execution in current/main thread

Reading stdout after terminating a process

I would like to launch a process, let it run for some time and then read its output (the last output is ok, I don't need everything). I tried to use the following code:
def connect(interface):
proc = subprocess.Popen(['mycommand', interface], stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)
time.sleep(10)
proc.terminate()
output, err = proc.communicate()
print(output)
Unfortunately, every time it gets stuck when reading the output. I also tried to use proc.read() instead of communicate() but it didn't solve the problem.
What's the best way to handle output in this case?
Many thanks in advance!
After some research, I found that the issue came from the buffer. As indicated in the subprocess module documentation:
This will deadlock when using stdout=PIPE or stderr=PIPE and the child
process generates enough output to a pipe such that it blocks waiting
for the OS pipe buffer to accept more data. Use Popen.communicate()
when using pipes to avoid that.
There are two solutions:
Use the bufsize argument to set the buffer to a size large enought to store all the output generated during the time you wait.
Use readline() to read the output and flush the buffer during the waiting time. If you don't need it just discard the output.
I choose the second approach ; my code is as follow:
def connect(interface):
stdout = []
timeout = 60
start_time = time.time()
proc = subprocess.Popen(['mycommand', interface], stdout=subprocess.PIPE, stderr=subprocess.PIPE, bufsize=1, universal_newlines=True)
while time.time() < start_time + timeout:
line = proc.stdout.readline()
stdout.append(line)
proc.terminate()
print(output)

Child Process is failing using Popen

I'm trying to get a program named xselect to run using the Popen construct in python. If I run xselect from the terminal manually by typing in the commands by hand, It runs all the way through. However, when done from the python script, it freezes at a certain command and will not continue. When I check the log file, all of the output is captured, but none of the error messages are captured.
I'm thinking that Popen may not know what to do with the errors from the output of xselect, and its causing xselect to freeze. To counter this, I tried to add a timeout so that it kills xselect after 5 seconds, but this hasn't worked either.
Can anyone help me get this running?
with subprocess.Popen(args, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, universal_newlines=True) as proc:
proc.wait(timeout=5)
out = proc.stdout.read()
if TimeoutExpired:
proc.kill()
See the warning for proc.wait() here: https://docs.python.org/2/library/subprocess.html#subprocess.Popen.wait. Basically, you should either be using proc.communicate(), or should be reading from proc.stdout instead of waiting.

Check if subprocess is still running while reading its output line by line [duplicate]

This question already has answers here:
Read streaming input from subprocess.communicate()
(7 answers)
Closed 6 years ago.
I am starting a subprocess in Python and trying to read each line of output. Unfortunately I can't find a good way of testing if my processes is still alive. The standard method seems to be checking poll(), but that seems to always return None. Here is my code.
proc = Popen(sys.argv[1:], stdin=PIPE, stdout=PIPE, stderr=STDOUT)
while True:
process_line(proc.stdout.readline().decode('utf-8'))
if not proc.poll():
break
for line in proc.communicate()[0].splitlines():
process_line(line.decode('utf-8'))
I've also tried using os.kill(proc.pid, 0), which works for non-spawned processes, but it seems that Python keeps a handle on processes it starts so os.kill(proc.pid, 0) always returns.
What am I doing wrong?
to process subprocess output line by line, try this:
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, bufsize=1)
while p.poll() is None:
line = p.stdout.readline()
...do something with the line here
notice I set the buffer size to 1.
also, os.kill(proc.pid, 0) is not needed. Just call
.kill() on the subprocess you spawned. You can also call .wait() to wait for termination instead of immediately killing the process.

Output of subprocess both to PIPE and directly to stdout

I found a number of questions which looks like mine, but which did not produce a solution I can use (closest is: subprocess output to stdout and to PIPE)
The problem: I want to start a process using subprocess which takes a long time. After running the command I need to parse the stdout-output and the stderr-output.
Currently I do it as follows:
p = subprocess.Popen( command_list, stdout=subprocess.PIPE,
stderr=subprocess.PIPE )
out, error_msg = p.communicate()
print out + "\n\n" + error_msg
#next comes code in which I check out and error_msg
But the drawback of this method is that the user does not see the output of the process while it is running. Only at the end the output is printed.
Is there a way that the output is printed while the command is running (as if I gave the command without stdout/stderr=subprocess.PIPE) and still have the output via p.communicate in the end?
Note: I'm currently developing on python 2.5 (old software release which uses this python version).
This snippet has helped me once in a similar situation:
process = subprocess.Popen(cmd, bufsize=1, universal_newlines=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in iter(process.stdout.readline, ''):
print line,
sys.stdout.flush() # please see comments regarding the necessity of this line
process.wait()
errcode = process.returncode

Categories

Resources