python subprocess not capturing stdout [duplicate] - python

This question already has answers here:
read from subprocess output python
(2 answers)
Closed 2 years ago.
i'm trying to capture all output when running a python application using subprocess. i've tried several variants using both subprocess.run and subprocess.Popen. The python app that runs, executes a perl script and this output is captured.
import subprocess as sp
print("some data")
print("some data")
x = subprocess.run(['script.py', 'some', 'options'], stdout=sp.PIPE, stderr=sp.PIPE)
proc_out = sp.stdout
proc_err = sp.stderr
I've also tried adding '> out 2>&1' to the list, tried with capture_output=True, tried redirecting stdout/stderr. The odd thing is that the print statements I'm trying to capture no longer display.
so, it's a python app (which output is captured), that uses subprocess to call another python app (unable to capture it's output), which in turn calls a perl function (which output is captured).
I've been through most of the threads that referenced capturing all data, but still no luck.
Any ideas?

import subprocess
command = "your command"
proc = subprocess.Popen([command], stdout=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
You need .communicate(), which writes input then reads all output and waits for the subprocess to exit before continuing execution in current/main thread

Related

Why python's subprocess returns unexpected value? [duplicate]

This question already has answers here:
Why does python print version info to stderr?
(2 answers)
Closed 4 years ago.
I wrote following program for an example --
from subprocess import *
import shlex
def pipe(command):
proc = Popen(shlex.split(command), stdout=PIPE, stderr=PIPE)
out, err = proc.communicate()
print "output:", out # blank
print "errors:", err # expected output
#return str(err) # returns expected output
return str(out) # returns blanked output
out = pipe('python --version')
print 'pipe returned ----- %s' % out
Actually, err holds the expected value instead of out.
What is wrong with this code?
Is subprocess module is only used to handle basic OS's commands?
The main error is the unfounded assumption that version information will be displayed on standard output. This is poorly standardized, but Python - and many other Unix tools - sends this output to standard error.
Somewhat less crucially, you should probably be using subprocess.run() instead of raw Popen(); and not import *.
from subprocess import run, PIPE
out = run(['python', '--version'],
stdout=PIPE, stderr=PIPE, universal_newlines=True,
check=True).stderr
If you like shlex you can use that to split the command into a list of two strings, though it seems rather superfluous in this case.

Call a shell commands from python's subprocess continuously and parse the output [duplicate]

This question already has answers here:
Read streaming input from subprocess.communicate()
(7 answers)
Closed 4 years ago.
I am working on an Embedded application which uses some precompiled binaries from the CANBOAT repository.
There are two binaries available for usage:
actisense-serial
analyzer
In a shell, one requires to execute actisense -r /dev/ttyUSB0 | analyzer -json to obtain information from a device connected to the USB port. The above mentioned command dumps JSON information to STDOUT.
Sample output:
{"timestamp":"2018-08-30T16:27:23.629Z","prio":2,"src":3,"dst":255,"pgn":128259,"description":"Speed","fields":{"SID":106,"Speed Water Referenced Type":"Paddle wheel"}}
{"timestamp":"2018-08-30T16:27:23.629Z","prio":2,"src":6,"dst":255,"pgn":128259,"description":"Speed","fields":{"SID":106,"Speed Water Referenced Type":"Paddle wheel"}}
The above mentioned values keep being displayed on the STDOUT.
I wish to use the above mentioned shell commands in a python script to obtain the JSON values, to parse them and save them to a database.
Initially I want to start out with subprocess.check_output.
I tried:
import subprocess
if __name_ == "__main__":
while True:
value = subprocess.check_output(['actisense-serial -r /ttyUSB0',
'|',
'analyzer -json'],
shell=True)
print(value)
But there is no output available. I am not sure how do I route the output of the STDOUT to the check_output.
How do I achieve this, where the continuous JSON information coming from the shell commands can be parsed and used further in the application?
You can pass in a pipe to stdout and stderr when you're using Popen like this:
actisense_proc = subprocess.Popen(['actisense-serial', '-r', '/ttyUSB0'],
stdout=subprocess.PIPE)
analyzer_proc = subprocess.Popen(['analyzer', '-json'], stdin=actisense_proc.stdout,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while analyzer_proc.poll() is None:
print(analyzer_proc.stdout.readline())
Also note that instead of using shell=True, I used two Popen calls and piped the stdout of the first into the stdin of the second.
EDIT: Missed the streaming part of the question. Updated so it will constantly read from the stdout pipe. This will run until the subprocesses terminate though.

Limited buffer in Popen [duplicate]

This question already has answers here:
Alternatives to Python Popen.communicate() memory limitations?
(3 answers)
Closed 9 years ago.
I am launching a script using a python code. This code should launch the script which writes a file on the disk, and wait for the script to finish.
But whenever I launch this script using python, the result file doesn't exceed 65768 bytes and the script doesn't respond anymore. Here is what I use in python :
p = subprocess.Popen(command,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=-1)
p.wait()
where command is the command for the script.
Does anyone know a solution for this issue ?
Thank you.
from time import sleep
p = subprocess.Popen(command,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=-1)
output = ''
while p.poll() is None:
output += p.stdout.readline()+'\n' # <--- This is your magic
sleep(0.025)
output += p.stdout.read() # <--- And this is just to get the leftover data
print('Command finished')
p.stdout.close()
as #J.F comments on almost every single Popen answer i give, you should never define stdout=..., stdin=... or stderr=... unless you're going to utelize them.
Because they will fill up the buffer and hang your application.
But if you do, make sure you "tap" from it once in a while.

How can I execute interactive external commands? [duplicate]

This question already has answers here:
Running an interactive command from within Python
(3 answers)
Closed 9 years ago.
I usually write bash scripts, but I am writing one now in python. So I have the problem that I want to run an interactive command which asks for some user data, so ideally I would like to pass control of stdin, stdout to bash, and then go back to my python script when the command has been executed correctly.
Problem is: I haven't been able to do it with os.system. And I would also like to capture the exit status of the command that I run.
from subprocess import Popen, STDOUT, PIPE
from time import sleep
x = Popen('du -h', shell=True, stdout=PIPE, stdin=PIPE, stderr=STDOUT)
while x.poll() == None:
sleep(0.25)
print('Command finished successfully with the following exit status:',x.poll())
print('And this was the output given by the command:')
print(x.stdout.read())
x.stdout.close()
x.stdin.close()

Getting stdout to display called script containing input

I was looking to implement a python script that called another script and captured its stdout. The called script will contain some input and output messages eg
print ("Line 1 of Text")
variable = raw_input("Input 1 :")
print "Line 2 of Text Input: ", vairable
The section of the code I'm running is
import subprocess
cmd='testfile.py'
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE)
so, se = p.communicate()
print(so)
The problem that is occurring is that the stdout is not printing until after the script has been executed. This leaves a blank prompt waiting for the user input. Is there a way to get stdout to print while the called script is still running?
Thanks,
There are two problems here.
Firstly, python is buffering output to stdout and you need to prevent this. You could insert a call to sys.stdout.flush() in testfile.py as Ilia Frenkel has suggested, or you could use python -u to execute testfile.py with unbuffered I/O. (See the other stack overflow question that Ilia linked to.)
You need a way of asynchronously reading data from the sub-process and then, when it is ready for input, printing the data you've read so that the prompt for the user appears. For this, it would be very helpful to have an asynchronous version of the subprocess module.
I downloaded the asynchronous subprocess and re-wrote your script to use it, along with using python -u to get unbuffered I/O:
import async_subprocess as subprocess
cmd = ['python', '-u', 'testfile.py']
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
so = p.asyncread()
print so,
(so, se) = p.communicate()
print so
When I run this script using python -u I get the following results:
$ python -u script.py
Line 1 of Text
Input 1:
and the script pauses, waiting for input. This is the desired result.
If I then type something (e.g. "Hullo") I get the following:
$ python -u script.py
Line 1 of Text
Input 1:Hullo
Line 2 of Text Input: Hullo
You don't need to capture it's stdout really, just have the child program print out its stuff and quit, instead of feeding the output into your parent program and printing it there. If you need variable output, just use a function instead.
But anyways, that's not what you asked.
I actually got this from another stackoverflow question:
import subprocess, sys
cmd='testfile.py'
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
while True:
out = p.stdout.read(20)
if out == '' and p.poll() != None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
First, it opens up your process: then it continually reads the output from p and prints it onto the screen using sys.stdout.write. The part that makes this all work is sys.stdout.flush(), which will continually "flush out" the output of the program.

Categories

Resources