This question already has answers here:
How to write to stdout AND to log file simultaneously with Popen?
(3 answers)
Closed 2 years ago.
I am trying to run a shell command via subprocess.run, and I would like to redirect the output to a file, but also display it on stdout at the same time.
I have not found a way to do it, is this possible in a pure Python script?
It would be equivalent to doing some_command | tee file.txt in bash.
I could always write a wrapper bash script that will invoke the python script and call tee, but would be nice if Python had a way to do this directly.
You can capture the output and send to stdout and a file.
Python 3.7+:
r = subprocess.run(cmds, capture_output=True, text=True)
Python 3.5+:
r = subprocess.run(cmds, stdout=PIPE, stderr=PIPE)
stdout = r.stdout.decode("utf-8") # specify encoding
Example
import subprocess
r = subprocess.run(['ls'], capture_output=True, text=True)
print(r.stdout)
with open('a.txt','w') as f:
f.write(r.stdout)
Related
This question already has answers here:
read from subprocess output python
(2 answers)
Closed 2 years ago.
i'm trying to capture all output when running a python application using subprocess. i've tried several variants using both subprocess.run and subprocess.Popen. The python app that runs, executes a perl script and this output is captured.
import subprocess as sp
print("some data")
print("some data")
x = subprocess.run(['script.py', 'some', 'options'], stdout=sp.PIPE, stderr=sp.PIPE)
proc_out = sp.stdout
proc_err = sp.stderr
I've also tried adding '> out 2>&1' to the list, tried with capture_output=True, tried redirecting stdout/stderr. The odd thing is that the print statements I'm trying to capture no longer display.
so, it's a python app (which output is captured), that uses subprocess to call another python app (unable to capture it's output), which in turn calls a perl function (which output is captured).
I've been through most of the threads that referenced capturing all data, but still no luck.
Any ideas?
import subprocess
command = "your command"
proc = subprocess.Popen([command], stdout=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
You need .communicate(), which writes input then reads all output and waits for the subprocess to exit before continuing execution in current/main thread
This question already has answers here:
Read streaming input from subprocess.communicate()
(7 answers)
Closed 4 years ago.
I am working on an Embedded application which uses some precompiled binaries from the CANBOAT repository.
There are two binaries available for usage:
actisense-serial
analyzer
In a shell, one requires to execute actisense -r /dev/ttyUSB0 | analyzer -json to obtain information from a device connected to the USB port. The above mentioned command dumps JSON information to STDOUT.
Sample output:
{"timestamp":"2018-08-30T16:27:23.629Z","prio":2,"src":3,"dst":255,"pgn":128259,"description":"Speed","fields":{"SID":106,"Speed Water Referenced Type":"Paddle wheel"}}
{"timestamp":"2018-08-30T16:27:23.629Z","prio":2,"src":6,"dst":255,"pgn":128259,"description":"Speed","fields":{"SID":106,"Speed Water Referenced Type":"Paddle wheel"}}
The above mentioned values keep being displayed on the STDOUT.
I wish to use the above mentioned shell commands in a python script to obtain the JSON values, to parse them and save them to a database.
Initially I want to start out with subprocess.check_output.
I tried:
import subprocess
if __name_ == "__main__":
while True:
value = subprocess.check_output(['actisense-serial -r /ttyUSB0',
'|',
'analyzer -json'],
shell=True)
print(value)
But there is no output available. I am not sure how do I route the output of the STDOUT to the check_output.
How do I achieve this, where the continuous JSON information coming from the shell commands can be parsed and used further in the application?
You can pass in a pipe to stdout and stderr when you're using Popen like this:
actisense_proc = subprocess.Popen(['actisense-serial', '-r', '/ttyUSB0'],
stdout=subprocess.PIPE)
analyzer_proc = subprocess.Popen(['analyzer', '-json'], stdin=actisense_proc.stdout,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while analyzer_proc.poll() is None:
print(analyzer_proc.stdout.readline())
Also note that instead of using shell=True, I used two Popen calls and piped the stdout of the first into the stdin of the second.
EDIT: Missed the streaming part of the question. Updated so it will constantly read from the stdout pipe. This will run until the subprocesses terminate though.
This question already has answers here:
Send value of variable that is in a loop from one script to another script
(2 answers)
Closed 6 years ago.
I am trying to run code from this answer. For convenience, the code is below.
main.py
from subprocess import Popen, PIPE
p = Popen(['py', 'client.py'], stdin=PIPE, stdout=PIPE, stderr=PIPE, shell=True)
r = True
while r:
r = p.stdout.readline()
print r
client.py
def fn():
for a in (0, 1, 2):
print a
fn()
OUTP:
b'0\r\n'
b'1\r\n'
b'2\r\n'
b''
The person who contributed the answer said it works for them. However, I cannot get it to produce an output. I attempted this with python 2.7 on Mac, Linux, AND Windows.
If this works for you, please explain why it is not working for me. All I am doing is python main.py in the directory where both files are.
EDIT: "OUT:" is what I am supposed to get. However, I get nothing.
What happens if you do this?
main.py
from subprocess import check_output
client_output = check_output(['python', 'client.py'])
print client_output
https://docs.python.org/2/library/subprocess.html
You should look into the Popen.communicate() method.
Maybe something like this:
main.py
from subprocess import Popen, PIPE
p = Popen(['python', 'client.py'], stdin=PIPE, stdout=PIPE, stderr=PIPE)
print p.communicate()[0]
This question already has answers here:
Running an interactive command from within Python
(3 answers)
Closed 9 years ago.
I usually write bash scripts, but I am writing one now in python. So I have the problem that I want to run an interactive command which asks for some user data, so ideally I would like to pass control of stdin, stdout to bash, and then go back to my python script when the command has been executed correctly.
Problem is: I haven't been able to do it with os.system. And I would also like to capture the exit status of the command that I run.
from subprocess import Popen, STDOUT, PIPE
from time import sleep
x = Popen('du -h', shell=True, stdout=PIPE, stdin=PIPE, stderr=STDOUT)
while x.poll() == None:
sleep(0.25)
print('Command finished successfully with the following exit status:',x.poll())
print('And this was the output given by the command:')
print(x.stdout.read())
x.stdout.close()
x.stdin.close()
This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
How can I capture the stdout output of a child process?
I'm running a cat-like program in bash from Python:
import os
os.system('cat foo.txt')
How do I get the output of the shell command back in the Python script, something like:
s = somefunction('cat foo.txt')
?
UPD: Here is a related thread.
Use the subprocess module.
from subprocess import Popen, PIPE
(stdout, stderr) = Popen(["cat","foo.txt"], stdout=PIPE).communicate()
print stdout