This question already has answers here:
Running an interactive command from within Python
(3 answers)
Closed 9 years ago.
I usually write bash scripts, but I am writing one now in python. So I have the problem that I want to run an interactive command which asks for some user data, so ideally I would like to pass control of stdin, stdout to bash, and then go back to my python script when the command has been executed correctly.
Problem is: I haven't been able to do it with os.system. And I would also like to capture the exit status of the command that I run.
from subprocess import Popen, STDOUT, PIPE
from time import sleep
x = Popen('du -h', shell=True, stdout=PIPE, stdin=PIPE, stderr=STDOUT)
while x.poll() == None:
sleep(0.25)
print('Command finished successfully with the following exit status:',x.poll())
print('And this was the output given by the command:')
print(x.stdout.read())
x.stdout.close()
x.stdin.close()
Related
This question already has answers here:
How to write to stdout AND to log file simultaneously with Popen?
(3 answers)
Closed 2 years ago.
I am trying to run a shell command via subprocess.run, and I would like to redirect the output to a file, but also display it on stdout at the same time.
I have not found a way to do it, is this possible in a pure Python script?
It would be equivalent to doing some_command | tee file.txt in bash.
I could always write a wrapper bash script that will invoke the python script and call tee, but would be nice if Python had a way to do this directly.
You can capture the output and send to stdout and a file.
Python 3.7+:
r = subprocess.run(cmds, capture_output=True, text=True)
Python 3.5+:
r = subprocess.run(cmds, stdout=PIPE, stderr=PIPE)
stdout = r.stdout.decode("utf-8") # specify encoding
Example
import subprocess
r = subprocess.run(['ls'], capture_output=True, text=True)
print(r.stdout)
with open('a.txt','w') as f:
f.write(r.stdout)
This question already has answers here:
read from subprocess output python
(2 answers)
Closed 2 years ago.
i'm trying to capture all output when running a python application using subprocess. i've tried several variants using both subprocess.run and subprocess.Popen. The python app that runs, executes a perl script and this output is captured.
import subprocess as sp
print("some data")
print("some data")
x = subprocess.run(['script.py', 'some', 'options'], stdout=sp.PIPE, stderr=sp.PIPE)
proc_out = sp.stdout
proc_err = sp.stderr
I've also tried adding '> out 2>&1' to the list, tried with capture_output=True, tried redirecting stdout/stderr. The odd thing is that the print statements I'm trying to capture no longer display.
so, it's a python app (which output is captured), that uses subprocess to call another python app (unable to capture it's output), which in turn calls a perl function (which output is captured).
I've been through most of the threads that referenced capturing all data, but still no luck.
Any ideas?
import subprocess
command = "your command"
proc = subprocess.Popen([command], stdout=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
You need .communicate(), which writes input then reads all output and waits for the subprocess to exit before continuing execution in current/main thread
This question already has answers here:
catching stdout in realtime from subprocess
(13 answers)
Closed 9 years ago.
I am trying to call "git clone" command in Python. I hope the Python script can display the GIT command output to the screen the same as running it in a terminal. For example the percentage information of the clone process. Is it there anyway to do it in Python?
Take a look at Python's sub process module, you can capture the output to a variable and then work with it. There are ways to intercept steer and stout. It's available as of 2.4 and should do the trick. I've used this for scripts running system commands at capturing the output for things at work. Reply if you need an example and I can dig one up from my work computer tomorrow morning...
http://docs.python.org/2/library/subprocess.html
try:
# Needs to all be one argument for acme command...
cmd = ["acme nw -proj " + self.based_on]
p = subprocess.Popen(cmd,
cwd=place,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
output, error_msgs = p.communicate()
except (OSError) as err:
sys.stdout.write("ERROR {0} ({1})\n".format(err.strerror,
err.errno))
sys.exit(err.errno)
if len(error_msgs) > 0 and error_msgs != "Scanning the source base and preparing the workspace, please wait ...\n":
sys.stdout.write("There were errors, during view create\n")
sys.stdout.write(error_msgs)
else:
sys.stdout.write("SUCCESS\n")
This question already has answers here:
Run a program from python, and have it continue to run after the script is killed
(5 answers)
Closed 8 years ago.
I have recently come across some situations where I want to start a command completely independently, and in a different process then the script, equivalent typing it into a terminal, or more specifically writing the command into a .sh or .desktop and double clicking it. The request I have are:
I can close the python window without closing the application.
The python window does not display the stdout from the application (you don't want random text appearing in a CLI), nor do I need it!.
Things I have tried:
os.system waits.
subprocess.call waits.
subprocess.Popen starts a subprocess (duh) closing the parent process thus quits the application
And thats basically all you can find on the web!
if it really comes down to it I could launch a .sh (or .bat for windows), but that feels like a cheap hack and not the way to do this.
If you were to place an & after the command when called from os.system, would that not work? For example:
import os
os.system( "google-chrome & disown " )
import subprocess
import os
with open(os.devnull, 'w') as f:
proc = subprocess.Popen(command, shell=True, stdout=f, stderr=f)
Spawn a child process "the UNIX way" with fork and exec. Here's a simple example of a shell in Python.
import os
prompt = '$ '
while True:
cmds = input(prompt).split()
if len(cmds):
if (os.fork() == 0):
# Child process
os.execv(cmds[0], cmds)
else:
# Parent process
# (Hint: You don't have to wait, you can just exit here.)
os.wait()
This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
How can I capture the stdout output of a child process?
I'm running a cat-like program in bash from Python:
import os
os.system('cat foo.txt')
How do I get the output of the shell command back in the Python script, something like:
s = somefunction('cat foo.txt')
?
UPD: Here is a related thread.
Use the subprocess module.
from subprocess import Popen, PIPE
(stdout, stderr) = Popen(["cat","foo.txt"], stdout=PIPE).communicate()
print stdout