Subprocess Command Output? [duplicate] - python

This question already has answers here:
Running shell command and capturing the output
(21 answers)
Closed 1 year ago.
I have been trying for hours to get the output of a shell command as a string. I have tried both subprocess and os, neither of which have worked, and within subprocess I have tried check_output(), getoutput(), Popen(), communicate(), and everything else I've been able to find on this site and many others.
Sometimes I've had errors such as FileNotFoundError: [WinError 2] The system cannot find the file specified though I have been able to fix these relatively swiftly, however when the code does actually work, and I try to print the output of the command, either it returns nothing (as in, it prints blank space), or it prints (b'', b'') or (b'', None).
decode() doesn't work, encoding doesn't change anything and I even tried:
subpr = str(process)
which, of course, did nothing.
How do you get the output of a shell command, as a string?
Other attempts:
subpr = (Popen(commandRun,shell=True,stdout=PIPE,stderr=PIPE,universal_newlines=True).communicate()[0])
process = subprocess.getoutput(commandRun)
process = subprocess.check_output(commandRun,shell=True)
process = subprocess.check_output(commandRun,stdout=PIPE,shell=True)
process = Popen(commandRun,stdout=PIPE,shell=True)
subpr = process.communicate()[0]
output = Popen(commandRun,shell=True,stdout=PIPE,stderr=PIPE)
subpr = output.communicate()
Imported:
import subprocess
from subprocess import Popen, PIPE
There is not much more code to add. I haven't written anything regarding subprocess other than that one broken line.

How are you trying to use these?
I have the following code that works, redirecting STDERR to STDOUT, because I wanted to have them merged:
import subprocess
args = ["whoami"]
run = subprocess.run(args, text=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
print(run.stdout)
If you want to pipe processes together, the best way is probably to put the popes in arguments of Popen, see https://docs.python.org/3/library/subprocess.html#replacing-bin-sh-shell-command-substitution
p1 = Popen(["dmesg"], stdout=PIPE)
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits.
output = p2.communicate()[0]

Related

python subprocess not capturing stdout [duplicate]

This question already has answers here:
read from subprocess output python
(2 answers)
Closed 2 years ago.
i'm trying to capture all output when running a python application using subprocess. i've tried several variants using both subprocess.run and subprocess.Popen. The python app that runs, executes a perl script and this output is captured.
import subprocess as sp
print("some data")
print("some data")
x = subprocess.run(['script.py', 'some', 'options'], stdout=sp.PIPE, stderr=sp.PIPE)
proc_out = sp.stdout
proc_err = sp.stderr
I've also tried adding '> out 2>&1' to the list, tried with capture_output=True, tried redirecting stdout/stderr. The odd thing is that the print statements I'm trying to capture no longer display.
so, it's a python app (which output is captured), that uses subprocess to call another python app (unable to capture it's output), which in turn calls a perl function (which output is captured).
I've been through most of the threads that referenced capturing all data, but still no luck.
Any ideas?
import subprocess
command = "your command"
proc = subprocess.Popen([command], stdout=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
You need .communicate(), which writes input then reads all output and waits for the subprocess to exit before continuing execution in current/main thread

Pass variable to bash command with Python

I have the next code:
from subprocess import Popen, PIPE
p = Popen("C:/cygwin64/bin/bash.exe", stdin=PIPE, stdout=PIPE)
path = "C:/Users/Link/Desktop/folder/"
p.stdin.write(b"cd " + str.encode(path)))
p.stdin.close()
out = p.stdout.read()
print(out)
The output is b''
Is there any way to pass a variable to the bash command p.stdin.write(b"cd " + path)
I ask because the way it is written above don't work. Output is null, just like Cygwin started and nothing else.
EDIT
As long as I see the question is not so clear, I'll add this scenario:
I am on Windows and I am using Python 3.6.
I have a bash cmd that requieres Cygwin to be executed. This cmd may have a variable in his string, which will change after every execution. Immagine a for loop which executes a command.
For example (an ImageMagick command):
convert image.jpg -resize 1024x768 output_file.jpg
How can I execute this cmd from Python with output_file.jpg as variable ?
Bash doesn't run in interactive mode by default unless it detects that standard input and output are connected to a terminal. You PIPEd these in, therefore they're definitely not connected to a terminal.
Bash does not display any prompts in non-interactive mode, hence you see nothing. You can force it to be interactive with -i switch.
However, even then, it is not going to write to stdout but stderr; you can try piping stderr to stdout
from subprocess import Popen, PIPE, STDOUT
p = Popen(["C:/cygwin64/bin/bash.exe", "-i"], stdin=PIPE, stdout=PIPE, stderr=STDOUT)
and you will capture the prompts and such.
Or use your original approach with a command that does produce output - here pwd that prints the current working directory:
p.stdin.write(b"cd " + path.encode() + b"\n")
p.stdin.write(b"pwd")
It is tricky to talk to an interactive process like this though - read too little => deadlock. Write too much => deadlock. This is why Popen has the .communicate method for providing all of input at once and getting the stdout and stderr afterwards.
As it seems you are using the Cygwin python, than you should use proper
Posix paths and not Windows-like ones
Instead of
p = Popen("C:/cygwin64/bin/bash.exe", stdin=PIPE, stdout=PIPE)
use
p = Popen("/bin/bash.exe", stdin=PIPE, stdout=PIPE)

How do subprocess.Popen pipes work in Python?

I am fully aware of the existence of related questions, but all of them are very fuzzy and fail to clearly explain what is going on every step of the way. The examples provided are often not tested and provide no information on how to adapt them to different scenarios. Here are the questions as well as Python's documentation:
How do I use subprocess.Popen to connect multiple processes by pipes?
link several Popen commands with pipes
https://docs.python.org/2/library/subprocess.html#popen-objects
This is essentially what I am trying to achieve, but in Python:
curl http://asdf.com/89asdf.gif | convert -resize 80x80 - - | icat -k -
Here is what I have after hours for frankensteining together bits and parts of the aforementioned answers:
import requests
import subprocess
from subprocess import Popen, PIPE, STDOUT
url = 'http://asdf.com/89asdf.gif'
img = requests.get(url)
p1 = subprocess.Popen(['convert', '-resize', '80x80', '-', '-'], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
p2 = subprocess.Popen(['icat', '-k', '-'], stdin=p1.stdout, stdout=subprocess.PIPE)
p1.communicate(img.content)
print p2.stdout
Here is my code again, this time improved based on your answers:
import requests
from subprocess import Popen, PIPE, STDOUT
url = 'http://asdf.com/89asdf.gif'
img = requests.get(url)
p1 = Popen(['convert', '-resize', '80x80', '-', '-'], stdin=PIPE, stdout=PIPE)
p2 = Popen(['icat', '-k', '-'], stdin=p1.stdout, stdout=PIPE)
p1.stdin.write(img.content)
p1.stdin.close()
p1.stdout.close()
output = p2.communicate()[0]
print output
Note: icat outputs images in 256-color capable terminals. I already managed to print its output successfully in another question. A typical image looks like this in a terminal once processed by icat:
Please correct me where I am wrong, but this is my current understanding:
p1.communicate(img.content): This sets p1's STDIN.
p1's STDIN is subprocess.PIPE, which is what p1.communicate(mg.content) provides.
p2's STDIN is p1's STDOUT, which is the image resized by convert.
I then print the p2's STDOUT, which should be the image with ASCII colors provided by icat.
Questions:
I have seen both stdin=subprocess.PIPE and stdin=PIPE. What is the difference?
Why is the first STDIN provided after the other subprocesses are defined?
Could someone explain what is actually going on in the process and what is wrong with my code? This whole piping business must be simpler than it seems, and I'd love to understand how it works.
Q1. In either case of subprocess.PIPE or just PIPE, they are referencing the same symbol, which is PIPE from the subprocess module. The following are identical:
# Version 1
import subprocess
proc = subprocess.Popen(['ls'], stdout=subprocess.PIPE)
output = proc.communicate()[0]
# Version 2
from subprocess import PIPE, Popen
proc = Popen(['ls'], stdout=PIPE)
output = proc.communicate()[0]
Q2. What communicate() is actually doing is sending input into p1's STDIN stream. Although the subprocesses are indeed alive and running when the Popen constructors are called, in your particular case the convert utility seems like it won't do anything until it actually receives content via STDIN. If it were a less interactive command (like ls for example) then it wouldn't wait until communicate() to do anything.
UPDATED CONTENT
In your case, instead of using communicate to send input into p1, try instead the following:
p1.stdin.write(img.content)
p1.stdin.close()
p1.stdout.close() # Protect against the busted pipe condition when p2 finishes before p1
output = p2.communicate()[0]
print output
See if you have better luck with that.
Q1. there is no difference they are both subprocess.PIPE ....
Q2. you dont have to do it that way you could just as easily do
p1.stdin.write(img.content)
(in fact thats exactly what that part of communicate does.... )
communicate is used because it blocks until the process finishes (in this case so its stdout can be made available to p2.stdin(maybe?)) where as write simply writes to the pipe and continues to the next line of python in the file

Catching and outputting stderr at the same time with python's subprocess

(Using python 3.2 currently)
I need to be able to:
Run a command using subprocess
Both stdout/stderr of that command need be printed to the terminal in real-time (it doesn't matter if they both come out on stdout or stderr or whatever
At the same time, I need a way to know if the command printed anything to stderr (and preferably what it printed).
I've played around with subprocess pipes as well as doing strange pipe redirects in bash, as well as using tee, but as of yet haven't found anything that would work. Is this something that's possible?
My solution:
import subprocess
process = subprocess.Popen("my command", shell=True,
stdout=None, # print to terminal
stderr=subprocess.PIPE)
duplicator = subprocess.Popen("tee /dev/stderr", shell=True, # duplicate input stream
stdin=process.stderr,
stdout=subprocess.PIPE, # catch error stream of first process
stderr=None) # print to terminal
error_stream = duplicator.stdout
print('error_stream.read() = ' + error_stream.read())
Try something like this:
import os
cmd = 'for i in 1 2 3 4 5; do sleep 5; echo $i; done'
p = os.popen(cmd)
while True:
output = p.readline()
print(output)
if not output: break
In python2 you can catch stderr easily as well by using popen3 like this:
i, o, err = os.popen3(cmd)
but there seem to be no such function in python3. If you don find the way around this, try using subprocess.Popen directly, as described here: http://www.saltycrane.com/blog/2009/10/how-capture-stdout-in-real-time-python/

python subprocess: "write error: Broken pipe"

I have a problem piping a simple subprocess.Popen.
Code:
import subprocess
cmd = 'cat file | sort -g -k3 | head -20 | cut -f2,3' % (pattern,file)
p = subprocess.Popen(cmd,shell=True,stdout=subprocess.PIPE)
for line in p.stdout:
print(line.decode().strip())
Output for file ~1000 lines in length:
...
sort: write failed: standard output: Broken pipe
sort: write error
Output for file >241 lines in length:
...
sort: fflush failed: standard output: Broken pipe
sort: write error
Output for file <241 lines in length is fine.
I have been reading the docs and googling like mad but there is something fundamental about the subprocess module that I'm missing ... maybe to do with buffers. I've tried p.stdout.flush() and playing with the buffer size and p.wait(). I've tried to reproduce this with commands like 'sleep 20; cat moderatefile' but this seems to run without error.
From the recipes on subprocess docs:
# To replace shell pipeline like output=`dmesg | grep hda`
p1 = Popen(["dmesg"], stdout=PIPE)
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0]
This is because you shouldn't use "shell pipes" in the command passed to subprocess.Popen, you should use the subprocess.PIPE like this:
from subprocess import Popen, PIPE
p1 = Popen('cat file', stdout=PIPE)
p2 = Popen('sort -g -k 3', stdin=p1.stdout, stdout=PIPE)
p3 = Popen('head -20', stdin=p2.stdout, stdout=PIPE)
p4 = Popen('cut -f2,3', stdin=p3.stdout)
final_output = p4.stdout.read()
But i have to say that what you're trying to do could be done in pure python instead of calling a bunch of shell commands.
I have been having the same error. Even put the pipe in a bash script and executed that instead of the pipe in Python. From Python it would get the broken pipe error, from bash it wouldn't.
It seems to me that perhaps the last command prior to the head is throwing an error as it's (the sort) STDOUT is closed. Python must be picking up on this whereas with the shell the error is silent. I've changed my code to consume the entire input and the error went away.
Would make sense also with smaller files working as the pipe probably buffers the entire output before head exits. This would explain the breaks on larger files.
e.g., instead of a 'head -1' (in my case, I was only wanting the first line), I did an awk 'NR == 1'
There are probably better ways of doing this depending on where the 'head -X' occurs in the pipe.
You don't need shell=True. Don't invoke the shell. This is how I would do it:
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
stdout_value = p.communicate()[0]
stdout_value # the output
See if you face the problem about the buffer after using this?
try using communicate(), rather than reading directly from stdout.
the python docs say this:
"Warning Use communicate() rather than
.stdin.write, .stdout.read or
.stderr.read to avoid deadlocks due to
any of the other OS pipe buffers
filling up and blocking the child
process."
http://docs.python.org/library/subprocess.html#subprocess.Popen.stdout
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
output = p.communicate[0]
for line in output:
# do stuff

Categories

Resources