The below code produces a lesser ip yield than doing arp -a in cmd
arpA_req = Popen('arp -a', stdin=PIPE, stdout=PIPE, stderr=STDOUT)
line = arpA_req.stdout.readline().decode('ascii').rsplit()
print(line)
Does anyone know why this may be? And if it's a common issue, how can I obtain a fuller ip list?
As wim pointed out, readline() only reads one line.
To read all the output, one way is to call communicate:
import subprocess
PIPE, STDOUT = subprocess.PIPE, subprocess.STDOUT
arpA_req = subprocess.Popen(
['arp', '-a'], stdin=PIPE, stdout=PIPE, stderr=STDOUT)
out, err = arpA_req.communicate()
print(out)
Or, to process one line at a time, a standard idiom is to use iter(func, stop_value):
for line in iter(arpA_req.stdout.readline, ''):
print(line)
Related
I'm trying to run a sub processes and watching his stdout until I find desirable string.
this is my code:
def waitForAppOutput(proc, word):
for stdout_line in iter(proc.stdout.readline, b''):
print stdout_line
if word in stdout_line.rstrip():
return;
p = Popen(["./app.sh"], shell=True, stdin=PIPE ,stdout=PIPE, stderr=PIPE)
waitForAppOutput(p,"done!")
the issue here is that for some reason the function waitForAppOutput stop printing stdout few lines before the "done!" which is the last line that should appear in the stdout . I assume iter(proc.stdout.readline, b'') is blocking and readline is not able to read the last lines of the stdout.
any idea what is the issue here?
You have a misspelling: it should be waitForAppOutput instead of waitForAppOutout. How does this even run at all? And when you are invoking a command using a shell, you should not be passing an array of strings but rather one single string.
Normally one should use the communicate method on the return subprocess object from the Popen call to prevent potential deadlocks (which seems to be what you are experiencing). This returns a tuple: (stdout, stderr), the stdout and stderr output strings:
from subprocess import Popen, PIPE
def waitForAppOutput(stdout_lines, word):
for stdout_line in stdout_lines:
print stdout_line
if word in stdout_line.rstrip():
return;
p = Popen("./app.sh", shell=True, stdout=PIPE, stderr=PIPE, stdin=PIPE, universal_newlines=True)
expected_input = "command line 1\ncommand line 2\n"
stdout, stderr = p.communicate(expected_input)
stdout_lines = stdout.splitlines()
waitForAppOutput(stdout_lines, "done!")
The only issue is if the output strings are large (whatever your definition of large might be), for it might be memory-inefficient or even prohibitive to read the entire output into memory. If this is your situation, then I would try to avoid the deadlock by piping only stdout.
from subprocess import Popen, PIPE
def waitForAppOutput(proc, word):
for stdout_line in iter(proc.stdout.readline, ''):
print stdout_line
if word in stdout_line.rstrip():
return;
p = Popen("./app.sh", shell=True, stdout=PIPE, stdin=PIPE, universal_newlines=True)
expected_input = "command line 1\ncommand line 2\n"
p.stdin.write(expected_input)
p.stdin.close()
waitForAppOutput(p, "done!")
for stdout_line in iter(p.stdout.readline, ''):
pass # read rest of output
p.wait() # wait for termination
Update
Here is an example using both techniques that runs the Windows sort command to sort a bunch of input lines. This works particularly well both ways because the sort command does not start output until all the input has been read, so it's a very simple protocol in which deadlocking is easy to avoid. Try running this with USE_COMMUNICATE set alternately to True and False:
from subprocess import Popen, PIPE
USE_COMMUNICATE = False
p = Popen("sort", shell=True, stdout=PIPE, stdin=PIPE, universal_newlines=True)
expected_input = """q
w
e
r
t
y
u
i
o
p
"""
if USE_COMMUNICATE:
stdout_lines, stderr_lines = p.communicate(expected_input)
output = stdout_lines
else:
p.stdin.write(expected_input)
p.stdin.close()
output = iter(p.stdout.readline, '')
for stdout_line in output:
print stdout_line,
p.wait() # wait for termination
Prints:
e
i
o
p
q
r
t
u
w
y
How can i read pipe's buffer without waiting. Subprocess is executing Swift script. If it was a python script, there is a flag for python that makes pipe unbuffered (-u). Any other ways to solve this ?
sub_proc = subprocess.Popen(['swift', 'script2.swift'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
thr1 = threading.Thread(target=self.pipe_reader, args=[sub_proc.stdout]).start()
thr2 = threading.Thread(target=self.pipe_reader, args=[sub_proc.stderr]).start()
def pipe_reader(self, pipe):
for line in iter(pipe.readline, b''):
self.q.put((pipe, line))
self.q.put((pipe, None))
I solved it. Changed line(added stdbuf -o0 as a argument)
sub_proc = subprocess.Popen(['swift', 'script2.swift'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
to
sub_proc = subprocess.Popen(['stdbuf', '-o0','swift', 'script2.swift'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
For further information about stdbuf: https://www.gnu.org/software/coreutils/manual/html_node/stdbuf-invocation.html
When I am executing an utility, blab, and it will ask yes or no for confirmation, what can I do? Thanks,
The code is as below:
proc = subprocess.Popen("blab delete {}".format(num), shell=True,
stderr=subprocess.STDOUT, stdin=subprocess.STDIN)
stdout_value = proc.communicate()[0]
Popen.communicate() documentation:
If you want to send data to process's stdin using python, create the Popen object with stdin=PIPE. Similarly, to get anything other than None in the result tuple, you need to give stdout=PIPE and/or stderr=PIPE too.
from subprocess import PIPE, Popen, STDOUT
process = Popen("blab delete {}".format(num), shell=True, stdin=PIPE, stdout=PIPE, stderr=STDOUT)
output = process.communicate(input=b'yes')[0]
output = output.decode('utf-8')
I have a python script search for logs, it continuously output the logs found and I want to use linux pipe to filter the desired output. example like that:
$python logsearch.py | grep timeout
The problem is the sort and wc are blocked until the logsearch.py finishes, while the logsearch.py will continuous output the result.
sample logsearch.py:
p = subprocess.Popen("ping google.com", shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
for line in p.stdout:
print(line)
UPDATE:
figured out, just change the stdout in subprocess to sys.stdout, python will handle the pipe for you.
p = subprocess.Popen("ping -c 5 google.com", shell=True, stdout=**sys.stdout**)
Thanks for all of you help!
And why use grep? Why don't do all the stuff in Python?
from subprocess import Popen, PIPE
p = Popen(['ping', 'google.com'], shell=False, stdin=PIPE, stdout=PIPE)
for line in p.stdout:
if 'timeout' in line.split():
# Process the error
print("Timeout error!!")
else:
print(line)
UPDATE:
I change the Popen line as recommended #triplee. Pros and cons in Actual meaning of 'shell=True' in subprocess
This is a follow up to this question, but if I want to pass an argument to stdin to subprocess, how can I get the output in real time? This is what I currently have; I also tried replacing Popen with call from the subprocess module and this just leads to the script hanging.
from subprocess import Popen, PIPE, STDOUT
cmd = 'rsync --rsh=ssh -rv --files-from=- thisdir/ servername:folder/'
p = Popen(cmd.split(), stdout=PIPE, stdin=PIPE, stderr=STDOUT)
subfolders = '\n'.join(['subfolder1','subfolder2'])
output = p.communicate(input=subfolders)[0]
print output
In the former question where I did not have to pass stdin I was suggested to use p.stdout.readline, there there is no room there to pipe anything to stdin.
Addendum: This works for the transfer, but I see the output only at the end and I would like to see the details of the transfer while it's happening.
In order to grab stdout from the subprocess in real time you need to decide exactly what behavior you want; specifically, you need to decide whether you want to deal with the output line-by-line or character-by-character, and whether you want to block while waiting for output or be able to do something else while waiting.
It looks like it will probably suffice for your case to read the output in line-buffered fashion, blocking until each complete line comes in, which means the convenience functions provided by subprocess are good enough:
p = subprocess.Popen(some_cmd, stdout=subprocess.PIPE)
# Grab stdout line by line as it becomes available. This will loop until
# p terminates.
while p.poll() is None:
l = p.stdout.readline() # This blocks until it receives a newline.
print l
# When the subprocess terminates there might be unconsumed output
# that still needs to be processed.
print p.stdout.read()
If you need to write to the stdin of the process, just use another pipe:
p = subprocess.Popen(some_cmd, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
# Send input to p.
p.stdin.write("some input\n")
p.stdin.flush()
# Now start grabbing output.
while p.poll() is None:
l = p.stdout.readline()
print l
print p.stdout.read()
Pace the other answer, there's no need to indirect through a file in order to pass input to the subprocess.
something like this I think
from subprocess import Popen, PIPE, STDOUT
p = Popen('c:/python26/python printingTest.py', stdout = PIPE,
stderr = PIPE)
for line in iter(p.stdout.readline, ''):
print line
p.stdout.close()
using an iterator will return live results basically ..
in order to send input to stdin you would need something like
other_input = "some extra input stuff"
with open("to_input.txt","w") as f:
f.write(other_input)
p = Popen('c:/python26/python printingTest.py < some_input_redirection_thing',
stdin = open("to_input.txt"),
stdout = PIPE,
stderr = PIPE)
this would be similar to the linux shell command of
%prompt%> some_file.o < cat to_input.txt
see alps answer for better passing to stdin
If you pass all your input before starting reading the output and if by "real-time" you mean whenever the subprocess flushes its stdout buffer:
from subprocess import Popen, PIPE, STDOUT
cmd = 'rsync --rsh=ssh -rv --files-from=- thisdir/ servername:folder/'
p = Popen(cmd.split(), stdout=PIPE, stdin=PIPE, stderr=STDOUT, bufsize=1)
subfolders = '\n'.join(['subfolder1','subfolder2'])
p.stdin.write(subfolders)
p.stdin.close() # eof
for line in iter(p.stdout.readline, ''):
print line, # do something with the output here
p.stdout.close()
rc = p.wait()