My python script uses subprocess to call a linux utility that is very noisy. I want to store all of the output to a log file and show some of it to the user. I thought the following would work, but the output doesn't show up in my application until the utility has produced a significant amount of output.
#fake_utility.py, just generates lots of output over time
import time
i = 0
while True:
print hex(i)*512
i += 1
time.sleep(0.5)
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
for line in proc.stdout:
#the real code does filtering here
print "test:", line.rstrip()
The behavior I really want is for the filter script to print each line as it is received from the subprocess. Sorta like what tee does but with python code.
What am I missing? Is this even possible?
Update:
If a sys.stdout.flush() is added to fake_utility.py, the code has the desired behavior in python 3.1. I'm using python 2.6. You would think that using proc.stdout.xreadlines() would work the same as py3k, but it doesn't.
Update 2:
Here is the minimal working code.
#fake_utility.py, just generates lots of output over time
import sys, time
for i in range(10):
print i
sys.stdout.flush()
time.sleep(0.5)
#display out put line by line
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
#works in python 3.0+
#for line in proc.stdout:
for line in iter(proc.stdout.readline,''):
print line.rstrip()
I think the problem is with the statement for line in proc.stdout, which reads the entire input before iterating over it. The solution is to use readline() instead:
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
while True:
line = proc.stdout.readline()
if not line:
break
#the real code does filtering here
print "test:", line.rstrip()
Of course you still have to deal with the subprocess' buffering.
Note: according to the documentation the solution with an iterator should be equivalent to using readline(), except for the read-ahead buffer, but (or exactly because of this) the proposed change did produce different results for me (Python 2.5 on Windows XP).
Bit late to the party, but was surprised not to see what I think is the simplest solution here:
import io
import subprocess
proc = subprocess.Popen(["prog", "arg"], stdout=subprocess.PIPE)
for line in io.TextIOWrapper(proc.stdout, encoding="utf-8"): # or another encoding
# do something with line
(This requires Python 3.)
Indeed, if you sorted out the iterator then buffering could now be your problem. You could tell the python in the sub-process not to buffer its output.
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
becomes
proc = subprocess.Popen(['python','-u', 'fake_utility.py'],stdout=subprocess.PIPE)
I have needed this when calling python from within python.
You want to pass these extra parameters to subprocess.Popen:
bufsize=1, universal_newlines=True
Then you can iterate as in your example. (Tested with Python 3.5)
A function that allows iterating over both stdout and stderr concurrently, in realtime, line by line
In case you need to get the output stream for both stdout and stderr at the same time, you can use the following function.
The function uses Queues to merge both Popen pipes into a single iterator.
Here we create the function read_popen_pipes():
from queue import Queue, Empty
from concurrent.futures import ThreadPoolExecutor
def enqueue_output(file, queue):
for line in iter(file.readline, ''):
queue.put(line)
file.close()
def read_popen_pipes(p):
with ThreadPoolExecutor(2) as pool:
q_stdout, q_stderr = Queue(), Queue()
pool.submit(enqueue_output, p.stdout, q_stdout)
pool.submit(enqueue_output, p.stderr, q_stderr)
while True:
if p.poll() is not None and q_stdout.empty() and q_stderr.empty():
break
out_line = err_line = ''
try:
out_line = q_stdout.get_nowait()
except Empty:
pass
try:
err_line = q_stderr.get_nowait()
except Empty:
pass
yield (out_line, err_line)
read_popen_pipes() in use:
import subprocess as sp
with sp.Popen(my_cmd, stdout=sp.PIPE, stderr=sp.PIPE, text=True) as p:
for out_line, err_line in read_popen_pipes(p):
# Do stuff with each line, e.g.:
print(out_line, end='')
print(err_line, end='')
return p.poll() # return status-code
You can also read lines w/o loop. Works in python3.6.
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
list_of_byte_strings = process.stdout.readlines()
Pythont 3.5 added the methods run() and call() to the subprocess module, both returning a CompletedProcess object. With this you are fine using proc.stdout.splitlines():
proc = subprocess.run( comman, shell=True, capture_output=True, text=True, check=True )
for line in proc.stdout.splitlines():
print "stdout:", line
See also How to Execute Shell Commands in Python Using the Subprocess Run Method
I tried this with python3 and it worked, source
When you use popen to spawn the new thread, you tell the operating system to PIPE the stdout of the child processes so the parent process can read it and here, stderr is copied to the stderr of the parent process.
in output_reader we read each line of stdout of the child process by wrapping it in an iterator that populates line by line output from the child process whenever a new line is ready.
def output_reader(proc):
for line in iter(proc.stdout.readline, b''):
print('got line: {0}'.format(line.decode('utf-8')), end='')
def main():
proc = subprocess.Popen(['python', 'fake_utility.py'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
t = threading.Thread(target=output_reader, args=(proc,))
t.start()
try:
time.sleep(0.2)
import time
i = 0
while True:
print (hex(i)*512)
i += 1
time.sleep(0.5)
finally:
proc.terminate()
try:
proc.wait(timeout=0.2)
print('== subprocess exited with rc =', proc.returncode)
except subprocess.TimeoutExpired:
print('subprocess did not terminate in time')
t.join()
The following modification of RĂ´mulo's answer works for me on Python 2 and 3 (2.7.12 and 3.6.1):
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
while True:
line = process.stdout.readline()
if line != '':
os.write(1, line)
else:
break
I was having a problem with the arg list of Popen to update servers, the following code resolves this a bit.
import getpass
from subprocess import Popen, PIPE
username = 'user1'
ip = '127.0.0.1'
print ('What is the password?')
password = getpass.getpass()
cmd1 = f"""sshpass -p {password} ssh {username}#{ip}"""
cmd2 = f"""echo {password} | sudo -S apt update"""
cmd3 = " && "
cmd4 = f"""echo {password} | sudo -S apt upgrade -y"""
cmd5 = " && "
cmd6 = "exit"
commands = [cmd1, cmd2, cmd3, cmd4, cmd5, cmd6]
command = " ".join(commands)
cmd = command.split()
with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')
And to run the update on a local computer, the following code example does this.
import getpass
from subprocess import Popen, PIPE
print ('What is the password?')
password = getpass.getpass()
cmd1_local = f"""apt update"""
cmd2_local = f"""apt upgrade -y"""
commands = [cmd1_local, cmd2_local]
with Popen(['echo', password], stdout=PIPE) as auth:
for cmd in commands:
cmd = cmd.split()
with Popen(['sudo','-S'] + cmd, stdin=auth.stdout, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')
I would like to know which window is hosting the terminal running Python. In specific, I would like to distinguish between windows terminal and the old CMD console on Windows machine.
EDIT:
I'm not sure I'm using correct words and there is an overload of words anyway. To be more specifc, I want to know the host window becaue they have different behaviours. Here's a photo of different windows, one of which Windows Terminal. powershell or cmd can be run in either of the windows, I'm interested in figuring out that window host.
If you use the psutil and os packages, you can use
parent_pid = os.getppid()
print(psutil.Process(parent_pid).name())
to get the parent process' name.
You could query WMI as I prefer to use OS tools (should work with psutil aswell, as mentioned by #BaguetteYeeter):
import os
import subprocess
import sys
print("Python interpreter: %s" % sys.executable)
parentShellName = None
# root process to look for parents until we find a process name
# which has not python in it's name
parentPid = os.getpid()
while 1:
# In case of ipython the second parent process is the Shell, so we are looping!
# Probably there should be a counter to finish the while loop in case no shell could be detected!
cmd = 'wmic process where "ProcessId=%s" get parentprocessid /format:list' % parentPid
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
out, err = proc.communicate()
key, parentPid = out.strip().decode('utf-8').split('=')
print("Parent ProcessId: %s" % parentPid)
cmd2 = 'wmic process where "ProcessId=%s" get name /format:list' % parentPid
proc = subprocess.Popen(cmd2, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
out, err = proc.communicate()
key, parentShellName = out.strip().decode('utf-8').split('=')
if 'python' not in parentShellName.lower():
break
print(parentShellName)
Out:
I am trying to run a piped command from python3 and would like the contents printed to the screen. After googling for an hour and reading multiple stackoverflow questions I have not been able to execute the command and also output the contents to the screen live.
I am using subprocess but am up to any solution that accomplishes the task. Security is not a requirement. I will need the ability to run multiple commands one by one in order.
import subprocess
from datetime import datetime
import os
currentDateTime = datetime.today().strftime('%Y%m%d_%H%M%S')
domain = "tesla.com"
waybackurlsDir = "/opt/project/recon/{0}/_waybackurls".format(domain)
os.makedirs(waybackurlsDir)
payload = '/usr/bin/echo "{0}" | /root/go-workspace/bin/waybackurls > {1}/{2}_waybackurls.txt'.format(domain, waybackurlsDir, currentDateTime)
process = subprocess.Popen(payload, shell=True, stdout=subprocess.PIPE)
for c in iter(lambda: process.stdout.read(1), b''):
sys.stdout.buffer.write(c)
process.buffer.write(c)
Assuming you want this.
Pass parameters to subprocess and capture the output/error of the execution
you can try this
child = subprocess.Popen(['command'], stdin=subprocess.PIPE,stdout=subprocess.PIPE,stderr=subprocess.PIPE, shell=False)
out,err=child.communicate('your parameters')
print(out,err)
I figured out how to solve my problem. Answer is below:
import subprocess
from datetime import datetime
import sys
import os
currentDateTime = datetime.today().strftime('%Y%m%d_%H%M%S')
domain = "tesla.com"
waybackurlsDir = "/opt/myproject/recon/{0}/_waybackurls".format(domain)
if not os.path.exists(waybackurlsDir):
os.makedirs(waybackurlsDir)
p1 = subprocess.Popen('/usr/bin/echo "tesla.com"', shell=True, stdout=subprocess.PIPE)
p2 = subprocess.Popen('waybackurls', shell=True, stdin=p1.stdout, stdout=subprocess.PIPE)
waybackurlsLogFile = '{0}/{1}_waybackurls.txt'.format(waybackurlsDir, currentDateTime)
waybackurlsFile = open(waybackurlsLogFile, "w+")
while p2.poll() is None:
for stdout_line in iter(p2.stdout.readline, ""):
if len(stdout_line) == 0:
break
waybackurlsFile.writelines(p2.stdout.readline().decode("utf-8"))
print("[waybackurls] {0}".format(p2.stdout.readline().rstrip().decode("utf-8")))
waybackurlsFile.close()
I am downloading a file using wget in Python using the code below:
p1 = subprocess.Popen(['wget',
'-P',
'path/to/folder','http://www.ncbi.nlm.nih.gov/Traces/wgs/?download=ACHI01.1.fsa_nt.gz'],
stdout=subprocess.PIPE)
p1.studout.close()
The file gets downloaded and saved correctly in the given folder but the process keeps running. I tried p1.kills() but that doesn't work either.
Please advise.
Thanks!
Use subprocess.call
import subprocess
subprocess.call (['wget', '-P', '/', 'http://www.ncbi.nlm.nih.gov/Traces/wgs/?download=ACHI01.1.fsa_nt.gz'])
A call to wait() is missing after the Popen, to make the main process wait till the child process is terminated.. The following code seems to work fine:
import subprocess as sp
def main(args):
p1 = sp.Popen(['wget', '-P path',
'http://www.ncbi.nlm.nih.gov/Traces/wgs/?download=ACHI01.1.fsa_nt.gz'],
stdout = sp.PIPE)
p1.wait()
return 0
if __name__ == '__main__':
import sys
sys.exit(main(sys.argv))
(You can also group commandline parameters with their values, if they have any).
I need to use stream redirectiton in Popen call in python to use bat file with wine. I need make this:
wine32 cmd < file.bat
It works when I run it manually from terminal, however when I try to call it from python:
proc = Popen('wine32 cmd < file.bat',stdout = PIPE)
I got error: No such file or directory
How to manage with that?
Thanks
Try this:
import sys
#...
with open('file.bat', 'r') as infile:
subprocess.Popen(['wine32', 'cmd'],
stdin=infile, stdout=sys.stdout, stderr=sys.stderr)
Make sure that each argument to wine32 is a separate list element.
maybe you can check this thread.. https://stackoverflow.com/a/5469427/3445802
from subprocess import Popen
p = Popen("batch.bat", cwd=r"C:\Path\to\batchfolder")
stdout, stderr = p.communicate()