python subprocess.Popen stdin.write - python

I'm new to python and would like to open a windows cmd prompt, start a process, leave the process running and then issue commands to the same running process.
The commands will change so i cant just include these commands in the cmdline variable below. Also, the process takes 10-15 seconds to start so i dont want to waste time waiting for the process to start and run commands each time. just want to start process once. and run quick commands as needed in the same process
I was hoping to use subprocess.Popen to make this work, though i am open to better methods. Note that my process to run is not cmd, but im just using this as example
import subprocess
cmdline = ['cmd', '/k']
cmd = subprocess.Popen(cmdline, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
cmd.stdin.write("echo hi") #would like this to be written to the cmd prompt
print cmd.stdout.readline() #would like to see 'hi' readback
cmd.stdin.write("echo hi again") #would like this to be written to the cmd prompt
print cmd.stdout.readline() #would like to see 'hi again' readback
The results arent what i expect. Seems as though the stdin.write commands arent actually getting in and the readline freezes up with nothing to read.
I have tried the popen.communicate() instead of write/readline, but it kills the process. I have tried setting bufsize in the Popen line, but that didn't make too much difference

Your comments suggest that you are confusing command-line arguments with input via stdin. Namely, the fact that system-console.exe program accepts script=filename parameter does not imply that you can send it the same string as a command via stdin e.g., python executable accepts -c "print(1)" command-line arguments but it is a SyntaxError if you pass it as a command to Python shell.
Therefore, the first step is to use the correct syntax. Suppose the system-console.exe accepts a filename by itself:
#!/usr/bin/env python3
import time
from subprocess import Popen, PIPE
with Popen(r'C:\full\path\to\system-console.exe -cli -',
stdin=PIPE, bufsize=1, universal_newlines=True) as shell:
for _ in range(10):
print('capture.tcl', file=shell.stdin, flush=True)
time.sleep(5)
Note: if you've redirected more than one stream e.g., stdin, stdout then you should read/write both streams concurrently (e.g., using multiple threads) otherwise it is very easy to deadlock your program.
Related:
Q: Why not just use a pipe (popen())? -- mandatory reading for Unix environment but it might also be applicable for some programs on Windows
subprocess readline hangs waiting for EOF -- code example on how to pass multiple inputs, read multiple outputs using subprocess, pexpect modules.
The second and the following steps might have to deal with buffering issues on the side of the child process (out of your hands on Windows), whether system-console allows to redirect its stdin/stdout or whether it works with a console directly, and character encoding issues (how various commands in the pipeline encode text).

Here is some code that I tested and is working on Windows 10, Quartus Prime 15.1 and Python 3.5
import subprocess
class altera_system_console:
def __init__(self):
sc_path = r'C:\altera_lite\15.1\quartus\sopc_builder\bin\system-console.exe --cli --disable_readline'
self.console = subprocess.Popen(sc_path, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
def read_output(self):
rtn = ""
loop = True
i = 0
match = '% '
while loop:
out = self.console.stdout.read1(1)
if bytes(match[i],'utf-8') == out:
i = i+1
if i==len(match):
loop=False
else:
rtn = rtn + out.decode('utf-8')
return rtn
def cmd(self,cmd_string):
self.console.stdin.write(bytes(cmd_string+'\n','utf-8'))
self.console.stdin.flush()
c = altera_system_console()
print(c.read_output())
c.cmd('set jtag_master [lindex [get_service_paths master] 0]')
print(c.read_output())
c.cmd('open_service master $jtag_master')
print(c.read_output())
c.cmd('master_write_8 $jtag_master 0x00 0xFF')
print(c.read_output())

You need to use iter if you want to see the output in real time:
import subprocess
cmdline = ['cmd', '/k']
cmd = subprocess.Popen(cmdline, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
cmd.stdin.write("echo hi\n")#would like this to be written to the cmd prompt
for line in iter(cmd.stdout.readline,""):
print line
cmd.stdin.write("echo hi again\n")#would like this to be written to the cmd prompt
Not sure exactly what you are trying to do but if you want to input certain data when you get certain output then I would recommend using pexpect

Related

How to run cmd.exe in python

I'm trying to port the following c# code into Python. It firstly defines a new process and then runs a windows prompt command (cmd.exe). After that, it executes a command in the prompt and when an external event occurs, it closes the prompt.
//Start the prompt - when an event occured
Process winShell = new Process();
winShell.StartInfo.FileName = "cmd.exe";
winShell.StartInfo.RedirectStandardInput = true;
winShell.Start();
//Execute a command in the prompt
winShell.StandardInput.WriteLine("cd " + projectDirectory);
//Close it - when an event occured
winShell.StandardInput.Flush();
winShell.StandardInput.Close();
winShell.WaitForExit();
I read that for Python 3 (my version 3.7), It is recommended to use subprocess. Unfortunately, I feel a bit confused about which of the function to use. I found call, run and Popen, but I didn't understand how to use them.
I wrote the following lines, but they don't produce any visible result.
import subprocess
subprocess.run(['cmd.exe'])
First of all, I would like that the shell appears and than to write some commands in it. Finally, I want to close it.
Use subprocess.Popen() like this. Each API matches to the corresponding C# API almost 1:1.
p = subprocess.Popen(['cmd.exe'],
stdin=subprocess.PIPE, stdout=subprocess.PIPE,
text=True)
p.stdin.write('dir\n')
p.stdin.close()
print(p.stdout.read())
p.wait()
p.stdout.close()
Other API's such as run(), call(), etc are wrappers for Popen(). For example, the above code is equivalent to this one line.
print(subprocess.run(['cmd.exe'], capture_output=True, text=True, input = 'dir\n').stdout)

python subprocess for user input

I want to get user input from a subprocess in an new terminal.
import subprocess
additionalBuildArguments = "defaultarg1"
proc = subprocess.Popen(["python", "user_input.py", additionalBuildArguments],
creationflags=subprocess.CREATE_NEW_CONSOLE,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
try:
outs, errs = proc.communicate(timeout=15)
except subprocess.TimeoutExpired:
proc.kill()
outs, errs = proc.communicate()
additionalBuildArguments = outs or additionalBuildArguments
user_input.py:
import sys
arg = sys.argv[1]
user_input = input(f"Additional build arguments [{arg}] (Push <ENTER> to use these settings):\n")
print(user_input)
as long as I don't set the stdout=subprocess.PIPE and/or the stderr=subprocess.PIPE options I can enter input. But with these options I can't write any input to the console.
Indeed I need these options to redirect the stdout, to have access to the printed user_input in the parent process.
Does anyone know what's the problem here?
Please note that I do not understand why you want to do this, and feel instinctively that you should not. However, it's perfectly possible: just catpure only stdout:
import sys
from subprocess import run
print("Type away: ", end="")
sys.stdout.flush()
r = run(["python", "-c", "print(input())"], capture_output=True, encoding="utf8")
print(f"You entered {r.stdout}")
EDIT Apparently you are using windows. Per the docs your flag is set when shell=True. With shell=True this works for me, but I have no idea whether it will for you:
import sys
from subprocess import run
print("Type away: ", end="")
sys.stdout.flush()
r = run("python -c 'print(input())'", capture_output=True, shell=True, encoding="utf8")
print(f"You entered {r.stdout}")
This can be chained to run in yet a third process, which would be needed to print whilst also capturing stdout, from a subprocess. But at this point we are in the realm of very horrible hacks.
A better, but still hacky, solution, is to re-phrase the problem a bit. You want to spawn a terminal, which apparently you can do, and the user can interact with it correctly, and then you want to get output from that terminal in the spawning code. STDOUT is not the proper channel for this communication. Personally I would structure my code like this:
in spawning code:
generate parametrised script to run in spawned terminal and save it as a temp file
spawn subterminal running the generated script
wait for completion
read temp out file and get data
delete both temp script and temp out file
in generated code:
do as much as possible (you have a full python, after all)
dump output as json to a temporary file
This is still hacky, but it only involves spawning one terminal. Note that I still don't understand why you want to do this, but this should at least work.

writing commands to mplayer subprocess with python 3 in windows

I have a... very specific problem. Really tried to find a broader question but couldn't.
I am trying to use mplayer as a subprocess to play music (on windows and also linux), and retain the ability to pass commands to it. I have accomplished this just fine in python 2.7 with subprocess.Popen and p.stdin.write('pause\n').
However this doesn't seem to have survived the trip to Python 3. I have to either use 'pause\n'.encode() or b'pause\n' to convert to bytes, and the mplayer process does not pause. It does seem to work however if i use p.communicate, but I have ruled that out as a possiblity due to this question which claims it can only be called once per process.
Here's is my code:
p = subprocess.Popen('mplayer -slave -quiet "C:\\users\\me\\music\\Nickel Creek\\Nickel Creek\\07 Sweet Afton.mp3"', stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
time.sleep(1)
mplayer.stdin.write(b'pause\n')
time.sleep(1)
mplayer.stdin.write(b'pause\n')
time.sleep(1)
mplayer.stdin.write(b'quit\n')
seeing as this code worked (without the bs) in 2.7, i can only assume encoding the string as a bytes is somehow changing the byte values so that mplayer can't understand it any more? however when i try to see exactly what bytes are sent through the pipeline it looks correct. it could also be windows pipeline acting strange. i've tried this with both cmd.exe and powershell, since i know powershell interprets the pipeline as xml. i used this code to test what comes in through the pipeline:
# test.py
if __name__ == "__main__":
x = ''
with open('test.out','w') as f:
while (len(x) == 0 or x[-1] != 'q'):
x += sys.stdin.read(1)
print(x)
f.write(x)
and
p = subprocess.Popen('python test.py', stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
p.stdin.write(b'hello there\ntest2\nq\n')
seeing as this code worked (without the bs) in 2.7, i can only assume encoding the string as a bytes is somehow changing the byte values so that mplayer can't understand it any more?
'pause\n' in Python 2 is exactly the same value as b'pause\n' -- moreover you could use b'pause\n' on Python 2 too (to communicate the intent of the code).
The difference is that bufsize=0 on Python 2 and therefore .write() pushes the content to the subprocess immediately while .write() on Python 3 puts it in some internal buffer instead. Add .flush() call, to empty the buffer.
Pass universal_newlines=True, to enable the text mode on Python 3 (then you could use 'pause\n' instead of b'pause\n'). You might also need it if mplayer expects os.newline instead of b'\n' as the end of line.
#!/usr/bin/env python3
import time
from subprocess import Popen, PIPE
LINE_BUFFERED = 1
filename = r"C:\Users\me\...Afton.mp3"
with Popen('mplayer -slave -quiet'.split() + [filename],
stdin=PIPE, universal_newlines=True, bufsize=LINE_BUFFERED) as process:
send_command = lambda command: print(command, flush=True, file=process.stdin)
time.sleep(1)
for _ in range(2):
send_command('pause')
time.sleep(1)
send_command('quit')
Unrelated: do not use stdout=PIPE unless you read from the pipe otherwise you may hang the child process. To discard the output, use stdout=subprocess.DEVNULL instead. See How to hide output of subprocess in Python 2.7

Simple Python Script not Executing Properly

The code is as follows:
fh = tempfile.NamedTemporaryFile(delete=False,suffix = '.py')
stream = io.open(fh.name,'w',newline='\r\n')
stream.write(unicode(script))
stream.flush()
stream.close()
proc = subprocess.Popen(
[path,fh.name],
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
proc.stdin.close()
proc.stderr.close()
out = proc.stdout.readline()
print out
script is a string which contains the subprocess code, in this case a simple hello world. Since it has unix file endings, I had to use io.open in order to write it properly for windows. path is the path to the python.exe on my machine. The file is generated and looks fine in notepad:
def main():
print 'hello world'
However, when I run the program, the subprocess executes and does nothing.
Its not a problem with the executable path, I've tested it with other programs, so it must be with either the temp file itself, or the text within it. Delete is set to false in order to check the contents of the file for debugging. Is there anything glaringly wrong with this code? I'm a bit new to using Popen.
The main issue in your program is that when you specify shell=True , you need to provide the entire command as a string, not a list.
Given that, there is really no need for you to use shell=True , also, unless absolutely neccessary, you should not use shell=True , its a security hazard, this is given in the documentation as well -
Executing shell commands that incorporate unsanitized input from an untrusted source makes a program vulnerable to shell injection, a serious security flaw which can result in arbitrary command execution. For this reason, the use of shell=True is strongly discouraged in cases where the command string is constructed from external input:
Also, if you do not want to use stdin / stderr (since you are closing those off as soon as you start the process) , there is no need to use PIPE for them.
Example -
fh = tempfile.NamedTemporaryFile(delete=False,suffix = '.py')
stream = io.open(fh.name,'w',newline='\r\n')
stream.write(unicode(script))
stream.flush()
stream.close()
proc = subprocess.Popen(
[path,fh.name],
stdout=subprocess.PIPE,
)
out = proc.stdout.readline()
print out
Also, the script -
def main():
print 'hello world'
would not work, since you need to call main() for it to run.

Popen in Python 3

HERE is a code snippet from my program.
I am trying to open cmd.exe on Windows and pass commands to a separate program and capture the output and parse it WITHOUT having to load cmd.exe every time.
All the examples to do this I found where using Python 2, and there are several changes in Python 3 about PIPES, making me unsure what is going wrong.
#DOScmd is a list of command line parameters to type into command shell.
p = subprocess.Popen('cmd.exe',
stdout=subprocess.PIPE,
stdin=subprocess.PIPE,
shell=True,
bufsize=0)
myCall = ' '.join(DOScmd) + '\n'
p.stdin.write( bytes(myCall, 'UTF-8') )
searchLines = p.stdout.readlines()
print(searchLines)
I am calling a program bowtie.exe. Now, bowtie.exe crashes when I do this. I think I might be angering the I/O gods. Any help appreciate.
I am trying to open cmd.exe on Windows and pass commands to a separate program and capture the output and parse it WITHOUT having to load cmd.exe every time.
Unless you want to run commands that are builtin to cmd.exe such as dir then you don't need to start cmd.exe at all:
from subprocess import check_output
for cmd in ["first.exe", "arg1", "arg2"], ["second.exe", ".."]:
output = check_output(cmd)
do_whatever_you_like_with(output)

Categories

Resources