Python subprocess readline is blocking even after closing stdout - python

test.py file
#test.py
#!/usr/bin/env python3
while True:
inp = input("input: ")
print("output: " + inp)
subp.py file:
#subp.py
#!/usr/bin/env python3
from subprocess import Popen, PIPE
cmd = Popen(["python3", "test.py"], stdin=PIPE, stdout=PIPE)
while True:
print("Writing to stdin")
cmd.stdin.write("My Input To PIPE")
print("Done Writing to stdin")
print("Reading from stdout")
s = cmd.stdout.readline()
print("Done reading from stdout") # Never executes this line
The output is the following:
Writing to stdin
Done Writing to stdin
Reading from stdout
I understand that the line s = cmd.stdout.readline() is going to block until EOF is found in the stdout file object.
And if I am right, EOF will never be found in stdout unless stdout gets closed ? Can somebody correct me on this?
Going by my understanding, if I modify test.py
import sys
while True:
inp = input("input: ")
print("output: " + inp)
sys.stdout.close() # Added this line hoping to unblock stdout.readline()
Nothing changes, cmd.stdout.readline() still is looking for EOF even though the stdout file is closed ?
What am I missing? I am more concerned about the theory rather done just making it work without understanding. Thank you for the explanations
Also if modify subp.py and add the line cmd.stdout.close() before the line s = cmd.stdout.readline(), it throws an error saying that I tried reading from a closed file object, which makes sense, but how come it did not throw an error when I close the stdout file object in the test.py file by adding the line sys.stdout.close(). Are these two stdout different things?

Related

Why did python subprocess stdin.write() not work after a stdin.flush()?

gameserver.py
import config
import os
from subprocess import Popen, PIPE, STDOUT
class GameserverHandler:
def __init__(self):
print("==== Gameserver ====")
self.__gameServerInstance__ = self.restart_gameserver()
print("\tGameserver started")
def restart_gameserver(self):
if os.path.exists(config.bot_config[0]['GameServer']['OutPutFile']):
os.remove(config.bot_config[0]['GameServer']['OutPutFile'])
f = open(config.bot_config[0]['GameServer']['OutPutFile'], "wb")
return Popen(config.bot_config[0]['GameServer']['Path'], stdin=PIPE, stdout=f, stderr=STDOUT)
def send_command(self, command):
try:
self.__gameServerInstance__.stdin.write(str.encode(command))
self.__gameServerInstance__.stdin.flush()
except BrokenPipeError:
pass
except OSError as e:
exit()
main.py
import config
import gameserver
gameServer = gameserver.GameserverHandler()
a = input()
gameServer.send_command('quit\n')
Hallo everyone, i just wrote my first python script.
This script just starts a gameserver on my computer, write the stdout and stderr in a file and gives me an option to send commands to the server.
But there is the Problem when i use send_command() the gameserver dont gets the stdin.write. I read that i have to put a flush() after it. But this is not helping.
Funny is when i change the code to this:
def send_command(self, command):
try:
self.__gameServerInstance__.stdin.write(str.encode(command))
self.__gameServerInstance__.stdin.flush()
self.__gameServerInstance__.stdout.flush()
self.__gameServerInstance__.stderr.flush()
except BrokenPipeError:
pass
except OSError as e:
exit()
I get this Error
Traceback (most recent call last): File "D:\Projekte\python\PycharmProjects\ServerLauncher\main.py", line 7, in <module>
gameServer.send_command('quit') File "D:\Projekte\python\PycharmProjects\ServerLauncher\gameserver.py", line 22, in send_command
self.__gameServerInstance__.stdout.flush() AttributeError: 'NoneType' object has no attribute 'flush'
I think its because i set the stdout, stderr to a file, but why is it working than.
Sorry for a maybe dumb question, i'm just started python programming
UPDATE AFTER ANSWER from Serge Ballesta
Changed the code:
gameserver.py
def restart_gameserver(self):
if os.path.exists(config.bot_config[0]['GameServer']['OutPutFile']):
os.remove(config.bot_config[0]['GameServer']['OutPutFile'])
f = open(config.bot_config[0]['GameServer']['OutPutFile'], "wb")
return Popen(config.bot_config[0]['GameServer']['Path'], stdin=PIPE, stdout=f, stderr=STDOUT)
main.py
import config
import gameserver
import discord
gameServer = gameserver.GameserverHandler()
a = input()
print("SENDING")
gameServer.send_command('quit\n')
print("FINISH")
a = input()
Changes:
stderr to subprocess.STDOUT
change the file operation to 'wb'
add a new line to the send_command
But yeah dont know why but still the process dont get the quit. When i put everything in the main.py and remove the class and so like this.
if os.path.exists(config.bot_config[0]['GameServer']['OutPutFile']):
os.remove(config.bot_config[0]['GameServer']['OutPutFile'])
f = open(config.bot_config[0]['GameServer']['OutPutFile'], "wb")
p = Popen(config.bot_config[0]['GameServer']['Path'], stdin=PIPE, stdout=f, stderr=STDOUT)
a = input()
p.stdin.write(b'quit')
it works, i dont know why, could it be that the stdin is not flushing?
And thx for the fast answer #Serge Ballesta
There are a number of inconsistencies in your code:
you use the same file for both stdout and stderr. That is wrong and may lead to incorrect output in the file. You should use the special value subprocess.STDOUT:
from subprocess import Popen, PIPE, STDOUT
...
return Popen(config.bot_config[0]['GameServer']['Path'], stdin=PIPE,
stdout=f, stderr=STDOUT)
You define the child process with bytes IO, yet open the output file as text. You should use binary mode:
f = open(config.bot_config[0]['GameServer']['OutPutFile'], "wb")
You send a command quit. Most CLI programs expect a command to be terminated with a newline character. You should add a \n to your command:
self.__gameServerInstance__.stdin.write(str.encode(command) + 'b\n')
or
gameServer.send_command('quit\n')
After those fixes, I could successfully start a cmd.exe child process (on Windows), have it to terminate after the exit\n command, and got the expected data in the output file.

Receive return data from subprocess in python

I'm spawning a process from a script using subprocess. My subprocess takes a JSON input and performs some operations and should return some real time data to the main process. How can I do this from subprocess?
I'm trying something like this. But it is throwing an error.
Following is may main process "main.py"
p = subprocess.Popen(['python','handler.py'],
stdin=subprocess.PIPE,stdout=subprocess.PIPE)
p.communicate(JSONEncoder().encode(data))
while True:
out = process.stdout.read(1)
if out == '' and process.poll() != None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
Below is my subprocess "handler.py"
if __name__ == '__main__' :
command = json.load(sys.stdin)
os.environ["PYTHONPATH"] = "../../"
if command["cmd"] == "archive" :
print "command recieved:",command["cmd"]
file_ids, count = archive(command["files"])
sys.stdout.write(JSONEncoder().encode(file_ids))
But it throws an error.
Traceback (most recent call last):
File "./core/main.py", line 46, in <module>
out = p.stdout.read(1)
ValueError: I/O operation on closed file
Am I doing something wrong here??
Popen.communicate() does not return until the process is dead and it returns all the output. You can't read subprocess' stdout after it. Look at the top of the .communicate() docs:
Interact with process: Send data to stdin. Read data from stdout and
stderr, until end-of-file is reached. Wait for process to terminate.emphasis is mine
If you want to send data and then read the output line by line as text while the child process is still running:
#!/usr/bin/env python3
import json
from subprocess import Popen, PIPE
with Popen(command, stdin=PIPE, stdout=PIPE, universal_newline=True) as process:
with process.stdin as pipe:
pipe.write(json.dumps(data))
for line in process.stdout:
print(line, end='')
process(line)
If you need code for older python versions or you have buffering issues, see Python: read streaming input from subprocess.communicate().
If all you want is to pass data to the child process and to print the output to terminal:
#!/usr/bin/env python3.5
import json
import subprocess
subprocess.run(command, input=json.dumps(data).encode())
If your actual child process is a Python script then consider importing it as a module and running the corresponding functions instead, see Call python script with input with in a python script using subprocess.
communicate reads all the output from a subprocess and closes it. If you want to be able to read from the process after writing, you have to use something other than communicate, such as p.stdin.write. Alternatively, just use the output of communicate; it should have what you want https://docs.python.org/3/library/subprocess.html#popen-objects.

Python reading from stdin hangs when interacting with ruby code

I was trying to put python and ruby codes into conversation, and I found the methods from this link (http://www.decalage.info/python/ruby_bridge)
I tried the last method, using stdin and stdout to pass information. I made some changes to the origin code so that it fits python 3.4, but I am not sure whether or not the code that I changed messed all the things up. My python program always hangs when reading from stdin, and nothing was printed. I am not familiar with stdin and stdout, so I am just wondering why this does not work.
Here are my ruby codes:
$stdin.set_encoding("utf-8:utf-8")
$stdout.set_encoding("utf-8:utf-8")
while cmd = $stdin.gets
cmd.chop!
if cmd == "exit"
break
else
puts eval(cmd)
puts "[end]"
$stdout.flush
end
end
I am not sure if it is possible to set internal encoding and external encoding like this. And here are my python codes:
from subprocess import Popen, PIPE, STDOUT
print("Launch slave process...")
slave = Popen(['ruby', 'slave.rb'], stdin=PIPE, stdout=PIPE, stderr=STDOUT)
while True:
line = input("Enter expression or exit:")
slave.stdin.write((line+'\n').encode('UTF-8'))
result = []
while True:
if slave.poll() is not None:
print("Slave has terminated.")
exit()
line = slave.stdout.readline().decode('UTF-8').rstrip()
if line == "[end]":
break
result.append(line)
print("result:")
print("\n".join(result))
When I try to run the python script, input "3*4", and press enter, nothing shows until I broke the process manually with exit code 1 and KeyboardInterrupt Exception.
I have been struggling with this problem for quite a long time and I don't know what goes wrong...
Thanks in advance for any potential help!
The difference is that bufsize=-1 by default in Python 3.4 and therefore slave.stdin.write() does not send the line to the ruby subprocess immediately. A quick fix is to add slave.stdin.flush() call.
#!/usr/bin/env python3
from subprocess import Popen, PIPE
log = print
log("Launch slave process...")
with Popen(['ruby', 'slave.rb'], stdin=PIPE, stdout=PIPE,
bufsize=1, universal_newlines=True) as ruby:
while True:
line = input("Enter expression or exit:")
# send request
print(line, file=ruby.stdin, flush=True)
# read reply
result = []
for line in ruby.stdout:
line = line.rstrip('\n')
if line == "[end]":
break
result.append(line)
else: # no break, EOF
log("Slave has terminated.")
break
log("result:" + "\n".join(result))
It uses universal_newlines=True to enable text mode. It uses locale.getpreferredencoding(False) to decode bytes. If you want to force utf-8 encoding regardless of locale settings then drop universal_newlines and wrap the pipes into io.TextIOWrapper(encoding="utf-8") (code example -- it also shows the proper exception handling for the pipes).

control stdin and stdout of a ruby program in python

First I should notice: I'm a python programmer with no knowledge about ruby!
Now, I need to feed stdin of a ruby program and capture stdout of the script with
a python program.
I tried this (forth solution) and the code works in python2.7 but not in python3; The python3 code reads input with no output.
Now, I need a way to tie the ruby program to either python 2 or 3.
My try:
This code written with six module to have cross version compatibility.
python code:
from subprocess import Popen, PIPE as pipe, STDOUT as out
import six
print('launching slave')
slave = Popen(['ruby', 'slave.rb'], stdin=pipe, stdout=pipe, stderr=out)
while True:
if six.PY3:
from sys import stderr
line = input('enter command: ') + '\n'
line = line.encode('ascii')
else:
line = raw_input('entercommand: ') + '\n'
slave.stdin.write(line)
res = []
while True:
if slave.poll() is not None:
print('slave rerminated')
exit()
line = slave.stdout.readline().decode().rstrip()
print('line:', line)
if line == '[exit]': break
res.append(line)
print('results:')
print('\n'.join(res))
ruby code:
while cmd = STDIN.gets
cmd.chop!
if cmd == "exit"
break
else
print eval(cmd), "\n"
print "[exit]\n"
STDOUT.flush
end
end
NOTE:
Either another way to do this stuff is welcomed! (like socket programming, etc.)
Also I think it's a better idea to not using pipe as stdout and use a file-like object. (like tempfile or StringIO or etc.)
It's because of bufsize. In Python 2.x, default value was 0 (unbufffered). And in Python 3.x it changed to -1 (using default buffer size of system).
Specifying it explicitly will solve your problem.
slave = Popen(['ruby', 'slave.rb'], stdin=pipe, stdout=pipe, stderr=out, bufsize=0)
DEMO
Below is the code on how I got it working with Ruby & Python3.
Ruby Slave
# read command from standard input:
while cmd = STDIN.gets
# remove whitespaces:
cmd.chop!
# if command is "exit", terminate:
if cmd == "exit"
break
else
# else evaluate command, send result to standard output:
print eval(cmd), "\n"
print "[exit]\n"
# flush stdout to avoid buffering issues:
STDOUT.flush
end
end
Python master
from subprocess import Popen, PIPE as pipe, STDOUT as out
print('Launching slave')
slave = Popen(['ruby', 'slave.rb'], stdin=pipe, stdout=pipe, stderr=out, bufsize=0)
while True:
from sys import stderr
line = input('Enter command: ') + '\n'
line = line.encode('ascii')
slave.stdin.write(line)
res = []
while True:
if slave.poll() is not None:
print('Slave terminated')
exit()
line = slave.stdout.readline().decode().rstrip()
if line == '[exit]': break
res.append(line)
print('results:')
print('\n'.join(res))

Read the Content fully from a Popen file object

I'm using subprocess to run a script , get the output of the script on a pipe and process on the output .
I experience a weird problem where in sometimes it reads till the end of the script and someother time it does not go till the end.
I suspect this could be a problem with the buffer size .. tried few alternatives but haven't been succesful yet..
def main():
x = subprocess.Popen('./autotest', bufsize = 1, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, cwd = '/home/vijay/run/bin', shell = True)
with open("out.txt",'wb') as f:
for line in x.stdout:
if 'Press \'q\' to quit scheduler' in line:
print line.strip()
f.write(line.strip())
x.stdin.write('q')
f.write('\n')
x.stdin.close()
x.stdout.flush()
try:
x.stdout.read()
except:
print 'Exception Occured !!!'
os._exit(1)
else:
print line.strip()
f.write(line.strip())
f.write('\n')
x.stdout.flush()
if __name__ == '__main__':
main()
You should keep trying to read from stdout until the process terminates not until stdout ends, use poll() to check if the process terminated and if not, try to read again.
From the Subprocess manual:
[ http://docs.python.org/library/subprocess.html ]
Warning Use communicate() rather than .stdin.write, .stdout.read or
.stderr.read to avoid deadlocks due to any of the other OS pipe
buffers filling up and blocking the child process.
This sounds like it may be the problem you are experiencing. For example, if stderr filled up, I believe that could cause the process to block, preventing it from producing further output on stdout.

Categories

Resources