Interprocess Communication Python [duplicate] - python

This question already has answers here:
How do I pass a string into subprocess.Popen (using the stdin argument)?
(12 answers)
Closed 8 years ago.
I'm writing a script in Python that should communicate with a software
"ConsoleApplication.exe"(wrote in C); this last one once started waits for a fixed
lenght command(5 bytes) from his "stdin" and generates (after 2-3 seconds) on
his "stdout" an output that I should read on my Python script.
#
//This is the "ConsoleApplication.c" file
#include <stdio.h>
#include <function.h>
char* command[5];
int main()
{
while(1)
{
scanf("%s\n", &command);
output = function(command);
print("%s\n", output);
}
}
#
#this is Python code
import subprocess
#start the process
p = subprocess.Popen(['ConsoleApplication.exe'], shell=True, stderr=subprocess.PIPE)
#command to send to ConsoleApplication.exe
command_to_send = "000648"
#this seems to work well but I need to send a command stored into a buffer and if Itry
#to use sys.stdout.write(command_to_send)nothing will happen. The problem seem that
#sys.stdout.write expect an object I/O FILE
while True:
out = p.stderr.read(1)
if out == '' and p.poll() != None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
#
Any suggestions? How can I fix it?
I tried to use
stdout = p.communicate(input='test\n')[0]
but Im getting the following error at runtime:
"TypeError: 'str' does not support the buffer interface"
I also tried this
from subprocess import Popen, PIPE, STDOUT
p = Popen(['ConsoleApplication.exe'], stdout=PIPE, stdin=PIPE, stderr=PIPE)
out, err = p.communicate(input='00056\n'.encode())
print(out)
out, err = p.communicate(input='00043\n'.encode())
print(out)
but I get this error:
"ValueError: Cannot send input after starting communication"

Looks like this question has your solution
Python - How do I pass a string into subprocess.Popen (using the stdin argument)?
Use the Popen.communicate() method

Related

Cannot use python subprocess to send input to console application written in C

I want to automatically test a program written in C using python. I have the C file below and the python code below. I compiled my C code into an executable (a.exe) and running it in the command prompt or normally works as expected. When I try to run it with subprocess, though, I receive no output. I tried using p1.communicate() and p1.read() as well, and both trying to send and receive input and output and only output, but to no avail. Is there a way using python (using subprocess or anything else) to run the C code below and interact with it (sending it both input and output until it closes itself)?
Output and input attempt
from subprocess import Popen, PIPE
p = Popen(['a.exe'], shell=True, stdout=PIPE, stdin=PIPE)
for ii in range(10):
value = str(ii) + '\n'
value = bytes(value, 'UTF-8') # Needed in Python 3.
p.stdin.write(value)
p.stdin.flush()
result = p.stdout.readline().strip()
print(result)
value = '-1\n'
value = bytes(value, 'UTF-8') # Needed in Python 3.
p.stdin.write(value)
p.stdin.flush()
result = p.stdout.readline().strip()
print(result)
Output only attempt
from subprocess import Popen, PIPE
p = Popen(['a.exe'], shell=True, stdout=PIPE, stdin=PIPE)
result = p.stdout.readline().strip()
print(result)
Example C file I'm trying to run for reference
#include <stdio.h>
int main() {
int userInput;
printf("Hello, this is a basic program\n");
do{
printf("Enter user input:\n");
scanf("%d", &userInput);
printf("The user input is: %d\n", userInput);
printf("Entering -1 will exit");
}while(userInput != -1);
printf("Left the loop succesfully\n");
return 0;
}

Receive return data from subprocess in python

I'm spawning a process from a script using subprocess. My subprocess takes a JSON input and performs some operations and should return some real time data to the main process. How can I do this from subprocess?
I'm trying something like this. But it is throwing an error.
Following is may main process "main.py"
p = subprocess.Popen(['python','handler.py'],
stdin=subprocess.PIPE,stdout=subprocess.PIPE)
p.communicate(JSONEncoder().encode(data))
while True:
out = process.stdout.read(1)
if out == '' and process.poll() != None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
Below is my subprocess "handler.py"
if __name__ == '__main__' :
command = json.load(sys.stdin)
os.environ["PYTHONPATH"] = "../../"
if command["cmd"] == "archive" :
print "command recieved:",command["cmd"]
file_ids, count = archive(command["files"])
sys.stdout.write(JSONEncoder().encode(file_ids))
But it throws an error.
Traceback (most recent call last):
File "./core/main.py", line 46, in <module>
out = p.stdout.read(1)
ValueError: I/O operation on closed file
Am I doing something wrong here??
Popen.communicate() does not return until the process is dead and it returns all the output. You can't read subprocess' stdout after it. Look at the top of the .communicate() docs:
Interact with process: Send data to stdin. Read data from stdout and
stderr, until end-of-file is reached. Wait for process to terminate.emphasis is mine
If you want to send data and then read the output line by line as text while the child process is still running:
#!/usr/bin/env python3
import json
from subprocess import Popen, PIPE
with Popen(command, stdin=PIPE, stdout=PIPE, universal_newline=True) as process:
with process.stdin as pipe:
pipe.write(json.dumps(data))
for line in process.stdout:
print(line, end='')
process(line)
If you need code for older python versions or you have buffering issues, see Python: read streaming input from subprocess.communicate().
If all you want is to pass data to the child process and to print the output to terminal:
#!/usr/bin/env python3.5
import json
import subprocess
subprocess.run(command, input=json.dumps(data).encode())
If your actual child process is a Python script then consider importing it as a module and running the corresponding functions instead, see Call python script with input with in a python script using subprocess.
communicate reads all the output from a subprocess and closes it. If you want to be able to read from the process after writing, you have to use something other than communicate, such as p.stdin.write. Alternatively, just use the output of communicate; it should have what you want https://docs.python.org/3/library/subprocess.html#popen-objects.

Python reading from stdin hangs when interacting with ruby code

I was trying to put python and ruby codes into conversation, and I found the methods from this link (http://www.decalage.info/python/ruby_bridge)
I tried the last method, using stdin and stdout to pass information. I made some changes to the origin code so that it fits python 3.4, but I am not sure whether or not the code that I changed messed all the things up. My python program always hangs when reading from stdin, and nothing was printed. I am not familiar with stdin and stdout, so I am just wondering why this does not work.
Here are my ruby codes:
$stdin.set_encoding("utf-8:utf-8")
$stdout.set_encoding("utf-8:utf-8")
while cmd = $stdin.gets
cmd.chop!
if cmd == "exit"
break
else
puts eval(cmd)
puts "[end]"
$stdout.flush
end
end
I am not sure if it is possible to set internal encoding and external encoding like this. And here are my python codes:
from subprocess import Popen, PIPE, STDOUT
print("Launch slave process...")
slave = Popen(['ruby', 'slave.rb'], stdin=PIPE, stdout=PIPE, stderr=STDOUT)
while True:
line = input("Enter expression or exit:")
slave.stdin.write((line+'\n').encode('UTF-8'))
result = []
while True:
if slave.poll() is not None:
print("Slave has terminated.")
exit()
line = slave.stdout.readline().decode('UTF-8').rstrip()
if line == "[end]":
break
result.append(line)
print("result:")
print("\n".join(result))
When I try to run the python script, input "3*4", and press enter, nothing shows until I broke the process manually with exit code 1 and KeyboardInterrupt Exception.
I have been struggling with this problem for quite a long time and I don't know what goes wrong...
Thanks in advance for any potential help!
The difference is that bufsize=-1 by default in Python 3.4 and therefore slave.stdin.write() does not send the line to the ruby subprocess immediately. A quick fix is to add slave.stdin.flush() call.
#!/usr/bin/env python3
from subprocess import Popen, PIPE
log = print
log("Launch slave process...")
with Popen(['ruby', 'slave.rb'], stdin=PIPE, stdout=PIPE,
bufsize=1, universal_newlines=True) as ruby:
while True:
line = input("Enter expression or exit:")
# send request
print(line, file=ruby.stdin, flush=True)
# read reply
result = []
for line in ruby.stdout:
line = line.rstrip('\n')
if line == "[end]":
break
result.append(line)
else: # no break, EOF
log("Slave has terminated.")
break
log("result:" + "\n".join(result))
It uses universal_newlines=True to enable text mode. It uses locale.getpreferredencoding(False) to decode bytes. If you want to force utf-8 encoding regardless of locale settings then drop universal_newlines and wrap the pipes into io.TextIOWrapper(encoding="utf-8") (code example -- it also shows the proper exception handling for the pipes).

Python string as file argument to subprocess

I am trying to pass a file to a program (MolPro) that I start as subprocess with Python.
It most commonly takes a file as argument, like this in console:
path/molpro filename.ext
Where filename.ex contains the code to execute. Alternatively a bash script (what I'm trying to do but in Python):
#!/usr/bin/env bash
path/molpro << EOF
# MolPro code
EOF
I'm trying to do the above in Python. I have tried this:
from subprocess import Popen, STDOUT, PIPE
DEVNULL = open('/dev/null', 'w') # I'm using Python 2 so I can't use subprocess.DEVNULL
StdinCommand = '''
MolPro code
'''
# Method 1 (stdout will be a file)
Popen(['path/molpro', StdinCommand], shell = False, stdout = None, stderr = STDOUT, stdin = DEVNULL)
# ERROR: more than 1 input file not allowed
# Method 2
p = Popen(['path/molpro', StdinCommand], shell = False, stdout = None, stderr = STDOUT, stdin = PIPE)
p.communicate(input = StdinCommand)
# ERROR: more than 1 input file not allowed
So I am pretty sure the input doesn't look enough like a file, but even looking at Python - How do I pass a string into subprocess.Popen (using the stdin argument)? I can't find what Im doing wrong.
I prefer not to:
Write the string to an actual file
set shell to True
(And I can't change MolPro code)
Thanks a lot for any help!
Update: if anyone is trying to do the same thing, if you don't want to wait for the job to finish (as it doesn't return anything, either way), use p.stdin.write(StdinCommand) instead.
It seems like your second method should work if you remove StdinCommand from the Popen() arguments:
p = Popen(['/vol/thchem/x86_64-linux/bin/molpro'], shell = False, stdout = None, stderr = STDOUT, stdin = PIPE)
p.communicate(input = StdinCommand)

Python subprocess module: parent child communication not working

I'm trying to run the following code as a subprocess
#include<stdio.h>
int main()
{
int a;
printf("Hello\n");
fprintf(stderr, "Hey\n");
scanf("%d", &a);
printf("%d\n", a);
return 0;
}
This script works fine: write to stdin, read from stdout and from stderr.
#!/usr/bin/python
import subprocess
p1=subprocess.Popen("/mnt/test/a.out", stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE)
p1.stdin.write('1\n')
print p1.stdout.readline()
print p1.stderr.readline()
print p1.stdout.readline()
But this script fails to read any output from stdout and gets blocked there even though the C program does print to stdout before demanding any input. Why is it that I'm unable to read anything from stdout?
#!/usr/bin/python
import subprocess
p1=subprocess.Popen("/mnt/test/a.out", stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE)
print p1.stdout.readline()
p1.stdin.write('1\n')
print p1.stderr.readline()
print p1.stdout.readline()
You need to flush the stream first. This will make sure all data is actually written to the stream.
#include<stdio.h>
int main()
{
int a;
printf("Hello\n");
fprintf(stderr, "Hey\n");
fflush(stdout); // <--
scanf("%d", &a);
printf("%d\n", a);
return 0;
}
By default, stderr is unbuffered, which is why you don't need to flush it. stdout however is fully buffered, unless it points to a terminal, then it line-buffered (i.e. the \n would automatically trigger flushing.
Have a look here, for setbuf() and setvbuf().
I don't see something like
stdout_data, stderr_data = p1.communicate()
in your code
Popen.communicate(input=None)
Interact with process: Send data to stdin. Read data from stdout and stderr, until end-of-file is reached. Wait for process to terminate. The optional input argument should be a string to be sent to the child process, or None, if no data should be sent to the child.
communicate() returns a tuple (stdoutdata, stderrdata).
Note that if you want to send data to the process’s stdin, you need to create the Popen object with stdin=PIPE. Similarly, to get anything other than None in the result tuple, you need to give stdout=PIPE and/or stderr=PIPE too.
Note The data read is buffered in memory, so do not use this method if the data size is large or unlimited.
See docs.python.org
A function I keep in my utility-belt to wrap calling an external program using subprocess is this (modify to suit your needs):
def __run(self, cmd):
"""wrapper, makes it easy to call an external program.
return the result as a newline-separated list
"""
args = shlex.split(cmd)
try:
p = subprocess.Popen(args, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
retdata = p.communicate()[0]
p.wait()
except OSError, e:
print >>sys.stderr, "Execution failed:", e
return (p.returncode, retdata.split('\n'))
Just place your command as you would write it on the cmd-line in a variable an call the function e.g.:
cmd = r'git branch -r'
data = self.__run(cmd)

Categories

Resources