I am trying to get realtime output of a subprocess.call by defining my own output stream but it doesn't seem to work.
Reason: I want to run a subprocess and get output of that call to both stdout(in realtime so i can look at the script and see current progress) as well as logging it to a file
subprocess.py:
import time
while True:
print("Things")
time.sleep(1)
mainprocess.py
import subprocess
import io
class CustomIO(io.IOBase):
def write(self, str):
print("CustomIO: %s"%str)
# logging to be implemented here
customio = CustomIO()
subprocess.call(["python3", "print_process.py"], stdout=customio)
But when i run this code i get this error message:
Traceback (most recent call last):
File "call_test.py", line 9, in <module>
subprocess.call(["python3", "print_process.py"], stdout=customio)
File "/usr/lib/python3.4/subprocess.py", line 537, in call
with Popen(*popenargs, **kwargs) as p:
File "/usr/lib/python3.4/subprocess.py", line 823, in __init__
errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "/usr/lib/python3.4/subprocess.py", line 1302, in _get_handles
c2pwrite = stdout.fileno()
io.UnsupportedOperation: fileno
So, anyone have any clue if this is possible?
Am i inheriting the wrong baseclass?
Am i not overloading the proper methods?
Or am i completely off the rails and should be going about this in a completely different way?
If you want to process the output of a subprocess, you need to pass stdout=subprocess.PIPE. However, call() and run() will both wait until the process is finished before making it available, so you cannot handle it in real time using these functions.
You need to use subprocess.Popen:
import subprocess as sp
def handle_output(output_line):
...
my_process = sp.Popen(["python3", "print_process.py"],
stdout=sp.PIPE,
universal_newlines=True) # changes stdout from bytes to text
for line in my_process.stdout:
handle_output(line)
my_process.wait()
Update: Make sure to flush the output buffer in your child process:
while True:
print("Things", flush=True)
time.sleep(1)
You need to specify and open stream with a file descriptor. fileno isn't implemented for io.IOBase because this is just an in-memory stream:
Frequently Used Arguments
stdin, stdout and stderr specify the executed program’s standard
input, standard output and standard error file handles, respectively.
Valid values are PIPE, DEVNULL, an existing file descriptor (a
positive integer), an existing file object, and None. PIPE indicates
that a new pipe to the child should be created. DEVNULL indicates that
the special file os.devnull will be used. With the default settings of
None, no redirection will occur;
So you might use sockets, pipes, and open files as stdout, the file descriptor is passed to the child process as it's stdout. I didn't use sockets with subprocess.Popen though, but I expect them to work, I believe what matters here is the file descriptor to the child, not what type of object the file descriptor points to.
Related
I'm writing a python string to parse a value from a JSON file, run a tool called Googler with a couple of arguments including the value from the JSON file, and then save the output of the tool to a file (CSV preferred, but that's for another day).
So far the code is:
import json
import os
import subprocess
import time
with open("test.json") as json_file:
json_data = json.load(json_file)
test = (json_data["search_term1"]["apparel"]["biba"])
#os.system("googler -N -t d1 "+test) shows the output, but can't write to a file.
result= subprocess.run(["googler", "-N","-t","d1",test], stdout=subprocess.PIPE, universal_newlines=True)
print(result.stdout)
When I run the above script, nothing happens, the terminal just sits blank until I send a keyboard interrupt and then I get this error:
Traceback (most recent call last):
File "script.py", line 12, in <module>
result= subprocess.run(["googler", "-N","-t","d1",test], stdout=subprocess.PIPE, universal_newlines=True)
File "/usr/lib/python3.5/subprocess.py", line 695, in run
stdout, stderr = process.communicate(input, timeout=timeout)
File "/usr/lib/python3.5/subprocess.py", line 1059, in communicate
stdout = self.stdout.read()
KeyboardInterrupt
I tried replacing the test variable with a string, same error. The same line works on something like "ls", "-l", "/dev/null".
How do I extract the output of this tool and write it to a file?
Your googler command works in interactive mode. It never exits, so your program is stuck.
You want googler to run the search, print the output and then exit.
From the docs, I think --np (or --noprompt) is the right parameter for that. I didn't test.
result = subprocess.run(["googler", "-N", "-t", "d1", "--np", test], stdout=subprocess.PIPE, universal_newlines=True)
I'm currently learning how to use subprocesses, for this (and some other) reasons
I've bought a book to learn how to work with subprocesses. It's a good book
and I'm not having troubles understanding it. In my book they start to explain
how to execute shell commands as subprocesses.
I had a programming problem I had for ages, and with the subprocesses, I could
be able to solve it but I need to execute a function callback as subprocess.
I have this code to echo something but it's a shell command:
import subprocess
proc = subprocess.Popen(['echo', 'Hello, this is child process speaking'],
stdout=subprocess.PIPE, shell=True)
out, err = proc.communicate()
print(out.decode('utf-8'))
I want this callback to be executed as a subprocess:
def callb()
import time as t
print('2')
t.sleep(2)
print('1')
t.sleep(2)
print('0')
I just tried out to execute this callback like this (it was a simple naive idea):
proc = subprocess.Popen(callb())
but this gives me the following error:
Traceback (most recent call last):
File "/root/testfile.py", line 6, in <module>
proc = subprocess.Popen(callb())
File "/usr/lib/python3.3/subprocess.py", line 818, in __init__
restore_signals, start_new_session)
File "/usr/lib/python3.3/subprocess.py", line 1321, in _execute_child
args = list(args)
TypeError: 'NoneType' object is not iterable
The strange thing is, that it does execute the callback, but then it raises this error!
What did I do wrong? Did I forget something?
the subprocess module is not suitable for executing python callbacks. you want to look at the multiprocessing module instead. the first few examples with Process and Pool seem to do what you want.
My code is as follows, basically this module will run the required command and capture its ouput line by line but in my case when the command runs, it takes just more than a second to return to the command prompt and thats where child.stdout.read(1) hangs, if I run a normal command using this it prints everthing as expected. but in a particular case where, the command prints somthing to STDOUT and then takes some time to return to the prompt, it hangs.. Please help
New code:
def run_command(shell_command):
'''run the required command and print the log'''
child = subprocess.Popen(shell_command, shell=True,stdout=subprocess.PIPE)
(stdoutdata, stderrdata) = child.communicate()
print stdoutdata
print "Exiting.."
Error:
File "upgrade_cloud.py", line 62, in <module>
stop_cloud()
File "upgrade_cloud.py", line 49, in stop_cloud
run_command(shell_command)
File "upgrade_cloud.py", line 33, in run_command
(stdoutdata, stderrdata) = child.communicate()
File "/usr/lib/python2.6/subprocess.py", line 693, in communicate
stdout = self.stdout.read()
KeyboardInterrupt
Here's your problem:
child.wait()
This line causes Python to wait for the child process to exit. If the child process tries to print a lot of data to stdout, it will block waiting for Python to read said data. Since Python is waiting for the child process and the child process is waiting for Python, you get a deadlock.
I would recommend using subprocess.check_output() instead of subprocess.Popen. You could also use the Popen.communicate() method instead of the .wait() method.
I would like to run an exe from this directory:/home/pi/pi_sensors-master/bin/Release/
This exe is then run by tying mono i2c.exe and it runs fine.
I would like to get this output in python which is in a completely different directory.
I know that I should use subprocess.check_output to take the output as a string.
I tried to implement this in python:
import subprocess
import os
cmd = "/home/pi/pi_sensors-master/bin/Release/"
os.chdir(cmd)
process=subprocess.check_output(['mono i2c.exe'])
print process
However, I received this error:
The output would usually be a data stream with a new number each time, is it possible to capture this output and store it as a constantly changing variable?
Any help would be greatly appreciated.
Your command syntax is incorrect, which is actually generating the exception. You want to call mono i2c.exe, so your command list should look like:
subprocess.check_output(['mono', 'i2c.exe']) # Notice the comma separation.
Try the following:
import subprocess
import os
executable = "/home/pi/pi_sensors-master/bin/Release/i2c.exe"
print subprocess.check_output(['mono', executable])
The sudo is not a problem as long as you give the full path to the file and you are sure that running the mono command as sudo works.
I can generate the same error by doing a ls -l:
>>> subprocess.check_output(['ls -l'])
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/subprocess.py", line 537, in check_output
process = Popen(stdout=PIPE, *popenargs, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 679, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1249, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
However when you separate the command from the options:
>>> subprocess.check_output(['ls', '-l'])
# outputs my entire folder contents which are quite large.
I strongly advice you to use the subprocess.Popen -object to deal with external processes. Use Popen.communicate() to get the data from both stdout and stderr. This way you should not run into blocking problems.
import os
import subprocess
executable = "/home/pi/pi_sensors-master/bin/Release/i2c.exe"
proc = subprocess.Popen(['mono', executable])
try:
outs, errs = proc.communicate(timeout=15) # Times out after 15 seconds.
except TimeoutExpired:
proc.kill()
outs, errs = proc.communicate()
Or you can call the communicate in a loop if you want a 'data-stream' of sort, an answer from this question:
from subprocess import Popen, PIPE
executable = "/home/pi/pi_sensors-master/bin/Release/i2c.exe"
p = Popen(["mono", executable], stdout=PIPE, bufsize=1)
for line in iter(p.stdout.readline, b''):
print line,
p.communicate() # close p.stdout, wait for the subprocess to exit
I'm using a python script to execute an Ant based framework batch file(Helium.bat)
subprocess.Popen('hlm '+commands, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
However the script will always stop and display the following error when it executes the .bat file:
import codecs
File "C:\Python25\lib\codecs.py", line 1007, in <module>
strict_errors = lookup_error("strict")
File "C:\Python25\lib\codecs.py", line 1007, in <module>
strict_errors = lookup_error("strict")
File "C:\Python25\lib\encodings\__init__.py", line 31, in <module>
import codecs, types
File "C:\Python25\lib\types.py", line 36, in <module>
BufferType = buffer
NameError: name 'buffer' is not defined
If I execute the .bat directly on command line, there will not be any issue.
I think at least part of the problem is how you're executing the batch file. Give this a try:
# execute the batch file as a separate process and echo its output
Popen_kwargs = { 'stdout': subprocess.PIPE, 'stderr': subprocess.STDOUT,
'universal_newlines': True }
with subprocess.Popen('hlm '+commands, **Popen_kwargs).stdout as output:
for line in output:
print line,
This pass different arguments to Popen -- the difference are this version removes the shell=True which isn't needed on a batch file, sets stderr=subprocess.STDOUT which redirects stdout to the same place stdout is going to to avoid missing any error messages, and adds a universal_newlines=True to make the output more readable.
Another difference is it reads and prints the output from the Popen process which will effectively make the Python script running the batch file wait until it's finished executing before continuing on -- which I suspect is important.