Execute Bash Command in new Process - python

I need to get information from a bash command which takes several seconds. I want the rest of program to continue until I get the returncode. I tried to do it with multiprocessing but I cant get the returncode of the subprocess alltough the console prints the correct returncode.
from multiprocessing import Process, Value
import subprocess
num = Value("d", 0.0)
class foo(object):
def __init__(self):
self.createProcess()
def createProcess(self):
p = Process(target=self.Process, args=(num,))
p.start()
...Do Stuff here in parallel...
def Process(self,n):
somebashParam = int(n.value)
p = subprocess.Popen("some -command"+str(somebashParam),shell=True)
out, err = p.communicate()
n.value = p.returncode
why does the console print out the right returncode but I cant grab it?
It seems strange for me to launch a subprocess in a other new Process. Is there a better way?

External processes automatically run in parallel. If you are only interested in the return code, you don't need any additional code:
n = 23
process = subprocess.Popen(["some", "-command", str(n)])
while process.poll() is None:
do_something_else()
result = process.wait()

Related

Real time multipocess stdout monitoring

Right now, I'm using subprocess to run a long-running job in the background. For multiple reasons (PyInstaller + AWS CLI) I can't use subprocess anymore.
Is there an easy way to achieve the same thing as below ? Running a long running python function in a multiprocess pool (or something else) and do real time processing of stdout/stderr ?
import subprocess
process = subprocess.Popen(
["python", "long-job.py"],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True,
)
while True:
out = process.stdout.read(2000).decode()
if not out:
err = process.stderr.read().decode()
else:
err = ""
if (out == "" or err == "") and process.poll() is not None:
break
live_stdout_process(out)
Thanks
getting it cross platform is messy .... first of all windows implementation of non-blocking pipe is not user friendly or portable.
one option is to just have your application read its command line arguments and conditionally execute a file, and you get to use subprocess since you will be launching yourself with different argument.
but to keep it to multiprocessing :
the output must be logged to queues instead of pipes.
you need the child to execute a python file, this can be done using runpy to execute the file as __main__.
this runpy function should run under a multiprocessing child, this child must first redirect its stdout and stderr in the initializer.
when an error happens, your main application must catch it .... but if it is too busy reading the output it won't be able to wait for the error, so a child thread has to start the multiprocess and wait for the error.
the main process has to create the queues and launch the child thread and read the output.
putting it all together:
import multiprocessing
from multiprocessing import Queue
import sys
import concurrent.futures
import threading
import traceback
import runpy
import time
class StdoutQueueWrapper:
def __init__(self,queue:Queue):
self._queue = queue
def write(self,text):
self._queue.put(text)
def flush(self):
pass
def function_to_run():
# runpy.run_path("long-job.py",run_name="__main__") # run long-job.py
print("hello") # print something
raise ValueError # error out
def initializer(stdout_queue: Queue,stderr_queue: Queue):
sys.stdout = StdoutQueueWrapper(stdout_queue)
sys.stderr = StdoutQueueWrapper(stderr_queue)
def thread_function(child_stdout_queue,child_stderr_queue):
with concurrent.futures.ProcessPoolExecutor(1, initializer=initializer,
initargs=(child_stdout_queue, child_stderr_queue)) as pool:
result = pool.submit(function_to_run)
try:
result.result()
except Exception as e:
child_stderr_queue.put(traceback.format_exc())
if __name__ == "__main__":
child_stdout_queue = multiprocessing.Queue()
child_stderr_queue = multiprocessing.Queue()
child_thread = threading.Thread(target=thread_function,args=(child_stdout_queue,child_stderr_queue),daemon=True)
child_thread.start()
while True:
while not child_stdout_queue.empty():
var = child_stdout_queue.get()
print(var,end='')
while not child_stderr_queue.empty():
var = child_stderr_queue.get()
print(var,end='')
if not child_thread.is_alive():
break
time.sleep(0.01) # check output every 0.01 seconds
Note that a direct consequence of running as a multiprocess is that if the child runs into a segmentation fault or some unrecoverable error the parent will also die, hencing running yourself under subprocess might seem a better option if segfaults are expected.

Tried to make a non-blocking command execution function, what's causing this unexpected behavior?

What I wanted to happen:
So my goal was to write a function that leverages subprocess to run a command and read the stdout, whether it be immediate or delayed, line by line as it comes. And to do that in a non-blocking, asynchronous way.
I also wanted to be able to pass a function to be called each time a new stdout line is read.
What happened instead:
Until the process being run is completely finished / killed, the output isn't handled / printed as expected. All the correct output happens, but I expected it to print in real-time as the output is polled. Rather, it waits until the entire process finishes running, then prints all the expected output.
What I tried:
So I wrote a simple test script lab_temp.py to provide some output:
from time import sleep
for i in range(10):
print('i:', i)
sleep(1)
And a function set_interval.py which I mostly copied from some SO answer (although I'm sorry I don't recall which answer to give credit):
import threading
def set_interval(func, sec):
def func_wrapper():
t = set_interval(func, sec)
result = func()
if result == False:
t.cancel()
t = threading.Timer(sec, func_wrapper)
t.start()
return t
And then a function call_command.py to run the command and asynchronously poll the process at some interval for output, until it's done. I'm only barely experienced with asynchronous code, and that's probably related to my mistake, but I think the async part is being handled behind the scenes by threading.Timer (in set_interval.py).
call_command.py:
import subprocess
from set_interval import set_interval
def call_command(cmd, update_func=None):
p = subprocess.Popen(cmd, stderr=subprocess.PIPE, stdout=subprocess.PIPE, encoding='utf-8')
def polling(): # Replaces "while True:" to convert to non-blocking
for line in iter(p.stdout.readline, ''):
if update_func:
update_func(line.rstrip())
if p.poll() == 0:
print('False')
return False # cancel interval
else:
print('True')
return True # continue interval
set_interval(polling, 1)
And each of these functions have basic tests:
set_interval.test.py (seems to run as expected):
from set_interval import set_interval
i = 0
def func():
global i
i += 1
print(f"Interval: {i}...")
if i > 5:
return False
else:
return True
set_interval(func, 2)
print('non blocking')
call_command.test.py (results in the wrong behavior, as described initially):
from call_command import call_command
def func(out):
print(out) # <- This will print in one big batch once
# the entire process is complete.
call_command('python3 lab_temp.py', update_func=func)
print('non-blocking') # <- This will print right away, so I
# know it's not blocked / frozen.
What have I gotten wrong here causing the deviation from expectation?
Edit: Continued efforts...
import subprocess
from set_interval import set_interval
def call_command(cmd):
p = subprocess.Popen(cmd, stderr=subprocess.PIPE, stdout=subprocess.PIPE, encoding='utf-8')
def polling():
line = p.stdout.readline().strip()
if not line and p.poll() is not None:
return False
else:
print(line)
return True
set_interval(polling, 1)
Doesn't work. Nearly identical issues.
The problem is located in your command. The lab_temp.py script uses the print function that prints to sys.stdout by default. sys.stdout is block-buffered by default. Since the buffer is large enough to accept the whole script's output, it gets flushed no sooner than at the end.
To fix this, you can either use the sys.stdout's flush method:
from time import sleep
import sys
for i in range(10):
print('i:', i)
sys.stdout.flush()
sleep(1)
or use print's flush parameter:
from time import sleep
for i in range(10):
print('i:', i, flush=True)
sleep(1)
or run Python interpreter with the -u option:
call_command(['python3', '-u', 'lab_temp.py'])
or set the PYTHONUNBUFFERED environment variable to a non-empty string
import subprocess
from set_interval import set_interval
def call_command(cmd):
p = subprocess.Popen(cmd, stderr=subprocess.PIPE, stdout=subprocess.PIPE, encoding='utf-8', env={'PYTHONUNBUFFERED':'1'})
def polling():
line = p.stdout.readline().strip()
if not line and p.poll() is not None:
return False
else:
print(line)
return True
set_interval(polling, 1)
BTW, in order to avoid threads, you might want to use asyncio:
import asyncio
async def call_command(cmd):
p = await asyncio.create_subprocess_exec(cmd[0], *cmd[1:], stderr=asyncio.subprocess.PIPE, stdout=asyncio.subprocess.PIPE)
async for line in p.stdout:
line = line.strip().decode('utf-8')
print(line)

Get the command line output of a thread

I have a main scirpt that run several other programms in some order (to avoid lauching them manually). Hence I use thread to call them.
The first one is a Windows application and I call it like that:
class NepThread(threading.Thread):
def run(self):
subprocess.call('PATH_TO_PRGM.exe')
pass
#...
nepThread = NepThread()
nepThread.daemon = True
nepThread.start()
Then I run a Python script in kind of same way:
class UsbCameraThread(threading.Thread):
def run(self):
subprocess.call(["python", 'PATH\\USBcamera.py'])
pass
#...
usbCameraThread = UsbCameraThread()
usbCameraThread.daemon = True
usbCameraThread.start()
But for this one, I need to wait it is started before running the next script.
When USBcamera script is ready, it writes something on cout and then starts an infinite loop:
print('Start Video recording!')
while True:
My question is: how can I got the command line output to know if the script is started?
Thanks in advance!
Dark Patate
You can probably use the subprocess.Popen() instruction to capture the stdout stream. Here is a sample of its usage:
proc = subprocess.Popen( command, stdout = subprocess.PIPE, stderr = subprocess.PIPE )
stdout_stream, stderr_stream = proc.communicate()
stdout_stream = stdout_stream.decode( "ascii" )
Then print out the contents of stdout_stream.

How do I get data from a subprocess PIPE while the subprocess is running in Python?

I've got a program on Windows that calls a bunch of subprocesses, and displays the results in a GUI. I'm using PyQt for the GUI, and the subprocess module to run the programs.
I've got the following WorkerThread, that spawns a subthread for each shell command devoted to reading the process stdout and printing the results (later I'll wire it up to the GUI).
This all works. Except proc.stdout.read(1) never returns until after the subprocess has completed. This is a big problem, since some of these subprocesses can take 15-20 minutes to run, and I need to display results as they're running.
What do I need to do to get the pipe working while the subprocess is running?
class WorkerThread(QtCore.QThread):
def run(self):
def sh(cmd, cwd = None):
proc = subprocess.Popen(cmd,
shell = True,
stdout = subprocess.PIPE,
stderr = subprocess.STDOUT,
stdin = subprocess.PIPE,
cwd = cwd,
env = os.environ)
proc.stdin.close()
class ReadStdOutThread(QtCore.QThread):
def run(_self):
s = ''
while True:
if self.request_exit: return
b = proc.stdout.read(1)
if b == '\n':
print s
s = ''
continue
if b:
s += b
continue
if s: print s
return
thread = ReadStdOutThread()
thread.start()
retcode = proc.wait()
if retcode:
raise subprocess.CalledProcessError(retcode, cmd)
return 0
FWIW: I rewrote the whole thing using QProcess, and I see the exact same problem. The stdout receives no data, until the underlying process has returned. Then I get everything all at once.
If you know how long will be the the lines of command's output you can poll on the stdout PIPE of the process.
An example of what I mean:
import select
import subprocess
import threading
import os
# Some time consuming command.
command = 'while [ 1 ]; do sleep 1; echo "Testing"; done'
# A worker thread, not as complex as yours, just to show my point.
class Worker(threading.Thread):
def __init__(self):
super(Worker, self).__init__()
self.proc = subprocess.Popen(
command, shell=True,
stdout=subprocess.PIPE,
stdin=subprocess.PIPE, stderr=subprocess.STDOUT
)
def run(self):
self.proc.communicate()
def get_proc(self):
# The proc is needed for ask him for his
# output file descriptor later.
return self.proc
if __name__ == '__main__':
w = Worker()
w.start()
proc = w.get_proc()
pollin = select.poll()
pollin.register(proc.stdout, select.POLLIN)
while ( 1 ):
events = pollin.poll()
for fd, event in events:
if event == select.POLLIN:
# This is the main issue of my idea,
# if you don't know the length of lines
# that process ouput, this is a problem.
# I put 7 since I know the word "Testing" have
# 7 characters.
print os.read(fd, 7)
Maybe this is not exactly what you're looking for, but I think it give you a pretty good idea of what to do to solve your problem.
EDIT: I think I've just found what you need Streaming stdout from a Python subprocess in Python.

showing progress while spawning and running subprocess

I need to show some progress bar or something while spawning and running subprocess.
How can I do that with python?
import subprocess
cmd = ['python','wait.py']
p = subprocess.Popen(cmd, bufsize=1024,stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
p.stdin.close()
outputmessage = p.stdout.read() #This will print the standard output from the spawned process
message = p.stderr.read()
I could spawn subprocess with this code, but I need to print out something when each second is passing.
Since the subprocess call is blocking, one way to print something out while waiting would be to use multithreading. Here's an example using threading._Timer:
import threading
import subprocess
class RepeatingTimer(threading._Timer):
def run(self):
while True:
self.finished.wait(self.interval)
if self.finished.is_set():
return
else:
self.function(*self.args, **self.kwargs)
def status():
print "I'm alive"
timer = RepeatingTimer(1.0, status)
timer.daemon = True # Allows program to exit if only the thread is alive
timer.start()
proc = subprocess.Popen([ '/bin/sleep', "5" ])
proc.wait()
timer.cancel()
On an unrelated note, calling stdout.read() while using multiple pipes can lead to deadlock. The subprocess.communicate() function should be used instead.
As far as I see it all you need to do is put those reads in a loop with a delay and a print - does it have to be precisely a second or around about a second?

Categories

Resources