I am running a Tkinter GUI that spins off another process (python script) with subprocess.Popen(...) and uses pipes for stdout and stderr. Then I'm spinning off a separate thread to asynchronously read the out/err from that process and draw it into a Tkinter Text widget using threading.Thread.
Everything works great except that the async. read thread only executes when I'm moving the mouse or pressing keys on the keyboard. I even put print statements into the threaded function and they start/stop printing when I move the mouse around in circles.
Here's the async read class that I'm using, borrowed from here:
class AsynchronousFileReader(threading.Thread):
'''
Helper class to implement asynchronous reading of a file
in a separate thread. Pushes read lines on a queue to
be consumed in another thread.
'''
def __init__(self, fd, queue):
assert isinstance(queue, Queue.Queue)
assert callable(fd.readline)
threading.Thread.__init__(self)
self._fd = fd
self._queue = queue
def run(self):
'''The body of the tread: read lines and put them on the queue.'''
for line in iter(self._fd.readline, ''):
self._queue.put(line)
def eof(self):
'''Check whether there is no more content to expect.'''
return not self.is_alive() and self._queue.empty()
And my consume method for pulling messages out of the async file reader (this is the one that runs on a separate thread:
def consume(self, process, console_frame):
# Launch the asynchronous readers of the process' stdout and stderr.
stdout_queue = Queue.Queue()
stdout_reader = AsynchronousFileReader(process.stdout, stdout_queue)
stdout_reader.start()
stderr_queue = Queue.Queue()
stderr_reader = AsynchronousFileReader(process.stderr, stderr_queue)
stderr_reader.start()
# Check the queues if we received some output (until there is nothing more to get).
while not stdout_reader.eof() or not stderr_reader.eof():
# Show what we received from standard output.
while not stdout_queue.empty():
line = stdout_queue.get()
console_frame.writeToLog(line.strip(), max_lines=None)
time.sleep(.03) # prevents it from printing out in large blocks at a time
# Show what we received from standard error.
while not stderr_queue.empty():
line = stderr_queue.get()
console_frame.writeToLog(line.strip(), max_lines=None)
time.sleep(.03) # prevents it from printing out in large blocks at a time
# Sleep a bit before asking the readers again.
time.sleep(.05)
# Let's be tidy and join the threads we've started.
stdout_reader.join()
stderr_reader.join()
# Close subprocess' file descriptors.
process.stdout.close()
process.stderr.close()
print "finished executing"
if self.stop_callback:
self.stop_callback()
Like I said before -- the consume() thread only executes when I move the mouse or type on the keyboard -- which means the writeToLog(...) function (for appending text into the Tkinter GUI) only gets executed when mouse/keyboard activity happens... Any ideas?
EDIT: I think I might have an idea of what's happening... If I comment the writeToLog(...) call and replace it with a simple print (taking Tkinter out of the equation) then the consume thread executes normally. It seems Tkinter is the problem here. Any ideas on I can accomplish the Tkinter text-widget update from the consume thread?
EDIT2: Got it working thanks to the comments. Here's is the final code that I used:
gui_text_queue = Queue.Queue()
def consume(self, process, console_frame):
# Launch the asynchronous readers of the process' stdout and stderr.
stdout_queue = Queue.Queue()
stdout_reader = AsynchronousFileReader(process.stdout, stdout_queue)
stdout_reader.start()
stderr_queue = Queue.Queue()
stderr_reader = AsynchronousFileReader(process.stderr, stderr_queue)
stderr_reader.start()
# Check the queues if we received some output (until there is nothing more to get).
while not stdout_reader.eof() or not stderr_reader.eof():
# Show what we received from standard output.
while not stdout_queue.empty():
line = stdout_queue.get()
gui_text_queue.put(line.strip())
# Show what we received from standard error.
while not stderr_queue.empty():
line = stderr_queue.get()
gui_text_queue.put(line.strip())
# Sleep a bit before asking the readers again.
time.sleep(.01)
# Let's be tidy and join the threads we've started.
stdout_reader.join()
stderr_reader.join()
# Close subprocess' file descriptors.
process.stdout.close()
process.stderr.close()
if self.stop_callback:
self.stop_callback()
Added this method to my Tkinter console frame and called it once at the end of the frame initializer:
def pull_text_and_update_gui(self):
while not gui_text_queue.empty():
text = gui_text_queue.get()
self.writeToLog(text, max_lines=None)
self.after(5, self.pull_text_and_update_gui)
Tkinter isn't thread safe. If your writeToLog function tries to insert data into the text widget, you'll get unpredictable behavior. In order for a separate thread to send data to a widget you'll need to write the data to a thread-safe queue, then have your main thread poll that queue (using tkinter's after method).
Related
I have two python programs which one of them connects to a bluetooth device(socket package), it receives and saves data from device, and another one read the stored data and draw a real time plot. I should make one application from these two programs.
I tried to mix these two python programs, but since bluetooth should wait to receive data (through a while loop), the other parts of program does not work. I tried to solve this problem using Clock.schedule_interval, but the program will hang after a period of time. So I decided to run these two programs simultaneously. I read, we can run some python programs at a same time using a python script. Is there any trick to join these two programs and build one application?
Any help would be greatly appreciated.
Install threaded:
pip install threaded
Create a new python file:
from threading import Thread
def runFile1(): import file1
def runFile2(): import file2
Thread(target=runFile1).start()
runFile2()
Run the new python file.
It can be done with threading. To do communication between the threaded function and your main function, use objects such as queue.Queue and threading.Event.
the bluetooth functions can be placed into a function that is the target of the thread
import time
from threading import Thread
from queue import Queue
class BlueToothFunctions(Thread):
def __init__(self, my_queue):
super().__init__()
self.my_queue = my_queue
# optional: causes this thread to end immediately if the main program is terminated
self.daemon = True
def run(self) -> None:
while True:
# do all the bluetooth stuff foreverer
g = self.my_queue.get()
if g == (None, None):
break
print(g)
time.sleep(1.0)
print("bluetooth closed")
if __name__ == '__main__':
_queue = Queue() # just one way to communicate to a thread
# pass an object reference so both main and thread have a way to communicate on this common queue
my_bluetooth = BlueToothFunctions(_queue)
my_bluetooth.start() # creates the thread and executes run() method
for i in range(5):
# communicate to the threaded functions
_queue.put(i)
_queue.put((None, None)) # optional, a way to cause the thread to end
my_bluetooth.join(timeout=5.0) # optional, pause here until thread ends
print('program complete')
I have functioning code that displays data in a GUI which is periodically updated with new information downloaded from the web. (The base code for the threaded approach was sourced from https://www.oreilly.com/library/view/python-cookbook/0596001673/ch09s07.html) I am using a threaded solution so as to improve blocking IO issues (IO code not included in the simplified code example below, as the IO does not appear to be the problem). The code runs fine if I run it as a single instance. However, it would be most convenient if I could use multiprocessing to run several instances of the code in parallel, using a different input list for each instance. When I try to implement the multiprocessing version, each separate process hangs during the attempt to create the root window: "window = tk.Tk()". Here is the working single instance version:
import threading
import random
import tkinter as tk
import random
import queue #Queue
import multiprocessing
import psutil
class GuiPartBase:
def __init__(self, master, queue, myList, endCommand):
self.queue = queue
# Set up the GUI
a = Label(master, text="Test Tkinter Display!")
a.pack()
## etc
def processIncoming(self):
"""Handle all messages currently in the queue, if any."""
while self.queue.qsize():
try:
result = (self.queue.get(0))
## do stuff with incoming data...
print('result =', result)
except queue.Empty:
# just on general principles...
pass
class ThreadedClientBase:
"""
Launch the main part of the GUI and the worker thread. periodicCall and
endApplication could reside in the GUI part, but putting them here
means that you have all the thread controls in a single place.
"""
def __init__(self, master, mylist):
"""
Start the GUI and the asynchronous threads. We are in the main
(original) thread of the application, which will later be used by
the GUI as well. We spawn a new thread for the worker (I/O).
"""
self.master = master
self.mylist = mylist
# Create the queue
self.queue = queue.Queue()
# Set up the GUI part
self.gui = GuiPartBase(self.master, self.queue, mylist, self.endApplication)
# Set up the thread to do asynchronous I/O
# More threads can also be created and used, if necessary
self.running = 1
self.thread1 = threading.Thread(target=self.workerThread1)
self.thread1.start()
# Start the periodic call in the GUI to check if the queue contains
# anything
self.periodicCall()
def periodicCall(self):
"""
Check every 200 ms if there is something new in the queue.
"""
self.gui.processIncoming()
if not self.running:
# This is the brutal stop of the system. You may want to do
# some cleanup before actually shutting it down.
import sys
sys.exit(1)
self.master.after(200, self.periodicCall)
def workerThread1(self):
"""
This is where we handle the asynchronous I/O. For example, it may be
a 'select( )'. One important thing to remember is that the thread has
to yield control pretty regularly, by select or otherwise.
"""
while self.running:
# simulate asynchronous I/O,
time.sleep(rand.random() * 1.5)
msg = rand.random()
self.queue.put(msg)
def endApplication(self):
self.running = 0
def runGUIthread(threadedList2Get):
print('entering runGUIthread...')
print('list2Get = ', threadedList2Get)
window = tk.Tk()
print('type of window = ', type(window))
print('window = ', window)
client = ThreadedClientBase(window, threadedList2Get)
print('type of client = ', type(client))
print('client = ', client)
window.mainloop()
if __name__ == '__main__':
rand = random.Random()
testList2a = ['abc','def','geh']
testList2b = ['xyz', 'lmn', 'opq']
allLists = [testList2a,testList2b]
runGUIthread(testList2a)
So, like I said, the above works - a single tkinter GUI is displayed appropriately without errors. However, if I attempt to implement multiprocessing with the following code below, the code spawns two processes as expected, and as documented by the printout of pid. However, each process prints the 'list2Get' (in runGUIthread), and then there is nothing else. There is no error message and the python code seems to have exited as there is no persistent process listed in the system activity monitor. Presumably the code is "hanging"/ exiting at the line "window = tk.TK()", as the line "print('type of window=',type(window))" is never executed:
if __name__ == '__main__':
rand = random.Random()
testList2a = ['abc','def','geh']
testList2b = ['xyz', 'lmn', 'opq']
allLists = [testList2a,testList2b]
#runGUIthread(testList2a)
for list in allLists:
p = multiprocessing.Process(target=runGUIthread, args=(list,))
p.start()
ps = psutil.Process(p.pid)
print('pid = ', ps)
#with multiprocessing.Pool(processes=2) as pool:
# pool.map(runGUIthread, allLists)
I am not experienced with multiprocessing, so perhaps I have implemented it incorrectly. I tried using multiprocessing.Pool(), with the same results.
I have not been able to find info indicating that tkinter can't spawn multiple GUI displays in the same program. In fact I found an instance of somebody accidentally spawning multiple GUI's, although this appears to be with Python 3.8 using concurrent.futures.ProcessPoolExecutor (Concurrent.futures opens new windows in tkinter instead of running the function). I am currently on Python 3.7, and was hoping not to have to reinstall a new enviroment to make this multiprocessing code work, although perhaps that is necessary...?
Other info: using python 3.7.6, tkinter 8.6.8, Eclipse 4.11.0, macOS10.13.6.
Any help appreciated.
You cannot use tkinter code across multiple processes. At least, you can't run the same tkinter code. It simply isn't designed to be used that way. When you create a root window, underneath the covers a tcl interpreter is created, and this interpreter can't be pickled or shared between processes, and doesn't use python's global interpreter lock.
In short, all of your GUI code needs to be in a single thread in a single process.
The following answer is a slightly better explanation, written by one of the developers on the Tcl core team: https://stackoverflow.com/a/38767665/7432. Here is the opening paragraph of that answer:
Each Tcl interpreter object (i.e., the context that knows how to run a Tcl procedure) can only be safely used from the OS thread that creates it. This is because Tcl doesn't use a global interpreter lock like Python, and instead makes extensive use of thread-specific data to reduce the number of locks required internally. (Well-written Tcl code can take advantage of this to scale up very large on suitable hardware.)
I found that this has been reported as part of a bug related to tkinter, python, and macOSX: https://bugs.python.org/issue33111.
(The bug was reported for python 2.7 and 3.6.4 and OSX 10.11.6, but apparently is still a problem with python 3.7.6 and OSX 10.13.6.)
However, there is a partial workaround (also reported at the same site), which for my case seems to work perfectly:
import multiprocessing
multiprocessing.set_start_method("spawn", force=True)
...
... other code same as initial ...
...
if __name__ == '__main__':
testList2a = ['abc','def','geh']
testList2b = ['xyz', 'lmn', 'opq']
allLists = [testList2a,testList2b]
with multiprocessing.Pool(processes=2) as pool:
pool.map(runGUIthread, allLists)
The result is the spawning of multiple GUIs, one for each process.
Per the bug report, in python 3.8, the default start method for multiprocessing when run on MacOS has been changed and is now "spawn" instead of "fork", so the problem will not reveal itself (unless you change the start method to be "fork", in which case the code will fail).
I have a Tkinter wrapper written over robocopy.exe, the code is organized as shown below :-
Tkinter Wrapper :
Spawns a new thread and pass the arguments, which includes source/destination and other parameters
(Note : Queue object is also passed to thread, since the thread will read the output from robocopy and will put in queue, main tkinter thread will keep on polling queue and will update Tkinter text widget with the output)
Code Snippet
... Code to poll queue and update tk widget ...
q = Queue.Queue()
t1 = threading.Thread(target=CopyFiles,args=(q,src,dst,), kwargs={"ignore":ignore_list})
t1.daemon = True
t1.start()
Thread : (In a separate file)
Below is the code snippet from thread
def CopyFiles(q,src,dst,ignore=None):
extra_args = ['/MT:15', '/E', '/LOG:./log.txt', '/tee', '/r:2', '/w:2']
if len(ignore) > 0:
extra_args.append('/xf')
extra_args.extend(ignore)
extra_args.append('/xd')
extra_args.extend(ignore)
command_to_pass = ["robocopy",src, dst]
command_to_pass.extend(extra_args)
proc = subprocess.Popen(command_to_pass,stdout=subprocess.PIPE)
while True:
line = proc.stdout.readline()
if line == '':
break
q.put(line.strip())
Code which is called, when tkinter application is closed :-
def onQuit(self):
global t1
if t1.isAlive():
pass
if tkMessageBox.askyesno("Title", "Do you really want to exit?"):
self.destroy()
self.master.destroy()
Problem
Whenever, I close the tkinter application when robocopy is running, python application closes but the robocopy.exe keeps on running.
I have tried setting the thread as daemon, but it has no effect. How can I stop robocopy.exe when onQuit method is called?
To simplify things let's ignore Tkinter and the fact that a separate thread is used.
The situation is that your app spawns a subprocess to execute an external program (robocopy.exe in this question), and you'd need to stop the spawned program from you're application on a certain event (when the Tkinter app is closing in this question).
This requires an inter process communication mechanism, so the spawned process would be notified of the event, and reacts accordingly. A common mechanism is to use signals provided by the OS.
You could send a signal (SIGTERM) to the external process and ask from it to quit. Assuming that the program reacts to signal as expected (most well written applications do) you'll get the desired behavior (the process will terminate).
Using the terminate method on the subprocess will send the proper signal of current platform to the subprocess.
You'd need a reference to the subprocess object proc in the onQuit function (from the provided code I see onQuit is a function and not an object method, so it could use a global variable to access proc), so you can call the terminate method of the process:
def onQuit(self):
global t1, proc
if t1.isAlive():
pass
# by the way I'm not sure about the logic, but I guess this
# below statement should be an elif instead of if
if tkMessageBox.askyesno("Title", "Do you really want to exit?"):
proc.terminate()
self.destroy()
self.master.destroy()
This code assumes you're storing the reference to the subprocess in the global scope, so you'd have to modify the CopyFiles as well.
I'm not sure how robocopy handles terminate signals and I'm guessing that's not something that we have any control over.
If you had more control on the external program (could modify the source), there might have been more options, for example sending messages using stdio, or using a shared memory, etc.
My program spawns multiple processes to do some time consuming calculations. The results are then collected in a queue and a writer process writes them into an output file.
Below is a simplified version of my code which should illustrate my issue. If I comment out the flush statement in the Writer class, test.out is empty at the end of the program.
What exactly is happening here? Is test.out not closed properly? Was it naive to assume that passing the file handle to an autonomous process should work in the first place?
from multiprocessing import JoinableQueue, Process
def main():
queue = JoinableQueue()
queue.put("hello world!")
with open("test.out", "w") as outhandle:
wproc = Writer(queue, outhandle)
wproc.start()
queue.join()
with open("test.out") as handle:
for line in handle:
print(line.strip())
class Writer(Process):
def __init__(self, queue, handle):
Process.__init__(self)
self.daemon = True
self.queue = queue
self.handle = handle
def run(self):
while True:
msg = self.queue.get()
print(msg, file=self.handle)
#self.handle.flush()
self.queue.task_done()
if __name__ == '__main__':
main()
The writer is a separate process. The data it writes to the file might be buffered, and because the process keeps running, it doesn't know that it should flush the buffer (write it to the file). Flushing manually is the right thing to do.
Normally, the file would be closed when you exit the with block, and this would flush the buffer. But the parent process doesn't know anything about its children's buffers, so the child has to flush it's own buffer (closing the file should work too - that doesn't close the file for the parent, at least on Unix systems).
Also, check out the Pool class from multiprocessing (https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing.pool) - it might save you some work.
I had the same issue with writing the output from a processed pooled data set to a file directly.
This was sorted out by first collecting the pooled results into a list and then finally wrote to the file.
This is mainly happening due to, the hard disk can't write the speed which processor processes, the buffered content getting lost or pooled data is not in the proper order.
Best thing to do is allocate pooled output into a memory location or to a variable (string or a list) first and then write to a file, that way things should get sorted out.
Good lick!
I want to run a function continuoulsy in parallel to my main process.How do i do it in python?multiprocessing?threading or thread module?
I am new to python.Any help much appreciated.
If the aim is to capture stderr and do some action you can simply replace sys.stderr by a custom object:
>>> import sys
>>> class MyLogger(object):
... def __init__(self, callback):
... self._callback = callback
... def write(self, text):
... if 'log' in text:
... self._callback(text)
... sys.__stderr__.write(text) # continue writing to normal stderr
...
>>> def the_callback(s):
... print('Stderr: %r' % s)
...
>>> sys.stderr = MyLogger(the_callback)
>>> sys.stderr.write('Some log message\n')
Stderr: 'Some log message'
Some log message
>>>
>>> sys.stderr.write('Another message\n')
Another message
If you want to handle tracebacks and exceptions you can use sys.excepthook.
If you want to capture logs created by the logging module you can implement your own Handler class similar to the above Logger but reimplementing the emit method.
A more interesting, but less practical solution would be to use some kind of scheduler and generators to simulate parallel execution without actually creating threads(searching on the internet will yield some nice results about this)
It definitely depends on your aim, but I'd suggest looking at the threading module. There are many good StackOverflow questions on the use of threading and multithreading (e.g., Multiprocessing vs Threading Python).
Here's a brief skeleton from one of my projects:
import threading # Threading module itself
import Queue # A handy way to pass tasks to your thread
job_queue = Queue.Queue()
job_queue.append('one job to do')
# This is the function that we want to keep running while our program does its thing
def function_to_run_in_background():
# Do something...here is one form of flow control
while True:
job_to_do = job_queue.get() # Get the task from the Queue
print job_to_do # Print what it was we fetched
job_queue.task_done() # Signal that we've finished with that queue item
# Launch the thread...
t = threadingThread(target=function_to_run_in_background, args=(args_to_pass,))
t.daemon = True # YOU MAY NOT WANT THIS: Only use this line if you want the program to exit without waiting for the thread to finish
t.start() # Starts the thread
t.setName('threadName') # Makes it easier to interact with the thread later
# Do other stuff
sleep(5)
print "I am still here..."
job_queue.append('Here is another job for the thread...')
# Wait for everything in job_queue to finish. Since the thread is a daemon, the program will now exit, killing the thread.
job_queue.join()
if you just want to run a function in background in the same process, do:
import thread
def function(a):
pass
thread.start_new(function, (1,)) # a is 1 then
I found that client-server architecture was solution for me. Running server, and spawning many clients talking to server and between clients directly, something like messenger.
Talking/comunication can be achieved through network or text file located in memory, (to speed things up and save hard drive).
Bakuriu: give u a good tip about logging module.