Python3.6
First I put some items in a queue, then start a thread and called join() of the queue in the main thread, then called get() in the thread loop, when the size of queue == 0, I called task_done() and break loop and exit from the thread. But the join() method still blocked in the main thread. I can not figure out what`s wrong.
Below is the code
Thanks
import queue
import threading
def worker(work_queue):
while True:
if work_queue.empty():
print("Task 1 Over!")
work_queue.task_done()
break
else:
_ = work_queue.get()
print(work_queue.qsize())
# do actual work
def main():
work_queue = queue.Queue()
for i in range(10):
work_queue.put("Item %d" % (i + 1))
t = threading.Thread(target=worker, args=(work_queue, ))
t.setDaemon(True)
t.start()
print("Main Thread 1")
work_queue.join()
print("Main Thread 2")
t.join()
print("Finish!")
if __name__ == "__main__":
main()
task_done should be called for each work item which is dequeued and processed, not once the queue is entirely empty. (There'd be no reason for that-- the queue already knows when it's empty.) join() will block until task_done has been called as many times as put was called.
So:
def worker(work_queue):
while True:
if work_queue.empty():
print("Task 1 Over!")
break
else:
_ = work_queue.get()
print(work_queue.qsize())
# do actual work
Note that it's weird for a worker to exit as soon as it sees an empty queue. Normally it would get() with blocking, and only exit when it got a "time to exit" work item out of the queue.
Related
Hi guy I am writing a socket programming in python and using multithreading but I have one problem when I want to exit a program It seem like I can not exit a running thread.
picture of my code
def create_workers():
for _ in range(NUMBER_OF_THREADS):
t = threading.Thread(target=work)
t.daemon = True # End the Thread
t.start()
def work():
while True:
x = queue.get()
if x == 1:
create_socket()
bind_socket()
accept_connections()
if x == 2:
start_turtle()
break
queue.task_done()
the function create_workers are running two thread and targeting function work but I don't really know to terminate it after I break a while loop in function work
Use a threading.Event instance as a flag that you set just before work ends, and check if it is set at the start of each iteration of the infinite loop.
If your work function is more complicated, with multiple return statements, you could chuck the event.set() call into the finally block of a try statement.
threading.Event is thread-safe.
As pointed out by user2357112 supports Monica, making the threads daemonic doesn't make sense, so I've removed that line.
def create_workers():
event = threading.Event()
for _ in range(NUMBER_OF_THREADS):
t = threading.Thread(target=work, args=(event,))
t.start()
def work(event):
while True:
if event.is_set():
return
x = queue.get()
if x == 1:
create_socket()
bind_socket()
accept_connections()
if x == 2:
start_turtle()
break
queue.task_done()
event.set()
you can use python-worker (link)
from worker import abort_all_thread
## your running code
abort_all_thread() # this will kill any active threads
I want to kill a thread in python. This thread can run in a blocking operation and join can't terminate it.
Simular to this:
from threading import Thread
import time
def block():
while True:
print("running")
time.sleep(1)
if __name__ == "__main__":
thread = Thread(target = block)
thread.start()
#kill thread
#do other stuff
My problem is that the real blocking operation is in another module that is not from me so there is no place where I can break with a running variable.
The thread will be killed when exiting the main process if you set it up as a daemon:
from threading import Thread
import time
def block():
while True:
print("running")
time.sleep(1)
if __name__ == "__main__":
thread = Thread(target = block, daemon = True)
thread.start()
sys.exit(0)
Otherwise just set a flag, I'm using a bad example (you should use some synchronization not just a plain variable):
from threading import Thread
import time
RUNNING = True
def block():
global RUNNING
while RUNNING:
print("running")
time.sleep(1)
if __name__ == "__main__":
thread = Thread(target = block, daemon = True)
thread.start()
RUNNING = False # thread will stop, not killed until next loop iteration
.... continue your stuff here
Use a running variable:
from threading import Thread
import time
running = True
def block():
global running
while running:
print("running")
time.sleep(1)
if __name__ == "__main__":
thread = Thread(target = block)
thread.start()
running = False
# do other stuff
I would prefer to wrap it all in a class, but this should work (untested though).
EDIT
There is a way to asynchronously raise an exception in a separate thread which could be caught by a try: except: block, but it's a dirty dirty hack: https://gist.github.com/liuw/2407154
Original post
"I want to kill a thread in python." you can't. Threads are only killed when they're daemons when there are no more non-daemonic threads running from the parent process. Any thread can be asked nicely to terminate itself using standard inter-thread communication methods, but you state that you don't have any chance to interrupt the function you want to kill. This leaves processes.
Processes have more overhead, and are more difficult to pass data to and from, but they do support being killed by sending SIGTERM or SIGKILL.
from multiprocessing import Process, Queue
from time import sleep
def workfunction(*args, **kwargs): #any arguments you send to a child process must be picklable by python's pickle module
sleep(args[0]) #really long computation you might want to kill
return 'results' #anything you want to get back from a child process must be picklable by python's pickle module
class daemon_worker(Process):
def __init__(self, target_func, *args, **kwargs):
self.return_queue = Queue()
self.target_func = target_func
self.args = args
self.kwargs = kwargs
super().__init__(daemon=True)
self.start()
def run(self): #called by self.start()
self.return_queue.put(self.target_func(*self.args, **self.kwargs))
def get_result(self): #raises queue.Empty if no result is ready
return self.return_queue.get()
if __name__=='__main__':
#start some work that takes 1 sec:
worker1 = daemon_worker(workfunction, 1)
worker1.join(3) #wait up to 3 sec for the worker to complete
if not worker1.is_alive(): #if we didn't hit 3 sec timeout
print('worker1 got: {}'.format(worker1.get_result()))
else:
print('worker1 still running')
worker1.terminate()
print('killing worker1')
sleep(.1) #calling worker.is_alive() immediately might incur a race condition where it may or may not have shut down yet.
print('worker1 is alive: {}'.format(worker1.is_alive()))
#start some work that takes 100 sec:
worker2 = daemon_worker(workfunction, 100)
worker2.join(3) #wait up to 3 sec for the worker to complete
if not worker2.is_alive(): #if we didn't hit 3 sec timeout
print('worker2 got: {}'.format(worker2.get_result()))
else:
print('worker2 still running')
worker2.terminate()
print('killing worker2')
sleep(.1) #calling worker.is_alive() immediately might incur a race condition where it may or may not have shut down yet.
print('worker2 is alive: {}'.format(worker2.is_alive())
def check_incoming_messages_to_client(incoming_chat_messages,uri_str, kill_threads_subscript):
global kill_threads
messaging = Pyro4.Proxy(uri_str)
while(TRUE):
if(messaging.get_connection() == 'yes'):
msg = messaging.read_messages_to_client()
if (msg):
incoming_chat_messages.insert(END, msg)
if(kill_threads[kill_threads_subscript]):
print('break loop')
break
print('start')
t1 = Thread(target=check_incoming_messages_to_client(incoming_chat_messages[length-1],uri_str, kill_threads_subscript))
t1.setDaemon(True)
t1.start()
print('end')
The code above only print start and not end. That means it was stuck in the infinite loop, which must not be because it was threaded. How will I fix it?
Thread(target=check_incoming_messages_to_client(incoming_chat_messages[length-1],uri_str, kill_threads_subscript)) calls your function, then passes the result as the target (except since it never ends, no result ever materializes, and you never even construct the Thread).
You want to pass the function uncalled, and the args separately so the thread calls it when it runs, rather than the main thread running it before the worker thread even launches:
t1 = Thread(target=check_incoming_messages_to_client,
args=(incoming_chat_messages[length-1], uri_str, kill_threads_subscript))
I have a simple python app that will not terminate if i use queue.join(). Below is the code:
import threading
import Queue
q = Queue.Queue()
for i in range(5):
q.put("BLAH")
def worker():
while True:
print q.qsize()
a = q.get()
print q.qsize()
q.task_done()
print q.qsize()
for i in range(2):
t = threading.Thread(target=worker())
t.daemon = True
t.start()
q.join()
I've also created a watchdog thread that print's threading.enumerate(), then sleeps for 2 seconds. The only thread left is the MainThread, and the queue size is in fact 0. This script will never terminate. I have to ctrl + z, then kill it. What's going on?
t = threading.Thread(target=worker)
You want to pass a reference to the worker function, you should not call it.
worker function does not exit, therefore it will not join. Second you probably want to join thread not queue.
I'm not an expert in python threading, but queue is just for data passing between threads.
I have some producer function which rely on I/O heavy blocking calls and some consumer functions which too rely on I/O heavy blocking calls. In order to speed them up, I used the Gevent micro-threading library as glue.
Here's what my paradigm looks like:
import gevent
from gevent.queue import *
import time
import random
q = JoinableQueue()
workers = []
producers = []
def do_work(wid, value):
gevent.sleep(random.randint(0,2))
print 'Task', value, 'done', wid
def worker(wid):
while True:
item = q.get()
try:
print "Got item %s" % item
do_work(wid, item)
finally:
print "No more items"
q.task_done()
def producer():
while True:
item = random.randint(1, 11)
if item == 10:
print "Signal Received"
return
else:
print "Added item %s" % item
q.put(item)
for i in range(4):
workers.append(gevent.spawn(worker, random.randint(1, 100000)))
#This doesnt work.
for j in range(2):
producers.append(gevent.spawn(producer))
#Uncommenting this makes this script work.
#producer()
q.join()
I have four consumer and would like to have two producers. The producers exit when they a signal i.e. 10. The consumers keep feeding off this queue and the whole task finishes when the producers and consumers are over.
However, this doesn't work. If I comment out the for loop which spawns multiple producers and use only a single producer, the script runs fine.
I can't seem to figure out what I've done wrong.
Any ideas?
Thanks
You don't actually want to quit when the queue has no unfinished work, because conceptually that's not when the application should finish.
You want to quit when the producers have finished, and then when there is no unfinished work.
# Wait for all producers to finish producing
gevent.joinall(producers)
# *Now* we want to make sure there's no unfinished work
q.join()
# We don't care about workers. We weren't paying them anything, anyways
gevent.killall(workers)
# And, we're done.
I think it does q.join() before anything is put in the queue and exits immediately. Try joining all producers before joining queue.
What you want do to is to block the main program while the producers and workers communicate. Blocking on the queue will wait until the queue is empty and then yield, which could be immediately. Put this at the end of your program instead of q.join()
gevent.joinall(producers)
I have met same issues like yours. The main problem with your code was that your producer has been spawned in gevent thread which make worker couldn't get task immediately.
I suggest that you should run producer() in the main process not spawn in gevent thread When the process run met the producer which could push the task immediately.
import gevent
from gevent.queue import *
import time
import random
q = JoinableQueue()
workers = []
producers = []
def do_work(wid, value):
gevent.sleep(random.randint(0,2))
print 'Task', value, 'done', wid
def worker(wid):
while True:
item = q.get()
try:
print "Got item %s" % item
do_work(wid, item)
finally:
print "No more items"
q.task_done()
def producer():
while True:
item = random.randint(1, 11)
if item == 10:
print "Signal Received"
return
else:
print "Added item %s" % item
q.put(item)
producer()
for i in range(4):
workers.append(gevent.spawn(worker, random.randint(1, 100000)))
Codes above make sense.. :)