Stop a thread if function delays the response - python

I have the following loop which is calling the getHLS function in a separate thread for each of the text lines. The problem is that the getHLS function might be really slow at some times, and I am looking for a way to "timeout" a thread if the function does not return anything for 10 seconds.
links = open("links.txt")
lines = links.readlines()
linenumber = 0
for line in lines:
linenumber += 1
thread = threading.Thread(target = getHLS, args = (line, linenumber))
thread.setDaemon(False)
thread.start()
if threading.active_count() == 50:
thread.join(10)

Related

Is there a way to return back to a main menu loop after all tasks in a thread are completed in Python 3?

I'm making a small program that can do some small tasks for me with multiple threads. Currently, I have a small menu loop that asks me what task I want to run, it then asks me for the information it needs to complete the task, and sets up a specified number of threads to complete the task through a queue.
Currently, my menu loop code looks like this and is run when the program starts:
# menu loop
def menu_loop():
while True:
choice = menu()
if choice is not None:
break
# configure program
if choice == 1:
while True:
config_choice = configure() # another menu loop
if config_choice is not None:
if config_choice == 0:
clear_screen()
print(f"[*] {red}exiting config...\n")
menu_loop()
elif config_choice == 1:
num_of_threads = int(input("number of threads: "))
clear_screen()
print(f"[*] {green}number of threads set to {num_of_threads}.\n")
menu_loop()
elif config_choice == 2:
folder_path = input("folder path for results: ")
clear_screen()
print(f"[*] {green}path to results set to {folder_path}.\n")
menu_loop()
break
if choice == 2:
strings = [line.strip() for line in open((input("path to string file: ")), "r")] # load strings
for string in strings:
q.put(string) # insert each string into queue
print(f"{green}loaded {len(strings)} strings into the checker.") # log
with open(string_results_ml, "a+") as sr_multi_line:
sr_multi_line.write("------------------------------\n") # first line of the file
for i in range(int(input("number of threads: "))):
Thread(target=check_string, args=(q,)).start() # start threads
The code for the check_string function is as follows:
def check_string(q):
while True:
try:
work = q.get()
if q.empty(): # needed because except doesn't trigger half the time
sleep(1)
quit()
except queue.Empty:
quit()
headers = {"Content-Type": "application/json"}
url = "https://[redacted]/{}".format(work)
try:
r = requests.get(url, headers=headers)
jd = r.json()
if r.status_code == 200:
try:
result = jd["result"]
# write working strings and their data to the results file
with open(string_results, "a+") as s_r:
s_r.write(f"{string} - {result}\n")
with lock:
print(f"[{blue}{r.status_code}{rs}]{green} got result for {string}")
sleep(0.3) # sleep for a bit, time out
except:
print(f"[{blue}{r.status_code}{rs}]{red} something happened while parsing the data.")
else:
with lock:
print(f"[{blue}{r.status_code}{rs}]{red} no results for {string}")
except:
with lock:
print(f"{red}fatal error while checking {string}! the string has been written to a separate file for you to check later.")
with open(fatal, "a+") as f_e:
f_e.write(string + "\n")
q.task_done()
Right now, what happens after all the strings are done being checked is that the program just hangs and doesn't go back to the menu loop. There is no code for it to go back to the loop currently but it should at least quit after all strings are done being checked. It just hangs currently. When I press CTRL-C, it gives me the following exception:
^CException ignored in: <module 'threading' from '/usr/local/Cellar/python/3.7.7/Frameworks/Python.framework/Versions/3.7/lib/python3.7/threading.py'>
Traceback (most recent call last):
File "/usr/local/Cellar/python/3.7.7/Frameworks/Python.framework/Versions/3.7/lib/python3.7/threading.py", line 1307, in _shutdown
lock.acquire()
KeyboardInterrupt
How can I make the program go back to the main menu loop after all strings are done being checked instead of just hanging?
Your threads are stopping because the queue is empty and they are waiting for new item to be queued.
By default, queue.get() blocks if the queue is empty. In your code, check the queue before q.get(). In addition, change q.get() to q.get(false) to prevent blocking (an error will be thrown if empty queue).
while True:
try:
if q.empty(): quit() # exit thread if no items left - add this line
work = q.get(false) # don't block to be safe - update this line
if q.empty(): # needed because except doesn't trigger half the time
time.sleep(1)
quit()
except queue.Empty:
quit()

Why threads are not working simultaneously?

For starters i'm new in Python.
I will be brief. I'm trying to fetch all links from the website using threads.
The problem is that threads are waiting for their turn, but I want them to work simultaneously with other threads.
For example, I set the number of threads to 2, and then get 2 chunks with links.
I want the first thread to iterate over the links in the first chunk, and the second thread to iterate over the links in the second chunk SIMULTANEOUSLY. But my program works in such a way that threads are waiting for their turn. What am I doing wrong, guys? Much obliged for your help
My code:
target()
def url_target(text, e):
global links
global chunks
number = int(sys.argv[1])
for m in text:
time.sleep(0.2)
print(m, e)
print('\n')
main()
def main():
global links
global chunks
url = sys.argv[2]
links = fetch_links(url)
number = int(sys.argv[1])
url_chunk = len(links) // number
start, stop = 0, url_chunk + len(links) % number
chunks = []
time.sleep(1)
while start < len(links):
for i in range(number):
part_links = links[start:stop]
p = Thread(name='myThread', target=url_target, args=(part_links, i+1))
p.start()
chunks.append(p)
start, stop = stop, stop + url_chunk
p.join()
time.sleep(1)
while chunks:
d = chunks.pop()
print(f'{d.ident} done')
Thanks! I'd appreciate any help you can give!
p.join() blocks until p completes. You want to start all the threads first, then wait on each in turn.
while start < len(links):
for i in range(number):
part_links = links[start:stop]
p = Thread(name='myThread', target=url_target, args=(part_links, i+1))
p.start()
chunks.append(p)
start, stop = stop, stop + url_chunk
time.sleep(1)
for p in chunks:
p.join()
If you aren't planning on doing anything while waiting for all the threads to complete, this is fine. However, you might want to block until any thread completes, rather than an arbitrarily chosen one. A thread pool can help, but
a simple way to implement a thread pool is to wait for a short period of time for a thread to complete. If it doesn't, wait on another one and come back to the first one later. For example,
from collections import deque
chunks = deque()
for start in range(0, len(links), url_chunk):
for i in range(1, number+1):
part_links = links[start:start + url_chunk]
p = Thread(name='myThread', target=url_target, args=(part_links, i))
p.start()
chunks.append(p)
while chunks:
p = chunks.popleft()
p.join(5) # Wait 5 seconds, or some other small period of time
if p.is_alive():
chunks.append(p) # put it back

Python processes fail to start

I'm running the following code block in my application. While running it with python3.4 I get 'python quit unexpectedly' popup on my screen. The data missing from the aOut file is for a bunch of iterations and it is in chunks. Say 0-1000 items in the list are not present and others have the data. The other items run properly on their own without intervention.
While using python2.7 the failures are for items ~3400-4400 in the list.
On logging I see that, the detect() call are not made for processes from 0-1000 (i.e) process.start() calls dont trigger the detect method.
I am doing this on MAC OS Sierra. What is happening here? Is there a better way to achieve my purpose?
def detectInBatch (aList, aOut):
#iterate through the objects
processPool = []
pthreadIndex = 0
pIndex = 0
manager = Manager()
dict = manager.dict()
outline = ""
print("Threads: ", getMaxThreads()) # max threads is 20
for key in aList:
print("Key: %s, pIndex: %d"%(key.key, pIndex))
processPool.append(Process(target=detect, args=(key.key, dict)))
pthreadIndex = pthreadIndex + 1
pIndex = pIndex + 1
#print("Added for %d" %(pIndex))
if(pthreadIndex == getMaxThreads()):
print("ProcessPool size: %d" %len(processPool))
for process in processPool:
#print("Started")
process.start()
#end for
print("20 Processes started")
for process in processPool:
#print("Joined")
process.join()
#end for
print("20 Processes joined")
for key in dict.keys():
outline = outline + dict.get(key)
#end for
dict.clear()
pthreadIndex = 0
processPool = []
#endif
#endfor
if(pthreadIndex != 0):
for process in processPool:
# print("End Start")
process.start()
#end for
for process in processPool:
# print("End done")
process.join()
#end for
for key in dict.keys():
print ("Dict: " + dict.get(key))
outline = outline + dict.get(key)
#end for
#endif
aOut.write(outline)
#end method detectInBatch
To avoid the 'unexpected quit' perhaps try to ignore the exception with
try:
your_loop()
except:
pass
Then, put in some logging to track the root cause.

Causing a thread to stop while stuck within a while loop?

Is is possible to stop a thread prematurely when it is stuck inside a while loop? Below is my sample code, which runs correctly, since each time it calls loop_thread it will check to see if the threading.Event() flag is set. When attempting to run the code for a file that processes information much longer than each second, there is no way to stop the entire function from continuing its execution until the next iteration. For example, if I run dld_img_thread, it takes about 5 minutes to complete its execution and recheck the while loop to see if should proceed. What I want to have happen is kill the dld_img_thread at a time shorter than 5 minutes (e.g. 1 minute). I don't care if the data is lost, just that the thread stops before the function finishes execution. Thank you
import threading, time, pythoncom, read_mt0
import powerfail_debugport_reader as pf_dbg_rdr
import powerfail_firmware_downloader as pf_fwdld
def loop_thread(thread_name, thread_event):
loopCnt = 0
print "\nstarting {}".format(thread_name)
print "is {0} alive? {1}\n".format(thread_name, L00P_thread.is_alive())
while not thread_event.is_set():
print("value of loopCnt = {}".format(loopCnt))
loopCnt += 1
time.sleep(1)
print('stopping {}\n'.format(thread_name))
def image_dld(thread_name, thread_event):
pythoncom.CoInitializeEx(pythoncom.COINIT_MULTITHREADED)
print "\nstarting {}".format(thread_name)
print "is {0} alive? {1}\n".format(thread_name, dld_img_thread.is_alive())
while not thread_event.is_set():
pf_fwdld.power_fail_test()
print('stopping {}'.format(thread_name))
def debug_port_thread(thread_name, thread_event):
pythoncom.CoInitializeEx(pythoncom.COINIT_MULTITHREADED)
print "\nstarting {}".format(thread_name)
print "is {0} alive? {1}\n".format(thread_name, debug_thread.is_alive())
pf_dbg_rdr.debug_port_reader()
print('\nstopping {}'.format(thread_name))
def main():
global L00P_thread, debug_thread
pf_dbg_rdr.samurai_event = threading.Event()
L00P_thread = threading.Thread(target=loop_thread, \
args=('L00P_thread', pf_dbg_rdr.samurai_event))
dld_img_thread = threading.Thread(target=image_dld, \
args=('image_download', pf_dbg_rdr.samurai_event))
debug_thread = threading.Thread(target=debug_port_thread, \
args=('debug_port_reader', pf_dbg_rdr.samurai_event))
L00P_thread.start()
dld_img_thread.start()
debug_thread.start()
debug_thread.join()
if __name__ == '__main__':
main()
print('processes stopped')
print "Exiting Main Thread"
Use a second variable in your while condition that you can change once your timeout is reached.
For example:
shouldRun = True
while not thread_event.is_set() and shouldRun:
print("value of loopCnt = {}".format(loopCnt))
loopCnt += 1
time.sleep(1)
if loopCnt > 60: shouldRun = False
would stop after 60 iterations (about 60 seconds given you sleep for 1 second on each iteration).

Python Tkinter, Stop a threading function

I'm currently developing a GUI for a 3D printer and I'm having a problem of how to stop a threading function. I want to be able to click a button that has another function within my GUI that will stop the threading function from sending strings of G-code across the serial port. Currently the function has threading incorporated to allow other functions to be triggered during printing. I would greatly appreciate some advice on how I would incorporate this stop feature.
Below is the function that opens a G-code file and sends each line across the serial port.
def printFile():
def callback():
f = open(entryBox2.get(), 'r');
for line in f:
l = line.strip('\r')
ser.write("<" + l + ">")
while True:
response = ser.read()
if (response == 'a'):
break
t = threading.Thread(target=callback)
t.start()
Threads cannot be stopped, they have to stop themselves. So you need to send a signal to the thread that it's time to stop. This is usually done with an Event.
stop_event = threading.Event()
def callback():
f = open(entryBox2.get(), 'r');
for line in f:
l = line.strip('\r')
ser.write("<" + l + ">")
while True:
response = ser.read()
if (response == 'a'):
break
if stop_event.is_set():
break
t = threading.Thread(target=callback)
t.start()
Now if you set the event elsewhere in your code:
stop_event.set()
The thread will notice that, break the loop and die.
Use a global variable as a condition for the thread to stop.
send_gcode = True
def printFile():
def print_thread():
f = open(entryBox2.get(), 'r');
for line in f:
if not send_gcode:
break
l = line.strip('\r')
ser.write("<" + l + ">")
while True:
response = ser.read()
if (response == 'a'):
break
t = threading.Thread(target=print_thread)
send_gcode = True
t.start()
The thread will run until send_gcode is set to False (by e.g. a callback for a button:
def stop_callback(event):
global send_gcode
send_gcode = False

Categories

Resources