I am implementing a Flask application and I'm calling a function A, based on a request. Inside the function A, there is another function called B and it is called. But, I don't need to wait until the end of the execution of function B to return the output from function A. I have done it with the following code implementation.
from threading import Thread
def functionA():
result = doSomething1()
Thread(target=functionB).start()
return result
def functionB():
# Do something after the execution of doSomething1()
Here, I am starting a new thread and do what I need to do but, I do not terminate the newly started thread. Do I need to terminate that thread by myself? If so, what is the best way to do that?
A thread will terminate on its own. To see if it is alive you can use thread.is_Alive()
If you must force termination of a thread, use thread.join() would be the closes option, as this blocks the calling thread until the thread in question has finished.
Also refer to the docs for more info on the Threading functions
https://docs.python.org/3/library/threading.html
Related
I have a function in thread A which needs to wait until a function in thread B is called.
The function in thread B is called periodically, so it just needs to wait until the next time it is called. This allows me to sync up with it.
How would I do this?
(Sorry if this is trivial.)
It may be a principle of computer science that no multithreading question is trivial.
There are various ways to do this, but one of the simplest involves the use of a threading.Event object. Events are the simplest of the so-called synchronization primitives. See the manual section on the threading module for more ideas. Here is a working example:
#! python3.8
import threading
import time
t0 = time.time()
def elapsed_time():
return time.time() - t0
class StopMe:
def __init__(self):
self.running = True
def main():
ev1 = threading.Event()
stop = StopMe()
th1 = threading.Thread(target=thread1, args=(ev1, stop))
th1.start()
for _ in range(10):
ev1.wait()
print("The function was just called", elapsed_time())
ev1.clear()
stop.running = False
th1.join()
print("Exit", elapsed_time())
def thread1(event, stop):
def a_function():
event.set()
print("I am the function", elapsed_time())
while stop.running:
time.sleep(1.0)
a_function()
main()
Output:
I am the function 1.0116908550262451
The function was just called 1.0116908550262451
I am the function 2.0219264030456543
The function was just called 2.0219264030456543
I am the function 3.0322916507720947
The function was just called 3.0322916507720947
I am the function 4.033170938491821
The function was just called 4.033170938491821
I am the function 5.043376445770264
The function was just called 5.043376445770264
I am the function 6.043909788131714
The function was just called 6.043909788131714
I am the function 7.054021596908569
The function was just called 7.054021596908569
I am the function 8.06399941444397
The function was just called 8.06399941444397
I am the function 9.064924716949463
The function was just called 9.064924716949463
I am the function 10.066757678985596
The function was just called 10.066757678985596
I am the function 11.076870918273926
Exit 11.076870918273926
Some things to note here:
Once you put a synchronization primitive into your code, you need to give some thought about how to terminate the thread gracefully, and how to terminate the application as a whole. In this example, the threads communicate through the little "StopMe" object, and through the Event object. Note that the main thread may have to wait one second until the secondary thread finishes its sleep function. That occurs if thread1 begins its time delay before the main thread calls the join function. That didn't happen in my test run but it might, depending on how CPU time slices are given to the different threads. If that's not acceptable to you, you have to write more code to get around it.
Also note that the function call ev1.wait() will block the main thread until the event is set from the secondary thread. In a GUI application that is not what you want.
I ran this with Python3.8 but the program doesn't use any version-specific features, so it should work the same with any reasonably recent version of Python.
My code runs N number of threads. I want to stop specific threads on some condition but the remaining threads should continue running. I am doing some operation once each thread finishes its job. Is there a way to stop running thread in Python 3.
My current code is implemented in Python2 which does this by "_Thread__stop()". Is there any identical thing in Python3?
The practice is to "signal" the thread that it is time to finish and then the thread needs to exit. This is not killing like you kill a process but a regular state machine behavior of your thread function.
For example, suppose your thread is lopping. You should insert an if statement inside the loop that instructing the thread function to break or return if stop is True. The stop variable should be a shared variable with the main thread (or the thread who need to stop out thread) that will change it to True. usually after this, the stopper thread will want to wait for the thread completion by join()
It's a bad habit to kill a thread, better is to create a "flag" which will tell you when your thread made its work done.
Consider the following example:
import threading
import random
class CheckSomething(threading.Thread):
def __init__(self, variable):
super(CheckSomething, self).__init__()
self.start_flag = threading.Event()
self.variable = variable
def check_position(self, variable):
x = random.randint(100)
if variable == x:
self.stop_checking()
def run(self):
while True:
self.check_position(self.variable)
def stop_checking():
self.start_flag.set()
def stopped():
return self.start_flag.is_set()
The set() method of Event() set its status to True. More you can read in docs: https://docs.python.org/3.5/library/threading.html
So you need to call stop_checking() when you meet a condition where you want exit.
I'm trying to call a multiprocessing function from a python thread to avoid the Global Interpreter Lock affecting my multiprocessing function.
The logic looks like this:
python --- ongoing python processing...
\
-> py thread -> py multiprocessing call -> C-code (dll/ctypes)
Does this make sense? Will the C code run on a separate core, or is this too complex to work?
Thanks.
EDIT: Thanks for the reply. I should clarify, but I need to call on a second thread because I have to first make a python array and then pass the pointer to a C function. So I can't call the multiprocessing function too early (and also the main python processing needs to start and continue seamlessly).
EDIT: Here's the code logic and why I can't call a 2nd process inline with main code:
main():
...
p = Process(target=save_c, args=(...,))
p.start()
p.join() #Main thread will lock here and wait until return;
#Other code that needs to be processed follows the multiprocess call
save_c():
''' Function which calls C-module '''
_sum = ctypes.CDLL('/home/pi/murphylab_picam/temp/libsum.so')
_sum.c_function.argtypes = (ctypes.c_int, ctypes.POINTER(ctypes.c_int))
_sum.c_function(ctypes.c_int(num_numbers), array_type(*_sum.numbers))
What am I missing here? Is there a different way to use multiprocessing inline with ongoing processing?
You don't need to join immediately after you create a process as long as you don't want to wait for that process to finish before continuing.
This is the principle of concurrent programming.
Important thing is that you eventually call join or your main process will terminate leaving the child orphan.
child_process = Process(....)
child_process.start() # the child process will start its independend execution here
do_some_stuff()
do_some_more_stuff()
child_process.join() # here you are waiting for it to finish
I have a function I'm calling every 5 seconds like such:
def check_buzz(super_buzz_words):
print 'Checking buzz'
t = Timer(5.0, check_buzz, args=(super_buzz_words,))
t.dameon = True
t.start()
buzz_word = get_buzz_word()
if buzz_word is not 'fail':
super_buzz_words.put(buzz_word)
main()
check_buzz()
I'm exiting the script by either catching a KeyboardInterrupt or by catching a System exit and calling this:
sys.exit('\nShutting Down\n')
I'm also restarting the program every so often by calling:
execv(sys.executable, [sys.executable] + sys.argv)
My question is, how do I get that timer thread to shut off? If I keyboard interrupt, the timer keeps going.
I think you just spelled daemon wrong, it should have been:
t.daemon = True
Then sys.exit() should work
Expanding on the answer from notorious.no, and the comment asking:
How can I call t.cancel() if I have no access to t oustide the
function?
Give the Timer thread a distinct name when you first create it:
import threading
def check_buzz(super_buzz_words):
print 'Checking buzz'
t = Timer(5.0, check_buzz, args=(super_buzz_words,))
t.daemon = True
t.name = "check_buzz_daemon"
t.start()
Although the local variable t soon goes out of scope, the Timer thread that t pointed to still exists and still retains the name assigned to it.
Your atexit-registered method can then identify this thread by its name and cancel it:
from atexit import register
def all_done():
for thr in threading._enumerate():
if thr.name == "check_buzz_daemon":
if thr.is_alive():
thr.cancel()
thr.join()
register(all_done)
Calling join() after calling cancel()is based on a StackOverflow answer by Cédric Julien.
HOWEVER, your thread is set to be a Daemon. According to this StackOverflow post, daemon threads do not need to be explicitly terminated.
from atexit import register
def all_done():
if t.is_alive():
# do something that will close your thread gracefully
register(all_done)
Basically when your code is about to exit, it will fire one last function and this is where you will check if your thread is still running. If it is, do something that will either cancel the transaction or otherwise exit gracefully. In general, it's best to let threads finish by themselves, but if it's not doing anything important (please note the emphasis) than you can just do t.cancel(). Design your code so that threads will finish on their own if possible.
Another way would be to use the Queue() module to send and recieve info from a thread using the .put() outside the thread and the .get() inside the thread.
What you can also do is create a txt file and make program write to it when you exit And put an if statement in the thread function to check it after each iteration (this is not a really good solution but it also works)
I would have put a code exemple but i am writing from mobile sorry
I'm trying to use a new thread or multiprocessing to run a function.
The function is called like this:
Actualize_files_and_folders(self)
i've read alot about multiprocessing and threads, and looking the questions at StackOverflow but i can't make it work... It would be great to have some help >.<
im calling the function with a button.
def on_button_act_clicked(self, menuitem, data=None):
self.window_waiting.show()
Actualize_files_and_folders(self)
self.window_waiting.hide()
In the waiting_window i have a button called 'cancel', it would be great if i can have a command/function that kills the thread.
i've tryed a lot of stuff, for exemple:
self.window_waiting.show()
from multiprocessing import Process
a=Process(Target=Actualize_files_and_folders(self))
a.start()
a.join()
self.window_waiting.hide()
But the window still freezing, and window_waiting is displayed at the end of Actualize_files_and_folders(self), like if i had called a normal function.
Thanks so much for any help!!
It looks like the worker function is being called rather than used as a callback for the process target:
process = Process(target=actualize_files_and_folders(self))
This is essentially equivalent to:
tmp = actualize_files_and_folders(self)
process = Process(target=tmp)
So the worker function is called in the main thread blocking it. The result of this function is passed into the Process as the target which will do nothing if it is None. You need to pass the function itself as a callback, not its result:
process = Process(target=actualize_files_and_folders, args=[self])
process.start()
See: https://docs.python.org/2/library/multiprocessing.html