Python threading library: code executes linearly and not in parallel - python

I want to run two threads in parallel (on python3.6), which works for following code example:
import threading
from time import sleep
# use Thread to run def in background
# Example:
def func1():
while True:
sleep(1)
print("Working")
def func2():
while True:
sleep(2)
print("Working2")
Thread(target = func1).start()
Thread(target = func2).start()
but it does not work for threading.Thread:
import threading
from time import sleep
# use Thread to run def in background
# Example:
def func1():
while True:
sleep(1)
print("Working")
def func2():
while True:
sleep(2)
print("Working2")
x = threading.Thread(target=func1())
y = threading.Thread(target=func2())
x.start()
y.start()
I would like to use the latter option to check if x or y are still alive.

There's a difference between Thread(target = func1) (first code) and Thread(target=func1()) (second code):
the first one passes the function object to Thread
the second one executes the function (because you called it with func1()) and passes its return value to Thread
Since you want the threads to call your functions, don't call them:
x = threading.Thread(target=func1)
y = threading.Thread(target=func2)
x.start()
y.start()

Related

Interrupting Infinite Loop in Threading

I am learning about threading using Python 3.8.2. I have one function with an infinite loop, and then two other functions that use the threading.Timer class.
def x:
while True:
dosomething()
def f1():
dosomething2()
threading.Timer(60,f1).start()
def f2():
dosomething3()
threading.Timer(100,f2).start()
Then I start three threads:
t1 = threading.Thread(target=x)
t2 = threading.Thread(target=f1)
t3 = threading.Thread(target=f2)
When it comes time to execute the f1 or f2, I don't want the x() to be executing at the same time, (they might be using the same resource) and pause, let f2 or f1 finish, and then resume the infinite loop in x(). How can I do this?
I've looked at join() but it seems to me it will wait forever for f1() and f2() because it creates a new thread every time and won't terminate.
Here's a photo to explain the flow:
Here's a possible solution, I added the functions dosomething() dosomething2() dosomething3() into the code to have a working example. I've also changed the timer on the threads to 6 and 10 seconds each instead of 60 and 100 so we don't have to wait that long to see their functionality.
dosomething()
prints 'dosomething is running' every second if no other function is running
dosomething2()
sets dosomething2.status = 'run'
prints 'dosomething2 is running 1st second'
waits one second
prints 'dosomething2 is running 2nd second'
sets dosomething2.status = 'sleep'
dosomething3()
sets dosomething3.status = 'run'
prints 'dosomething3 is running 1st second'
waits one second
prints 'dosomething3 is running 2nd second'
waits one second
prints 'dosomething3 is running 3rd second'
sets dosomething3.status = 'sleep'
The first and last lines in dosomething2() and dosomething3() will be triggers to let our x() function know to run only if both functions are in the state of 'sleep'.
You could use global variables instead of dosomething2.status and dosomething3.status but some people recommend not using them.
Code
import time
import threading
def dosomething():
print('dosomething is running')
time.sleep(1)
def dosomething2():
dosomething2.status = 'run'
print('\tdosomething2 is running 1st second')
time.sleep(1)
print('\tdosomething2 is running 2nd second')
dosomething2.status = 'sleep'
def dosomething3():
dosomething3.status = 'run'
print('\tdosomething3 is running 1st second')
time.sleep(1)
print('\tdosomething3 is running 2nd second')
time.sleep(1)
print('\tdosomething3 is running 3rd second')
dosomething3.status = 'sleep'
dosomething2.status = ''
dosomething3.status = ''
def x():
while True:
if dosomething2.status == 'sleep' and dosomething3.status == 'sleep':
dosomething()
def f1():
dosomething2()
threading.Timer(6,f1).start()
def f2():
dosomething3()
threading.Timer(10,f2).start()
t1 = threading.Thread(target=x)
t2 = threading.Thread(target=f1)
t3 = threading.Thread(target=f2)
t1.start()
t2.start()
t3.start()

python - how to run background thread without stopping main thread

This simple code example:
import threading
import time
class Monitor():
def __init__(self):
self.stop = False
self.blocked_emails = []
def start_monitor(self):
print("Run start_monitor")
rows = []
while not self.stop:
self.check_rows(rows)
print("inside while")
time.sleep(1)
def check_rows(self, rows):
print('check_rows')
def stop_monitoring(self):
print("Run stop_monitoring")
self.stop = True
if __name__ == '__main__':
monitor = Monitor()
b = threading.Thread(name='background_monitor', target=monitor.start_monitor())
b.start()
b.join()
for i in range(0, 10):
time.sleep(2)
print('Wait 2 sec.')
monitor.stop_monitoring()
How can I run background thread, in mine case background_monitor without blocking main thread?
I wanted to background_monitor thread stopped on after stop_monitoring will be called
I mine example, the for loop from main thread never called and the background is running forever.
There are two issues with your current code. Firstly, you're calling monitor.start_monitor on this line, whereas according to the docs
target is the callable object to be invoked by the run() method. Defaults to None, meaning nothing is called
This means that you need to pass it as a function rather than calling it. To fix this, you should change the line
b = threading.Thread(name='background_monitor', target=monitor.start_monitor())
to
b = threading.Thread(name='background_monitor', target=monitor.start_monitor)
which passes the function as an argument.
Secondly, you use b.join() before stopping the thread, which waits for the second thread to finish before continuing. Instead, you should place that below the monitor.stop_monitoring().
The corrected code looks like this:
import threading
import time
class Monitor():
def __init__(self):
self.stop = False
self.blocked_emails = []
def start_monitor(self):
print("Run start_monitor")
rows = []
while not self.stop:
self.check_rows(rows)
print("inside while")
time.sleep(1)
def check_rows(self, rows):
print('check_rows')
def stop_monitoring(self):
print("Run stop_monitoring")
self.stop = True
if __name__ == '__main__':
monitor = Monitor()
b = threading.Thread(name='background_monitor', target=monitor.start_monitor)
b.start()
for i in range(0, 10):
time.sleep(2)
print('Wait 2 sec.')
monitor.stop_monitoring()
b.join()

Python: How combine returned value from 2 functions and appending them into a list using thread?

I am new to thread, I had encountered abnormal result while printing the value inside a list using thread to allow 2 functions to working at the same time and appending the result to a list. Below my code:
import threading
def func1():
return "HTML"
def func2():
return "IS FUN"
threadslist = []
thread1 = threading.Thread(target=func1)
thread2 = threading.Thread(target=func2)
x = thread1
y = thread2
x.start()
y.start()
threadslist.append(x)
threadslist.append(y)
print(threadslist)
And here is the result for the list:
[<Thread(Thread-1, stopped 1076)>, <Thread(Thread-2, stopped 7948)>]
Why it storing the Threads object instead of storing ['HTML', 'IS FUN'] ?
import threading
threading_list = []
def func1():
threading_list.append("HTML")
def func2():
threading_list.append("IS FUN")
thread1 = threading.Thread(target=func1)
thread2 = threading.Thread(target=func2)
x = thread1
y = thread2
x.start()
y.start()
print(threading_list)
In your threadlist you are saving the Thread variables, so that is what you're seeing in your output is their representation as strings.
You can't get the return value of a function running in a different thread like that.
To do what you can either:
Use the multithreading module:
:
from multiprocessing.pool import ThreadPool
def func1():
return 'HTML'
def func2():
return 'IS FUN'
pool = ThreadPool(processes=1)
return_values = []
return_values.append(pool.apply(func1, ())) # Using apply for synchronous call directly returns the function return value.
func2_result = pool.applyasync(func2) # Using applyasync for asynchronous call will require a later call.
return_values.append(func2_result.get()) # get return value from asynchronous call to func2.
print(return_values)
Use mutable object, like a list, to save the return values:
:
return_values = []
def func1():
return_values.append('HTML')
def func2():
return_values.append('IS FUN')
# rest of your code here
print(return_values)
And you'll get:
['HTML', 'IS FUN']

Semaphores on Python

I've started programming in Python a few weeks ago and was trying to use Semaphores to synchronize two simple threads, for learning purposes. Here is what I've got:
import threading
sem = threading.Semaphore()
def fun1():
while True:
sem.acquire()
print(1)
sem.release()
def fun2():
while True:
sem.acquire()
print(2)
sem.release()
t = threading.Thread(target = fun1)
t.start()
t2 = threading.Thread(target = fun2)
t2.start()
But it keeps printing just 1's. How can I intercale the prints?
It is working fine, its just that its printing too fast for you to see . Try putting a time.sleep() in both functions (a small amount) to sleep the thread for that much amount of time, to actually be able to see both 1 as well as 2.
Example -
import threading
import time
sem = threading.Semaphore()
def fun1():
while True:
sem.acquire()
print(1)
sem.release()
time.sleep(0.25)
def fun2():
while True:
sem.acquire()
print(2)
sem.release()
time.sleep(0.25)
t = threading.Thread(target = fun1)
t.start()
t2 = threading.Thread(target = fun2)
t2.start()
Also, you can use Lock/mutex method as follows:
import threading
import time
mutex = threading.Lock() # is equal to threading.Semaphore(1)
def fun1():
while True:
mutex.acquire()
print(1)
mutex.release()
time.sleep(.5)
def fun2():
while True:
mutex.acquire()
print(2)
mutex.release()
time.sleep(.5)
t1 = threading.Thread(target=fun1).start()
t2 = threading.Thread(target=fun2).start()
Simpler style using "with":
import threading
import time
mutex = threading.Lock() # is equal to threading.Semaphore(1)
def fun1():
while True:
with mutex:
print(1)
time.sleep(.5)
def fun2():
while True:
with mutex:
print(2)
time.sleep(.5)
t1 = threading.Thread(target=fun1).start()
t2 = threading.Thread(target=fun2).start()
[NOTE]:
The difference between mutex, semaphore, and lock
In fact, I want to find asyncio.Semaphores, not threading.Semaphore,
and I believe someone may want it too.
So, I decided to share the asyncio.Semaphores, hope you don't mind.
from asyncio import (
Task,
Semaphore,
)
import asyncio
from typing import List
async def shopping(sem: Semaphore):
while True:
async with sem:
print(shopping.__name__)
await asyncio.sleep(0.25) # Transfer control to the loop, and it will assign another job (is idle) to run.
async def coding(sem: Semaphore):
while True:
async with sem:
print(coding.__name__)
await asyncio.sleep(0.25)
async def main():
sem = Semaphore(value=1)
list_task: List[Task] = [asyncio.create_task(_coroutine(sem)) for _coroutine in (shopping, coding)]
"""
# Normally, we will wait until all the task has done, but that is impossible in your case.
for task in list_task:
await task
"""
await asyncio.sleep(2) # So, I let the main loop wait for 2 seconds, then close the program.
asyncio.run(main())
output
shopping
coding
shopping
coding
shopping
coding
shopping
coding
shopping
coding
shopping
coding
shopping
coding
shopping
coding
16*0.25 = 2
I used this code to demonstrate how 1 thread can use a Semaphore and the other thread will wait (non-blocking) until the Sempahore is available.
This was written using Python3.6; Not tested on any other version.
This will only work is the synchronization is being done from the same thread, IPC from separate processes will fail using this mechanism.
import threading
from time import sleep
sem = threading.Semaphore()
def fun1():
print("fun1 starting")
sem.acquire()
for loop in range(1,5):
print("Fun1 Working {}".format(loop))
sleep(1)
sem.release()
print("fun1 finished")
def fun2():
print("fun2 starting")
while not sem.acquire(blocking=False):
print("Fun2 No Semaphore available")
sleep(1)
else:
print("Got Semphore")
for loop in range(1, 5):
print("Fun2 Working {}".format(loop))
sleep(1)
sem.release()
t1 = threading.Thread(target = fun1)
t2 = threading.Thread(target = fun2)
t1.start()
t2.start()
t1.join()
t2.join()
print("All Threads done Exiting")
When I run this - I get the following output.
fun1 starting
Fun1 Working 1
fun2 starting
Fun2 No Semaphore available
Fun1 Working 2
Fun2 No Semaphore available
Fun1 Working 3
Fun2 No Semaphore available
Fun1 Working 4
Fun2 No Semaphore available
fun1 finished
Got Semphore
Fun2 Working 1
Fun2 Working 2
Fun2 Working 3
Fun2 Working 4
All Threads done Exiting
Existing answers are wastefully sleeping
I noticed that almost all answers use some form of time.sleep or asyncio.sleep, which blocks the thread. This should be avoided in real software, because blocking your thread for 0.25, 0.5 or 1 second is unnecessary/wasteful - you could be doing more processing, especially if your application is IO bound - it already blocks when it does IO AND you are introducing arbitrary delays (latency) in your processing time. If all your threads are sleeping, your app isn't doing anything. Also, these variables are quite arbitrary, which is why each answer has a different value they sleep (block the thread for).
The answers are using it as a way to get Python's bytecode interpreter to pre-empt the thread after each print line, so that it alternates deterministically between running the 2 threads. By default, the interpreter pre-empts a thread every 5ms (sys.getswitchinterval() returns 0.005), and remember that these threads never run in parallel, because of Python's GIL
Solution to problem
How can I intercale the prints?
So my answer would be, you do not want to use semaphores to print (or process) something in a certain order reliably, because you cannot rely on thread prioritization in Python. See Controlling scheduling priority of python threads? for more. time.sleep(arbitrarilyLargeEnoughNumber) doesn't really work when you have more than 2 concurrent pieces of code, since you don't know which one will run next - see * below. If the order matters, use a queue, and worker threads:
from threading import Thread
import queue
q = queue.Queue()
def enqueue():
while True:
q.put(1)
q.put(2)
def reader():
while True:
value = q.get()
print(value)
enqueuer_thread = Thread(target = enqueue)
reader_thread_1 = Thread(target = reader)
reader_thread_2 = Thread(target = reader)
reader_thread_3 = Thread(target = reader)
enqueuer_thread.start()
reader_thread_1.start()
reader_thread_2.start()
reader_thread_3.start()
...
Unfortunately in this problem, you don't get to use Semaphore.
*An extra check for you
If you try a modification of the top voted answer but with an extra function/thread to print(3), you'll get:
1
2
3
1
3
2
1
3
...
Within a few prints, the ordering is broken - it's 1-3-2.
You need to use 2 semaphores to do what you want to do, and you need to initialize them at 0.
import threading
SEM_FUN1 = threading.Semaphore(0)
SEM_FUN2 = threading.Semaphore(0)
def fun1() -> None:
for _ in range(5):
SEM_FUN1.acquire()
print(1)
SEM_FUN2.release()
def fun2() -> None:
for _ in range(5):
SEM_FUN2.acquire()
print(2)
SEM_FUN1.release()
threading.Thread(target=fun1).start()
threading.Thread(target=fun2).start()
SEM_FUN1.release() # Trigger fun1
Output:

How to use a thread pool to do infinite loop function?

I want to do a infinite loop function.
Here is my code
def do_request():
# my code here
print(result)
while True:
do_request()
When use while True to do this, it's a little slow, so I want to use a thread pool to concurrently execute the function do_request(). How to do this ?
Just like use ab (Apache Bench) to test HTTP server.
Finally, I've solved this problem. I use a variable to limit the thread number.
Here is my final code, solved my problem.
import threading
import time
thread_num = 0
lock = threading.Lock()
def do_request():
global thread_num
# -------------
# my code here
# -------------
with lock:
thread_num -= 1
while True:
if thread_num <= 50:
with lock:
thread_num += 1
t = threading.Thread(target=do_request)
t.start()
else:
time.sleep(0.01)
Thanks for all replies.
You can use threading in Python to implement this.
Can be something similar to this (when using two extra threads only):
import threading
# define threads
task1 = threading.Thread(target = do_request)
task2 = threading.Thread(target = do_request)
# start both threads
task1.start()
task2.start()
# wait for threads to complete
task1.join()
task2.join()
Basically, you start as many threads as you need (make sure you don't get too many, so your system can handle it), then you .join() them to wait for tasks to complete.
Or you can get fancier with multiprocessing Python module.
Try the following code:
import multiprocessing as mp
import time
def do_request():
while(True):
print('I\'m making requests')
time.sleep(0.5)
p = mp.Process(target=do_request)
p.start()
for ii in range(10):
print 'I\'m also doing other things though'
time.sleep(0.7)
print 'Now it is time to kill the service thread'
p.terminate()
The main thread stars a service thread that does the request and goes on until it has to, and then it finishes up the service thread.
Maybe you can use the concurrent.futures.ThreadPoolExecutor
from concurrent.futures import ThreadPoolExecutor
import time
def wait_on_b(hello):
time.sleep(1)
print(hello) # b will never complete because it is waiting on a.
return 5
def wait_on_a():
time.sleep(1)
print(a.result()) # a will never complete because it is waiting on b.
return 6
executor = ThreadPoolExecutor(max_workers=2)
a = executor.submit(wait_on_b, 3)
b = executor.submit(wait_on_a)
How about this?
from threading import Thread, Event
class WorkerThread(Thread):
def __init__(self, logger, func):
Thread.__init__(self)
self.stop_event = Event()
self.logger = logger
self.func = func
def run(self):
self.logger("Going to start the infinite loop...")
#Your code
self.func()
concur_task = WorkerThread(logger, func = do_request)
concur_task.start()
To end this thread...
concur_task.stop_event.set()
concur_task.join(10) #or any value you like

Categories

Resources