How to end properly 2 looping threading? [duplicate] - python

This question already has an answer here:
Python Threading with Event object
(1 answer)
Closed 3 years ago.
I'm doing a telemetry application using Azure IoT Hub, Azure IoT SDK in Python and a raspberry pi with temperature and humidity sensors.
Humidity + Temperature sensors => Rasperry Pi => Azure IoT Hub
For my application, I send the data with different frequencies using 2 looping threads:
- One loop collect the data of the temperature sensor and send it to Azure IoT Hub every 60 seconds
-One loop collect the data of the humidity sensor and send it to Azure IoT Hub every 600 seconds.
I want to close properly the 2 looping threads. They currently run with no way to break them.
I'm using Python 2.7.
I heard about Event from of the library "threading, Thread", but I can't find some good example of program structure to apply.
How can I use Event to close properly thread? How to end those loops with another method?
Here is the structure of my code using the 2 threads including loop.
from threading import Thread
def send_to_azure_temperature_thread_func:
client = iothub_client_init()
while True:
collect_temperature_data()
send_temperature_data(client)
time.sleep(60)
def send_to_humidity_thread_func():
client = iothub_client_init()
while True:
collect_humidity_data()
send_humidity_data(client)
time.sleep(600)
if __name__ == '__main__':
print("Threads...")
temperature_thread = Thread(target=send_to_azure_temperature_thread_func)
temperature_thread.daemon = True
print("Thread1 init")
humidity_thread = Thread(target=send_to_azure_humidity_thread_func)
humidity_thread.daemon = True
print("Thread2 init")
temperature_thread.start()
humidity_thread.start()
print("Threads start")
temperature_thread.join()
humidity_thread.join()
print("Threads wait")

Event seems like a good approach. Create one and pass it to all threads and replace the sleep() by Event.wait() and check if the loop needs to be left.
In the main thread the event can be set to signal to the threads that they should leave the loop and thus end themselves.
from threading import Event, Thread
def temperature_loop(stop_requested):
client = iothub_client_init()
while True:
collect_temperature_data()
send_temperature_data(client)
if stop_requested.wait(60):
break
def humidity_loop(stop_requested):
client = iothub_client_init()
while True:
collect_humidity_data()
send_humidity_data(client)
if stop_requested.wait(600):
break
def main():
stop_requested = Event()
print('Threads...')
temperature_thread = Thread(target=temperature_loop, args=[stop_requested])
temperature_thread.daemon = True
print('Thread1 init')
humidity_thread = Thread(target=humidity_loop, args=[stop_requested])
humidity_thread.daemon = True
print('Thread2 init')
temperature_thread.start()
humidity_thread.start()
print('Threads start')
time.sleep(2000)
stop_requested.set()
temperature_thread.join()
humidity_thread.join()
print('Threads wait')
if __name__ == '__main__':
main()

Related

Asyncio Thread Management (Trying to avoid cascading forever-threads)

I currently have a WebRTC client running with an Python Flask backend. It receives video from an RTSP source and uses WebRTC to deliver it to a client.
I have an issue where when the client disconnects, the threads still run as they are looping forever, I'm able to stop part of the threads I made, but I believe asyncio loops are still creating some. Below is the relevant code, and I always seem to gain two extra threads when a client disconnects. For instance when it starts I have two, a client connects, I have 7, client disconnects, left running with 4.
# WEBRTC ===================================
# This is what runs forever, while the client is connected,
# MASSIVELY SCREWEY DUE TO ME NOT KNOWING HOW TO THREAD PROPERLY.
def webRtcWatchdog():
global rtcloop
global pingTime
global webRTCThread
global watchdog
time.sleep(5) # Wait at least 5 seconds for client to connect.
while True:
time.sleep(1)
# If timer exceeds value, then run thread.stop;
diffTime = float(time.time()) - float(pingTime)
# print("Diff Time: " + str(diffTime))
if (diffTime > 5.0):
print("Running Threads Before: " + str(active_count()))
rtcloop.stop()
webRTCThread.join()
break
print("Broken Watchdog, thread should be terminating now!")
async def webRtcStart():
# Get current loop
loop = asyncio.get_event_loop()
global rtspCredString
pingTime = None
# Set Media Source and decode offered data
player = MediaPlayer(rtspCredString) # In the future the media source will be the local relay docker container that has the camera in question, instead of from the camera itself.
params = ast.literal_eval((request.data).decode("UTF-8"))
# Set ICE Server to local server CURRENTLY STATIC
offer = RTCSessionDescription(sdp=params.get("sdp"), type=params.get("type"))
webRtcPeer = RTCPeerConnection(configuration=RTCConfiguration(
iceServers=[RTCIceServer(
urls=['stun:nvr.internal.my.domain'])]))
# Create Event Watcher On Data Channel To Know If Client Is Still Alive, AKA Ping - Pong
#webRtcPeer.on("datachannel")
def on_datachannel(channel):
#channel.on("message")
def on_message(message):
global pingTime
if isinstance(message, str) and message.startswith("ping"):
pingTime = time.time()
channel.send("pong" + message[4:])
elif ():
print("Closing Peer!")
webRtcPeer.close()
if (player.video):
webRtcPeer.addTrack(player.video)
if (player.audio):
webRtcPeer.addTrack(player.audio)
# Wait to Set Remote Description
await webRtcPeer.setRemoteDescription(offer)
# Generate Answer to Give To Peer
answer = await webRtcPeer.createAnswer()
# Set Description of Peer to answer.
await webRtcPeer.setLocalDescription(answer)
final = ("{0}").format(json.dumps(
{"sdp": (webRtcPeer.localDescription.sdp), "type": webRtcPeer.localDescription.type}
))
# Retirn response
return final
# When we grab a WebRTC offer
#app.route('/rtcoffer', methods=['GET', 'POST'])
#login_required
def webRTCOFFER():
global rtcloop
global webRTCThread
global watchdog
# Get Event Loop If It Exists, Create It If Not.
try:
rtcloop = asyncio.get_event_loop()
except RuntimeError:
rtcloop = asyncio.new_event_loop()
asyncio.set_event_loop(rtcloop)
# Run an event into that loop until it's complete and returns a value
t = rtcloop.run_until_complete(webRtcStart())
# Now create a timer that is reset by Ping-Pong.
# Continue running that loop forever to keep AioRTC Objects In Memory Executing, while shifting it to
# Another thread so we don't block the code.
webRTCThread = Thread(target=rtcloop.run_forever)
webRTCThread.start()
watchdog = Thread(target=webRtcWatchdog)
watchdog.start()
print("Current Number Of Running Threads: " + str(active_count())) # We currently don't ever stop the started threads as
# we are not currently monitoring Ping-Pong, will impletent. What this currently means is the program is unusable in production
# with the cascading threads never being cleaned up.
# Return Our Parsed SDP
return t.encode()

Python Socket Programming - Simulate a radio stream with multiple clients using threads

I've been trying to write a python program that simulates a radio web stream, but I'm not quite sure how to do it properly. To do so, I would like to have the program continuously "playing" the musics even if there are no clients connected, so it would simulate a "live" radio where you connect and listen to whatever is playing.
What I have now is a server/client relation with TCP basic socket programming, the server side has a producer thread that was supposed to keep reading the music, and on-demand consumer threads that should send the audio frame to the client, that plays it with PyAudio. The problem is probably in the way the data is shared between threads.
First I've tried to do it with a single Queue, but as the client reads data from the queue, this data is removed and if I have multiple clients connected, that will make the music skip some frames.
Then I've tried to create a fixed number (10) of Queue objects that would be used for each client, with the producer thread feeding every queue, but each client would create a consumer thread of its own and read only from the queue "assigned" to it with a control variable. The problem here is: if there are any queues not being consumed (if I have only one client connected, for example), the Queue.put() method will block because these queues are full. How do I keep all queues "running" and synchronized even when they are not being used?
This is where I am now, and any advice is appreciated. I am not an experienced programmer yet, so please be patient with me. I believe Queue is not the recommended IPC method in this case, but if there is a way to use it, let me know.
Below is the code I have for now:
server.py
#TCP config omitted
#Producer Thread
def readTheMusics(queue):
#Control variable to keep looping through 2 music files
i=1
while i < 3:
fname = "music" + str(i) + ".wav"
wf = wave.open(fname, 'rb')
data = wf.readframes(CHUNK)
while data:
for k in range (10):
queue[k].put(data)
data = wf.readframes(CHUNK)
wf.close()
i += 1
if i==3:
i=1
#Consumer Thread
def connection(connectionSocket, addr, queue, index):
while True:
data = queue[index-1].get(True)
connectionSocket.send(data)
connectionSocket.close()
def main():
i = 1
#Queue(1) was used to prevent an infinite queue and therefore a memory leak
queueList = [Queue(1) for j in range(10)]
th2 = threading.Thread(target=musicReading, args=(queueList, ))
th2.start()
while True:
connectionSocket, addr = serverSocket.accept()
print("connected - id {}".format(i))
th = threading.Thread(target=connection, args=(connectionSocket, addr, queueList, i))
th.start()
i = i + 1
if __name__ == '__main__':
main()
Tim Roberts' comments were enough to make it work.

Publish and subscribe both ways using MQTT Python

I currently have a Python program written on the Raspberry Pi 3 to read in humidity and temperature sensor data and publish this data to a topic. I can then receive this data using my laptop. Here is my code for reading sensor data and publishing it to a topic from my Raspberry Pi:
import RPi.GPIO as GPIO
import time
import json
import Adafruit_DHT as dht
import math
import paho.mqtt.publish as publish
import paho.mqtt.client as mqtt
# Creating the JSON Objects
dht22 = {}
arduino = {}
dht22Temp = []
dht22Hum = []
arduinoLED = []
dht22['temperature'] = dht22Temp
dht22['humidity'] = dht22Hum
dht22['sensor'] = 'DHT22'
arduino['blink'] = arduinoLED
arduino['actuator'] = 'arduinoLED'
# Timing constants
E_PULSE = 0.0005
E_DELAY = 0.0005
def main():
# Main program block
while True:
h, t = dht.read_retry(dht.DHT22, 17) //Reading humidity and temp data from GPIO17
t = round(t,2)
h = round(h,2)
if t > 25:
if len(arduinoLED) == 3:
arduinoLED.pop(0)
arduinoLED.append("true")
else:
arduinoLED.append("true")
else:
if len(arduinoLED) == 3:
arduinoLED.pop(0)
arduinoLED.append("false")
else:
arduinoLED.append("false")
if len(dht22Temp) == 3:
dht22Temp.pop(0)
dht22Temp.append(t)
else:
dht22Temp.append(t)
if len(dht22Hum) == 3:
dht22Hum.pop(0)
dht22Hum.append(h)
else:
dht22Hum.append(h)
# lm35dzTemp.append(tempc)
# Publishing sensor information by JSON converting object to a string
publish.single("topic/sensorTemperature", json.dumps(dht22), hostname = "test.mosquitto.org")
publish.single("topic/sensorTemperature", json.dumps(arduino), hostname = "test.mosquitto.org")
# Printing JSON objects
print(dht22)
print(arduino)
time.sleep(2)
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
pass
finally:
GPIO.cleanup()
Here is my code for subscribing and receiving data from my laptop:
import paho.mqtt.client as mqtt
import json
# This is the Subscriber
def on_connect(client, userdata, flags, rc):
print("Connected with result code " + str(rc))
client.subscribe("topic/sensorTemperature")
def on_message(client, userdata, msg):
print(json.loads(msg.payload)) #converting the string back to a JSON object
client = mqtt.Client()
client.on_connect = on_connect
client.on_message = on_message
client.connect("test.mosquitto.org", 1883, 60)
client.loop_forever()
What I want to do is now publish something from my laptop (perhaps in the same code as the subscriber, or in a separate file that will just publish a message to the same topic - "topic/sensorTemperature"). But my question is: how do I also publish and subscribe to messages on my Raspberry Pi (in my first code that I published)? Since I am publishing messages in an infinite loop to my laptop, I will also need an infinite loop to subscribe to the same (or different topic) to receive messages. How do you run two of these loops at once? Will I need two different threads?
Thank you.
As suggested by Sergey you can use loop_start to create a separate thread for receiving messages.
Here is how your main function will look like:
def main():
# Create a new client for receiving messages
client = mqtt.Client()
client.on_connect = on_connect
client.on_message = on_message
client.subscribe(topic)
client.connect(mqttserver)
client.loop_start()
while True:
#code for publishing
pass
The easiest way is to start another Python process (similar to your laptop's script) on Raspberry in parallel, handling messages received from laptop.
But if you want to implement everything in one script, you can extend your second code fragment (processing messages) with implementation of first fragment (publishing sensors data).
Of course, you can't use loop_forever() in this case. When you call loop_forever(), it will never return until client calls disconnect(), so you can't process received messages (main thread is blocked). Paho client also has routines loop() and loop_start()/loop_stop() to control over network loop.
Take a look on them:
1) Function loop() can take timeout as an argument. It will block until new message arrives or time is out. In first case - preform the processing of received message and calculate time until the next publish. Pass this time as parameter to loop() again. In second case, just publish data and call loop() with time until next publish (2 seconds in your example).
2) loop_start()/loop_stop() starts and stops background thread doing job of sending and receiving(and processing) data for you. Create client, register on_message() callback, connect/subscribe, and call loop_start() to start this thread. Main thread is free for you now - use it with logic of first fragment (loop with 2 seconds sleep).
Simply put your code from subscribing script into publishing script before while True: and replace loop_forever() with loop_start(). Use loop_stop() when you script is exitting before GPIO.cleanup().

python-xbee cannot read frame in thread

I am using one XBee S2 as coordinator (API mode), 3 XBee S2 as routers (AT mode). The routers are connected to Naze32 board (using MSP).
On the computer side, I have a GUI using wxpython to send out command to request data.
The GUI will send out command to XBee (Coordinator) to request data from the routers every second.
I am using python-xbee library to do the send and receive frame job on computer side. Once new data received, it will notify the GUI to update some labels with the new data.
Currently I am able to send and receive frames outside a thread, but once I move the send and receive functions to a thread, it will never be able to read a frame any more. As I don't want to let the serial stop the GUI or make it not responding. Another thing is if I close the thread, then start new thread with xbee again, it will not work any more.
The communication is controlled by a button on the GUI; once "Turn on" clicked, the "self._serialOn" will set to True, then create new thread; once "Turn off" clicked, "self._serialOn" will set to False and thread is stopped.
How can I fix this problem?
Thanks in advance.
class DataExchange(object):
def __init__(self):
self._observers = []
self._addressList = [['\x00\x13\xA2\x00\x40\xC1\x43\x0F', '\xFF\xFE'],[],[]]
self._serialPort = ''
self._serialOn = False
self.workerSerial = None
# serial switch
def get_serialOn(self):
return self._serialOn
def set_serialOn(self, value):
self._serialOn = value
print(self._serialOn)
if self.serialOn == True:
EVT_ID_VALUE = wx.NewId()
self.workerSerial = WorkerSerialThread(self, EVT_ID_VALUE, self.serialPort, self.addressList)
self.workerSerial.daemon = True
self.workerSerial.start()
elif self.serialOn == False:
self.workerSerial.stop()
del(self.workerSerial)
self.workerSerial = None
serialOn = property(get_serialOn, set_serialOn)
class WorkerSerialThread(threading.Thread):
def __init__(self, notify_window, id, port, addresslist):
threading.Thread.__init__(self)
self.id = id
self.serialPort = port
self.addressList = addresslist
self.counter = 0
self._notify_window = notify_window
self.abort = False
self.sch = SerialCommunication(self.serialPort, self.addressList)
try:
self.sch.PreLoadInfo()
except:
print('failed')
def run(self):
while not self.abort:
self.counter += 1
print('Serial Working on '+str(self.id))
self.sch.RegularLoadInfo()
#wx.PostEvent(self._notify_window, DataEvent(self.counter, self.id))
time.sleep(1)
def stop(self):
self.sch.board.stop()
self.abort = True
This question was finally solved with multiprocessing rather than threading of python.
In the manual of python-xbee, it mentioned that "... Make sure that updates to external state are thread-safe...". Also in the source code, threading was used.
So I guess in this case threading will cause problem.
Anyway, using multiprocessing it finally works.

More elegant way to check for an event/trigger periodically?

I have a passive infrared sensor and I wanted to turn off and on my display based on motion. E.g. if there is no motion for 5 minutes, then the display should turn off to save power. However if there is motion don't turn off the display, or turn it back on. (Don't ask why isn't a screensaver good for this. The device I'm making won't have any keyboard or mouse. It only will be a standalone display.)
My idea was to create two threads, a producer, and a consumer. The producer thread (the PIR sensor) puts a message into a queue, which the consumer (which controls the display) reads. This way I can send signals from one to another.
I have a fully functional code below (with some explanation), which completes the previously described. My question is that is there some way to achieve this in a more elegant way? What do you think of my approach, is it okay, is it hackish?
#!/usr/bin/env python
import Queue
from threading import Thread
import RPi.GPIO as gpio
import time
import os
import sys
class PIRSensor:
# PIR sensor's states
current_state = 0
previous_state = 0
def __init__(self, pir_pin, timeout):
# PIR GPIO pin
self.pir_pin = pir_pin
# Timeout between motion detections
self.timeout = timeout
def setup(self):
gpio.setmode(gpio.BCM)
gpio.setup(self.pir_pin, gpio.IN)
# Wait for the PIR sensor to settle
# (loop until PIR output is 0)
while gpio.input(self.pir_pin) == 1:
self.current_state = 0
def report_motion(self, queue):
try:
self.setup()
while True:
self.current_state = gpio.input(self.pir_pin)
if self.current_state == 1 and self.previous_state == 0:
# PIR sensor is triggered
queue.put(True)
# Record previous state
self.previous_state = 1
elif self.current_state == 1 and self.previous_state == 1:
# Feed the queue since there is still motion
queue.put(True)
elif self.current_state == 0 and self.previous_state == 1:
# PIR sensor has returned to ready state
self.previous_state = 0
time.sleep(self.timeout)
except KeyboardInterrupt:
raise
class DisplayControl:
# Display's status
display_on = True
def __init__(self, timeout):
self.timeout = timeout
def turn_off(self):
# Turn off the display
if self.display_on:
os.system("/opt/vc/bin/tvservice -o > /dev/null 2>&1")
self.display_on = False
def turn_on(self):
# Turn on the display
if not self.display_on:
os.system("{ /opt/vc/bin/tvservice -p && chvt 9 && chvt 7 ; } > /dev/null 2>&1")
self.display_on = True
def check_motion(self, queue):
try:
while True:
try:
motion = queue.get(True, self.timeout)
if motion:
self.turn_on()
except Queue.Empty:
self.turn_off()
except KeyboardInterrupt:
raise
if __name__ == "__main__":
try:
pir_sensor = PIRSensor(7, 0.25)
display_control = DisplayControl(300)
queue = Queue.Queue()
producer = Thread(target=pir_sensor.report_motion, args=(queue,))
consumer = Thread(target=display_control.check_motion, args=(queue,))
producer.daemon = True
consumer.daemon = True
producer.start()
consumer.start()
while True:
time.sleep(0.1)
except KeyboardInterrupt:
display_control.turn_on()
# Reset GPIO settings
gpio.cleanup()
sys.exit(0)
The producer thread runs a function (report_motion) of a PIRSensor class instance. The PIRSensor class reads the state of a passive infrared sensor four times per second, and whenever it senses motion puts a message into a queue.
The consumer thread runs a function of (check_motion) of a DisplayControl class instance. It reads the previously mentioned queue in blocking mode with a given timeout. The following can happen:
If the display is on and there is no message in the queue for a given
time, aka the timeout expires, the consumer thread will power off the
display.
If the display is off and a message comes, the thread will
power on the display.
I think the idea is good. The only question I have about your implementation is why have both the consumer and producer in child threads? You could just keep the consumer in the main thread, and then there'd be no need to have this meaningless loop in your main thread.
while True:
time.sleep(0.1)
which is just wasting CPU cycles. Instead you could just call display_motion.check_motion(queue) directly.
I think it is a good solution. The reason being that you have separated the concerns for the different classes. One class handles the PIR sensor. One handles the display. You glue them together by a queue today, that's one approach.
By doing this you can easily test the different classes.
To extend this (read make it extendable) you might introduce a controller. The controller gets events (e.g. from the queue) and acts on the events (e.g. tell the Display Controller to turn off the display). The controller knows about the sensor, and knows about the display. But the sensor should not know about the display or vice versa. (this is very similar to MVC where in this case the data is the model (sensor), the display is the view and the controller sits in between.
This approach makes the design testable, extendable, maintainable. And by that you are not hackish, you are writing real code.

Categories

Resources