In python I'm creating an application also using ZeroMQ. I'm using the PUSH/PULL method to send the loading status of one script to another. The message received on the PULL script runs inside of a Thread. The PULL script looks like this:
import time
from threading import Thread
import threading
import os
import zmq
import sys
context = zmq.Context()
zmqsocket = context.socket(zmq.PULL)
zmqsocket.bind("tcp://*:5555")
class TaskstatusUpdater(Thread):
def __init__(self):
Thread.__init__(self)
def run(self):
while True:
# Wait for next request from client
task_id = int(zmqsocket.recv_multipart()[0])
taskcolorstat = int(zmqsocket.recv_multipart()[1])
taskstatus = zmqsocket.recv_multipart()[2]
time.sleep(0.1)
print(task_id, taskstatus, taskcolorstat)
thread = TaskstatusUpdater()
thread.start()
The PUSH part sends constantly updates about the status of the other script. It looks something like this:
import time
import sys
import zmq
# zmq - client startup and connecting
try:
context = zmq.Context()
print("Connecting to server…")
zmqsocket = context.socket(zmq.PUSH)
zmqsocket.connect("tcp://localhost:5555")
print("succesful")
except:
print('error could not connect to service')
# zmq - client startup and connecting
for i in range(10):
zmqsocket.send_multipart([b_task_id, b"0", b"first message"])
time.sleep(3)# doing stuff
zmqsocket.send_multipart([b_task_id, b"1", b"second message"])
b_task_id is generated earlier in the program and is a simple binary value created out of an integer. There are multiple of those PUSH scripts running at the same time and thru the b_task_id I can define which script is responding to the PULL.
It is now often the case that those multipart messages get mixed up between each other. Can somebody explain to me why that is and how I can fix this problem?
For example, sometimes the output is:
2 b'second message' 0
The output that I was expecting is:
2 b'second message' 1
Related
Quite new to Streamlit but I am trying to create a dashboard that is able to send messages via ZMQ to my server application that acts as a subscriber.
import streamlit as st
import pandas as pd
import numpy as np
import altair as alt
import time
import zmq
class StreamLitManager(object):
def __init__(self,log_file_path,zmq_port="88888"):
self.log_file_path = log_file_path
self.port = zmq_port
self.context = None
self.socket = None
def InitConnections(self):
self.context = zmq.Context()
self.socket = self.context.socket(zmq.PUB)
self.socket.bind("tcp://*:" + str(self.port))
# Send Message
# Close it right after using, this doesn't work as it doesn't close/terminate
# I tried self.context.close() too, same situation
self.context.term()
if __name__ == "__main__":
submitted1 = st.form_submit_button('Submit 1')
if submitted1:
sm = StrategyManager(user_input)
sm.InitConnections()
So the idea is whenever the user presses that submit 1 button, I want to send a message. The problem I'm experiencing is ZMQError: Address in use. Based on my superficial understanding of streamlit is that whenever some parameter changes, the entire code gets re-ran. This doesn't seem to play well when I am constantly creating a new ZMQ publisher connection of each user click of submit. Am I doing something wrong or is there a better design pattern I should be doing.
Thanks
PUB/SUB is not a good pattern here where it's constantly getting started & stopped as you will likely lose messages. See here for details (py)zmq/PUB : Is it possible to call connect() then send() immediately and do not lose the message?
For a project I need to communicate between C++ and Python via ZMQ over the WLAN network.
If I use my C++ implementation, everything works fine. I just type in the IP+Port number at the client.bind("tcp:// ...) and I can send messages via WLAN.
If I try the same with the Python Code, it does not work.
So I just tested the python examples (so no C++ anymore): http://zguide.zeromq.org/py:durapub
http://zguide.zeromq.org/py:durasub
I replaced the >localhost< in the client with the IP of my host computer. I do not receive any messages. I am using exactly the code from the example, except the replacement.
Here is the Code:
PUBLISHER:
import zmq
import time
context = zmq.Context()
# Subscriber tells us when it's ready here
sync = context.socket(zmq.PULL)
sync.bind("tcp://*:5564")
# We send updates via this socket
publisher = context.socket(zmq.PUB)
publisher.bind("tcp://*:5565")
# Wait for synchronization request
sync_request = sync.recv()
# Now broadcast exactly 10 updates with pause
for n in xrange(10):
msg = "Update %d" % n
publisher.send(msg)
time.sleep(1)
publisher.send("END")
time.sleep(1) # Give 0MQ/2.0.x time to flush output
SUBSCRIBER
import zmq
import time
context = zmq.Context()
# Connect our subscriber socket
subscriber = context.socket(zmq.SUB)
subscriber.setsockopt(zmq.IDENTITY, "Hello")
subscriber.setsockopt(zmq.SUBSCRIBE, "")
subscriber.connect("tcp://192.168.2.119:5565")
# Syncronize with the publisher
sync = context.socket(zmq.PUSH)
sync.connect("tcp://192.168.2.119:5564")
sync.send("")
# Get updates, expect random Ctrl-C death
while True:
data = subscriber.recv()
print data
if data == "END":
break
Its exactly the example code, except that I changed localhost to the IP Adress of my publisher in the Subscriber-Code. Btw, I did the same in the C++ example Code and it works.
I have two notebooks in python.
The first one checks if a file exits in the data lake. I want to return a Boolean from here and the filePath if it exits.
The next notebook will then uses these Params as in input. How is this possible?
Also could I use a condition IF in the pipeline to check the returned Boolean?
Kind of new to Azure ADF
One method to pass messages between separate python scripts or jupyter notebooks is to use the pyzmq library. Run pairserver in one notebook and pairclient in another. You will see messages being passed from one to the other. This does add an extra dependency to your code, but pyzmq is a mature package.
pairserver.ipynb
#!/usr/bin/python3
import zmq
import random
import time
port = '5556'
context = zmq.Context()
socket = context.socket(zmq.PAIR)
socket.bind('tcp://*:%s' % port)
while True:
socket.send(b'Server message to client')
msg = socket.recv()
print(msg)
time.sleep(1)
pairclient.ipynb
#!/usr/bin/python3
import zmq
import random
import sys
import time
port = '5556'
context = zmq.Context()
socket = context.socket(zmq.PAIR)
socket.connect("tcp://localhost:%s" % port)
while True:
msg = socket.recv()
print(msg)
socket.send_string("client message to server")
time.sleep(1)
Is it possible to let Python Tornado run some long background process, but concurrently, it is also serving all the handlers?
I have a Tornado Webapp that serves some webpages. But I also have a message queue, and I want Tornado to poll the message queue as a subscriber. Can this be done in Tornado?
I've searched around the user guide, and there seems to be something called a periodic_call_back I can use within the ioloop. It sounds like I can use a callback function that reads a message queue. However, is there a way to create a co-routine that never stops?
Any help is appreciated, thanks!
To Read from Zero-MQ:
Install Zero-MQ Python Library
Install the IOLoop before application.listen()
Use an executor (For python2, you can install executor libraries from python3) to execute a message queue listener, which setups tornado to listen to a message queue, and then it will utilize callbacks when it recieves data.
Example (main.py):
# Import tornado libraries
import tornado.ioloop
import tornado.web
# Import URL mappings
from url import application
# Import zeroMQ libraries
from zmq.eventloop import ioloop
# Import zeroMQ.py functions
from zeroMQ import startListenToMessageQueue
# Import zeroMQ settings
import zeroMQ_settings
# Import our executor
import executors
# Import our db_settings
import db_settings
# main.py is the main access point of the tornado app, to run the application, just run "python main.py"
# What this will do is listen to port 8888, and then we can access the app using
# http://localhost:8888 on any browser, or using python requests library
if __name__ == "__main__":
# Install PyZMQ's IOLoop
ioloop.install()
# Set the application to listen to port 8888
application.listen(8888)
# Get the current IOLoop
currentIOLoop = tornado.ioloop.IOLoop.current()
# Execute ZeroMQ Subscriber for our topics
executors.executor.submit(startListenToMessageQueue(zeroMQ_settings.server_subscriber_ports,
zeroMQ_settings.server_subscriber_IP,
zeroMQ_settings.server_subscribe_list))
# Test if the connection to our database is successful before we start the IOLoop
db_settings.testForDatabase(db_settings.database)
# Start the IOLoop
currentIOLoop.start()
Example (zeroMQ.py):
# Import our executor
import executors
# Import zeroMQ libraries
import zmq
from zmq.eventloop import ioloop, zmqstream
# Import db functions to process the message
import db
# zeroMQ.py deals with the communication between a zero message queue
def startListenToMessageQueue(subscribe_ports, subscribe_IP, subscribe_topic):
# Usage:
# This function starts the subscriber for our application that will listen to the
# address and ports specified in the zeroMQ_settings.py, it will spawn a callback when we
# received anything relevant to our topic.
# Arguments:
# None
# Return:
# None
# Get zmq context
context = zmq.Context()
# Get the context socket
socket_sub = context.socket(zmq.SUB)
# Connect to multiple subscriber ports
for ports in subscribe_ports:
socket_sub.connect("tcp://"+str(subscribe_IP)+":"+str(ports))
# Subscribe to our relevant topics
for topic in subscribe_topic:
socket_sub.setsockopt(zmq.SUBSCRIBE, topic)
# Setup ZMQ Stream with our socket
stream_sub = zmqstream.ZMQStream(socket_sub)
# When we recieve our data, we will process the data by using a callback
stream_sub.on_recv(processMessage)
# Print the Information to Console
print "Connected to publisher with IP:" + \
str(subscribe_IP) + ", Port" + str(subscribe_ports) + ", Topic:" + str(subscribe_topic)
def processMessage(message):
# Usage:
# This function processes the data using a callback format. The on_recv will call this function
# and populate the message variable with the data that we recieved through the message queue
# Arguments:
# message: a string containing the data that we recieved from the message queue
# Return:
# None
# Process the message with an executor, and use the addData function in our db to process the message
executors.executor.submit(db.addData, message)
Example (executors.py):
# Import futures library
from concurrent import futures
# executors.py will create our threadpools, and this can be shared around different python files
# which will not re-create 10 threadpools when we call it.
# we can a handful of executors for running synchronous tasks
# Create a 10 thread threadpool that we can use to call any synchronous/blocking functions
executor = futures.ThreadPoolExecutor(10)
Example (zeroMQ_settings.py):
# zeroMQ_settings.py keep the settings for zeroMQ, for example port, IP, and topics that
# we need to subscribe
# Set the Port to 5558
server_subscriber_ports = ["5556", "5558"]
# Set IP to localhost
server_subscriber_IP = "localhost"
# Set Message to Subscribe: metrics.dat
server_subscriber_topic_metrics = "metrics.dat"
# Set Message to Subscribe: test-010
server_subscribe_topics_test_010 = "test-010"
# List of Subscriptions
server_subscribe_list = [server_subscriber_topic_metrics, server_subscribe_topics_test_010]
Extra thanks to #dano
Can someone point me to an example of a REQ/REP non-blocking ZeroMQ (0MQ) with Python bindings? Perhaps my understanding of ZMQ is faulty but I couldn't find an example online.
I have a server in Node.JS that sends work from multiple clients to the server. The idea is that the server can spin up a bunch of jobs that operate in parallel instead of processing data for one client followed by the next
You can use for this goal both zmq.Poller (many examples you can find in zguide repo, eg rrbroker.py) or gevent-zeromq implementation (code sample).
The example provided in the accepted answer gives the gist of it, but you can get away with something a bit simpler as well by using zmq.device for the broker while otherwise sticking to the "Extended Request-Reply" pattern from the guide. As such, a hello worldy example for the server could look something like the following:
import time
import threading
import zmq
context = zmq.Context()
def worker():
socket = context.socket(zmq.REP)
socket.connect('inproc://workers')
while True:
msg = socket.recv_string()
print(f'Received request: [{msg}]')
time.sleep(1)
socket.send_string(msg)
url_client = 'tcp://*:5556'
clients = context.socket(zmq.ROUTER)
clients.bind(url_client)
workers = context.socket(zmq.DEALER)
workers.bind('inproc://workers')
for _ in range(4):
thread = threading.Thread(target=worker)
thread.start()
zmq.device(zmq.QUEUE, clients, workers)
Here we're letting four workers handle incoming requests in parallel. Now, you're using Node on the client side, but just to keep the example complete, one can use the Python client below to see that this works. Here, we're creating 10 requests which will then be handled in 3 batches:
import zmq
import threading
context = zmq.Context()
def make_request(a):
socket = context.socket(zmq.REQ)
socket.connect('tcp://localhost:5556')
print(f'Sending request {a} ...')
socket.send_string(str(a))
message = socket.recv_string()
print(f'Received reply from request {a} [{message}]')
for a in range(10):
thread = threading.Thread(target=make_request, args=(a,))
thread.start()