I am working on a server that continuously send data to a client. This client may also interact punctuality with the server sending a specific request. I wrote a daemon to do this. Note that this daemon works in a thread. For now, the script is structured as follows :
class MyDaemon(threading.Thread):
def __init__(self):
# Init Stream socket (output)
self.MainSock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.MainSock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
self.MainSock.bind(('', 15555))
self.MainSock.listen(5)
self.MainSock.setblocking(0)
# Init Request socket (input)
self.RequestSock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.RequestSock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
self.RequestSock.bind(('', 15556))
self.RequestSock.listen(5)
self.RequestSock.setblocking(0)
def run(self):
while True:
# Listen to connection on MainSock
try:
self.stream, address = self.MainSock.accept()
except:
pass
# Listen to connection on RequestSock
try:
self.request, address = self.RequestSock.accept()
except:
pass
if self.stream:
send_message_continuously() # it is a stream
if self.request:
recv_a_message_from_client()
do_whatever_action_the client_request()
The problem is that :
using only the steamer, all works fine.
using only the requester, all works fine.
using the two sockets at the same time blocks the streamer.
I read that a single thread cannot connect (or be connected) to two sockets at the same time. I also read that the use of the select module may help to handle this kind of problem, but I never used it and I am a little bit lost about its use on my particular case.
What is the more efficient way to handle this problem ?
How to set up selectin my particular case ?
Wouldn't it be more efficent/simple to send stream to a sub-thread and request to another ?
EDIT : Finally, I used a sub-thread for the stream
When using select you have to test, which of your two sockets is ready to accept:
def run(self):
while True:
ready, _, _ = select.select([self.MainSock, self.RequestSock],[],[])
for sock in ready:
if sock is self.MainSock:
send_message_continuously(sock.accept())
elif sock is self.RequestSock:
recv_a_message_from_client(sock.accept())
I suggest to you to try gevent, simple to understant the api, its good to go if you want to overcome the problem, there is a section about servers
to understand about tcp communication & rethink your current solution.
a code snap -
def handle(socket, address):
print('new connection!')
server = StreamServer(('127.0.0.1', 1234), handle) # creates a new server
server.start() # start accepting new connections
Hope you can spend more time on making the application without making skelts. :)
Related
I have a code which works perfectly for one connection. I have seen two options for multi-client handling but I don't really understand it.
Here is the server socket code:
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as listening_sock:
listening_sock.bind(('', port))
listening_sock.listen()
client_soc, client_address = listening_sock.accept()
client_soc.sendall('200#Welcome to my server!'.encode())
print(f'Address {client_soc.getsockname()[0]} connected with port {client_soc.getsockname()[1]}')
while True:
# get message
msg = client_soc.recv(1024).decode()
# receive log print:
print(f'"{msg}" sent from {client_soc.getsockname()[0]}')
if 'Quit' in msg:
client_soc.sendall('200#Thanks for using my server!'.encode())
client_soc.close()
elif '0' < msg.split('#')[0] <= '9': # one of the valid actions
answer = call_action(msg.split('#')[0], db, msg.split('#')[1]) # the answer for given parameter
client_soc.sendall("200#".encode() + answer.encode())
If I have only one connection it works good and last thing I need to add is option for multiple-client handling. What is the shortest and easiest way to do it?
The code only calls accept once. Instead, call accept in a while loop and create a thread for each client connection so they are handled in parallel. Use the following pattern as an example:
import socket
import threading
# Thread to handle each "client_soc" connection
def handler(client_soc):
...
client_soc.close()
with socket.socket() as listening_sock:
listening_sock.bind(('', 8000))
listening_sock.listen()
while True:
client_soc, client_address = listening_sock.accept()
# Send each "client_soc" connection as a parameter to a thread.
threading.Thread(target=handler,args=(client_soc,), daemon=True).start()
There is also a built-in socket server that simplifies this process. Here's a tested example echo server that echoes back newline-terminated data:
from socketserver import ThreadingTCPServer,StreamRequestHandler
class echohandler(StreamRequestHandler):
def handle(self):
print(f'Connected: {self.client_address[0]}:{self.client_address[1]}')
while True:
# get message
msg = self.rfile.readline()
if not msg:
print(f'Disconnected: {self.client_address[0]}:{self.client_address[1]}')
break # exits handler, framework closes socket
print(f'Received: {msg}')
self.wfile.write(msg)
self.wfile.flush()
server = ThreadingTCPServer(('',8000),echohandler)
server.serve_forever()
Your code blocks itself.
For instance: client_soc, client_address = listening_sock.accept()
Accepts client, then while True: runs forever, so you can work with 1 connection only, because socket.accept() is called once. You should learn some of these to solve your problem: asyncio, threading, multiprocessing. These libraries will help your code to accept and work with clients concurrently. Sockets can use every, but often they are paired with asyncio: https://asyncio.readthedocs.io/
I'm trying to access socket objects from memory address "socket._socketobject object at 0x7f4c39d78b40" and use it for another function at different times. The clients are connected to port 9999 and I want the server to react with each one at a later stage while keeping the connection up.
def sock_con(host,port):
host = host
port = port
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sock.bind((host, port))
sock.listen(5)
while True:
client, address = sock.accept()
print client
print type(client)
print "Server (%s, %s) connected" % address
mongoconn = connectionx('IP_Clients')
key = {'addresses':'192.168.11.1'}
data = {'client':str(client), 'addresses':address}
mongoconn.update(key, data)
client.settimeout(60)
The next code is at a different module which can be used at anytime:
import os,sys
import socket
currentdir = os.path.dirname(os.path.realpath(__file__))
parentdir = os.path.dirname(currentdir)
sys.path.insert(0,parentdir)
from mgodb import connectionx
mongoconn = connectionx('IP_Clients')
x= mongoconn.find_one({'addresses':'192.168.11.1'})
client= eval(x['client'])
def send_stuff(client,addresses,arg1):
while True:
try:
#data = client.recv(size)
print data
client.send(arg1)
return data
except:
#raise error('Client disconnected')
client.close()
return False
send_stuff(client,x['addresses'],'test10')
To use sockets later in the same process, just store them at their arrival and find them later. Something like this:
...
clients = {}
while True:
client, addr = server.accept()
clients[addr[0]] = client
So, if you stop the listening loop, or run it in a thread, or you run something else in a thread (doesn't matter), you can get the opened socket object from dictionary clients by the client's IP address.
client = clients.get("192.168.1.1")
But you should count in the port as well for detection, because there may be two different clients contacting you from same IP address.
If you want to send an opened socket to another process, well, it is doable but not worth the trouble.
You would need to send the socket's filedescriptor ( socket.fileno() ) to another process, and that can be done using Python module sendfds. It can be found on pypi.python.org.
Then, in receiving process, you would have to construct the socket wrapper object around it manually or trick somehow the existing _socket.dll/.so and socket.py modules to do it for you.
A lot of work and success dubious. What you should do instead is to use the dictionary to store sockets and create an interface (over socket, PIPE or whatever IPC) to forward messages to and from needed connected sockets.
Finally, you do not have to worry about this mess at all, because Python has asyncore module.
It already does the socket storing into dictionary and other useful stuff. The thing is, you need to know what you want to achieve to be able to adequately tune the asyncore client handler. Set correct buffer sizes etc. etc. But asyncore is elegant and you can easily mix it with existing GUI event loop. asyncore and asynchat are often used when creating push servers or instant-messaging-like systems.
I have an instant messaging app, with client written in python.
Code that connects to a server:
def connect(self, host, port, name):
host = str(host)
port = int(port)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((host, port))
s.send('CONNECT:' + name)
print s.recv(1024)
return s
Then s will be stored in a self.socket.
Here is function that receives tcp packets from server and prints it to command line.
def chat(self):
while True:
data = self.socket.recv(4096)
if not data:
pass
else:
print data
So in my mind it should receive everything server sends and prints out, but it isn't happens. Anyone knows how to bring it to life?
There is a way with select function to monitor multiple streams, make a list of all the streams you need to handle and use the select function for it, for the user input use sys.stdin and all the sockets that you expect to chat with.
check this: https://docs.python.org/2/library/select.htmlBut still the best way to do asynchronous chat will be with udp, it will work really fine
I'm developing a Flask/gevent WSGIserver webserver that needs to communicate (in the background) with a hardware device over two sockets using XML.
One socket is initiated by the client (my application) and I can send XML commands to the device. The device answers on a different port and sends back information that my application has to confirm. So my application has to listen to this second port.
Up until now I have issued a command, opened the second port as a server, waited for a response from the device and closed the second port.
The problem is that it's possible that the device sends multiple responses that I have to confirm. So my solution was to keep the port open and keep responding to incoming requests. However, in the end the device is done sending requests, and my application is still listening (I don't know when the device is done), thereby blocking everything else.
This seemed like a perfect use case for a thread, so that my application launches a listening server in a separate thread. Because I'm already using gevent as a WSGI server for Flask, I can use the greenlets.
The problem is, I have looked for a good example of such a thing, but all I can find is examples of multi-threading handlers for a single socket server. I don't need to handle a lot of connections on the socket server, but I need it launched in a separate thread so it can listen for and handle incoming messages while my main program can keep sending messages.
The second problem I'm running into is that in the server, I need to use some methods from my "main" class. Being relatively new to Python I'm unsure how to structure it in a way to make that possible.
class Device(object):
def __init__(self, ...):
self.clientsocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.serversocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
def _connect_to_device(self):
print "OPEN CONNECTION TO DEVICE"
try:
self.clientsocket.connect((self.ip, 5100))
except socket.error as e:
pass
def _disconnect_from_device(self):
print "CLOSE CONNECTION TO DEVICE"
self.clientsocket.close()
def deviceaction1(self, ...):
# the data that is sent is an XML document that depends on the parameters of this method.
self._connect_to_device()
self._send_data(XMLdoc)
self._wait_for_response()
return True
def _send_data(self, data):
print "SEND:"
print(data)
self.clientsocket.send(data)
def _wait_for_response(self):
print "WAITING FOR REQUESTS FROM DEVICE (CHANNEL 1)"
self.serversocket.bind(('10.0.0.16', 5102))
self.serversocket.listen(5) # listen for answer, maximum 5 connections
connection, address = self.serversocket.accept()
# the data is of a specific length I can calculate
if len(data) > 0:
self._process_response(data)
self.serversocket.close()
def _process_response(self, data):
print "RECEIVED:"
print(data)
# here is some code that processes the incoming data and
# responds to the device
# this may or may not result in more incoming data
if __name__ == '__main__':
machine = Device(ip="10.0.0.240")
Device.deviceaction1(...)
This is (globally, I left out sensitive information) what I'm doing now. As you can see everything is sequential.
If anyone can provide an example of a listening server in a separate thread (preferably using greenlets) and a way to communicate from the listening server back to the spawning thread, it would be of great help.
Thanks.
EDIT:
After trying several methods, I decided to use Pythons default select() method to solve this problem. This worked, so my question regarding the use of threads is no longer relevant. Thanks for the people who provided input for your time and effort.
Hope it can provide some help, In example class if we will call tenMessageSender function then it will fire up an async thread without blocking main loop and then _zmqBasedListener will start listening on separate port untill that thread is alive. and whatever message our tenMessageSender function will send, those will be received by client and respond back to zmqBasedListener.
Server Side
import threading
import zmq
import sys
class Example:
def __init__(self):
self.context = zmq.Context()
self.publisher = self.context.socket(zmq.PUB)
self.publisher.bind('tcp://127.0.0.1:9997')
self.subscriber = self.context.socket(zmq.SUB)
self.thread = threading.Thread(target=self._zmqBasedListener)
def _zmqBasedListener(self):
self.subscriber.connect('tcp://127.0.0.1:9998')
self.subscriber.setsockopt(zmq.SUBSCRIBE, "some_key")
while True:
message = self.subscriber.recv()
print message
sys.exit()
def tenMessageSender(self):
self._decideListener()
for message in range(10):
self.publisher.send("testid : %d: I am a task" %message)
def _decideListener(self):
if not self.thread.is_alive():
print "STARTING THREAD"
self.thread.start()
Client
import zmq
context = zmq.Context()
subscriber = context.socket(zmq.SUB)
subscriber.connect('tcp://127.0.0.1:9997')
publisher = context.socket(zmq.PUB)
publisher.bind('tcp://127.0.0.1:9998')
subscriber.setsockopt(zmq.SUBSCRIBE, "testid")
count = 0
print "Listener"
while True:
message = subscriber.recv()
print message
publisher.send('some_key : Message received %d' %count)
count+=1
Instead of thread you can use greenlet etc.
I have an app X that can run on either of two computers, but on no more than one at once. I have another app Y, written in Python, that given the two possible ip addresses needs to find out which computer is running app X (if any). I've partially solved this by having a UDP service that listens on a port and responds with a 'Hello' whenever it receives some data. The client can try and send data to the app X port on each address and if it gets a response, I know the application is running on that computer.
My code so far looks like this:
def ipaddress(self):
"""Test which side responds on the status port."""
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
try:
s.settimeout(5)
s.sendto("status", (ADDR_A, PORT))
s.recvfrom(1024)
except socket.timeout:
try:
s.sendto("status", (ADDR_B, PORT))
s.recvfrom(1024)
except:
pass
else:
return ADDR_B
else:
return ADDR_A
finally:
s.close()
return None
The problem with this function is that it's called periodically whenever I want to talk to the computer running app X. It will always test ADDR_A first, and if it's not running app X then I have to wait for the socket to timeout before trying ADDR_B. Although it doesn't happen often app X could have switched computers whenever I come around trying again.
Is there a better way? I'm wondering if it's possible to connect to both computers in parallel and return as soon as one responds? Or should I cache which ip address responded first last time the function was called? How would I code these or other ideas?
Thanks.
EDIT: Here is my revised code using select:
def ipaddress(addr_a, addr_b, timeout=5):
"""Test which side responds on the status port."""
# Create UDP sockets for each address
socks = [ socket.socket(socket.AF_INET, socket.SOCK_DGRAM),
socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
]
# Send some data to each socket
for sock, addr in zip(socks, (addr_a, addr_b)):
sock.connect(addr) # do explicit connect so getpeername works
sock.send("status")
# Wait for the first to respond if any
while socks:
waiting = select.select(socks, [], socks, timeout)[0]
if waiting:
for sock in waiting:
try:
data = sock.recv(1024)
if data:
return sock.getpeername()[0]
except Exception, e:
# Occasionally get [Errno 10054] which means socket isn't really
# available, so see if other responds instead...
socks.remove(sock)
else:
break # timeout occurred
return None
You should look at select.select() which provides exactly the capability you are looking for to look at the two computers in parallel.