I am creating a socket client and trying to obtain some data. In order to do so, I need to connect to a web server via socket and the server actually creates another socket which listens and awaits for the data after which sends back to the client.
The problem I have with the code below is that my socket client does not wait for the incoming data from the server and just accepts empty data.
How can I wait for a non-empty data from the server using Python sockets?
My code:
import sys
import json
import socketIO_client
import time
host = 'https://SOME_URL'
socketIO = socketIO_client.SocketIO(host, params={"email" : "edmund#gmail.com"})
def on_connect(*args):
print "socket.io connected"
def on_disconnect(*args):
print "socketIO diconnected"
socketIO.on('connect', on_connect)
socketIO.on('disconnect', on_disconnect)
def on_response_state(*args):
print args # Prints ()
socketIO.emit('receive_state',on_response_state)
socketIO.wait_for_callbacks(seconds=3)
Here's an example using socket. Using s.accept(), the client will wait till a client accepts the connection before starting the while loop to receive data. This should help with your problem.
def receiver():
PORT = 123
CHUNK_SIZE = 1024
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(('0.0.0.0', PORT))
s.listen(1)
conn,address=s.accept() # accept an incoming connection using accept() method which will block until a new client connects
while True:
datachunk = conn.recv(CHUNK_SIZE) # reads data chunk from the socket in batches using method recv() until it returns an empty string
if not datachunk:
break # no more data coming in, so break out of the while loop
data.append(datachunk) # add chunk to your already collected data
conn.close()
print(data)
return
receiver()
put the recv socket in a while thread.
like this:
def rec(self):
while 1:
sleep 0.01
rdata = self.clientsocket.recv(self.buffsize)
print("rec from server: ", rdata.decode('utf8'),'\n','press enter to continue')
....
t2 = threading.Thread(target=y.rec, name="rec")
t2.start()
Since you're using the SocketIO library to include parameters (achieved using requests), and want to emit a message, you can wait indefinitely for a response by not specifying a wait time.
with SocketIO(host, params={"email" : "edmund#gmail.com"}) as socketIO:
def on_response_state(*args):
print args # Prints ()
socketIO.emit('receive_state', on_response_state)
socketIO.wait()
Related
This question is similar to this one, but that was for JavaScript whereas mine is for Python.
How do I send a message to every connected client from the server except a selected client in Python using the sockets library?
I am making a simple game, where I want to detect the first person to press a button among three clients, and then notify the other two clients that they lost while notifying the winner that they won.
Usually, to send information to a client you do (on a new thread):
connected_client.sendall(data)
To receive, you do:
data = socket.recv()
But from what I searched, I couldn't find a way to send data to every connected client except a certain client.
I thought I could get around this by creating an 'identifying name' for each thread which ran the receiving function, but I couldn't find a good way to do this due to which I decided to search for a better option.
How can I do this?
Inserting them into a list can help. For example...
For the server side:
import socket
import threading
# This is where you store all of your Client IP's
client_list = []
server = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_ip = "yourip"
server_port = 8888
server.bind((server_ip, server_port))
def check_client(client_ip):
while True:
data = client_ip.recv(1024).decode()
if "condition" in data:
for ip in client_list:
if ip != client_ip:
ip.send("something".encode())
def check_connection():
server.listen()
while True:
client_ip, client_address = server.accept()
client_list.append(client_ip)
threading.Thread(target=check_client, args=(client_ip,), daemon=True).start()
check_connection()
So what happens is you call the check_connection function to check for incoming connections. After it receives one, it appends the connection inside the client_list variable. At the same time, it creates a thread to the current connection, check_client, which checks for any info being sent. If there's an info being sent by one of your clients, it checks if the "condition" string is inside your sent data. If so, it sends "something" string into all of your clients with exception to itself. Take note that when you send data, it must be in bytes.
For the client side:
import socket
import threading
server = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_ip = "serverip"
server_port = 8888
server.connect((server_ip, server_port))
def receive_info():
while True:
data = server.recv(1024).decode()
if "something" in data:
print("Someone already sent something")
threading.Thread(target=receive_info, daemon=True).start()
while True:
user_input = input("Type 'condition': ")
server.send(user_input.encode())
What this only does is, it sends your input into the server. If you typed "condition" on your input, it will send "something" on the other clients except you. So you need to setup 2 more clients in order to see the results.
Don't forget to set server_ip and server_port's values!
I have a code which works perfectly for one connection. I have seen two options for multi-client handling but I don't really understand it.
Here is the server socket code:
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as listening_sock:
listening_sock.bind(('', port))
listening_sock.listen()
client_soc, client_address = listening_sock.accept()
client_soc.sendall('200#Welcome to my server!'.encode())
print(f'Address {client_soc.getsockname()[0]} connected with port {client_soc.getsockname()[1]}')
while True:
# get message
msg = client_soc.recv(1024).decode()
# receive log print:
print(f'"{msg}" sent from {client_soc.getsockname()[0]}')
if 'Quit' in msg:
client_soc.sendall('200#Thanks for using my server!'.encode())
client_soc.close()
elif '0' < msg.split('#')[0] <= '9': # one of the valid actions
answer = call_action(msg.split('#')[0], db, msg.split('#')[1]) # the answer for given parameter
client_soc.sendall("200#".encode() + answer.encode())
If I have only one connection it works good and last thing I need to add is option for multiple-client handling. What is the shortest and easiest way to do it?
The code only calls accept once. Instead, call accept in a while loop and create a thread for each client connection so they are handled in parallel. Use the following pattern as an example:
import socket
import threading
# Thread to handle each "client_soc" connection
def handler(client_soc):
...
client_soc.close()
with socket.socket() as listening_sock:
listening_sock.bind(('', 8000))
listening_sock.listen()
while True:
client_soc, client_address = listening_sock.accept()
# Send each "client_soc" connection as a parameter to a thread.
threading.Thread(target=handler,args=(client_soc,), daemon=True).start()
There is also a built-in socket server that simplifies this process. Here's a tested example echo server that echoes back newline-terminated data:
from socketserver import ThreadingTCPServer,StreamRequestHandler
class echohandler(StreamRequestHandler):
def handle(self):
print(f'Connected: {self.client_address[0]}:{self.client_address[1]}')
while True:
# get message
msg = self.rfile.readline()
if not msg:
print(f'Disconnected: {self.client_address[0]}:{self.client_address[1]}')
break # exits handler, framework closes socket
print(f'Received: {msg}')
self.wfile.write(msg)
self.wfile.flush()
server = ThreadingTCPServer(('',8000),echohandler)
server.serve_forever()
Your code blocks itself.
For instance: client_soc, client_address = listening_sock.accept()
Accepts client, then while True: runs forever, so you can work with 1 connection only, because socket.accept() is called once. You should learn some of these to solve your problem: asyncio, threading, multiprocessing. These libraries will help your code to accept and work with clients concurrently. Sockets can use every, but often they are paired with asyncio: https://asyncio.readthedocs.io/
I want to create a multiprocessing echo server. I am currently using telnet as my client to send messages to my echo server.Currently I can handle one telnet request and it echos the response. I initially, thought I should intialize the pid whenever I create a socket. Is that correct?
How do I allow several clients to connect to my server using multiprocessing.
#!/usr/bin/env python
import socket
import os
from multiprocessing import Process
def create_socket():
# Create socket
sockfd = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# Port for socket and Host
PORT = 8002
HOST = 'localhost'
# bind the socket to host and port
sockfd.bind((HOST, PORT))
# become a server socket
sockfd.listen(5)
start_socket(sockfd)
def start_socket(sockfd):
while True:
# Establish and accept connections woth client
(clientsocket, address) = sockfd.accept()
# Get the process id.
process_id = os.getpid()
print("Process id:", process_id)
print("Got connection from", address)
# Recieve message from the client
message = clientsocket.recv(2024)
print("Server received: " + message.decode('utf-8'))
reply = ("Server output: " + message.decode('utf-8'))
if not message:
print("Client has been disconnected.....")
break
# Display messags.
clientsocket.sendall(str.encode(reply))
# Close the connection with the client
clientsocket.close()
if __name__ == '__main__':
process = Process(target = create_socket)
process.start()
It's probably a good idea to understand which are blocking system calls and which are not. listen for example is not blocking and accept is blocking one. So basically - you created one process through Process(..), that blocks at the accept and when a connection is made - handles that connection.
Your code should have a structure - something like following (pseudo code)
def handle_connection(accepted_socket):
# do whatever you want with the socket
pass
def server():
# Create socket and listen to it.
sock = socket.socket(....)
sock.bind((HOST, PORT))
sock.listen(5)
while True:
new_client = sock.accept() # blocks here.
# unblocked
client_process = Process(target=handle_connection, args=(new_client))
client_process.start()
I must also mention, while this is a good way to just understand how things can be done, it is not a good idea to start a new process for every connection.
The initial part of setting up the server, binding, listening etc (your create_socket) should be in the master process.
Once you accept and get a socket, you should spawn off a separate process to take care of that connection. In other words, your start_socket should be spawned off in a separate process and should loop forever.
I have python script with only one socket object that is connect to a java server.
I started a thread for sending heart beat message to server per 5 secs.
And another thread for receiving message from server.
BTW, all the data send/recv is in protobuffer format.
# socket_client.py
def recv_handler():
global client_socket
while True:
try:
# read 4 bytes first
pack_len = client_socket.recv(4)
pack_len = struct.unpack('!i', pack_len)[0]
# read the rest
recv_data = client_socket.recv(pack_len)
# decode
decompressed_data = data_util.decompressMessage(recv_data)
sc_pb_message = data_util.decodePBMessage(decompressed_data)
sc_head = data_util.parseHead(sc_pb_message)
except:
print 'error'
def heart_handler():
global client_socket
while True:
if client_socket:
message = data_util.makeMessage('MSG_HEART_BEAT')
compressed_data = data_util.compressMessage(message)
send_data = data_util.makeSendData(compressed_data)
try:
client_socket.send(send_data)
except:
print 'except'
pass
time.sleep(5)
def connect(address, port):
global client_socket
client_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
client_socket.connect((address, port))
# thread sending heart beat message
th = threading.Thread(target = heart_handler)
th.start()
# thread recving message
tr = threading.Thread(target = recv_handler)
tr.start()
The code above works just fine. The script will send a heart beat message per 5 secs and receive the message from server and the message can be decoded successfully.
And here comes the trigger part than I do not know how to implement.
My python script need to receive input from the browser at the same time, so I started a BaseHTTPServer, to handle the POST request from the browser.
When a request come, I would like to call the client_socket.send method to send a specific message to the server and of course I need to return the data from server back to the browser.
# http_server.py
def do_POST(self):
# ...
result = socket_client.request(message)
self.send_response(200)
self.end_headers()
self.wfile.write(...)
And here is what I tried to do in request:
def request(message):
global client_socket
client_socket.send(message)
pack_len = client_socket.recv(4)
pack_len = struct.unpack('!i', pack_len)[0]
recv_data = client_socket.recv(pack_len)
return recv_data
The problem I am having is the data I received in the request method after calling the send method seems to be disturbed by the data of heart beat in the thread.
If I comment out the heart beat thread and the receive thread, than the request method will work just fine. The data from server can decoded with no error and it can be sent back to the browser successfully.
My solution now might be wrong and I really do not know how to get this work.
Any advice will be appreciated, thanks :)
socket object in Python is not thread-safe, you need to access the shared resources (in this case the client_socket object) with the help of some synchronization primitives, such as threading.Lock in Python 2. Check here for a similar problem: Python: Socket and threads?
In Python how do I fill a buffer with lines of data (strings) and consume it with a second process? There are ample of examples here adding and reading lines from a string, but I need to remove the consumed line from the string for the string to work as a buffer.
Example: read sporadic data from a serial port and send it via TCP/IP to a server. Line-by-line within one loop and no buffering = no problem, but in case the destination is unreachable the data should be stored in the buffer and then sent once connection is available.
#!/usr/bin/python
import serial
import socket
from multiprocessing import Process
ip = "someURL"
port = 12345
ser = serial.Serial("/dev/ttyUSB0", 57600, timeout=0)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
def serial_reader():
while True:
for line in ser.read():
try:
response = ser.readlines(None)
response = str(response)
message = response[7:]
except:
print datetime.datetime.now(), " No data from serial connection."
##
def data_sender():
s.connect((ip, port))
while True:
for line in queue():
try:
s.send(message)
except:
try:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((ip, port))
continue
except:
s.close()
##
if __name__ == '__main__':
Process(target=serial_reader).start()
Process(target=data_sender).start()
I think the best way to achieve what you want is to use a queue:
from multiprocessing import Queue
specifically use queue.put() to put a string on the queue, queue.get() to retrieve it, and queue.task_done() to indicate that the task is complete.
https://docs.python.org/2/library/queue.html#Queue.Queue
if you need a bigger gun take a look at RabbitMQ and python libraries that implement the AMPQ protocol such as rabbitpy. This is the defacto standard for inter process/inter service communication and has a lot of usefyl stuff already baked in, such as persisting messages in case the processes shut down, load balancing tasks across multiple processes, etc.