Python "push server" tcp client - python

I am developing python service for xbmc and I am hopelessly stuck. XBMC has TCP API that communicates by JSON-RPC. XBMC has server TCP socket that is mainly design to recieve commands and respond, but if something happens in system it sends "Notification" to TCP. The problem is that I need to create TCP client that behaves like server therefore it is able to recieve this "Notification". Wherever I run socket.recv(4096) it waits for data and stucks my code, because I need to loop my code. Structure of code is basically like this:
import xbmc, xbmcgui, xbmcaddon
class XPlayer(xbmc.Player) :
def __init__ (self):
xbmc.Player.__init__(self)
def onPlayBackStarted(self):
if xbmc.Player().isPlayingVideo():
doPlayBackStartedStuff()
player=XPlayer()
doStartupStuff()
while (not xbmc.abortRequested):
if xbmc.Player().isPlayingVideo():
doPlayingVideoStuff()
else:
doPlayingEverythingElseStuff()
xbmc.sleep(500)
# This loop is the most essential part of code
if (xbmc.abortRequested):
closeEverything()
xbmc.log('Aborting...')
I tried everything threading, multiprocessing, blocking, non-blocking and nothing helped.
Thanks,

You likely want select(), poll() or epoll():
http://docs.python.org/library/select.html
This Python pipe-progress-meter application uses select, as an example:
http://stromberg.dnsalias.org/~strombrg/reblock.html
If you know what sort of delimiters are separating the various portions of the protocol, you may also find this useful, without a need for select or similar:
http://stromberg.dnsalias.org/~strombrg/bufsock.html
It deals pretty gracefully with "read to the next null byte", "read a maximum of 100 bytes", etc.

Related

Local machine interprocess communication with multiple independent processes (1 server, n clients)

I would like to have a server process (preferably Python) that accepts simple messages and multiple clients (again, preferably Python) that connect to the server and send messages to it. The server and clients will only ever be running on the same local machine and the OS is Linux based. The server will be automatically started by the OS and the clients started later independent of the server. I strongly want to avoid installing a whole separate messaging framework/server to do this. The messages will be simple strings such as "kick" or even just a single byte representing the message type. It also needs to know when a connection is made and lost.
From these requirements, I think named pipes would be a feasible solution, with a new instance of that pipe created for each client connection. However, when I search for examples, all of the ones I have come across deal with processes that are spawned from the same parent process and not independently started which means they can pass a parent reference to the child.
Windows seems to allow multiple instances of a named pipe (one for each client connection), but I'm unsure if this is possible on a Linux based OS?
Please could someone point me in the right direction, preferably with a basic example, even if it's just pseudo-code.
I've looked at the multiprocessing module in Python, but this seems to be oriented around the server and client sharing the same process or having one spawn the other.
Edit
May be important, the host device is not guaranteed to have networking capabilities (embedded device).
I've used zeromq for this sort of thing before. it's a relatively lightweight library that exposes this sort of functionality
otherwise, you could implement it yourself by binding a socket in the server process and having clients connect to it. this works fine for unix domain sockets, just pass AF_UNIX when creating the socket, e.g:
import socket
with socket.socket(socket.AF_UNIX) as s:
s.bind('/tmp/srv')
s.listen(1)
(c, addr) = s.accept()
with c:
c.send(b"hello world")
for the server, and:
with socket.socket(socket.AF_UNIX) as c:
c.connect('/tmp/srv')
print(c.recv(8192))
for the client.
writing a protocol around this is more involved, which is where things like zmq really help where you can easily push JSON messages around

How to cancel a blocking thread caused by input() in Python?

I'm starting to learn more about TCP protocols in Python and I've been having some trouble with blocking threads inside clients.
Ideally, my application would work like this: I have different clients with thread functions, each one of them containing an input function in order to receive a specific command to send to the server (for example 'X'). When the 'X' is tapped in ONE client, the server receives it and sends a message to all the other clients informing that the program will continue and releasing them from their input functions - almost like cancelling them.
The problem lies on the fact that the input functions are blocking the clients from leaving the loop. I've tried setting the input thread functions as daemon but it blocks until you tap something anyway - which is unfortunately the only workaround that I've found so far.
I would like to use socket and the select module for connection, without being attached to any particular OS (so no msvcrt that works on Windows or the select module to monitor the stdin, which is only available in UNIX based OS).
Any help would be greatly appreciated!

Python: how to host a websocket and interact with a serial port without blocking?

I am busy developing a Python system that uses web-sockets to send/received data from a serial port.
For this to work I need to react to data from the serial port as it is received. Problem is to detect incoming data the serial port needs to queried continuously looking for incoming data. Most likely a continuous loop. From previous experiences(Slow disk access + heavy traffic) using Flask this sounds like it could cause the web-sockets to be blocked. Will this be the case or is there a work around?
I have looked at how NodeJS interact with serial ports and it seems much nicer. It raises an event when there is incoming data instead of querying it all the time. Is this an option in Python?
Extra Details:
For now it will only be run on Linux.(Raspbian)
Flask was my first selection but I am open to other Python Frameworks.
pyserial for serial connection.(Is the only option I know of)
Python provides the select module in the stdlib which can do what you want. It DOES depend on what operating system you are using though. So since you haven't provided that information I can't be that helpful. However a simple example under Linux would be:
import select
epoll = select.epoll()
# Do stuff to create serial connection and websocket connection
epoll.register(websocket_file_descriptor, select.EPOLLIN)
epoll.register(serial_file_descriptor, select.EPOLLIN)
while True:
events = epoll.poll(1)
# Do stuff with the event,
for fileno, event in events:
if fileno == serial_file_descriptor:
data = os.read(serial_file_descriptor)
os.write(websocket_file_descriptor, data)
elif fileno == websocket_file_descriptor:
data = os.read(websocket_file_descriptor)
# Do something with the incoming data
That's a basic, incomplete, example. But it should give you an idea of the general process of using a system like epoll.
Simply start a subprocess that listens to the serial socket and raises an event when it has a message. Have a separate sub-process for each web port that does the same.

Can I use the Python SocketServer class for this implementation?

I've come to the realization where I need to change my design for a file synchronization program I am writing.
Currently, my program goes as follows:
1) client connects to server (and is verified)
2) if the client is verified, create a thread and begin a loop using the socket the client connected with
3) if a file on the client or server changes, send the change through that socket (using select for asynchronous communication)
My code sucks because I am torn between using one socket for file transfer or using a socket for each file transfer. Either case (in my opinion) will work, but for the first case I would have to create some sort of protocol to determine what bytes go where (some sort of header), and for the second case, I would have to create new sockets on a new thread (that do not need to be verified again), so that files can be sent on each thread without worrying about asynchronous transfer.
I would prefer to do the second option, so I'm investigating using SocketServer. Would this kind of problem be solved with SocketServer.ThreadingTCPServer and SocketServer.ThreadingMixIn? I'm having trouble thinking about it because I would assume SocketServer.ThreadingMixIn works for newly connected clients, unless I somehow have an "outer" socket server which servers "inner" socket servers?
SocketServer will work for you. You create one SocketServer per port you want to listen on. Your choice is whether you have one listener that handles the client/server connection plus per file connections (you'd need some sort of header to tell the difference) or two listeners that separate client/server connection and per file connections (you'd still need a header so that you knew which file was coming in).
Alternately, you could choose something like zeromq that provides a message transport for you.

Socket : 2 way communication in python

I want a two way communication in Python :
I want to bind to a socket where one client can connect to, and then server and client can "chat" with eachother.
I already have the basic listener :
import socket
HOST='' #localhost
PORT=50008
s=socket.socket(socket.AF_INET, socket.SOCK_STREAM ) #create an INET, STREAMing socket
s.bind((HOST,PORT)) #bind to that port
s.listen(1) #listen for user input and accept 1 connection at a time.
conn, addr = s.accept()
print "The connection has been set up"
bool=1
while bool==1:
data=conn.recv(1024)
print data
if "#!END!#" in data:
print "closing the connection"
s.close()
bool=0
What I want to do now is implement something so this script also accepts user input and after the enter key is hit, send it back to the client.
But I can't figure out how I can do this ? Because if I would do it like this :
while bool==1:
data=conn.recv(1024)
print data
u_input = raw_input("input now")
if u_input != "":
conn.send(u_input)
u_input= ""
Problem is that it probably hangs at the user input prompt, so it does not allow my client to send data.
How do I solve this ?
I want to keep it in one window, can this be solved with threads ?
(I've never used threads in python)
Python's sockets have a makefile tool to make this sort of interaction much easier. After creating a socket s, then run f = s.makefile(). That will return an object with a file-like interface (so you can use readline, write, writelines and other convenient method calls). The Python standard library itself makes use of this approach (see the source for ftplib and poplib for example).
To get text from the client and display it on the server console, write a loop with print f.readline().
To get text from the server console and send it to the client, write a loop with f.write(raw_input('+ ') + '\n').
To be send and receive at the same time, do those two steps separate threads:
Thread(target=read_client_and_print_to_console).start()
Thread(target=read_server_console_and_send).start()
If you prefer async over threads, here are two examples to get you started:
Basic Async HTTP Client
Basic Async Echo Server
The basic problem is that you have two sources of input you're waiting for: the socket and the user. The three main approaches I can think of are to use asynchronous I/O, to use synchronous (blocking) I/O with multiple threads, or to use synchronous I/O with timeouts. The last approach is conceptually the simplest: wait for data on the socket for up to some timeout period, then switch to waiting for the user to enter data to send, then back to the socket, etc.
I know at a lower level, you could implement this relatively easily by treating both the socket and stdin as I/O handles and use select to wait on both of them simultaneously, but I can't recall if that functionality is mapped into Python, or if so, how. That's potentially a very good way of handling this if you can make it work. EDIT: I looked it up, and Python does have a select module, but it sounds like it only functions like this under Unix operating systems--in Windows, it can only accept sockets, not stdin or files.
have you checked twisted? twisted python event driven networking engine and library or
oidranot a python library especially for that based on torando web server

Categories

Resources