Set Socket Options On Python Asyncio Streams API - python

Im using the asyncio streams API to create a socket connection. Is there a way I can set options such as keep alive setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1) ?
I see from the docs that asyncio.open_connection transfers ownership of the socket over to the streamwrite and the socket object can be closed after, Im unclear if those options get ported over as well.
Am I missing out on something obvious ?

Related

Local machine interprocess communication with multiple independent processes (1 server, n clients)

I would like to have a server process (preferably Python) that accepts simple messages and multiple clients (again, preferably Python) that connect to the server and send messages to it. The server and clients will only ever be running on the same local machine and the OS is Linux based. The server will be automatically started by the OS and the clients started later independent of the server. I strongly want to avoid installing a whole separate messaging framework/server to do this. The messages will be simple strings such as "kick" or even just a single byte representing the message type. It also needs to know when a connection is made and lost.
From these requirements, I think named pipes would be a feasible solution, with a new instance of that pipe created for each client connection. However, when I search for examples, all of the ones I have come across deal with processes that are spawned from the same parent process and not independently started which means they can pass a parent reference to the child.
Windows seems to allow multiple instances of a named pipe (one for each client connection), but I'm unsure if this is possible on a Linux based OS?
Please could someone point me in the right direction, preferably with a basic example, even if it's just pseudo-code.
I've looked at the multiprocessing module in Python, but this seems to be oriented around the server and client sharing the same process or having one spawn the other.
Edit
May be important, the host device is not guaranteed to have networking capabilities (embedded device).
I've used zeromq for this sort of thing before. it's a relatively lightweight library that exposes this sort of functionality
otherwise, you could implement it yourself by binding a socket in the server process and having clients connect to it. this works fine for unix domain sockets, just pass AF_UNIX when creating the socket, e.g:
import socket
with socket.socket(socket.AF_UNIX) as s:
s.bind('/tmp/srv')
s.listen(1)
(c, addr) = s.accept()
with c:
c.send(b"hello world")
for the server, and:
with socket.socket(socket.AF_UNIX) as c:
c.connect('/tmp/srv')
print(c.recv(8192))
for the client.
writing a protocol around this is more involved, which is where things like zmq really help where you can easily push JSON messages around

Python socket server for multiple connection handling

Can some one suggest a good example for socket server which can handle multiple connections with threading from python. (Live connection (like server-client ping-pong) that will handle from threads)
Using the SocketServer module you can create a server that handles multiple connections. Using Asynchronous mixins you can start new threads for each connection. There is a very good example in the Python documentation above.

python socket keepalive setting

I am doing asynchronous networking programming with tornado, I've created a socket
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM))
and put it in tornado iostream
iostream = tornado.iostream.IOStream(sock)
I wonder if I could set the socket 'keepalive' after then? Is the setting still valid after the iostream creation. Thank you in advance.
I would say it's usually best to set any socket options you want before creating the IOStream, but in most cases it's fine to set it afterwards as well (as long as the underlying socket option can be set on a socket that is already connected). As of Tornado 4.0 the only option IOStream touches directly is TCP_NODELAY.

Python "push server" tcp client

I am developing python service for xbmc and I am hopelessly stuck. XBMC has TCP API that communicates by JSON-RPC. XBMC has server TCP socket that is mainly design to recieve commands and respond, but if something happens in system it sends "Notification" to TCP. The problem is that I need to create TCP client that behaves like server therefore it is able to recieve this "Notification". Wherever I run socket.recv(4096) it waits for data and stucks my code, because I need to loop my code. Structure of code is basically like this:
import xbmc, xbmcgui, xbmcaddon
class XPlayer(xbmc.Player) :
def __init__ (self):
xbmc.Player.__init__(self)
def onPlayBackStarted(self):
if xbmc.Player().isPlayingVideo():
doPlayBackStartedStuff()
player=XPlayer()
doStartupStuff()
while (not xbmc.abortRequested):
if xbmc.Player().isPlayingVideo():
doPlayingVideoStuff()
else:
doPlayingEverythingElseStuff()
xbmc.sleep(500)
# This loop is the most essential part of code
if (xbmc.abortRequested):
closeEverything()
xbmc.log('Aborting...')
I tried everything threading, multiprocessing, blocking, non-blocking and nothing helped.
Thanks,
You likely want select(), poll() or epoll():
http://docs.python.org/library/select.html
This Python pipe-progress-meter application uses select, as an example:
http://stromberg.dnsalias.org/~strombrg/reblock.html
If you know what sort of delimiters are separating the various portions of the protocol, you may also find this useful, without a need for select or similar:
http://stromberg.dnsalias.org/~strombrg/bufsock.html
It deals pretty gracefully with "read to the next null byte", "read a maximum of 100 bytes", etc.

simultaneously sending/receiving info from a server, in python?

I'm trying to figure out how to make a server that can accept multiple clients at one time. While doing so, I need the client able to send and receive data from the server at the same time.
Would i have to make a threaded server? And have a thread for listening for data.
And then another thread for sending out information to the client?
Then for the client side, do i need use threads to send/get info?
Use async IO. There are dozen of async IO socket libs for python. Here is a brief benchmark.
I also tested gevent, eventlet, asyncore, twisted, pyev, pycurl, tornado.
Twsited
is stable but most slow and also not easy to start with.
gevent, eventlet (libevent)
easy to start and fast (code looks like blocking) but have some issues with forking.
pycurl (libcurl)
fast and easy (if you ok to do flags magic.. but there are example) but only http.
pyev (libev)
you must understand what you are doing almost like polling yourself.
tornado (polling in python)
fast enough and i think stable and also easy to start.
asyncore
really fast.. but don't use it.. it is ugly-ugly.
Don't use threads in python unless you are really know what you are doing.
Python and threads not really big friends (unless version <3.2 in 3.2 there must be a new gil).
On server-side you clearly need a Socket Server. This server creates a new thread for every incoming client connection.
Once a connection is established, both the client and the thread that was instantiated for the communication require an additional thread if they have to do other business in parallel than listening to the socket if the communication is synchronous. In case an asynchronous communication is what you need, then Python provides an excellent Asynchronous Socket Handler.
Use a asynchronous socket. Example server could be found here and the client code here. No direct hassle with threads. Depending on your needs, you probably don't need the asynchronous client.
You don't need threads for either client or server; you can instead select() to multiplex all the I/O inside a single thread.

Categories

Resources