How can i pool a connection to XMPP server in django so that it is available across multiple requests. I don't want to connect and authenticate on every request which makes it a bit slow. Is this possible?
EDIT:
I am using xmpppy python xmpp library
As xmpppy has its own main loop I suggest to use it in a separate thread or even start separately. Actually you do have two separate applications: website and xmpp-client and it is normal to run them separately.
In this case you may use different ways to communicate between your applications: pipes between threads and/or processes, tcp or unix socket, file queue, different amqp silutions, any persistent storage, even d-bus, etc. But that is a subject for another question I think.
Related
I would like to have a server process (preferably Python) that accepts simple messages and multiple clients (again, preferably Python) that connect to the server and send messages to it. The server and clients will only ever be running on the same local machine and the OS is Linux based. The server will be automatically started by the OS and the clients started later independent of the server. I strongly want to avoid installing a whole separate messaging framework/server to do this. The messages will be simple strings such as "kick" or even just a single byte representing the message type. It also needs to know when a connection is made and lost.
From these requirements, I think named pipes would be a feasible solution, with a new instance of that pipe created for each client connection. However, when I search for examples, all of the ones I have come across deal with processes that are spawned from the same parent process and not independently started which means they can pass a parent reference to the child.
Windows seems to allow multiple instances of a named pipe (one for each client connection), but I'm unsure if this is possible on a Linux based OS?
Please could someone point me in the right direction, preferably with a basic example, even if it's just pseudo-code.
I've looked at the multiprocessing module in Python, but this seems to be oriented around the server and client sharing the same process or having one spawn the other.
Edit
May be important, the host device is not guaranteed to have networking capabilities (embedded device).
I've used zeromq for this sort of thing before. it's a relatively lightweight library that exposes this sort of functionality
otherwise, you could implement it yourself by binding a socket in the server process and having clients connect to it. this works fine for unix domain sockets, just pass AF_UNIX when creating the socket, e.g:
import socket
with socket.socket(socket.AF_UNIX) as s:
s.bind('/tmp/srv')
s.listen(1)
(c, addr) = s.accept()
with c:
c.send(b"hello world")
for the server, and:
with socket.socket(socket.AF_UNIX) as c:
c.connect('/tmp/srv')
print(c.recv(8192))
for the client.
writing a protocol around this is more involved, which is where things like zmq really help where you can easily push JSON messages around
I need to develop an application in Python handling a few thousand of persistent TCP connection in parallel. Clients connected to the server at bootstrap and send some message (in binary format) from time to time. The server also send both in reply to clients' message and asynchronously some other binary messages. Basically it is a persistent connection initiated by the client because I have no way to reach clients that are behind a NAT.
The question is: which is the libraries/framework i shall consider for this task. Spawning a thread for each client is not an option. I'm not aware of thread pool library for python. I also recently discovered gevent. Which other options do I have?
This link is an excellent read. It lists all the available event driven and asynchronous network frameworks within Python and also has good analysis of the performance for each framework.
It appears that the Tornado framework is one of the most-performant when developing such applications.
Hope this helps
'greenlets' is a leighweight concurrency package. See http://greenlet.readthedocs.org/en/latest/.
Besides greenlets, you might also want to consider multiprocessing. See http://docs.python.org/2/library/multiprocessing.html.
Is it possible to make a thread server, and another client in one application. The end result will be a binary that will use one thread as server and another as client. This means two different threads will be using same port, is this possible?
I'll be using python for writing this app.
Yes; if you're listening on a port in one thread, you can connect to it on a different thread in the same process.
In Python, this would be achieved using the threading module.
I'm trying to figure out how to make a server that can accept multiple clients at one time. While doing so, I need the client able to send and receive data from the server at the same time.
Would i have to make a threaded server? And have a thread for listening for data.
And then another thread for sending out information to the client?
Then for the client side, do i need use threads to send/get info?
Use async IO. There are dozen of async IO socket libs for python. Here is a brief benchmark.
I also tested gevent, eventlet, asyncore, twisted, pyev, pycurl, tornado.
Twsited
is stable but most slow and also not easy to start with.
gevent, eventlet (libevent)
easy to start and fast (code looks like blocking) but have some issues with forking.
pycurl (libcurl)
fast and easy (if you ok to do flags magic.. but there are example) but only http.
pyev (libev)
you must understand what you are doing almost like polling yourself.
tornado (polling in python)
fast enough and i think stable and also easy to start.
asyncore
really fast.. but don't use it.. it is ugly-ugly.
Don't use threads in python unless you are really know what you are doing.
Python and threads not really big friends (unless version <3.2 in 3.2 there must be a new gil).
On server-side you clearly need a Socket Server. This server creates a new thread for every incoming client connection.
Once a connection is established, both the client and the thread that was instantiated for the communication require an additional thread if they have to do other business in parallel than listening to the socket if the communication is synchronous. In case an asynchronous communication is what you need, then Python provides an excellent Asynchronous Socket Handler.
Use a asynchronous socket. Example server could be found here and the client code here. No direct hassle with threads. Depending on your needs, you probably don't need the asynchronous client.
You don't need threads for either client or server; you can instead select() to multiplex all the I/O inside a single thread.
What's the way to go to build a HTML gui for eg a multiplexed tcp server in python?
I am familiar with building websites with Django, but the thing i don't understand is, how is the tcp server part communicating with the Django related views? How could i implement the data sharing (do i see the wood for the trees)?
The problem for me is the mapping between the stateless "get an leave" and the "state full" py module "running as a daemon".
greetings
edit my standalone application skeleton:
#!/usr/bin/python
from django.core.management import setup_environ
import settings
setup_environ(settings)
from myapp.models import fanzy
def main():
for each in fanzy.objects.all():
print each.id, each.foo
if __name__ == '__main__':
main()
Django is just Python, so anything you've written in Python can be imported and referenced in the 'views' that you write for Django to serve back as HTTP responses.
In answer to another part of your question, the way a HTTP server handling TCP connections communicates with the python framework is most commonly through a protocol called WSGI. This is a good place to get more knowledge about the principles of WSGI. This is another.
With regards to running a background process and serving up a view of that processes' activities, it may be better to keep the two problems separate. You could write data to a file or a database and then access and serve this data via your web application.
These are just general comments, because your question is not totally clear. Please feel free to respond with further questions.
It's not always as easy as importing the libraries, mostly because process lifetime. For example, if you run Django through CGI with 1 request per process, then your TCP server won't stay alive between views. Similarly, if you use multiple processes to handle requests (e.g. using FastCGI), then you'll have several servers running at the same time.
If you want to have permanent network connections alive independent of request lifetimes, you'll need to run the TCP server in an external (daemon) process. This is the standard procedure for some caching schemes, where all your Django processes share cached data via a single deamon (e.g. Redis).
Basically, you have two approaches.
Global connection
Either establish a connection per Django process (if you have more than one) as a global object and forward requests to this from your view. This is most convenient if your TCP server is coded to handle multiple requests per connection. However, you'll have problems if your Django process is multi-threaded.
Connection per request
If your TCP server can accept multiple short-lived connections, this is also a viable approach. Just open the connection for the lifetime of your view. If this object is used often enough, you can even add some piece of middleware that opens up the connection and stores it in the request object.