How to design multithreaded GUI-network application? - python

I'm working on a small utility application in Python.
The networking is gonna send and receive messages.
The GUI is gonna display the messages from the GUI and provide user input for entering messages to be sent.
There's also a storage part as I call it, which is gonna get all the network messages and save them in some kind of database to provide some kind of search later.
What my question is, what would be the best way to design this? For sure all them have to happen in separate threads, but what I'm wondering is how to pass information between them the best way? What kind of signal passing / thread locking available in Python should I use to avoid possible problems?

One, probably the best, solution for this problem is to use Twisted. It supports all the GUI toolkits.

I think that you could use a Queue for passing messages between the GUI and the network threads.
As for GUI and threads in general, you might find the PyGTK and Threading article interesting.

Related

PyQt4: Conventional multithreaded UI over Signal/Slot mechanism?

Anyone who has worked with Multithreaded PyQt4 apps? I was just wondering if the inbuilt signal/slot mechanism coupled with QtThread of PyQt4 framework has any benefit over the standard Python threads (which are designed in my code to handle the UI components in a thread safe way offcourse) using event driven async callback.
I am looking for any major speed or security concerns, any specific run-time exceptions or edge cases. (The UI is quite complex hence a re-write at a later stage would be very counter-productive).
Thanks.
Edit: I realize this might mean replicating some of the already present PyQt core functionality but it is ok if it allows more flexibility within the app.
There's no point really using Qt/PyQt if your're not using the signal-and-slot mechanism and rolling your own event loop. Basically, you'd be reimplementing the core of the framework itself. But I'm guessing this is not what you're asking about.
It would be nice if you could clarify your question a bit, (because of which I've had to make a few assumptions)
but here's the deal:
I think you're a bit confused about what the signal and slot mechanism does. (or maybe not, forgive me for reiterating some stuff that might probably be obvious to you).
The signals-and-slots do not implement threading for you (so the question of using signal/slots having any benefit over standard Python threads is moot)
You're probably assuming that the signal-slot mechanism is multithreaded, and that a slot when called by a signal, executes in a new thread. Well, this is not the case.
The signal and slot mechanism in Qt runs in a single event loop in Qt (implemented by QApplication), which itself runs in a single thread.
So signals and slots are not replacements for multi-threading in any way.
If there's a slot that blocks, then it will block your entire application.
So any blocking I/O or time intensive functions should ideally be in a separate thread from the UI, and your slots should start execution of these threads.
Now whether to use QThread or standard Python threads to implement your own threads is another issue, and it's been asked on StackOverflow before, but I tend to use QThreads for Qt apps.
So if you have a button, and you want to start a file download with the Requests library when its clicked, you'll connect the clicked signal of the QPushButton to a slot say for example downloadButtonClicked, and that slot would start a new QThread which would take care of downloading the file using Requests. You can further connect the finished() signal from the QThread to know when the download is complete and to update your UI
(I'll add a code example if this is indeed what you're asking about. So please clarify your question)
Based on your comment to another reply:
Sorry for the ambiguity, I was talking about QtThread Slot/Signal
Mechanism vs. callbacks using inbuilt Python Threads. I intend on
creating separate threads from UI on event arrival (clicks, etc) and
then use callbacks to the main UI thread from the new threads to
update the UI (all UI logic in the main thread with locks to keep it
thread safe.) I know this might mean replicating some of the already
present PyQt functionality but I feel this way I would have a lot more
control over my app. (The extra work isn't a concern if it allows more
flexibility in the app. Plus it isn't so much of work)
I would say that what you are after is to use QApplication.postEvent() from your threads. With a bit of extra code, you can use this to execute arbitrary methods in the main thread either synchronously or asynchronously.
I'm not sure there are really any advantages or disadvantages to either option (Qt or Python threads). As far as I'm aware, they both still hold the GIL, meaning your program is never truly multithreaded. QThreads come with an event loop, but as you say that isn't hard to write yourself in your own Python thread.
Have you considered using multiple processes instead of multiple threads? While slower to start, you get the advantage of actually having your code run in parallel.
At the end of the day, I think the answer to your question is simply personal preference. Mine would be to avoid using a QThread because it makes it easier to port your application to another widget toolkit in the future if PyQt\PySide\Qt ever die (not that it is very likely, but I've had a bad experience with PyGTK so now I'm wary)
EDIT: Please also look at this, as it has people far better answers than I've given: Threading in a PyQt application: Use Qt threads or Python threads?

Running as many instances of a program as possible

I'm trying to implement some code to import user's data from another service via the service's API. The way I'm going to set it up is all the request jobs will be kept in a queue which my simple importer program will draw from. Handling one task at a time won't come anywhere close to maxing out any of the computer's resources so I'm wondering what is the standard way to structure a program to run multiple "jobs" at once? Should I be looking into threading or possibly a program that pulls the jobs from the queue and launches instances of the importer program? Thanks for the help.
EDIT: What I have right now is in Python although I'm open to rewriting it in another language if need be.
Use a Producer-Consumer queue, with as many Consumer threads as you need to optimize resource usage on the host (sorry - that's very vague advice, but the "right number" is problem-dependent).
If requests are lightweight you may well only need one Producer thread to handle them.
Launching multiple processes could work too - best choice depends on your requirements. Do you need the Producer to know whether the operation worked, or is it 'fire-and-forget'? Do you need retry logic in the event of failure? How do you keep count of concurrent Consumers in this model? And so on.
For Python, take a look at this.

best way to write a python socket server

i want to create a python socket server to send and get data to html5 ,
what is the best way to do it , a python socket lib ? or a simple code ?
thanks
#srgerg's socket documentation is useful, but if you want to handle multiple sockets simultaneously, you'll also need other mechanisms, such as select, epoll, or kqueue (depending upon your platform). (You could also spawn multiple processes using fork, or threads if the python threading implementation meets your needs, but both these approaches have enough complications that I'm reluctant to suggest them.)
Another approach is to use Twisted to manage your networking via an event loop, similar to using libevent, but I always found Twisted documentation difficult to follow. Maybe you will have better luck than I did.

Python Job Service Daemon?

What packages should I look at for writing a python daemon and processing jobs? Also, what do I need to do for a python daemon?
I'm pretty happy with beanstalkd, which has client libraries available in various languages:
Daemon:
http://kr.github.com/beanstalkd/
Python client library:
http://code.google.com/p/pybeanstalk/
Your question is a bit ambiguous, but I'm assuming you mean you would like to write a python daemon that will process jobs that get thrown in a queue. If not, please say as much. :-)
I've heard a lot of great things about redis. The folks at github built resque as a job processing daemon for Ruby. If you're language flexible, you could just use that, but if you're not, you could emulate it in as much or as little depth as you like making use of redis as your queue system. Depending on how pluggable and extensible you need it to be, this could be a really simple thing to implement.
Another option I ran across after some more googling is redqueue. It looks like it might already implement most of a job queue.
If you're using django, you may wish to consider the Celery project. It's a job queue system based on RabbitMQ which is yet another queuing server with excellent reviews.
As far as creating a daemon in python, there are a number of options. You can look at this page on activestate, which is a good start. Better yet, you can use python-daemon to do it all for you. But if you use one of the above options or beanstalkd as recommended by mczepiel, you probably won't have to make your process run as a daemon.
I have recently (this week) implemented a queue in RabbitMQ with a python daemon extracting the information and storing it on a database (using Django ORM). The daemon has a intermediate buffer so it will wait a little and write in the database in batches, instead of writing each time a little message arrives.
I've made the integration with the queue using this little flopsy module, which is easy to set up. The only problem I've got it to be able to set up a timeout for waiting a message, as the module has not a clear way of doing that. After a while playing with the interactive shell and making a few dir(), I manage to get to the socket object and set up the timeout.
I considered also Celery, but seems to be more focused on using internally a RabbitMQ to allow you to launch tasks (periodically or asynchronously), more that using a queue to communicating with other systems. In our case, the queue can be feed both by Python systems and Ruby ones.
Once I've completed the process, I've made some adjustments to allow running it as a daemon (mostly storing the standard output to a file to allow easy logging) and then create a bash script that launch a start-stop-daemon command. I've followed more or less this schema
I discovered python-daemon just about one day late, so after the work is done it makes no sense revisiting it, but maybe it makes more sense for a Python project.

What's the best dispatcher/callback library in Python?

I need to allow other Python applications to register callback functions for events in my application. These need to have priorities associated with them (so a callback with a priority of 10 runs before a callback with a priority of 1) and callback functions need to be able to signal errors to the dispatcher.
These are all lightweight callbacks running in the same process, so I don't need to send signals across process boundaries.
Is there a good Python library to handle this, or do I need to write my own?
Are these other applications running in another address space? If so, you'll need to use an interprocess communication library like D-BUS.
If you're just sending signals in the same process, try PyDispatcher
What platform are you running under? GObject is the basis of the GTK GUI that's widely-used under Linux, and it supports event loops with prioritizable events like this.
Try Twisted for anything network-related. Its perspective broker is quite nice to use.
Try python-callbacks - http://code.google.com/p/python-callbacks/.

Categories

Resources