SocketServer module and SQLite - python

So I have to make a small website for internal use at work (my work is not related to programming). And since our office is about 200 people I thought that I'd use the SocketServer module with SQLite database and this way I am going to learn some new stuff. From what I see the only way to do it is to connect to the database at every request. Isn't that expensive ? What happens if 2 people send requests and try to connect to the database at the same time (or close) ? So I have :
Start the server
At every request the server initializes RequestHandler instance and then I have to connect to the database ?
I am not posting code because its a general question of how the process works.

Related

Socket communication with Django

Is it possible to use Django for communication with some kind of server process? For example on my Django website I want to have form where I input connection details (host and port) and after connection I want to send some request or events to other server process (some simple action like slider moving or clicking a button). Can I use python socket programming for this or is there some easier way?
You can use with django any python packages as with any "normal" python program. If you have a module, that communicate with your server, you can use this, if not, you have to write one on your own possibly with socket programming.

Bottle server not responding while calculating

I have a bottle server running on port 8080, using the "gevent" server. I use this server to support some simple "server sent events".
My question is probably related to not knowing exactly how my set up is working. I hope someone can take the time to elaborate on this.
All routes and serving of files from the server is working great, but I have an issue when accessing a specific route "/get_data". This gathers data from the web as well as from some internal data sources. The gathering takes about 30 minutes. While this process is running, I am not able to access any routes on the server, i.e. "/" or "/login". Once the process is finished, everything works again and the database is updated with the gathered information.
I tried replacing the gathering algorithms by a simple time.sleep(60), and while the timer was active, I was still able to access other routes just fine.
This leads to my two questions:
Why am I not able to access the server while this process is running. Is it the port that is blocked (from reading web-information), or maybe it has something to do with threading?
What would be the best way to run a demanding / long process on my server? Preferably I would like to access this from my web app, but I have thought about just putting this in a seperate python file and run this localy on the server, in a seperate instance of python. This process is run at most once per day, maybe as seldom as once per week.
This happen because WSGI handle request/response synchronously.
You can use gunicorn to run your application, it will handle multi requests and response, or you can use other methods described in bottle website:
Primer to Asynchronous Applications

Implementing subscriber Pubsubhubbub in python

I have a bunch of Google alerts set up as rss feeds that update in real time. What I want is to be able to store the new data the rss feed is sending out in a database.
After looking around I found Google and Superfeedr both offer hubs that do most of the work for you; however they both require a callback url (obviously). I do have an Apache server running on the machine I'm working off, it already has python enabled so I can run python scripts on my server. However at the moment its only accessible from within my LAN.
What my real question is, what do I do next? I know that in php you would just have a call back file that handles requests but I'm lost as to what to do in python. Would I write a script and give the google/superfeedr services a url to that script? What would be in the script? Specific imports needed?
Also, I just read that if you use XMPP you don't need a callback url. How does that work?
For the local LAN problem, the most commonly used solution is to use tuneling solutions like Passageway. They will temporarily expose a local port of your machine to the "outer" web.
Now, as for implementation, it's fairly easy to set things up. Python is similar to PHP in the sense that you'll have to write a script that listen on networking connection and then handles the HTTP requests you're getting from Superfeedr or Google. (it looks like you're not familiar with Python, why not stick to PHP then?)
Finally XMPP is a feature that only us (Superfeedr) offer. It solves the problem of exposing local ports because it works behind the firewall.

How to write a html gui (django) for a python server module

What's the way to go to build a HTML gui for eg a multiplexed tcp server in python?
I am familiar with building websites with Django, but the thing i don't understand is, how is the tcp server part communicating with the Django related views? How could i implement the data sharing (do i see the wood for the trees)?
The problem for me is the mapping between the stateless "get an leave" and the "state full" py module "running as a daemon".
greetings
edit my standalone application skeleton:
#!/usr/bin/python
from django.core.management import setup_environ
import settings
setup_environ(settings)
from myapp.models import fanzy
def main():
for each in fanzy.objects.all():
print each.id, each.foo
if __name__ == '__main__':
main()
Django is just Python, so anything you've written in Python can be imported and referenced in the 'views' that you write for Django to serve back as HTTP responses.
In answer to another part of your question, the way a HTTP server handling TCP connections communicates with the python framework is most commonly through a protocol called WSGI. This is a good place to get more knowledge about the principles of WSGI. This is another.
With regards to running a background process and serving up a view of that processes' activities, it may be better to keep the two problems separate. You could write data to a file or a database and then access and serve this data via your web application.
These are just general comments, because your question is not totally clear. Please feel free to respond with further questions.
It's not always as easy as importing the libraries, mostly because process lifetime. For example, if you run Django through CGI with 1 request per process, then your TCP server won't stay alive between views. Similarly, if you use multiple processes to handle requests (e.g. using FastCGI), then you'll have several servers running at the same time.
If you want to have permanent network connections alive independent of request lifetimes, you'll need to run the TCP server in an external (daemon) process. This is the standard procedure for some caching schemes, where all your Django processes share cached data via a single deamon (e.g. Redis).
Basically, you have two approaches.
Global connection
Either establish a connection per Django process (if you have more than one) as a global object and forward requests to this from your view. This is most convenient if your TCP server is coded to handle multiple requests per connection. However, you'll have problems if your Django process is multi-threaded.
Connection per request
If your TCP server can accept multiple short-lived connections, this is also a viable approach. Just open the connection for the lifetime of your view. If this object is used often enough, you can even add some piece of middleware that opens up the connection and stores it in the request object.

How to use SQLAlchemy in this context

I want to do the following:
Have a software running written in Python 2.7
This software connects to a database (Currently a MySQL database)
This software listen for connections on a port X on TCP
When a connection is established, a client x request or command something, then the software use the database to store, remove or fetch information (Based on the request or command).
What I currently have in head is the classic approach of connecting to the database, store the connection to the database in an object (as a variable) that is passed in the threads that are spawned by the connection listener, then these threads use the variable in the object to do what they need to do with the database connection. (I know that multi-processing is better then multi-threading in Python, but it's not related to my question at this time)
Now my question, how should I use SQLAlchemy in this context? I am quite confused even although I have been reading quite a lot of documentation about it and there doesn't seem to be "good" examples on how to handle this kind of situation specifically even although I have been searching quite a lot.
What is the problem here? SQLAlchemy maintains a thread-local connection pool..what else do you need?

Categories

Resources