anyone here has experience implementing pika with Django?
I am basically running an event-based microservice using django rest framework. And using RabbitMQ as the message bus. I know the default library to use in this case would be Celery, but I am looking for a lighter version where I can just implement a simple pub / sub on the messages.
Has anyone implemented such a service using pika before?
My question is more how do you spawn pika as a separate process together with Django? Or is there a more elegant solution out there?
Thanks in advance.
--- UPDATE ---
What we ended up doing was:
For the publisher:
We spawn a separate thread (or many if you need to publish a high volume/ sec) that keeps a pika connection alive.
For the subscriber:
We spawn a separate worker process (in a separate container) that has the django context (using django.setup()) which consumes the messages from RabbitMQ
Related
Alright, so I have a somewhat unique issue that I'm dealing with. I'm writing a small web application in Django and I would like it to interface with RabbitMQ directly over the backend. IE, I have a server task that I'm developing a web interface for, and I want to interact with it's API directly through the Message Queue. I'm struggling with how to spawn a long-running connection to the Message Queue. To state the obvious, it's expensive to construct/teardown TCP connections to RMQ for every new request! So, is it possible to create some long-running process that I can use to interact with this AMQP API?
As a note regarding Celery, since I'm sure it will be mentioned, my understanding of Celery is that it is great at distributing time-intensive tasks to other processes and/or nodes, but does not expose a long-running AMQP connection to the Django application. So I don't think it will help here. That being said, I also haven't done much more than the tutorial and read parts of the documentation.
I need to share some queue between two applications on same machine, one is Tornado which is going to occasionally add message to that queue and another is python script runs from cron which is going in every iteration add new messages. Can anyone suggest me module for this ?
(Can this be solved with redis usage, I avoid to use mysql for this purpose )
I would use redis with a list. You can push a element top, and rpop to remove from the tail.
See redis rpop
and redis lpushx
The purest way I can think of to do this is with IPC. Python has very good support for IPC between two processes when one process spawns another, but not in your scenario. There are python modules for ipc such as sysv_ipc and posix_ipc. But if you are going to have your main application built in tornado, why not just have it listen on a zeromq socket for published messages.
Here is a link with more information. You want the Publisher-Subscriber model.
http://zeromq.github.io/pyzmq/eventloop.html#tornado-ioloop
Your cron job will start and publish messages a to zeromq socket. Your already running application will receive them as subscriber.
Try RabbitMQ for hosting the queue independent of your applications, then access using Pika, which even comes with a Tornado adapter. Just pick the appropriate model: queue/exchange/topic and protocol of the message you want (strings, json, xml, yaml) and you are set.
I need to create a python server that can accept multiple job requests. Then from those it requests, it processes each Job one at a time but the server can still accept new Jobs while processing a task.
Does anyone have an suggestions on how to do this?
Thanks
Sure. Create a multiprocessing.Pool which will by default spawn one process per core. Then use the original process to run an HTTP service or something else that accepts jobs via some protocol. The main process then listens for new requests and submits them to the pool for async processing.
Use twisted. Twisted is an event-driven networking engine. Twisted also supports many common network protocols, including SMTP, POP3, IMAP, SSHv2, and DNS.
Is there a way to push stdout into a queue broker or to a websocket?
So far I've been unable to find a clear explanation on how to do this.
I have several processes running in parallel and the idea is to create a UI where you can switch from process to process and take a look at what they are doing.
One approach that will work (and is non-blocking, can serve multiple clients) is using Python, Twisted and Autobahn:
Connect one or multiple ProcessProtocol instances
http://twistedmatrix.com/documents/current/core/howto/process.html
to a Twisted WebSocket server
https://github.com/tavendo/AutobahnPython
https://github.com/tavendo/AutobahnPython/tree/master/examples/websocket/broadcast
https://github.com/tavendo/AutobahnPython/tree/master/examples/wamp/pubsub/simple
Disclosure: I am author of Autobahn.
So, I'm writing a python web application using the twisted web2 framework. There's a library that I need to use (SQLAlchemy, to be specific) that doesn't have asynchronous code. Would it be bad to spawn a thread to handle the request, fetch any data from the DB, and then return a response? I'm afraid that if there was a flood of requests, too many threads would be started and the server would be overwhelmed. Is there something built into twisted that prevents this from happening (eg request throttling)?
See the docs, and specifically the thread pool which lets you control how many threads are active at most. Spawning one new thread per request would definitely be an inferior idea!