Twisted: Advise on using txredisapi library required - python

Below I provided a code example which simply respond to HTTP GET request with the data from Redis:
Request: http://example.com:8888/?auth=zefDWDd5mS7mcbfoDbDDf4eVAKb1nlDmzLwcmhDOeUc
Response: get: u'"True"'
The purpose of this code is to serve as a REST server (that's why I'm using lazyConnectionPool) responding to the requests, and using data from Redis (read/ write).
What I need to do:
Run multiple requests to Redis inside render_GET of the IndexHandler (like GET, HMGET, SET, etc)
Run multiple requests in a transaction inside render_GET of the IndexHandler
I've tried multiple ways to do that (including examples from the txredisapi library), but due to lack of experience failed to do that. Could you please advise on questions 1) and 2).
Thanks in advance.
import txredisapi as redis
from twisted.application import internet
from twisted.application import service
from twisted.web import server
from twisted.web.resource import Resource
class Root(Resource):
isLeaf = False
class BaseHandler(object):
isLeaf = True
def __init__(self, db):
self.db = db
Resource.__init__(self)
class IndexHandler(BaseHandler, Resource):
def _success(self, value, request, message):
request.write(message % repr(value))
request.finish()
def _failure(self, error, request, message):
request.write(message % str(error))
request.finish()
def render_GET(self, request):
try:
auth = request.args["auth"][0]
except:
request.setResponseCode(404, "not found")
return ""
d = self.db.hget(auth, 'user_add')
d.addCallback(self._success, request, "get: %s\n")
d.addErrback(self._failure, request, "get failed: %s\n")
return server.NOT_DONE_YET
# Redis connection parameters
REDIS_HOST = '10.10.0.110'
REDIS_PORT = 6379
REDIS_DB = 1
REDIS_POOL_SIZE = 1
REDIS_RECONNECT = True
# redis connection
_db = redis.lazyConnectionPool(REDIS_HOST, REDIS_PORT, REDIS_DB, REDIS_POOL_SIZE)
# http resources
root = Root()
root.putChild("", IndexHandler(_db))
application = service.Application("web")
srv = internet.TCPServer(8888, server.Site(root), interface="127.0.0.1")
srv.setServiceParent(application)

Regarding first question:
There is a few ways to generalize to making multiple database requests in a single HTTP request.
For example you can make multiple requests:
d1 = self.db.hget(auth, 'user_add')
d2 = self.db.get('foo')
Then you can get a callback to trigger when all of these simultaneous requests are finished (see twisted.internet.defer.DeferredList).
Or you can use inlineCallbacks if you need sequential requests. For example:
#inlineCallbacks
def do_redis(self):
foo = yield self.db.get('somekey')
bar = yield self.db.hget(foo, 'bar') # Get 'bar' field of hash foo
But you will need to read more about combining inlineCallbacks with twisted.web (there are SO questions on that topic you should look up).
Regarding question 2:
Transactions are really ugly to do without using inlineCallbacks. There is an example at txredisapi homepage that shows it using inlineCallbacks.

Related

How can I use Twisted's ThrottlingFactory with their web client?

Problem
I need to execute HTTP requests and simulate high latency at the same time. I have encountered the Twisted package in Python which includes both an HTTP client and a ThrottlingFactory. The issue I am encountering is that the documentation is not clear for a newcomer and I am having trouble understanding how I could utilize the ThrottlingFactory within API calls using the HTTP client.
I am currently utilizing the following example code to test things out. Nothing has worked so far.
from sys import argv
from pprint import pformat
from twisted.internet.task import react
from twisted.web.client import Agent, readBody
from twisted.web.http_headers import Headers
def cbRequest(response):
print("Response version:", response.version)
print("Response code:", response.code)
print("Response phrase:", response.phrase)
print("Response headers:")
print(pformat(list(response.headers.getAllRawHeaders())))
d = readBody(response)
d.addCallback(cbBody)
return d
def cbBody(body):
print("Response body:")
print(body)
def main(reactor, url=b"http://httpbin.org/get"):
agent = Agent(reactor)
d = agent.request(
b"GET", url, Headers({"User-Agent": ["Twisted Web Client Example"]}), None
)
d.addCallback(cbRequest)
return d
react(main, argv[1:])
How can I use the ThrottlingFactory in this example?
You're right - this composition is awkward, and it should be better documented, and arguably have a nicer API!
Still, you can accomplish this by putting a proxy between your application and the reactor.
It would look like this:
from sys import argv
from pprint import pformat
from dataclasses import dataclass
from twisted.internet.task import react
from twisted.internet.interfaces import IReactorTCP
from twisted.web.client import Agent, readBody
from twisted.web.http_headers import Headers
from twisted.protocols.policies import ThrottlingFactory
def cbRequest(response):
print("Response version:", response.version)
print("Response code:", response.code)
print("Response phrase:", response.phrase)
print("Response headers:")
print(pformat(list(response.headers.getAllRawHeaders())))
d = readBody(response)
d.addCallback(cbBody)
return d
def cbBody(body):
print("Response body:")
print(len(body))
#dataclass
class SlowReactorProxy:
original: IReactorTCP
def __getattr__(self, name):
return getattr(self.original, name)
def connectTCP(self, host, port, factory, timeout=30, bindAddress=None):
return self.original.connectTCP(
host, port, ThrottlingFactory(factory, readLimit=0.1), timeout, bindAddress
)
def main(reactor, url=b"http://httpbin.org/bytes/10485760000"):
agent = Agent(SlowReactorProxy(reactor))
d = agent.request(
b"GET", url, Headers({"User-Agent": ["Twisted Web Client Example"]}), None
)
d.addCallback(cbRequest)
return d
react(main, argv[1:])
However, unfortunately, ThrottlingFactory's algorithm for throttling traffic is quite primitive; there's just a timer that fires once per second and pauses everyone if too much data has been consumed. This means that you will be reading at maximum speed with zero throttling for an entire second at a time, then, having exhausted that quota, pause for a commensurately long period of time. On my (gigabit) network, I cannot get a large enough entity-body out of httpbin (the max size seems to be 102400) in order to be producing data for longer than a second, so no throttling will ever take place in this scenario.
Hopefully this will help you accomplish your task, but I'd encourage you to file a bug on twisted in order to make the composition of HTTP and throttling somewhat more graceful and responsive.

How to make each client get their state if there is class instance in grpc-python server side?

I want to use grpc-python in the following scenario, but I don' t know how to realize it.
The scenario is that, in the python server, it uses class to calculate and update the instance' s state, then sends such state to corresponding client; in the client side, more than one clients need to communicate with the server to get its one result and not interfered by others.
Specifically, suppose there is a class with initial value self.i =0, then each time the client calls the class' s update function, it does self.i=self.i+1 and returns self.i. Actually there are two clients call such update function simultaneously, like when client1 calls update at third time, client2 calls update at first time.
I think this may can be solved by creating thread for each client to avoid conflict. If the new client calls, new thead will be created; if existing client calls, existing thread will be used. But I don' t know how to realize it?
Hope you can help me. Thanks in advance.
I think I solved this problem by myself. If you have any other better solutions, you can post here.
I edited helloworld example in grpc-python introduction to explain my aim.
For helloworld.proto
syntax = "proto3";
option java_multiple_files = true;
option java_package = "io.grpc.examples.helloworld";
option java_outer_classname = "HelloWorldProto";
option objc_class_prefix = "HLW";
package helloworld;
// The greeting service definition.
service Greeter {
// Sends a greeting
rpc SayHello (HelloRequest) returns (HelloReply) {}
rpc Unsubscribe (HelloRequest) returns (HelloReply) {}
}
// The request message containing the user's name.
message HelloRequest {
string name = 1;
}
// The response message containing the greetings
message HelloReply {
string message = 1;
}
I add Unsubsribe function to allow one specific client to diconnect from server.
In hello_server.py
import grpc
import helloworld_pb2
import helloworld_pb2_grpc
import threading
from threading import RLock
import time
from concurrent import futures
import logging
class Calcuate:
def __init__(self):
self.i = 0
def add(self):
self.i+=1
return self.i
class PeerSet(object):
def __init__(self):
self._peers_lock = RLock()
self._peers = {}
self.instances = {}
def connect(self, peer):
with self._peers_lock:
if peer not in self._peers:
print("Peer {} connecting".format(peer))
self._peers[peer] = 1
a = Calcuate()
self.instances[peer] = a
output = a.add()
return output
else:
self._peers[peer] += 1
a = self.instances[peer]
output = a.add()
return output
def disconnect(self, peer):
print("Peer {} disconnecting".format(peer))
with self._peers_lock:
if peer not in self._peers:
raise RuntimeError("Tried to disconnect peer '{}' but it was never connected.".format(peer))
del self._peers[peer]
del self.instances[peer]
def peers(self):
with self._peers_lock:
return self._peers.keys()
class Greeter(helloworld_pb2_grpc.GreeterServicer):
def __init__(self):
self._peer_set = PeerSet()
def _record_peer(self, context):
return self._peer_set.connect(context.peer())
def SayHello(self, request, context):
output = self._record_peer(context)
print("[thread {}] Peers: {}, output: {}".format(threading.currentThread().ident, self._peer_set.peers(), output))
time.sleep(1)
return helloworld_pb2.HelloReply(message='Hello, {}, {}!'.format(request.name, output))
def Unsubscribe(self, request, context):
self._peer_set.disconnect(context.peer())
return helloworld_pb2.HelloReply(message='{} disconnected!'.format(context.peer()))
def serve():
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
helloworld_pb2_grpc.add_GreeterServicer_to_server(Greeter(), server)
server.add_insecure_port('[::]:50051')
server.start()
server.wait_for_termination()
if __name__ == '__main__':
logging.basicConfig()
serve()
The use of context.peer() is adapted from Richard Belleville' s answer in this post. You can change add() function to any other functions that can be used to update instance' s state.
In hello_client.py
from __future__ import print_function
import logging
import grpc
import helloworld_pb2
import helloworld_pb2_grpc
def run():
# NOTE(gRPC Python Team): .close() is possible on a channel and should be
# used in circumstances in which the with statement does not fit the needs
# of the code.
with grpc.insecure_channel('localhost:50051') as channel:
stub = helloworld_pb2_grpc.GreeterStub(channel)
response = stub.SayHello(helloworld_pb2.HelloRequest(name='you'))
print("Greeter client received: " + response.message)
response = stub.SayHello(helloworld_pb2.HelloRequest(name='Tom'))
print("Greeter client received: " + response.message)
response = stub.SayHello(helloworld_pb2.HelloRequest(name='Jerry'))
print("Greeter client received: " + response.message)
stub.Unsubscribe(helloworld_pb2.HelloRequest(name="end"))
if __name__ == '__main__':
logging.basicConfig()
run()
If we run serveral hello_client.py simultaneously, the server can distinguish the different clients and send correct corresponding info to them.

Nested web service calls with tornado (async?)

I am implementing a SOAP web service using tornado (and the third party tornadows module). One of the operations in my service needs to call another so I have the chain:
External request in (via SOAPUI) to operation A
Internal request (via requests module) in to operation B
Internal response from operation B
External response from operation A
Because it is all running in one service it is being blocked somewhere though. I'm not familiar with tornado's async functionality.
There is only one request handling method (post) because everything comes in on the single url and then the specific operation (method doing processing) is called based on the SOAPAction request header value. I have decorated my post method with #tornado.web.asynchronous and called self.finish() at the end but no dice.
Can tornado handle this scenario and if so how can I implement it?
EDIT (added code):
class SoapHandler(tornado.web.RequestHandler):
#tornado.web.asynchronous
def post(self):
""" Method post() to process of requests and responses SOAP messages """
try:
self._request = self._parseSoap(self.request.body)
soapaction = self.request.headers['SOAPAction'].replace('"','')
self.set_header('Content-Type','text/xml')
for operations in dir(self):
operation = getattr(self,operations)
method = ''
if callable(operation) and hasattr(operation,'_is_operation'):
num_methods = self._countOperations()
if hasattr(operation,'_operation') and soapaction.endswith(getattr(operation,'_operation')) and num_methods > 1:
method = getattr(operation,'_operation')
self._response = self._executeOperation(operation,method=method)
break
elif num_methods == 1:
self._response = self._executeOperation(operation,method='')
break
soapmsg = self._response.getSoap().toprettyxml()
self.write(soapmsg)
self.finish()
except Exception as detail:
#traceback.print_exc(file=sys.stdout)
wsdl_nameservice = self.request.uri.replace('/','').replace('?wsdl','').replace('?WSDL','')
fault = soapfault('Error in web service : {fault}'.format(fault=detail), wsdl_nameservice)
self.write(fault.getSoap().toxml())
self.finish()
This is the post method from the request handler. It's from the web services module I'm using (so not my code) but I added the async decorator and self.finish(). All it basically does is call the correct operation (as dictated in the SOAPAction of the request).
class CountryService(soaphandler.SoapHandler):
#webservice(_params=GetCurrencyRequest, _returns=GetCurrencyResponse)
def get_currency(self, input):
result = db_query(input.country, 'currency')
get_currency_response = GetCurrencyResponse()
get_currency_response.currency = result
headers = None
return headers, get_currency_response
#webservice(_params=GetTempRequest, _returns=GetTempResponse)
def get_temp(self, input):
get_temp_response = GetTempResponse()
curr = self.make_curr_request(input.country)
get_temp_response.temp = curr
headers = None
return headers, get_temp_response
def make_curr_request(self, country):
soap_request = """<soapenv:Envelope xmlns:soapenv='http://schemas.xmlsoap.org/soap/envelope/' xmlns:coun='CountryService'>
<soapenv:Header/>
<soapenv:Body>
<coun:GetCurrencyRequestget_currency>
<country>{0}</country>
</coun:GetCurrencyRequestget_currency>
</soapenv:Body>
</soapenv:Envelope>""".format(country)
headers = {'Content-Type': 'text/xml;charset=UTF-8', 'SOAPAction': '"http://localhost:8080/CountryService/get_currency"'}
r = requests.post('http://localhost:8080/CountryService', data=soap_request, headers=headers)
try:
tree = etree.fromstring(r.content)
currency = tree.xpath('//currency')
message = currency[0].text
except:
message = "Failure"
return message
These are two of the operations of the web service (get_currency & get_temp). So SOAPUI hits get_temp, which makes a SOAP request to get_currency (via make_curr_request and the requests module). Then the results should just chain back and be sent back to SOAPUI.
The actual operation of the service makes no sense (returning the currency when asked for the temperature) but i'm just trying to get the functionality working and these are the operations I have.
I don't think that your soap module, or requests is asyncronous.
I believe adding the #asyncronous decorator is only half the battle. Right now you aren't making any async requests inside of your function (every request is blocking, which ties up the server until your method finishes)
You can switch this up by using tornados AsynHttpClient. This can be used pretty much as an exact replacement for requests. From the docoumentation example:
class MainHandler(tornado.web.RequestHandler):
#tornado.web.asynchronous
def get(self):
http = tornado.httpclient.AsyncHTTPClient()
http.fetch("http://friendfeed-api.com/v2/feed/bret",
callback=self.on_response)
def on_response(self, response):
if response.error: raise tornado.web.HTTPError(500)
json = tornado.escape.json_decode(response.body)
self.write("Fetched " + str(len(json["entries"])) + " entries "
"from the FriendFeed API")
self.finish()
Their method is decorated with async AND they are making asyn http requests. This is where the flow gets a little strange. When you use the AsyncHttpClient it doesn't lock up the event loop (PLease I just started using tornado this week, take it easy if all of my terminology isn't correct yet). This allows the server to freely processs incoming requests. When your asynchttp request is finished the callback method will be executed, in this case on_response.
Here you can replace requests with the tornado asynchttp client realtively easily. For your soap service, though, things might be more complicated. You could make a local webserivce around your soap client and make async requests to it using the tornado asyn http client???
This will create some complex callback logic which can be fixed using the gen decorator
This issue was fixed since yesterday.
Pull request:
https://github.com/rancavil/tornado-webservices/pull/23
Example: here a simple webservice that doesn't take arguments and returns the version.
Notice you should:
Method declaration: decorate the method with #gen.coroutine
Returning results: use raise gen.Return(data)
Code:
from tornado import gen
from tornadows.soaphandler import SoapHandler
...
class Example(SoapHandler):
#gen.coroutine
#webservice(_params=None, _returns=Version)
def Version(self):
_version = Version()
# async stuff here, let's suppose you ask other rest service or resource for the version details.
# ...
# returns the result.
raise gen.Return(_version)
Cheers!

How to test twisted web resource with trial?

I'm developing a twisted.web server - it consists of some resources that apart from rendering stuff use adbapi to fetch some data and write some data to postgresql database. I'm trying to figoure out how to write a trial unittest that would test resource rendering without using net (in other words: that would initialize a resource, produce it a dummy request etc.).
Lets assume the View resource is a simple leaf that in render_GET returns NOT_DONE_YET and tinkers with adbapi to produce simple text as a result. Now, I've written this useless code and I can't come up how to make it actually initialize the resource and produce some sensible response:
from twisted.trial import unittest
from myserv.views import View
from twisted.web.test.test_web import DummyRequest
class ExistingView(unittest.TestCase):
def test_rendering(self):
slug = "hello_world"
view = View(slug)
request = DummyRequest([''])
output = view.render_GET(request)
self.assertEqual(request.responseCode, 200)
The output is... 1. I've also tried such approach: output = request.render(view) but same output = 1. Why? I'd be very gratefull for some example how to write such unittest!
Here's a function that will render a request and convert the result into a Deferred that fires when rendering is complete:
def _render(resource, request):
result = resource.render(request)
if isinstance(result, str):
request.write(result)
request.finish()
return succeed(None)
elif result is server.NOT_DONE_YET:
if request.finished:
return succeed(None)
else:
return request.notifyFinish()
else:
raise ValueError("Unexpected return value: %r" % (result,))
It's actually used in Twisted Web's test suite, but it's private because it has no unit tests itself. ;)
You can use it to write a test like this:
def test_rendering(self):
slug = "hello_world"
view = View(slug)
request = DummyRequest([''])
d = _render(view, request)
def rendered(ignored):
self.assertEquals(request.responseCode, 200)
self.assertEquals("".join(request.written), "...")
...
d.addCallback(rendered)
return d
Here is a DummierRequest class that fixes almost all my problems. Only thing left is it does not set any response code! Why?
from twisted.web.test.test_web import DummyRequest
from twisted.web import server
from twisted.internet.defer import succeed
from twisted.internet import interfaces, reactor, protocol, address
from twisted.web.http_headers import _DictHeaders, Headers
class DummierRequest(DummyRequest):
def __init__(self, postpath, session=None):
DummyRequest.__init__(self, postpath, session)
self.notifications = []
self.received_cookies = {}
self.requestHeaders = Headers()
self.responseHeaders = Headers()
self.cookies = [] # outgoing cookies
def setHost(self, host, port, ssl=0):
self._forceSSL = ssl
self.requestHeaders.setRawHeaders("host", [host])
self.host = address.IPv4Address("TCP", host, port)
def addCookie(self, k, v, expires=None, domain=None, path=None, max_age=None, comment=None, secure=None):
"""
Set an outgoing HTTP cookie.
In general, you should consider using sessions instead of cookies, see
L{twisted.web.server.Request.getSession} and the
L{twisted.web.server.Session} class for details.
"""
cookie = '%s=%s' % (k, v)
if expires is not None:
cookie = cookie +"; Expires=%s" % expires
if domain is not None:
cookie = cookie +"; Domain=%s" % domain
if path is not None:
cookie = cookie +"; Path=%s" % path
if max_age is not None:
cookie = cookie +"; Max-Age=%s" % max_age
if comment is not None:
cookie = cookie +"; Comment=%s" % comment
if secure:
cookie = cookie +"; Secure"
self.cookies.append(cookie)
def getCookie(self, key):
"""
Get a cookie that was sent from the network.
"""
return self.received_cookies.get(key)
def getClientIP(self):
"""
Return the IPv4 address of the client which made this request, if there
is one, otherwise C{None}.
"""
return "192.168.1.199"

How do I get my simple twisted proxy to work?

I am attempting to make use of the Twisted.Web framework.
Notice the three line comments (#line1, #line2, #line3). I want to create a proxy (gateway?) that will forward a request to one of two servers depending on the url. If I uncomment either comment 1 or 2 (and comment the rest), the request is proxied to the correct server. However, of course, it does not pick the server based on the URL.
from twisted.internet import reactor
from twisted.web import proxy, server
from twisted.web.resource import Resource
class Simple(Resource):
isLeaf = True
allowedMethods = ("GET","POST")
def getChild(self, name, request):
if name == "/" or name == "":
return proxy.ReverseProxyResource('localhost', 8086, '')
else:
return proxy.ReverseProxyResource('localhost', 8085, '')
simple = Simple()
# site = server.Site(proxy.ReverseProxyResource('localhost', 8085, '')) #line1
# site = server.Site(proxy.ReverseProxyResource('localhost', 8085, '')) #line2
site = server.Site(simple) #line3
reactor.listenTCP(8080, site)
reactor.run()
As the code above currently stands, when I run this script and navigate to server "localhost:8080/ANYTHING_AT_ALL" I get the following response.
Method Not Allowed
Your browser approached me (at /ANYTHING_AT_ALL) with the method "GET". I
only allow the methods GET, POST here.
I don't know what I am doing wrong? Any help would be appreciated.
Since your Simple class implements the getChild() method, it is implied that this is not a leaf node, however, you are stating that it is a leaf node by setting isLeaf = True. (How can a leaf node have a child?).
Try changing isLeaf = True to isLeaf = False and you'll find that it redirects to the proxy as you'd expect.
From the Resource.getChild docstring:
... This will not be called if the class-level variable 'isLeaf' is set in
your subclass; instead, the 'postpath' attribute of the request will be
left as a list of the remaining path elements....
Here is the final working solution. Basically two resource request go to the GAE server, and all remaining request go to the GWT server.
Other than implementing mhawke's change, there is only one other change, and that was adding '"/" + name' to the proxy servers path. I assume this had to be done because that portion of the path was consumed and placed in the 'name' variable.
from twisted.internet import reactor
from twisted.web import proxy, server
from twisted.web.resource import Resource
class Simple(Resource):
isLeaf = False
allowedMethods = ("GET","POST")
def getChild(self, name, request):
print "getChild called with name:'%s'" % name
if name == "get.json" or name == "post.json":
print "proxy on GAE"
return proxy.ReverseProxyResource('localhost', 8085, "/"+name)
else:
print "proxy on GWT"
return proxy.ReverseProxyResource('localhost', 8086, "/"+name)
simple = Simple()
site = server.Site(simple)
reactor.listenTCP(8080, site)
reactor.run()
Thank you.

Categories

Resources