Twisted - how to log FTP commands received by server - python

I'm using twisted as a FTP Server:
from twisted.protocols.ftp import FTPFactory, FTPRealm
from twisted.cred.portal import Portal
from twisted.cred.checkers import AllowAnonymousAccess, FilePasswordDB
from twisted.internet import reactor
p = Portal(FTPRealm('./'),
[AllowAnonymousAccess(), FilePasswordDB("pass.dat")])
f = FTPFactory(p)
reactor.listenTCP(21, f)
reactor.run()
How can i log every received FTP command from client?

FTPRealm creates FTPAnonymousShell and FTPShell instances (avatars) to mediate access to the filesystem.
These classes both implement IFTPShell. One solution is to create a wrapper for FTPRealm which applies a logging wrapper to the avatars it creates.
from twisted.python.components import proxyForInterface
class WrappingRealm(proxyForInterface(IRealm)):
wrap = staticmethod(logging_wrapper)
def requestAvatar(self, *a, **kw):
d = maybeDeferred(
super(WrappingRealm, self).requestAvatar,
*a, **kw
)
def got_avatar((iface, avatar, logout)):
return (iface, self.wrap(avatar), logout)
d.addCallback(got_avatar)
return d
And implement logging_wrapper something like:
class _LoggingFTPShell(proxyForInterface(IFTPShell)):
def makeDirectory(self, path):
log(avatar=self.avatar, operation="makeDirectory", path=path)
return super(_LoggingFTPShell, self).makeDirectory(path)
# The same for the rest of the methods of IFTPShell
# ...
def logging_wrapper(avatar):
return _LoggingFTPShell(avatar)
This is a bit tedious since you have to add logging for each method on the interface. However, since Twisted's FTP protocol implementation doesn't natively provide facilities for performing the logging you want, it's hard to get around this. With some meta-programming you might save yourself some typing (but at the cost of some complexity).
Another approach would be to contribute a patch to Twisted adding the logging you're interested in.

Related

How can I use Twisted's ThrottlingFactory with their web client?

Problem
I need to execute HTTP requests and simulate high latency at the same time. I have encountered the Twisted package in Python which includes both an HTTP client and a ThrottlingFactory. The issue I am encountering is that the documentation is not clear for a newcomer and I am having trouble understanding how I could utilize the ThrottlingFactory within API calls using the HTTP client.
I am currently utilizing the following example code to test things out. Nothing has worked so far.
from sys import argv
from pprint import pformat
from twisted.internet.task import react
from twisted.web.client import Agent, readBody
from twisted.web.http_headers import Headers
def cbRequest(response):
print("Response version:", response.version)
print("Response code:", response.code)
print("Response phrase:", response.phrase)
print("Response headers:")
print(pformat(list(response.headers.getAllRawHeaders())))
d = readBody(response)
d.addCallback(cbBody)
return d
def cbBody(body):
print("Response body:")
print(body)
def main(reactor, url=b"http://httpbin.org/get"):
agent = Agent(reactor)
d = agent.request(
b"GET", url, Headers({"User-Agent": ["Twisted Web Client Example"]}), None
)
d.addCallback(cbRequest)
return d
react(main, argv[1:])
How can I use the ThrottlingFactory in this example?
You're right - this composition is awkward, and it should be better documented, and arguably have a nicer API!
Still, you can accomplish this by putting a proxy between your application and the reactor.
It would look like this:
from sys import argv
from pprint import pformat
from dataclasses import dataclass
from twisted.internet.task import react
from twisted.internet.interfaces import IReactorTCP
from twisted.web.client import Agent, readBody
from twisted.web.http_headers import Headers
from twisted.protocols.policies import ThrottlingFactory
def cbRequest(response):
print("Response version:", response.version)
print("Response code:", response.code)
print("Response phrase:", response.phrase)
print("Response headers:")
print(pformat(list(response.headers.getAllRawHeaders())))
d = readBody(response)
d.addCallback(cbBody)
return d
def cbBody(body):
print("Response body:")
print(len(body))
#dataclass
class SlowReactorProxy:
original: IReactorTCP
def __getattr__(self, name):
return getattr(self.original, name)
def connectTCP(self, host, port, factory, timeout=30, bindAddress=None):
return self.original.connectTCP(
host, port, ThrottlingFactory(factory, readLimit=0.1), timeout, bindAddress
)
def main(reactor, url=b"http://httpbin.org/bytes/10485760000"):
agent = Agent(SlowReactorProxy(reactor))
d = agent.request(
b"GET", url, Headers({"User-Agent": ["Twisted Web Client Example"]}), None
)
d.addCallback(cbRequest)
return d
react(main, argv[1:])
However, unfortunately, ThrottlingFactory's algorithm for throttling traffic is quite primitive; there's just a timer that fires once per second and pauses everyone if too much data has been consumed. This means that you will be reading at maximum speed with zero throttling for an entire second at a time, then, having exhausted that quota, pause for a commensurately long period of time. On my (gigabit) network, I cannot get a large enough entity-body out of httpbin (the max size seems to be 102400) in order to be producing data for longer than a second, so no throttling will ever take place in this scenario.
Hopefully this will help you accomplish your task, but I'd encourage you to file a bug on twisted in order to make the composition of HTTP and throttling somewhat more graceful and responsive.

Calling a python class method from an instance of a class doesn't work as I expect

I'm new to Python, and I think I'm trying to do something simple. However, I am confused with the results I am getting. I am declaring a class that has 2 class methods, add and remove, which in my simple example add or remove a client from a list class variable. Here's my code:
Service.py
from Client import Client
class Service:
clients = []
#classmethod
def add(cls, client):
cls.clients.append(client)
#classmethod
def remove(cls, client):
if client in cls.clients:
cls.clients.remove(client)
if __name == '__main__'
a = Client()
b = Client()
c = Client()
Service.add(a)
Service.add(b)
Service.add(c)
print(Service.clients)
c.kill()
print(Service.clients)
Service.remove(c)
print(Service.clients)
Client.py
class Client:
def kill(self):
from Service import Service
Service.remove(self)
I would expect calling c.kill() would remove the instance from the clients list.
However, when I evaluate the clients list, it is showing 0 items. when I call Service.remove(c), it shows the correct list, and removes it as expected. I am not sure what I am missing here.
If it matters, I am currently using PyCharm with my code running in a Virtualenv with Python 3.6.5.
Your current code is using circular imports, as both files utilize each other. Also, instead of relying on the client to destroy the connections, use a contextmanager to facilitate the updating of clients, and at the end of the procedure, empty clients:
import contextlib
class Client:
pass
class Service:
clients = []
#classmethod
def add(cls, client):
cls.clients.append(client)
#classmethod
#contextlib.contextmanager
def thread(cls):
yield cls
cls.clients = []
with Service.thread() as t:
t.add(Client())
t.add(Client())

Python Mock imported library suds

I have a class that is using a suds client in several places to make some xml calls to another server. No problems when the code is running, however, I cannot figure out how to mock the suds client creation in the class constructor such that it will create a mocked object and not use a real socket. We have tried multiple permutations of mock.patch. mocker.patch, etc; but the ones that run result in a socket error and the rest result in AttributeError or ImportError.
This is a dumbed down version of the class:
from suds.client import Client
from suds.transport.https import HttpAuthenticated
class MYClass(object):
def __init__(self, host, usern, passw, provisioning_timeout=90):
wsdl_url = 'https://{host}/server/GetWsdl?wsdl'.format(host=host)
transport = CustomTransport()
try:
self.client = Client(wsdl_url, transport=transport, timeout=5)
def run_wsdl(self, data):
result = self.client.service.testwsdl(data=data)
return result
And this is what I am trying to run through a unit test
from me import my_class
from mock import patch
#These are some of the many permutations we've tried
#patch('my_class.MYClass.suds.client.Client') #ImportError, no suds
#patch('my_class.Client') #Socket use error in __init__
#patch('my_class.suds.client.Client') #ImportError, no suds
def test_sip_stuff(mock_client):
with patch.object(mock_client.client.service, 'testwsdl') as mockwsdl:
mockwsdl.return_value = good_wsdl_data
test_instance = my_class.MYClass(
host='10.10.10.20',
usern='user',
passw='pass'
)
return_value = test_instance.run_wsdl(data='something')
assert return_value == good_wsdl_data

tornado: How to create file-like object that writes to a RequestHandler?

In my application, I'm trying to create a handler that streams large files out to the client. These files are created by another module (tarfile to be exact).
What I want is a file-like object that instead of writing to a socket or an actual file on the disk, proxies to the RequestHandler.write method.
Here's what my current naive implementation looks like:
import tornado.gen
import tornado.ioloop
import tornado.web
class HandlerFileObject(object):
def __init__(self, handler):
self.handler = handler
#tornado.gen.coroutine
def write(self, data):
self.handler.write(data)
yield self.handler.flush()
def close(self):
self.handler.finish()
class DownloadHandler(tornado.web.RequestHandler):
def get(self):
self.set_status(200)
self.set_header("Content-Type", "application/octet-stream")
fp = HandlerFileObject(self)
with open('/dev/zero', 'rb') as devzero:
for _ in range(100*1024):
fp.write(devzero.read(1024))
fp.close()
if __name__ == '__main__':
app = tornado.web.Application([
(r"/", DownloadHandler)
])
app.listen(8888)
tornado.ioloop.IOLoop.instance().start()
It works, but the problem is that all of the data is loaded into RAM and is not released until I stop the application.
What would be a better/more idiomatic/resourceful way of going about this?
get() also needs to be a coroutine and yield when calling fp.write(). By making write a coroutine you've made your object less file-like - most callers will simply ignore its return value, masking exceptions and interfering with flow control. The file-like interface is synchronous so you'll probably need to do these operations in other threads so you can block them as needed.

Twisted: Advise on using txredisapi library required

Below I provided a code example which simply respond to HTTP GET request with the data from Redis:
Request: http://example.com:8888/?auth=zefDWDd5mS7mcbfoDbDDf4eVAKb1nlDmzLwcmhDOeUc
Response: get: u'"True"'
The purpose of this code is to serve as a REST server (that's why I'm using lazyConnectionPool) responding to the requests, and using data from Redis (read/ write).
What I need to do:
Run multiple requests to Redis inside render_GET of the IndexHandler (like GET, HMGET, SET, etc)
Run multiple requests in a transaction inside render_GET of the IndexHandler
I've tried multiple ways to do that (including examples from the txredisapi library), but due to lack of experience failed to do that. Could you please advise on questions 1) and 2).
Thanks in advance.
import txredisapi as redis
from twisted.application import internet
from twisted.application import service
from twisted.web import server
from twisted.web.resource import Resource
class Root(Resource):
isLeaf = False
class BaseHandler(object):
isLeaf = True
def __init__(self, db):
self.db = db
Resource.__init__(self)
class IndexHandler(BaseHandler, Resource):
def _success(self, value, request, message):
request.write(message % repr(value))
request.finish()
def _failure(self, error, request, message):
request.write(message % str(error))
request.finish()
def render_GET(self, request):
try:
auth = request.args["auth"][0]
except:
request.setResponseCode(404, "not found")
return ""
d = self.db.hget(auth, 'user_add')
d.addCallback(self._success, request, "get: %s\n")
d.addErrback(self._failure, request, "get failed: %s\n")
return server.NOT_DONE_YET
# Redis connection parameters
REDIS_HOST = '10.10.0.110'
REDIS_PORT = 6379
REDIS_DB = 1
REDIS_POOL_SIZE = 1
REDIS_RECONNECT = True
# redis connection
_db = redis.lazyConnectionPool(REDIS_HOST, REDIS_PORT, REDIS_DB, REDIS_POOL_SIZE)
# http resources
root = Root()
root.putChild("", IndexHandler(_db))
application = service.Application("web")
srv = internet.TCPServer(8888, server.Site(root), interface="127.0.0.1")
srv.setServiceParent(application)
Regarding first question:
There is a few ways to generalize to making multiple database requests in a single HTTP request.
For example you can make multiple requests:
d1 = self.db.hget(auth, 'user_add')
d2 = self.db.get('foo')
Then you can get a callback to trigger when all of these simultaneous requests are finished (see twisted.internet.defer.DeferredList).
Or you can use inlineCallbacks if you need sequential requests. For example:
#inlineCallbacks
def do_redis(self):
foo = yield self.db.get('somekey')
bar = yield self.db.hget(foo, 'bar') # Get 'bar' field of hash foo
But you will need to read more about combining inlineCallbacks with twisted.web (there are SO questions on that topic you should look up).
Regarding question 2:
Transactions are really ugly to do without using inlineCallbacks. There is an example at txredisapi homepage that shows it using inlineCallbacks.

Categories

Resources