How does AMF communication work? - python

How does Flash communicate with services / scripts on servers via AMF?
Regarding the AMF libraries for Python / Perl / PHP which are easier to develop than .NET / Java:
do they execute script files, whenever Flash sends an Remote Procedure Call?
or do they communicate via sockets, to script classes that are running as services?
Regarding typical AMF functionality:
How is data transferred? is it by method arguments that are automatically serialised?
How can servers "push" to clients? do Flash movies have to connect on a socket?
Thanks for your time.

The only AMF library I'm familiar with is PyAMF, which has been great to work with so far. Here are the answers to your questions for PyAMF:
I'd imagine you can run it as a script (do you mean like CGI?), but the easiest IMO is to set up an app server specifically for AMF requests
the easiest way is to define functions in pure python, which PyAMF wraps to serialize incoming / outgoing AMF data
you can communicate via sockets if that's what you need to do, but again, it's the easiest to use pure Python functions; one use for sockets is to keep an open connection and 'push' data to clients, see this example
Here's an example of three simple AMF services being served on localhost:8080:
from wsgiref import simple_server
from pyamf.remoting.gateway.wsgi import WSGIGateway
## amf services ##################################################
def echo(data):
return data
def reverse(data):
return data[::-1]
def rot13(data):
return data.encode('rot13')
services = {
'myservice.echo': echo,
'myservice.reverse': reverse,
'myservice.rot13': rot13,
}
## server ########################################################
def main():
app = WSGIGateway(services)
simple_server.make_server('localhost', 8080, app).serve_forever()
if __name__ == '__main__':
main()
I would definitely recommend PyAMF. Check out the examples to see what it's capable of and what the code looks like.

How does Flash communicate with services / scripts on servers via AMF?
Data is transferred over a TCP/IP connection. Sometimes an existing HTTP connection is used, and in other cases a new TCP/IP connection is opened for the AMF data. When the HTTP or additional TCP connections are opened, the sockets interface is probably used. The AMF definitely travels over a TCP connection of some sort, and the sockets interface is practically the only way to open such a connection.
The "data" that is transferred consists of ECMA-script (Javascript(tm)) data types such as "integer", "string", "object", and so on.
For a technical specification of how the objects are encoded into binary, Adobe has published a specification: AMF 3.0 Spec at Adobe.com
Generally the way an AMF-using client/server system works is something like this:
The client displays some user interface and opens a TCP connection to the server.
The server sends some data to the client, which updates its user interface.
If the user makes a command, the client sends some data to the server over the TCP connection.
Continue steps 2-3 until the user exits.
For example, if the user clicks a "send mail" button in the UI, then the client code might do this:
public class UICommandMessage extends my.CmdMsg
{
public function UICommandMessage(action:String, arg: String)
{
this.cmd = action;
this.data = String;
}
}
Then later:
UICommandMessage msg = new UICommandMessage("Button_Press", "Send_Mail");
server_connection.sendMessage(msg);
in the server code, the server is monitoring the connection as well for incoming AMF object. It receives the message, and passes control to an appropriate response function. This is called "dispatching a message".
With more information about what you are trying to accomplish, I could give you more useful details.

Related

issue in sending realtime data from esp32 to Django webpage

I am having trouble in sending data from tcp client in my esp32 board to my python django server,I am not familiar with setting channels in Django ,is there a way so that i can send the data and display in my page?
in order for your microcontroller (esp32) communicate with your own server side code first you need to define protocol you're going to use:
A. TCP:
TCP relies on IP which provides address to communicate between computers. TCP/IP is a basis for internet and other networks.
B. HTTP:
HTTP mostly used by browser (IE, Google Chrome). It rides on top of TCP which provides a safe and reliable link between two computers because if packet get lost - it can be safely re-transmitted.
After deciding protocol that you're going to use now you need suitable server side code. In python there are several library / framework that you can use:
A. HTTP:
Django, Flask, AIOHTTP (all of this supports sending and receiving JSON (REST)), I preferably use one of this framework for my IoT Projects.
B. TCP: If your microcontroller is very minimal and doesn't support HTTP/JSON, you can use a simple SocketServer or Tornado TCP Server. Don't worry even though communication between your board and server done through TCP you can still import django's libraries and use django's ORM.

How to send data between two clients

I am making a simple game with multiplayer mode in it. I need to somehow send data between one player and other. I can't find a way to transfer data between two clients. Is there a way to forward them through the server?
I'm using the socketserver library to accept connections. Here is the way I handle connections.
def handle(self):
data = self.request.recv(BUFFSIZE).decode("utf-8")
print("Received connection")
print(data)
From the 90s playbook: one of the clients can be a server; other clients (on the same LAN) connect to it.
It won't work well outside a LAN, because of NAT.
Normally you would need to run a dedicated server somewhere that would let users connect, and would route messages between them.
Naturally you would want to run all the game logic on the server, and leave to the clients only the display of state, and user input.
Also, a single server could host several games.
Implementation-wise, the socket server from standard library would be a good start; the HTTP server from the same standard library, an even easier start.

Django/Python - Serial line concurrency

I'm currently working on gateway with an embedded Linux and a Webserver. The goal of the gateway is to retrieve data from electrical devices through a RS485/Modbus line, and to display them on a server.
I'm using Nginx and Django, and the web front-end is delivered by "static" files. Repeatedly, a Javascript script file makes AJAX calls that send CGI requests to Nginx. These CGI requests are answered with JSON responses thanks to Django. The responses are mostly data that as been read on the appropriate Modbus device.
The exact path is the following :
Randomly timed CGI call -> urls.py -> ModbusCGI.py (import an other script ModbusComm.py)-> ModbusComm.py create a Modbus client and instantly try to read with it.
Next to that, I wanted to implement a Datalogger, to store data in a DB at regular intervals. I made a script that also import the ModbusComm.py script, but it doesn't work : sometime multiple Modbus frames are sent at the same time (datalogger and cgi scripts call the same function in ModbusComm.py "files" at the same time) which results in an error.
I'm sure this problem would also occur if there are a lot of users on the server (CGI requests sent at the same time). Or not ? (queue system already managed for CGI requests? I'm a bit lost)
So my goal would be to make a queue system that could handle calls from several python scripts => make them wait while it's not their turn => call a function with the right arguments when it's their turn (actually using the modbus line), and send back the response to the python script so it can generate the JSON response.
I really don't know how to achieve that, and I'm sure there are better way to do this.
If I'm not clear enough, don't hesitate to make me aware of it :)
I had the same problem when I had to allow multiple processes to read some Modbus (and not only Modbus) data through a serial port. I ended up with a standalone process (“serial port server”) that exclusively works with a serial port. All other processes work with that port through that standalone process via some inter processes communication mechanism (we used Unix sockets).
This way when an application wants to read a Modbus register it connects to the “serial port server”, sends its request and receives the response. All the actual serial port communication is done by the “serial port server” in sequential way to ensure consistency.

How to serve data from UDP stream over HTTP in Python?

I am currently working on exposing data from legacy system over the web. I have a (legacy) server application that sends and receives data over UDP. The software uses UDP to send sequential updates to a given set of variables in (near) real-time (updates every 5-10 ms). thus, I do not need to capture all UDP data -- it is sufficient that the latest update is retrieved.
In order to expose this data over the web, I am considering building a lightweight web server that reads/write UDP data and exposes this data over HTTP.
As I am experienced with Python, I am considering to use it.
The question is the following: how can I (continuously) read data from UDP and send snapshots of it over TCP/HTTP on-demand with Python? So basically, I am trying to build a kind of "UDP2HTTP" adapter to interface with the legacy app so that I wouldn't need to touch the legacy code.
A solution that is WSGI compliant would be much preferred. Of course any tips are very welcome and MUCH appreciated!
Twisted would be very suitable here. It supports many protocols (UDP, HTTP) and its asynchronous nature makes it possible to directly stream UDP data to HTTP without shooting yourself in the foot with (blocking) threading code. It also support wsgi.
Here's a quick "proof of concept" app using the twisted framework. This assumes that the legacy UDP service is listening on localhost:8000 and will start sending UDP data in response to a datagram containing "Send me data". And that the data is 3 32bit integers. Additionally it will respond to an "HTTP GET /" on port 2080.
You could start this with twistd -noy example.py:
example.py
from twisted.internet import protocol, defer
from twisted.application import service
from twisted.python import log
from twisted.web import resource, server as webserver
import struct
class legacyProtocol(protocol.DatagramProtocol):
def startProtocol(self):
self.transport.connect(self.service.legacyHost,self.service.legacyPort)
self.sendMessage("Send me data")
def stopProtocol(self):
# Assume the transport is closed, do any tidying that you need to.
return
def datagramReceived(self,datagram,addr):
# Inspect the datagram payload, do sanity checking.
try:
val1, val2, val3 = struct.unpack("!iii",datagram)
except struct.error, err:
# Problem unpacking data log and ignore
log.err()
return
self.service.update_data(val1,val2,val3)
def sendMessage(self,message):
self.transport.write(message)
class legacyValues(resource.Resource):
def __init__(self,service):
resource.Resource.__init__(self)
self.service=service
self.putChild("",self)
def render_GET(self,request):
data = "\n".join(["<li>%s</li>" % x for x in self.service.get_data()])
return """<html><head><title>Legacy Data</title>
<body><h1>Data</h1><ul>
%s
</ul></body></html>""" % (data,)
class protocolGatewayService(service.Service):
def __init__(self,legacyHost,legacyPort):
self.legacyHost = legacyHost #
self.legacyPort = legacyPort
self.udpListeningPort = None
self.httpListeningPort = None
self.lproto = None
self.reactor = None
self.data = [1,2,3]
def startService(self):
# called by application handling
if not self.reactor:
from twisted.internet import reactor
self.reactor = reactor
self.reactor.callWhenRunning(self.startStuff)
def stopService(self):
# called by application handling
defers = []
if self.udpListeningPort:
defers.append(defer.maybeDeferred(self.udpListeningPort.loseConnection))
if self.httpListeningPort:
defers.append(defer.maybeDeferred(self.httpListeningPort.stopListening))
return defer.DeferredList(defers)
def startStuff(self):
# UDP legacy stuff
proto = legacyProtocol()
proto.service = self
self.udpListeningPort = self.reactor.listenUDP(0,proto)
# Website
factory = webserver.Site(legacyValues(self))
self.httpListeningPort = self.reactor.listenTCP(2080,factory)
def update_data(self,*args):
self.data[:] = args
def get_data(self):
return self.data
application = service.Application('LegacyGateway')
services = service.IServiceCollection(application)
s = protocolGatewayService('127.0.0.1',8000)
s.setServiceParent(services)
Afterthought
This isn't a WSGI design. The idea for this would to use be to run this program daemonized and have it's http port on a local IP and apache or similar to proxy requests. It could be refactored for WSGI. It was quicker to knock up this way, easier to debug.
The software uses UDP to send sequential updates to a given set of variables in (near) real-time (updates every 5-10 ms). thus, I do not need to capture all UDP data -- it is sufficient that the latest update is retrieved
What you must do is this.
Step 1.
Build a Python app that collects the UDP data and caches it into a file. Create the file using XML, CSV or JSON notation.
This runs independently as some kind of daemon. This is your listener or collector.
Write the file to a directory from which it can be trivially downloaded by Apache or some other web server. Choose names and directory paths wisely and you're done.
Done.
If you want fancier results, you can do more. You don't need to, since you're already done.
Step 2.
Build a web application that allows someone to request this data being accumulated by the UDP listener or collector.
Use a web framework like Django for this. Write as little as possible. Django can serve flat files created by your listener.
You're done. Again.
Some folks think relational databases are important. If so, you can do this. Even though you're already done.
Step 3.
Modify your data collection to create a database that the Django ORM can query. This requires some learning and some adjusting to get a tidy, simple ORM model.
Then write your final Django application to serve the UDP data being collected by your listener and loaded into your Django database.

Design question on Python network programming

I'm currently writing a project in Python which has a client and a server part. I have troubles with the network communication, so I need to explain some things...
The client mainly does operations the server tells him to and sends the results of the operations back to the server. I need a way to communicate bidirectional on a TCP socket.
Current Situation
I currently use a LineReceiver of the Twisted framework on the server side, and a plain Python socket (and ssl) on client side (because I was unable to correctly implement a Twisted PushProducer). There is a Queue on the client side which gets filled with data which should be sent to the server; a subprocess continuously pulls data from the queue and sends it to the server (see code below).
This scenario works well, if only the client pushes its results to the manager. There is no possibility the server can send data to the client. More accurate, there is no way for the client to receive data the server has sent.
The Problem
I need a way to send commands from the server to the client.
I thought about listening for incoming data in the client loop I use to send data from the queue:
def run(self):
while True:
data = self.queue.get()
logger.debug("Sending: %s", repr(data))
data = cPickle.dumps(data)
self.socket.write(data + "\r\n")
# Here would be a good place to listen on the socket
But there are several problems with this solution:
the SSLSocket.read() method is a blocking one
if there is no data in the queue, the client will never receive any data
Yes, I could use Queue.get_nowait() instead of Queue.get(), but all in all it's not a good solution, I think.
The Question
Is there a good way to achieve this requirements with Twisted? I really do not have that much skills on Twisted to find my way round in there. I don't even know if using the LineReceiver is a good idea for this kind of problem, because it cannot send any data, if it does not receive data from the client. There is only a lineReceived event.
Is Twisted (or more general any event driven framework) able to solve this problem? I don't even have real event on the communication side. If the server decides to send data, it should be able to send it; there should not be a need to wait for any event on the communication side, as possible.
"I don't even know if using the LineReceiver is a good idea for this kind of problem, because it cannot send any data, if it does not receive data from the client. There is only a lineReceived event."
You can send data using protocol.transport.write from anywhere, not just in lineReceived.
"I need a way to send commands from the server to the client."
Don't do this. It inverts the usual meaning of "client" and "server". Clients take the active role and send stuff or request stuff from the server.
Is Twisted (or more general any event driven framework) able to solve this problem?
It shouldn't. You're inverting the role of client and server.
If the server decides to send data, it should be able to send it;
False, actually.
The server is constrained to wait for clients to request data. That's generally the accepted meaning of "client" and "server".
"One to send commands to the client and one to transmit the results to the server. Does this solution sound more like a standard client-server communication for you?"
No.
If a client sent messages to a server and received responses from the server, it would meet more usual definitions.
Sometimes, this sort of thing is described as having "Agents" which are -- each -- a kind of server and a "Controller" which is a single client of all these servers.
The controller dispatches work to the agents. The agents are servers -- they listen on a port, accept work from the controller, and do work. Each Agent must do two concurrent things (usually via the select API):
Monitor a well-known socket on which it will receive work from the one-and-only client.
Do the work (in the background).
This is what Client-Server usually means.
If each Agent is a Server, you'll find lots of libraries will support this. This is the way everyone does it.

Categories

Resources