Django Websocket Send Text and bytes at sametime - python

I have client and server in my project. In the client part, the user will upload his own excel file and this file will come to my server for processing. My artificial intelligence python code will run on my server and it will make changes to excel. When every time it changes, I want to send the updated version to the client so that the client can see the change live. Example Let's say I have 10 functions on server side, each function changes some cells in excel(I can get the index of the changed cells). When each function is finished, I will send the changing indexes to the client and these changed places will be updated in the table in the client (C++, Qt).
At first, I made the server with PHP, but calling my artificial intelligence python codes externally(shell_exec) was not a good method. That's why I want to do the server part with python.
Is django the best way for me?
What I've tried with Django:
I wanted to send data continuously from server to client with StreamingHttpResponse object, but even though I used iter_content to recv the incoming data on the client, when all the code was finished, all came at once. When I set the chunksize value of iter_content to a small value, I could get it instantly, but it's not a full word. So I decided to use websocket.
I have a problem with websocket; I can't send text and byte data at the same time.
When client while uploading the Excel file, I need to send some text data as a parameter to my server.
Waiting for your help thank you!

You can send bytes as hexadecimal string.
Check this out: binascii hexlify

Related

Export GnuRadio datas to create a webSDR

I am an intern and I work on a project to create WebSDR. I need to create a web interface that allows users to observe the activity on the frequency range they want (with a waterfall graph) and also transmit the sound on the chosen frequency.
example of a websdr
For this purpose we have an SDR connected to a local server and running GNURadio. (I show you the block diagram as it is now, it is obviously not final)
global architecture
I then created a server code in python which retrieves the data sent in UDP via the "UDP Sink" block which for the moment simply transmits it in text to the client code in javascript to display it on an html page. (I will send you the codes if needed)
server
client.js and client.html
I'm stuck now, I can't find any resources on the internet for the rest. I would like to process the data on the server in order to create an audio data stream that would be streamed to the web client. But also a way to create a waterfall graphic, which I will convert to an image afterwards and which will be sent to the client every second to give the impression that the waterfall graphic is refreshed regularly.
Please can you give me some answers to create these two features. I am open to other proposals as well if the method I want to apply is not the right one.
Thank you very much,

How to transfer data between SwiftNIO TCP-server and Python based TCP-client?

I have a TCP server written in SwiftNIO, based on this documentation.
I want my client to be written in python from which I can send multiple JSON strings & can receive similar/different multiple JSON string(s) as a response periodically for a few minutes.
In which format do I need to convert those JSON strings from the python client & how do I get the same JSON string on the SwiftNIO server (and vice versa)?
If I were you, I'd use HTTP using the Vapor web server and any Python HTTP library such as requests. If you do that, then your job will be pretty straightforward. The Vapor community is also super helpful in their Discord chat.
If you really want to do this in a low-level library like SwiftNIO then that's of course possible but you'll need to design a so called "wire protocol" for the framing (ie. when does one JSON message start and end). SwiftNIO is very well equipped for these things but you'll likely need to learn a bunch of things.
You could for example use NIO Extras' LineBasedFrameDecoder and send each JSON (make sure it doesn't contain newlines) followed by a \n. Or you could say that you prepend the JSON by say a 32 bit length field (which you could decode using the LengthFieldBasedFrameDecoder. There are many options...
You could also implement JSON-RPC and you could get some inside in this example which is also explained in this talk.

Running python script concurrently based on trigger

What would be best way to solve following problem with Python ?
I have real-time data stream coming to my object-oriented storage from user application (json files being stored into S3 storage in Amazon).
Upon receiving of each JSON file, I have to within certain time (1s in this instance) process data in the file and generate response that is send back to the user. This data is being processed by simple Python script.
My issue is, that the real-time data stream can at the same time generate even few hundreds JSON files from user applications that I need to run trough my Python script and I don't know how to approach this the best way.
I understand, that way to tackle this would be to use trigger based Lambdas that would execute job on the top of every file once uploaded from real-time stream in server-less environment, however this option is quite expensive compared to have single server instance running and somehow triggering jobs inside.
Any advice is appreciated. Thanks.
Serverless can actually be cheaper than using a server. It is much cheaper when there are periods of no activity because you don't need to pay for a server doing nothing.
The hardest part of your requirement is sending the response back to the user. If an object is uploaded to S3, there is no easy way to send back a response and it isn't even obvious who is the user that sent the file.
You could process the incoming file and then store a response back in a similarly-named object, and the client could then poll S3 for the response. That requires the upload to use a unique name that is somehow generated.
An alternative would be for the data to be sent to AWS API Gateway, which can trigger an AWS Lambda function and then directly return the response to the requester. No server required, automatic scaling.
If you wanted to use a server, then you'd need a way for the client to send a message to the server with a reference to the JSON object in S3 (or with the data itself). The server would need to be running a web server that can receive the request, perform the work and provide back the response.
Bottom line: Think about the data flow first, rather than the processing.

Querying objects from mysql with python

Since I can't explain clearly what I don't understand I'll use an example.
Lets say I have a client application and a server application. The server awaits and when the client sends some keyword to the server so the server knows what should be queried. And lets say that the client requests a product object so the server queries the database and gets back the row that the client needs as a set object. So every time I need some object I need send it to the client in form of a string and then instantiate it ?
Am i missing something ? Isn't it expensive to instantiate objects on every query ?
TIA!
Your question is very vague and doesn't really ask something but I'll try to give you a generic answer of how to interact between server and client.
When a user request a item in the client, you should provide the client with an API to the server, something like http://example.com/search?param=test. The client will use this API in either an AJAX call or a direct call.
The server should parse the param, connect to database, retrieve the requested item and return to client. The most common data types for this exchange are JSON and Plain Text.
The client will then parse either of the data types, generate if required an object from these and finnally show the user the requested data.
If this is not what you need please update your question to ask specifically the issue you have and maybe provide some code where you have the issue and I'll update my answer accordingly.
MySQL Server uses custom protocol over TCP. If you don't want to use any library you will have to parse TCP messages. MySQL Connector / Python does exactly that - you can look at its source code if wish.

Server Sent Events(SSE) in Google App Engine

Does GAE support Server Sent Events (SSE)?
I tried using SSE but it did not work ,so I switched to Channel API. But still is it possible to implement SSE in GAE ?
I've been trying like crazy to pull this one off but the GAE response is being buffered and compressed.
I'll be very happy if someone has an idea how to write the code/headers so the php file is streamed.
FYI, these are the headers I'm using:
header("Content-Type: text/event-stream; charset=utf-8");
header("Accept-Encoding: identity");
header("Cache-Control: no-cache");
header("Access-Control-Allow-Origin: https://mail.google.com");
header("Access-Control-Allow-Credentials: true");
header('Access-Control-Allow-Methods "PUT, GET, POST, DELETE, OPTIONS"');
[UPDATE]
From: http://grokbase.com/t/gg/google-appengine/15623azjjf/server-sent-events-using-channels-api
What this means in practice is that your stream will not be
"keep-alive" and will close each time one response is sent. Or, if you
implement your server-sent event code server-side as most people do,
it will buffer up all of its responses and finally send them all only
when it terminates.
Please read: https://cloud.google.com/appengine/docs/php/requests#PHP_Responses
Resume: there is no way to do SSE using GAE.

Categories

Resources