Streaming video file from flask server to react js client - python

I am creating a "Youtube Clone" using Python Flask on the backend and React JS on the server side, and I currently need to stream the video from a file on the server.
This is what I currently have:
#app.route('/api/video/<string:video_id>')
def ShowVideo(video_id):
video = Video.query.filter_by(id = video_id).first() #"Video" is the video database (using SQLAlchemy)
if video is None:
return {"return":"401"} #video doesn't exist in database
#return ??
now I have the video id and path in the database, and I need to stream it over http to a React JS website
I've seen people using OpenCV to stream, and put an image in the JS client, but from what I've seen it does not work with a video file already on the computer (I can be mistaken, I'm not sure).
Another thing I've seen is using JS to send the video directly to the browser (like in this video, but I don't think this will work for my circumstances)
Does anyone have any idea?

Related

Video streaming after image processiong from server to web client

I am still a beginner and I am trying to develop a Home Surveillance application with facial recognition, where the main idea is to stream the image from the server to the web browser including additional data such as the name of the person detected/ unknown based on detection with OpenCV and other information from a database in case the person is recognized, such as sex, age, etc.
I did something that works, but I don't know how efficient it is, thus I would like some opinions.
I am using flask as the web server and socketio for transferring real-time data.
The actual app runs as follows: I have a function that starts at the beginning of a thread, imageProcessing(), which takes the frames from the camera and does all the image processing (face detection and recognition), and stores the frame and persons detected in one global variable, respectively array with persons recognized/ unknown. Next, the web server app begins and after the connection is established between the web client and web server, the frame and array of persons are sent along with additional information via web sockets to the web client. The delay is less than a second. So everything is running on the same app. So the questions here are:
Is it efficient to have a thread that does image processing running inside the server app?
Is it efficient and secure to use the global variable such that the function imageProcessing() is storing the image from camera and then this global variable at the same time is used by the socket.emit() which sends the frame to web client?
Here is part of the code of the main app. The function is irrelevant since just stores a result to a global variable and that's why I did not place it here.
```
import camera #this is a file containg the imageProcessing function and frame global variable
cameraThread = threading.Thread(target = camera.imageProcessing)
cameraThread.start()
eventlet.monkey_patch()
app = Flask(__name__)
socketio = SocketIO(app, logger=True, async_mode='eventlet')
#socketio.on('connect', namespace='/web')
def connect_web():
print('[INFO] Web client connected: {}'.format(request.sid))
#socketio.on('message', namespace='/web')
def handle(msg):
while True:
message = {'image': camera.encode_image(camera.frame),'person': camera.person}
time.sleep(0.1)
socketio.emit('stream_response', message, namespace='/web', broadcast=True)
#socketio.on('disconnect', namespace='/local')
def disconnect_web():
print('[INFO] Web client disconnected: {}'.format(request.sid))
#app.route('/', methods=['POST', 'GET'])
def index():
return render_template('index.html', async_mode=socketio.async_mode)
if __name__ == '__main__':
socketio.run(app,host="127.0.0.1", port=5000)
Initially I tried to make a separate client where the image was taken from cameras and processed, then sent the result via web sockets to the web server and then again to the web client via web sockets, but when doing so, I had a delay of ~15 seconds.. I suppose this approach of having the web server independent from image processing app shall be better in theory from different reasons, but I do not know why I have such a delay..
Shall I still keep an independence between image processing client -> web server -> web client? If so then any suggestion for using another alternative/ technology since I have such a delay for this scenario using socket.io?
How would you develop such an app from an architecture design point of view? E.g Would you use WebSockets for streaming video or some other technology? Would you place image processing into a separate client/ app and transfer the result to the web server or all processing on the same server?

Read audio stream from an Icecast2 URL and re-stream it using Flask

I just want to know how to read or get data from an MP3 stream (URL), and then restream it in Flask.
I already have a Icecast2/DarkIce service running an MP3.
The purpose of this is that I want to re-stream that mp3 using my own Flask code, so this stream together with all my services are running on the same Flask server.
Thanks so much in advance!
So I've found the a solution and it's so stupidly easy:
#app.route("/audio_stream")
def Audio_Stream():
r = requests.get("http://localhost:8082/audio_stream.mp3", stream=True)
return Response(r.iter_content(chunk_size=1024), mimetype='audio/mpeg')
Basically I just used the Icecast2 stream URL, read the data and returned it using Flask.
I hope this helps somebody.

python lossless audio recording+http streaming library

I am working on a simple service to remotely record line input from an audio interface attached to a server, via REST API request.
My current solution, using PyAudio to manage the audio interface:
1) send HTTP request to start recording to a file on server filesystem.
2) send HTTP request to stop recording and pull the recorded audio file from the server filesystem
Instead, I would like to be able to just "stream" the line input to any http client who wants to download the audio stream.
Is there any simple python library solution to lossless http audio streaming directly from any audio interface's input?
More importantly, does this make sense or should I use RTSP instead? (More than efficiency I would like to focus on being able to download the audio stream by a simple http link on a browser or even via curl or simple programmatic request, and I'll usually not have more than one connected client at a time, that's why I'd prefer to avoid RTSP.)
I have done this using Python flask to provide the REST endpoint to stream audio, and the pyfaac module to pack PCM frames into the AAC format (this format is needed for streaming). Then, for example, you use the standard HTML5 audio tag with src set to your streaming endpoint.

Screenshot of flash element in Python

How can I take a scrennshot of flash website in Python 3.5.1. I trying something like this but I can't see video image.
from selenium import webdriver
def webshot(url, filename):
browser = webdriver.PhantomJS()
browser.get(url)
browser.save_screenshot(filename)
browser.quit()
webshot('https://www.youtube.com/watch?v=YQHsXMglC9A', 'screentest.png')
Short version : With Youtube system, if you didn't press that "play" button (initiate playback) there is no video served. Loading the page via browser is a form of initiating playback too. However using a webshot doesn't fulfill Youtube server's requirements so it wont work.
long version :
How can I take a screenshot of a Flash website... I tried this but I
can't see video image.
webshot('https://www.youtube.com/watch?v=YQHsXMglC9A', 'screentest.png')
You cannot screenshot Youtube's video player content like this. The way Youtube works is that when video page is ready, another PHP file is accessed to determine the video link (eg: the correct file for chosen quality settings, etc). Basically you have to appear to be like a browser making an HTTP request to their servers. Their server gives temporary token to access video link until token expires etc. There's other issues like CORS to deal with. These things are not being done by your tool.
If only Youtube used a normal <video tag> with simple MP4 link then your code would've worked.
The best you can get is like below (see how there is no controls?) using :
webshot('https://www.youtube.com/embed/YQHsXMglC9A', 'screentest.png')

How to pass image to python script using flask?

I have a website (managed with python-flask) with images on canvas and i would like to pass the content of those canvas to another python script as images.
The other python script is using openCV in order to perform face detection.
I know i could upload the image on my server and then read the file on my opencv application but i would like not to save any data on my server.
Do you have any ideas ?
You anyway should upload file to the server, because you need to transfer user's data to your server application.
But instead of saving it as a regular file, you could use someting like SpooledTemporaryFile
In other words, you'll have workflow like this:
Send image with POST to the server;
Read file from POST request with flask;
Write it to SpooledTemporaryFile and receive a file-like object;
Use that file-like object for OpenCV

Categories

Resources