Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to stream live video frames from client to flask server and back to the client?

I am trying to build a client server architecture where I am capturing the live video from user's webcam using getUserMedia(). Now instead of showing video directly in <video> tag, I want to send it to my flask server, do some processing on frames and throw it back to my web page.

I have used socketio for creating a client-server connection. This is the script in my index.html. Please pardon my mistakes or any wrong code.

<div id="container">
    <video autoplay="true" id="videoElement">

    </video>
</div>
<script type="text/javascript" charset="utf-8">

    var socket = io('http://127.0.0.1:5000');

    // checking for connection
    socket.on('connect', function(){
      console.log("Connected... ", socket.connected)
    });

    var video = document.querySelector("#videoElement");


    // asking permission to access the system camera of user, capturing live 
    // video on getting true.

    if (navigator.mediaDevices.getUserMedia) {
      navigator.mediaDevices.getUserMedia({ video: true })
        .then(function (stream) {

          // instead of showing it directly in <video>, I want to send these frame to server

          //video_t.srcObject = stream

          //this code might be wrong, but this is what I want to do.
          socket.emit('catch-frame', { image: true, buffer: getFrame() });
        })
        .catch(function (err0r) {
          console.log(err0r)
          console.log("Something went wrong!");
        });
    }

    // returns a frame encoded in base64
    const getFrame = () => {
        const canvas = document.createElement('canvas');
        canvas.width = video_t.videoWidth;
        canvas.height = video_t.videoHeight;
        canvas.getContext('2d').drawImage(video_t, 0, 0);
        const data = canvas.toDataURL('image/png');
        return data;
    }


    // receive the frame from the server after processed and now I want display them in either 
    // <video> or <img>
    socket.on('response_back', function(frame){

      // this code here is wrong, but again this is what something I want to do.
      video.srcObject = frame;
    });

</script>

In my app.py -

from flask import Flask, render_template
from flask_socketio import SocketIO, emit

app = Flask(__name__)
socketio = SocketIO(app)

@app.route('/', methods=['POST', 'GET'])
def index():
    return render_template('index.html')

@socketio.on('catch-frame')
def catch_frame(data):

    ## getting the data frames

    ## do some processing 

    ## send it back to client
    emit('response_back', data)  ## ??


if __name__ == '__main__':
    socketio.run(app, host='127.0.0.1')

I also have thought to do this by WebRTC, but I am only getting code for peer to peer.

So, can anyone help me with this? Thanks in advance for help.

like image 249
akan Avatar asked Nov 19 '19 10:11

akan


2 Answers

So, what I was trying to do is to take the real time video stream captured by the client's webcam and process them at backend.

My backend code is written in Python and I am using SocketIo to send the frames from frontend to backend. You can have a look at this design to get a better idea about what's happening - image

  1. My server(app.py) will be running in backend and client will be accessing index.html
  2. SocketIo connection will get establish and video stream captured using webcam will be send to server frames by frames.
  3. These frames will be then processed at the backend and emit back to the client.
  4. Processed frames coming form the server can be shown in img tag.

Here is the working code -

app.py

@socketio.on('image')
def image(data_image):
    sbuf = StringIO()
    sbuf.write(data_image)

    # decode and convert into image
    b = io.BytesIO(base64.b64decode(data_image))
    pimg = Image.open(b)

    ## converting RGB to BGR, as opencv standards
    frame = cv2.cvtColor(np.array(pimg), cv2.COLOR_RGB2BGR)

    # Process the image frame
    frame = imutils.resize(frame, width=700)
    frame = cv2.flip(frame, 1)
    imgencode = cv2.imencode('.jpg', frame)[1]

    # base64 encode
    stringData = base64.b64encode(imgencode).decode('utf-8')
    b64_src = 'data:image/jpg;base64,'
    stringData = b64_src + stringData

    # emit the frame back
    emit('response_back', stringData)

index.html

<div id="container">
    <canvas id="canvasOutput"></canvas>
    <video autoplay="true" id="videoElement"></video>
</div>

<div class = 'video'>
    <img id="image">
</div>

<script>
    var socket = io('http://localhost:5000');

    socket.on('connect', function(){
        console.log("Connected...!", socket.connected)
    });

    const video = document.querySelector("#videoElement");

    video.width = 500; 
    video.height = 375; ;

    if (navigator.mediaDevices.getUserMedia) {
        navigator.mediaDevices.getUserMedia({ video: true })
        .then(function (stream) {
            video.srcObject = stream;
            video.play();
        })
        .catch(function (err0r) {
            console.log(err0r)
            console.log("Something went wrong!");
        });
    }

    let src = new cv.Mat(video.height, video.width, cv.CV_8UC4);
    let dst = new cv.Mat(video.height, video.width, cv.CV_8UC1);
    let cap = new cv.VideoCapture(video);

    const FPS = 22;

    setInterval(() => {
        cap.read(src);

        var type = "image/png"
        var data = document.getElementById("canvasOutput").toDataURL(type);
        data = data.replace('data:' + type + ';base64,', ''); //split off junk 
        at the beginning

        socket.emit('image', data);
    }, 10000/FPS);


    socket.on('response_back', function(image){
        const image_id = document.getElementById('image');
        image_id.src = image;
    });

</script>

Also, websockets runs on secure origin.

like image 131
akan Avatar answered Nov 19 '22 11:11

akan


I had to tweak your solution a bit :-

I commented the three cv variables and the cap.read(src) statement, modified the following line

var data = document.getElementById("canvasOutput").toDataURL(type);

to

        var video_element = document.getElementById("videoElement")
        var frame = capture(video_element, 1)
        var data = frame.toDataURL(type);

Using the capture function from here :- http://appcropolis.com/blog/web-technology/using-html5-canvas-to-capture-frames-from-a-video/

I'm not sure if this is the right way to do it but it happened to work for me.

Like I said I'm not super comfortable with javascript so instead of manipulating the base64 string in javascript, I'd much rather just send the whole data from javascript and parse it in python this way

# Important to only split once
headers, image = base64_image.split(',', 1) 

My takeaway from this, at the risk of sounding circular, is that you can't directly pull an image string out of a canvas that is containing a video element, you need to create a new canvas onto which you draw a 2D image of the frame you capture from the video element.

like image 2
fibonachoceres Avatar answered Nov 19 '22 13:11

fibonachoceres