Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Live Video Stream on a Node.js Server

I have been researching this a lot but am frustrated as I feel like the solution should be simple though I know wont be. Ideally i'd just want to use node to host the server, webrtc getusermedia to get the live stream on the local client and use something like socket.io to send the stream to the server and then the server would broadcast the stream to the remote client; as if it was a simple messaging chat app.

Just thinking about this some more it seems as an approach this simple would be impossible because a live video requires continuous large amounts of data to be sent, which does not equate to sending a single message or even file after an event (send button pressed).

Maybe I am wrong however, can a live video stream app follow the same structure of a node/socket.io messenger app? Would you send the media object returned from getUserMedia, the blob, some binary data some how (I've tried all of these but perhaps not correctly).

The ideal goal would be an app that uses as little extra fluff as necessary, as little npm installs, as little extra javascript libraries, or little worrying about encoding/decoding or whatever the hell ICE or STUN are. Is there any way this is possible or am I asking for too much?

Ideal Client

    var socket = io();     var local = document.getElementById("local_video");     var remote = document.getElementById("remote_video");      // display local video     navigator.mediaDevices.getUserMedia({video: true, audio: true}).then(function(stream) {       local.src = window.URL.createObjectURL(stream);       socket.emit("stream", stream);     }).catch(function(err){console.log(err);});      // displays remote video     socket.on("stream", function(stream){       remote.src = window.URL.createObjectURL(stream);      }); 

Ideal Server

var app = require("express")(); var http = require("http").Server(app); var fs = require("fs"); var io = require("socket.io")(http);  app.get('/', onRequest); http.listen(process.env.PORT || 3000, function() {     console.log('server started'); })  //404 response function send404(response) {     response.writeHead(404, {"Content-Type" : "text/plain"});     response.write("Error 404: Page not found");     response.end(); }  function onRequest(request, response) {   if(request.method == 'GET' && request.url == '/') {     response.writeHead(200, {"Content-Type" : "text/html"});     fs.createReadStream("./index.html").pipe(response);   } else {     send404(response);   } }  io.on('connection', function(socket) {   console.log("a user connected");   socket.on('stream', function(stream) {     socket.broadcast.emit("stream", stream);   });   socket.on('disconnect', function () {     console.log("user disconnected");   }); }); 

This is the broken app in action : https://nodejs-videochat.herokuapp.com/

This is the broken code on github: https://github.com/joshydotpoo/nodejs-videochat

like image 291
joshy.poo Avatar asked Mar 15 '17 07:03

joshy.poo


People also ask

Is node JS suitable for video streaming?

Nodejs is very good to streaming audio and video, but nodejs is a new technology, so it's don't have a lot of softwares yet.


2 Answers

Try to be clear and specific. First, you are not using WebRTC here. getUserMedia() is a part of navigator WebAPI which you are using to get media stream from the camera.

Using WebRTC means you are using ICE and STUN/TURN servers for the purpose of signaling. You will use your host server(Node) for specifying ICE configuration, identify each user and provide a way to call each other.

If you want to stream it through your host, probably you should stream it in chunks and set up your own signaling infrastructure. You can use Stream API with socket io to stream data in chunks(packets). See here Stream API(socket.io)

Also, you can check out the live example of WebRTC + Socket.io here: Socket.io | WebRTC Video Chat

You can find out more information here: sending a media stream to Host server

like image 79
Gaurav Chaudhary Avatar answered Oct 09 '22 16:10

Gaurav Chaudhary


I think the topic is about Node Server to support Live Streaming or Video Chat, it's much complex than what you think, let me illustrate it. Both Live Streaming and Video Chat could use WebRTC, but it's not required to use WebRTC for Live Streaming. Both need some Node Server to support the signaling and streaming.

If you want to publish your camera as a live stream, and forward to many like thousands of players, it's something called Live Streaming. The latency is not very critical, generally 3~10s is OK.

If you want to talk to each other, use your camera, also forward to other users, it's called Video Chat. The latency/lagging is very sensitive, MUST <400ms, generally ~200ms.

They are totally different, let's discuss them separately.

Live Streaming

The key for live streaming is cross-platform(both H5 and mobile), fluency without buffering, fast startup to switch between streams. The stream arch is like bellow:

Publisher ---> Server/CDN ---> Player 

Let's talk about player, HLS(LLHLS) is a premier deliver protocol, it's widely used and works good at H5(both PC and mobile) and Mobile(both iOS and Android). The only problem is the latency is about 5~10s, or even larger. Because it's file based protocol.

For Chrome, it's also OK to use hls.js to play HLS, by MSE

Another low latency(3~5s) protocol is OK, it's HTTP-FLV and it's supported by all PC-H5 by hls.js, and mobile by ijkplayer, and some CDN also support this protocol. The only problem is not friendly for mobile-H5.

For player, WebRTC is also OK to play the stream, it works well on PC-H5 like Chrome. The problem is mobile, it's very hard to run a WebRTC native player. Beside the complexity, you also need a signaling server, which used to exchange SDP.

For publisher, it's complex because it depends on your client:

  • If H5 publisher, only WebRTC is available, so you need a server to covert WebRTC to protocol for player. Recommend SRS
  • If Native mobile publisher, recommend FFmpeg, there are lots of libraries and bindings. Any RTMP server is ok, also some node servers.
  • If TV device, it maybe use SRT, you also need a server to covert. Recommand SRS again.

Ultimately, the live streaming economy is based on C/C++, FFmpeg/WebRTC/SRS is writen by C/C++, however there are some servers by nodejs, and you could find by the protocol like nodejs rtmp.

Video Chat

Latency is the most important feature for video chat, so you must use WebRTC for client, both publisher and player.

There are different servers for video chat:

  • A room server, as signaling to exchange SDP for client, to manage the rooms and users, to kickoff some user, or mute the microphone, etc.
  • A SFU server(or MCU), to deliver media streams for all clients. There are also some SFUs, like SRS, Janus or mediasoup.
  • CDN: Few of CDN supports WebRTC server, but QUIC is developing as transport of WebRTC and HTTP/3, so in future might be better. Right now, you could search for some WebRTC cloud service.

As I said, it's very complicated to build a WebRTC system, so please think about your scenario again and again: Are you really need a WebRTC system, or just need to publish live streaming by WebRTC?

If not sure, try live streaming solution first, it's much simple and stable.

like image 36
Winlin Avatar answered Oct 09 '22 16:10

Winlin