Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

WebRTC: How do I stream Client A's video to Client B?

Tags:

webrtc

I am looking into WebRTC but I feel like I'm not understanding the full picture. I'm looking at this demo project in particular: https://github.com/oney/RCTWebRTCDemo/blob/master/main.js

I'm having trouble understanding how I can match 2 clients so that Client A can see Client B's video stream and vice versa.

This is in the demo:

function getLocalStream(isFront, callback) {
  MediaStreamTrack.getSources(sourceInfos => {
    console.log(sourceInfos);
    let videoSourceId;
    for (const i = 0; i < sourceInfos.length; i++) {
      const sourceInfo = sourceInfos[i];
      if(sourceInfo.kind == "video" && sourceInfo.facing == (isFront ? "front" : "back")) {
        videoSourceId = sourceInfo.id;
      }
    }
    getUserMedia({
      audio: true,
      video: {
        mandatory: {
          minWidth: 500, // Provide your own width, height and frame rate here
          minHeight: 300,
          minFrameRate: 30
        },
        facingMode: (isFront ? "user" : "environment"),
        optional: [{ sourceId: sourceInfos.id }]
      }
    }, function (stream) {
      console.log('dddd', stream);
      callback(stream);
    }, logError);
  });
}

and then it's used like this:

socket.on('connect', function(data) {
  console.log('connect');
  getLocalStream(true, function(stream) {
    localStream = stream;
    container.setState({selfViewSrc: stream.toURL()});
    container.setState({status: 'ready', info: 'Please enter or create room ID'});
  });
});

Questions:

  1. What exactly is MediaStreamTrack.getSources doing? Is this because devices can have multiple video sources (e.g. 3 webcams)?

  2. Doesn't getUserMedia just turn on the client's camera? In the code above isn't the client just viewing a video of himself?

I'd like to know how I can pass client A's URL of some sort to client B so that client B streams the video coming from client A. How do I do this? I'm imagining something like this:

  1. Client A enters, joins room "abc123". Waits for another client to join
  2. Client B enters, also joins room "abc123".
  3. Client A is signaled that Client B has entered the room, so he makes a connection with Client B
  4. Client A and Client B start streaming from their webcam. Client A can see Client B, and Client B can see Client A.

How would I do it using the WebRTC library (you can just assume that the backend server for room matching is created)

like image 775
bigpotato Avatar asked Nov 23 '16 04:11

bigpotato


People also ask

How does WebRTC stream video?

WebRTC leverages three HTML5 APIs enabling browsers to capture, encode, and transmit live streams. While streaming workflows can often require an IP camera, encoder, and streaming software, the most basic WebRTC use-cases can manage the whole enchilada with just a webcam and browser.

How do I view WebRTC?

If you use Google Chrome, or most Chromium-based browsers such as Opera or Vivaldi: load chrome://webrtc-internals/ in the browser's address bar to list all WebRTC connections. The site that tried to establish the WebRTC connection is listed at the top (in this case https://ip.voidsec.com/.


Video Answer


2 Answers

The process you are looking for is called JSEP (JavaScript Session Establishment Protocol) and it can be divided in the 3 steps I describe below. These steps start once both clients are in the room and can comunicate through WebSockets, I will use ws as an imaginary WebSocket API for communication between the client and the server and other clients:

1. Invite

During this step, one desinged caller creates and offer and sends it through the server to the other client (callee):

// This is only in Chrome
var pc = new webkitRTCPeerConnection({iceServers:[{url:"stun:stun.l.google.com:19302"}]}, {optional: [{RtpDataChannels: true}]});

// Someone must be chosen to be the caller
// (it can be either latest person who joins the room or the people in it)
ws.on('joined', function() {
  var offer = pc.createOffer(function (offer) {
    pc.setLocalDescription(offer);
    ws.send('offer', offer);
  });
});

// The callee receives offer and returns an answer
ws.on('offer', function (offer) {
  pc.setRemoteDescription(new RTCSessionDescription(offer));
  pc.createAnswer(function(answer) {
    pc.setLocalDescription(answer);
    ws.send('answer', answer);
  }, err => console.log('error'), {});
});

// The caller receives the answer
ws.on('answer', function (answer) {
  pc.setRemoteDescription(new RTCSessionDescription(answer));
});

Now both sides are have exchanged SDP packets and are ready to connect to each other.

2. Negotiation (ICE)

ICE candidates are created by each side to find a way to connect to each other, they are pretty much IP addresses where they can be found: localhost, local area network address (192.168.x.x) and external public IP Address (ISP). They are generated automatically by the PC object.

// Both processing them on each end:
ws.on('ICE', candidate => pc.addIceCandidate(new RTCIceCandidate(data)));
// Both sending them:
pc.onicecandidate = candidate => ws.send('ICE', candidate);

After the ICE negotiation, the conexion gets estabished unless you try to connect clients through firewalls on both sides of the connection, p2p communications are NAT traversal but won't work on some scenarios.

3. Data streaming

// Once the connection is established we can start to transfer video,
// audio or data

navigator.getUserMedia(function (stream) {
  pc.addStream(stream);
}, err => console.log('Error getting User Media'));

It is a good option to have the stream before making the call and adding it at earlier steps, before creating the offer for the caller and right after receiving the call for the callee, so you don't have to deal with renegotiations. This was a pain a few years ago but it may be better implemented now in WebRTC.

Feel free to check my WebRTC project in GitHub where I create p2p connections in rooms for many participants, it is in GitHub and has a live demo.

like image 125
Javier Conde Avatar answered Oct 17 '22 22:10

Javier Conde


MediaStreamTrack.getSources is used to get video devices connected. It seems to be deprecated now. See this stack-overflow question and documentation. Also refer MediaStreamTrack.getSources demo and code.

Yes. getUserMedia is just turning on camera. You can see the demo and code here.

Please refer to this peer connection sample & code here to stream audio and video between users.

Also look at this on Real time communication with WebRTC.

like image 38
Rajind Ruparathna Avatar answered Oct 17 '22 23:10

Rajind Ruparathna