Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does video resolution change when streaming from Android via WebRTC

I'm trying to stream at 640x480 from Chrome on Android using WebRTC, and the video starts off at that, but then the resolution drops to 320x240.

Here are the getUserMedia parameters that are sent:

 "getUserMedia": [
  {
   "origin": "http://webrtc.example.com:3001",
   "pid": 30062,
   "rid": 15,
   "video": "mandatory: {minWidth:640, maxWidth:640, minHeight:480, maxHeight:480}"
  }

My question is why does the resolution fall? When I try it from Chrome on my Mac that does not happen. I would like to make adjustments so that the video resolution does not change.

video frames dumped using ffmpeg

chrome://webrtc-internals text dump

I'm using the Licode WebRTC streaming server, but have also seen the same behavior using Kurento.

like image 556
Jay Prall Avatar asked Jun 09 '15 21:06

Jay Prall


People also ask

How do I improve video quality in WebRTC?

Making a choice between resolution and frame rate If possible, go for VBR instead of the default CBR in WebRTC. Assuming you're in the talking-heads domain, a higher frame rate is the better selection. 30fps is what we're aiming for, but if the bitrate is low, you will need to lower that as well.

Does WebRTC support 1080p?

Most WebRTC implementations to date have been able to reach 720p resolutions, with 1080p starting to be introduced.


Video Answer


1 Answers

getUserMedia constraints only affect the media requested from the browser to the hardware and returned as a stream. getUserMedia constraints don't have any effect on what is done to that stream afterwards (i.e., when it's streamed over a connection). The degradation you're seeing is in the PeerConnection layer, not in the getUserMedia layer. Degradation is triggered by the webrtc implementation when hardware and bandwidth statistics are indicative of low performance, and is negotiated by both sides.

[Hardware] <-   getUserMedia   -> [javascript client] <- PeerConnection -> [another client]
           <- 640x480 captured ->                     <-  320x240 sent  ->

You'll have to dig into source code for documentation and evidence of how it's done in each implementation, but references to behavior:

From the OReilly Chapter on WebRTC:

The good news is that the WebRTC audio and video engines work together with the underlying network transport to probe the available bandwidth and optimize delivery of the media streams. However, DataChannel transfers require additional application logic: the application must monitor the amount of buffered data and be ready to adjust as needed.

...

WebRTC audio and video engines will dynamically adjust the bitrate of the media streams to match the conditions of the network link between the peers. The application can set and update the media constraints (e.g., video resolution, framerate, and so on), and the engines do the rest—this part is easy.

like image 198
xdumaine Avatar answered Sep 20 '22 07:09

xdumaine