Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MediaSource vs MediaStream in Javascript

My Javascript application gets a WebM video stream over a Websocket connection. There is no delay between remote peer sending video frames and the application getting them.

I create a MediaSource object in the application, to which I "append video frames", and let a video element show it:

video.src = window.URL.createObjectURL(mediaSource);

This works nicely but there is some (less than a second) delay which arguably makes this solution not optimal for video calls.

Evidently, some WebRTC applications use MediaStream instead:

video.srcObject = mediaStream;

...and these show no delay.

I could not determine from documentation whether browsers handle src and srcObject differently.

Another thing I could not find is if it is possible to create a MediaStream and append buffers to it much like with MediaSource. I want to try that just to check if srcObject would not cause the aforementioned delay in my application.

If I use:

video.srcObject = mediaSource;

I get the error:

TypeError: Failed to set the 'srcObject' property on 'HTMLMediaElement': The provided value is not of type 'MediaStream'

like image 861
Sergio Avatar asked Aug 14 '18 14:08

Sergio


People also ask

What is srcObject in JavaScript?

The srcObject property of the HTMLMediaElement interface sets or returns the object which serves as the source of the media associated with the HTMLMediaElement . The object can be a MediaStream , a MediaSource , a Blob , or a File (which inherits from Blob ).

What is MediaStreamTrack?

The MediaStreamTrack interface represents a single media track within a stream; typically, these are audio or video tracks, but other track types may exist as well.

What is MSE video?

Media Source Extensions (MSE) is a JavaScript API that lets you build streams for playback from segments of audio or video.

What is media API?

The Media APIs are used to play and, in some cases, record media files. This includes audio (e.g., play MP3s or other music files, ringtones, game sound effects, or DTMF tones) and video (e.g., play a video streamed over the web or from local storage).


1 Answers

What you are asking are very good questions, and all of us, streaming video developers, encounter same issues and share same frustration when it comes to plugin-free near real time streaming video in browsers.

Let me address your questions to the best of my knowledge (I have implemented both WebRTC and Media Source Extensions in recent years, for a streaming server software)

  1. " if it is possible to create a MediaStream and append buffers to it, like the MediaSource"

This one is easy - it is NOT possible. MediaStream API: https://developer.mozilla.org/en-US/docs/Web/API/MediaStream does not expose access to MediaStream object's frame buffer, it handles everything internally using WebRTC, either getting frames using getUserMedia (from local webcam), or from RTCPeerConeection (from network). With MediaStream object you don't manipulate frames or segments directly.

And, of course, video.srcObject = mediaSource will not work: video.srcObject must be a MediaStream object created by WebRTC API, nothing else.

  1. "I could not find in the documentation if browsers handle src and srcObject differently"

Hell yes, browsers do treat video.src and video.srcObject very differently; and there is no documentation about it, and it doesn't make much sense. Politics play large role in it.

Notorious examples from Chrome browser:

a. Media Source Extensions (video.src) support AAC audio, but WebRTC (video.srcObject) does not, and never will. The reason is - Google bought too many audio compression companies and one of them - Opus - made it to WebRTC specs, and Google is pushing Opus to be a new "royalty-free" audio king, so no AAC support in video.srcObject, and all the hardware world must implement Opus now. So Google can and is legally allowed to add AAC support to Chrome, because it does it for Media Source Extesnsions (video.src). But it will not add AAC support to WebRTC, never.

b. Chrome uses different strategies for H264 video decoders in video.src and video.srcObject. This makes no sense but it's a fact. For example, on Android, only devices with hardware H264 decoding support will support H264 in WebRTC (video.srcObject). Older devices without hardware H264 support, will not play H264 video via WebRTC. But same devices will play same H264 video via Media Source Extensions (video.src). So video.src must be using a software decoder if hardware is not available. Why the same cannot be done in WebRTC?

Lastly, your VP8 stream will not play on iOS, neither in Media Source Extensions (iOS doesn't support it at all, ha ha ha), nor in WebRTC (iOS only support H264 video for WebRTC, ha ha ha ha). You are asking why Apple does that? ha ha ha ha ha

like image 162
user1390208 Avatar answered Sep 20 '22 11:09

user1390208