My Javascript application gets a WebM video stream over a Websocket connection. There is no delay between remote peer sending video frames and the application getting them.
I create a MediaSource
object in the application, to which I "append video frames", and let a video element show it:
video.src = window.URL.createObjectURL(mediaSource);
This works nicely but there is some (less than a second) delay which arguably makes this solution not optimal for video calls.
Evidently, some WebRTC applications use MediaStream
instead:
video.srcObject = mediaStream;
...and these show no delay.
I could not determine from documentation whether browsers handle src
and srcObject
differently.
Another thing I could not find is if it is possible to create a MediaStream
and append buffers to it much like with MediaSource
. I want to try that just to check if srcObject
would not cause the aforementioned delay in my application.
If I use:
video.srcObject = mediaSource;
I get the error:
TypeError: Failed to set the 'srcObject' property on 'HTMLMediaElement': The provided value is not of type 'MediaStream'
The srcObject property of the HTMLMediaElement interface sets or returns the object which serves as the source of the media associated with the HTMLMediaElement . The object can be a MediaStream , a MediaSource , a Blob , or a File (which inherits from Blob ).
The MediaStreamTrack interface represents a single media track within a stream; typically, these are audio or video tracks, but other track types may exist as well.
Media Source Extensions (MSE) is a JavaScript API that lets you build streams for playback from segments of audio or video.
The Media APIs are used to play and, in some cases, record media files. This includes audio (e.g., play MP3s or other music files, ringtones, game sound effects, or DTMF tones) and video (e.g., play a video streamed over the web or from local storage).
What you are asking are very good questions, and all of us, streaming video developers, encounter same issues and share same frustration when it comes to plugin-free near real time streaming video in browsers.
Let me address your questions to the best of my knowledge (I have implemented both WebRTC and Media Source Extensions in recent years, for a streaming server software)
This one is easy - it is NOT possible. MediaStream API: https://developer.mozilla.org/en-US/docs/Web/API/MediaStream does not expose access to MediaStream object's frame buffer, it handles everything internally using WebRTC, either getting frames using getUserMedia (from local webcam), or from RTCPeerConeection (from network). With MediaStream object you don't manipulate frames or segments directly.
And, of course, video.srcObject = mediaSource will not work: video.srcObject must be a MediaStream object created by WebRTC API, nothing else.
Hell yes, browsers do treat video.src and video.srcObject very differently; and there is no documentation about it, and it doesn't make much sense. Politics play large role in it.
Notorious examples from Chrome browser:
a. Media Source Extensions (video.src) support AAC audio, but WebRTC (video.srcObject) does not, and never will. The reason is - Google bought too many audio compression companies and one of them - Opus - made it to WebRTC specs, and Google is pushing Opus to be a new "royalty-free" audio king, so no AAC support in video.srcObject, and all the hardware world must implement Opus now. So Google can and is legally allowed to add AAC support to Chrome, because it does it for Media Source Extesnsions (video.src). But it will not add AAC support to WebRTC, never.
b. Chrome uses different strategies for H264 video decoders in video.src and video.srcObject. This makes no sense but it's a fact. For example, on Android, only devices with hardware H264 decoding support will support H264 in WebRTC (video.srcObject). Older devices without hardware H264 support, will not play H264 video via WebRTC. But same devices will play same H264 video via Media Source Extensions (video.src). So video.src must be using a software decoder if hardware is not available. Why the same cannot be done in WebRTC?
Lastly, your VP8 stream will not play on iOS, neither in Media Source Extensions (iOS doesn't support it at all, ha ha ha), nor in WebRTC (iOS only support H264 video for WebRTC, ha ha ha ha). You are asking why Apple does that? ha ha ha ha ha
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With