Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Media Source Extensions Not Working

I am trying to use the MediaSource API to append separate WebM videos to a single source.

I found a Github project that was attempting the same thing, where a playlist of WebMs is loaded, and each one is appended as a SourceBuffer. But it was last committed a year ago, and thus out-of-sync with the current spec. So I forked it and updated to the latest API properties/methods, plus some restructuring. Much of the existing code was taken directly from the spec’s examples and Eric Bidelman’s test page.

However, I can not get it to work as expected. I am testing in two browsers, both on Mac OS X 10.9.2: Chrome 35 stable (latest at the time of this writing), and Firefox 30 beta with the flag media.mediasource.enabled set to true in about:config (this feature will not be introduced until FF 25, and current stable is 24).

Here are the problems I’m running into.

Both browsers

I want the video to be, in the end, one long video composed of the 11 WebMs (00.webm, 01.webm, …, 10.webm). Right now, each browser only plays 1 segment of the video.

Chrome

Wildly inconsistent behavior. Seems impossible to reproduce any of these bugs reliably.

  • Sometimes the video is blank, or has a tall black bar in the middle of it, and is unplayable.
  • Sometimes the video will load and pause on the first frame of 01.webm.
  • Sometimes, the video will play a couple of frames of the 02.webm and pause, having only loaded the first three segments.
  • The Play button is initially grayed out.
  • Pressing the grayed out Play button produces wildly inconsistent behaviors. Sometimes, it loads a black, unplayable video. Other times, it will play the first segment, then, when you get to the end, it stops, and when you press Play/Pause again, it will load the next segment. Even then, it will sometimes skip over segments and gets stuck on 04.webm. Regardless, it never plays the final segment, even though the console will report going through all of the buffers.

It is honestly different every time. I can’t list them all here.

Known caveats: Chrome does not currently implement sourceBuffer.mode, though I do not know what effect this might have.

Firefox

  1. Only plays 00.webm. Total running time is 0:08, the length of that video.
  2. Video seeking does not work. (This may be expected behavior, as there is nothing actually happening in the onSeeking event handler.)
  3. Video can not be restarted once finished.

My initial theory was that this had to do with mediaSource.sourceBuffers[0].timestampOffset = duration and duration = mediaSource.duration. But I can’t seem to get anything back from mediaSource.duration except for NaN, even though I’m appending new segments.

Completely lost here. Guidance very much appreciated.

EDIT: I uncommented the duration parts of the code, and ran mse_webm_remuxer from Aaron Colwell's Media Source Extension Tools (thanks Adam Hart for the tips) on all of the videos. Voila, no more unpredictable glitches in Chrome! But alas, it still pauses once a media segment ends, and even when you press play, it sometimes gets stuck on one frame.

In Firefox Beta, it doesn’t play past the first segment, responding with:

TypeError: Value being assigned to SourceBuffer.timestampOffset is not a finite floating-point value.

Logging the value of duration returns NaN (but only in FF).

like image 323
Hugh Guiney Avatar asked Jun 12 '14 23:06

Hugh Guiney


People also ask

What is MSE ios?

The Media Source API, formally known as Media Source Extensions (MSE), provides functionality enabling plugin-free web-based streaming media. Using MSE, media streams can be created via JavaScript, and played using <audio> and <video> elements.

How do I get rid of source buffer?

remove() The remove() method of the SourceBuffer interface removes media segments within a specific time range from the SourceBuffer . This method can only be called when SourceBuffer.

What is Web MSE parser?

Media Source Extensions (MSE) is a JavaScript API that lets you build streams for playback from segments of audio or video.

What is Media API?

The Media APIs are used to play and, in some cases, record media files. This includes audio (e.g., play MP3s or other music files, ringtones, game sound effects, or DTMF tones) and video (e.g., play a video streamed over the web or from local storage).


1 Answers

The main problem is with the video files. If you open chrome://media-internals/ you can see error Media segment did not begin with keyframe. Using properly formatted videos, like the one from Eric Bidelman's example (I hope he doesn't get mad that I keep linking directly to that video, but it's the only example video I've found that works), your code does work with the following change in appendNextMediaSegment():

duration = mediaSource.duration;
mediaSource.sourceBuffers[0].timestampOffset = duration;
mediaSource.sourceBuffers[0].appendBuffer(mediaSegment);

You can try Aaron Colwell's Media Source Extension Tools to try to get your videos working, but I've had limited success.

It also seems a little weird that you're looking at the onProgress event before appending segments, but I guess that could work if you only want to append if the video is actually playing. It could make the seekbar act odd since the video length is unknown, but that can be a problem in any case.

like image 102
Adam Hart Avatar answered Sep 19 '22 00:09

Adam Hart