What I'm trying to achieve is to make Chrome load a video file as data (via the Fetch API, XHR, whatever) and to play it using <video>
while it's still being downloaded without issuing two separate requests for the same URL and without waiting until the file is completely downloaded.
It's easy to get a ReadableStream
from the Fetch API (response.body
), yet I can't find a way to feed it into the video
element. I've figured out I need a blob
URL for this, which can be created using a MediaSource
object. However, the SourceBuffer#appendStream
method, which sounds like just what is needed, isn't implemented in Chrome, so I can't connect the stream directly to the MediaSource
object.
I can probably read the stream in chunks, create Uint8Array
s out of them, and use SourceBuffer#appendBuffer
, but this means playback won't start immediately unless the chunk size is really small. Also it feels like manually doing something that all these APIs should be able to do out of the box. If there is no other solutions, and I go this way, what caveats should I expect?
Are there probably other ways to create a blob URL for a ReadableStream
? Or is there a way to make fetch
and <video>
share a request? There are so many new APIs that I could easily miss something.
The Fetch API allows you to make network requests similar to XMLHttpRequest (XHR). The main difference is that the Fetch API uses Promises, which enables a simpler and cleaner API, avoiding callback hell and having to remember the complex API of XMLHttpRequest.
The Fetch API is a modern interface that allows you to make HTTP requests to servers from web browsers. If you have worked with XMLHttpRequest ( XHR ) object, the Fetch API can perform all the tasks as the XHR object does. In addition, the Fetch API is much simpler and cleaner.
URL Fetch. This service allows scripts to access other resources on the web by fetching URLs. A script can use the UrlFetch service to issue HTTP and HTTPS requests and receive responses. The UrlFetch service uses Google's network infrastructure for efficiency and scaling purposes.
The Fetch API makes it easier to make asynchronous requests and handle responses better than using an XMLHttpRequest . Fetch allows us to create a better API for the simple things, using modern JavaScript features like promises .
After hours of experimenting, found a half-working solution:
const video = document.getElementById('audio');
const mediaSource = new MediaSource();
video.src = window.URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', async () => {
const sourceBuffer = mediaSource.addSourceBuffer('audio/webm; codecs="opus"');
const response = await fetch(audioSRC);
const body = response.body
const reader = body.getReader()
let streamNotDone = true;
while (streamNotDone) {
const {value, done} = await reader.read();
if (done) {streamNotDone = false; break;}
await new Promise((resolve, reject) => {
sourceBuffer.appendBuffer(value)
sourceBuffer.onupdateend = (() => {
resolve(true);
})
})
}
});
It works with https://developer.mozilla.org/en-US/docs/Web/API/MediaSource
Also, I tested this only with webm/opus format but I believe it should work with other formats as well as long as you specify it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With