I know it's doable with mediaSource but media source doesn't support all video formats (like fragmented mp4 for example). Which is a problem because my application doesn't have a server that can fix the file. It's a client side application only.
const blob = await ipfs.getBlobFromStream(hash)
const url = URL.createObjectURL(blob)
this.setState({...this.state, videoSrc: url})
const getBlobFromStream = async (hash) => {
return new Promise(async resolve => {
let entireBuffer
const s = await stream(hash)
s.on('data', buffer => {
console.log(buffer)
if (!entireBuffer) {
entireBuffer = buffer
}
else {
entireBuffer = concatTypedArrays(entireBuffer, buffer)
}
})
s.on('end', () => {
const arrayBuffer = typedArrayToArrayBuffer(entireBuffer)
const blob = new Blob(arrayBuffer)
resolve(blob)
})
})
}
this is the code i'm using right now, which basically waits for the entire file and puts it in a single array and then into a blob and then into URL.createObjectURL
Piping a stream will generally lock it for the duration of the pipe, preventing other readers from locking it. A WritableStream that acts as the final destination for the ReadableStream . The options that should be used when piping to the writable stream. Available options are:
The ReadableStream () constructor creates and returns a readable stream object from the given handlers. There are two types of underlying source: Push sources constantly push data at you when you have accessed them, and it is up to you to start, pause, or cancel access to the stream.
This involves two methods — ReadableStream.pipeThrough (), which pipes a readable stream through a writer/reader pair to transform one data format into another, and ReadableStream.pipeTo (), which pipes a readable stream to a writer acting as an end point for the pipe chain.
fs.createReadStream () does not work with http URLs only file:// URLs or filename paths. Unfortunately, this is not described in the fs doc, but if you got look at the source code for fs.createReadStream () and follow what it calls you can find that it ends up calling fileURULtoPath (url) which will throw if it's not a file: URL.
You can do it in which you restructure your code:
await ipfs.startBlobStreaming(hash);
this.setState({...this.state, videoComplete: true});
const startBlobStreaming = async (hash) => {
return new Promise(async (resolve) => {
let entireBuffer;
const s = await stream(hash);
s.on('data', buffer => {
if (!entireBuffer) {
entireBuffer = buffer;
} else {
entireBuffer = concatTypedArrays(entireBuffer, buffer);
}
const arrayBuffer = typedArrayToArrayBuffer(entireBuffer);
const blob = new Blob(arrayBuffer);
const url = URL.createObjectURL(blob);
this.setState({...this.state, videoSrc: url});
});
s.on('end', _ => resolve())
});
}
I dont know how intensive the buffers are come into s.on
but you could also collect a amount of buffer in a certain time(e.g. 1000ms) and then create the blob url.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With