I was wondering if it was possible to stream data from javascript to the browser's downloads manager.
Using webrtc, I stream data (from files > 1Gb) from a browser to the other. On the receiver side, I store into memory all this data (as arraybuffer ... so the data is essentially still chunks), and I would like the user to be able to download it.
Problem : Blob objects have a maximum size of about 600 Mb (depending on the browser) so I can't re-create the file from the chunks. Is there a way to stream these chunks so that the browser downloads them directly ?
if you want to fetch a large file blob from an api or url, you can use streamsaver.
npm install streamsaver
then you can do something like this
import { createWriteStream } from 'streamsaver';
export const downloadFile = (url, fileName) => {
return fetch(url).then(res => {
const fileStream = createWriteStream(fileName);
const writer = fileStream.getWriter();
if (res.body.pipeTo) {
writer.releaseLock();
return res.body.pipeTo(fileStream);
}
const reader = res.body.getReader();
const pump = () =>
reader
.read()
.then(({ value, done }) => (done ? writer.close() : writer.write(value).then(pump)));
return pump();
});
};
and you can use it like this:
const url = "http://urltobigfile";
const fileName = "bigfile.zip";
downloadFile(url, fileName).then(() => { alert('done'); });
Following @guest271314's advice, I added StreamSaver.js to my project, and I successfully received files bigger than 1GB on Chrome. According to the documentation, it should work for files up to 15GB but my browser crashed before that (maximum file size was about 4GB for me).
Note I: to avoid the Blob max size limitation, I also tried to manually append data to the href field of a <a></a>
but it failed with files of about 600MB ...
Note II: as amazing as it might seem, the basic technique using createObjectURL
works perfectly fine on Firefox for files up to 4GB !!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With