Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way to stream data into a blob (or generate a giant blob)

Checking MDN I see there used to be BlobBuilder and that I could call blobBuilder.append to continue adding data to a blob but according to MDN BlobBuilder is deprecated in favor of the Blob constructor. Unfortunately the Blob constructor requires all data in memory at construction time. My data is too large to be in memory at construction time. Looking at the File API see nothing there either.

Is there a way to generate large data client side and put it in a blob? For example say I wanted to render a 16k by 16k image. Uncompressed that's a 1gig image.

I have an algorithm that can generate it 1 or a few scan lines at a time but I need way to write those scan lines into a file/blob and then when finished I can use the standard way to let the user download that blob but, I can't seem to find an API that let's me stream data into a blob.

The only thing I can think of is apparently I can make a Blob from Blobs so I suppose I can write each part of the image to a separate blob and then send all the blobs to another blob to get a big blob.

Is that the only solution? Seems kind of um .... strange. Though if it works then ¯\_(ツ)_/¯


Someone voted to close as they don't understand the question. Here's another explanation.

Write 4 gig to a blob

 const arrays = [];
 for (let i = 0; i < 4096; ++i) {
   arrays.push(new Uint8Array(1024 * 1024)); // 1 meg
 }
 // arrays now holds 4 gig of data
 const blob = new Blob(arrays);

The code above will crash because the browser will kill the page for using too much memory. Using BlobBuilder I could have done something like

 const builder = new BlobBuilder();
 for (let i = 0; i < 4096; ++i) {
   const data = new Uint8Array(1024 * 1024); // 1 meg
   builder.append(data);
 }
 const blob = builder.getBlob(...);

That would not have run out of memory because there is never more than 1meg of data around. The browser can flush the data being appended to the BlobBuilder out to disk.

What's the new way to achieve writing 4 gig to a blob? Is it only writing lots of small blobs and then using those to generate a larger one or is there some more traditional way where traditional means steaming into some object/file/blob/storage.

like image 202
gman Avatar asked Oct 17 '22 16:10

gman


1 Answers

As you know, the data that the blob will contain must be ready to pass to the constructor. Let us take the example from MDN:

var aFileParts = ['<a id="a"><b id="b">hey!</b></a>'];
var oMyBlob = new Blob(aFileParts, {type : 'text/html'});

Now, we have two options:

  1. We can append data to the array, and then convert it to a blob:

    var aFileParts = ['<a id="a"><b id="b">hey!</b></a>'];
    aFileParts.push('<p>How are you?</p>');
    var oMyBlob = new Blob(aFileParts, {type : 'text/html'});
    
  2. Alternatively, we can use blobs to create the blob:

    var oMyOtherBlob = new Blob([], {type: 'text/html'});
    oMyOtherBlob = new Blob([oMyOtherBlob, '<a id="a"><b id="b">hey!</b></a>'], {type : 'text/html'});
    oMyOtherBlob= new Blob([oMyOtherBlob, '<p>How are you?</p>'], {type : 'text/html'});
    

You may build your own BlobBuilder encapsulating that... given that appending to an array seems to lead you to run out of memory, let us encapsulate the second option:

var MyBlobBuilder = function() {
     var blob = new Blob([], {type: 'text/html'});
     this.append = function(src)
     {
         blob = new Blob([blob, src], {type: 'text/html'});
     };
     this.getBlob = function()
     {
         return blob;
     }
};

Note: tested with your code (replaced BlobBuilder with MyBlobBuilder), did not run out of memory on my machine. Windows 10, Chrome 67, 8 GB Ram, Intel Core I3 - 64 bits.

like image 131
Theraot Avatar answered Oct 20 '22 17:10

Theraot