Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

When to use Uint8Array, Uint16Array, Uint32Array

I have an application that loads user text through a XMLHttpRequest and returns it in a binary format. I understand that the main difference between 8, 16, and 32 is the amount of bytes per element but I don't know when to use each.

For example, for my application, since the text file could contain any possible characters really, which one would be best?

I have tried different files including one that that had emojis and it seemed that in the Uint8Array the emoji took up 4 indices.

Is there any reason that for my use I shouldn't just use the Uint8Array or is there a reason I should dynamically choose when the file is read? I have read the MDN docs on each but it doesn't seem to offer much insight other than byte size.

Here is the code I currently use to load the file:

asset= new XMLHttpRequest();

asset.addEventListener('readystatechange', function load() {

    if (asset.readyState == 4 && asset.data.status == 200) {

        const arrayBuffer = asset.data.response;

        if (arrayBuffer) asset.data = new Uint8Array(arrayBuffer);

    }

}.bind(this), false);

// Error handling.

asset.responseType = 'arraybuffer';

asset.open('GET', asset.src);

asset.send();
like image 276
Mr.Smithyyy Avatar asked Nov 06 '22 21:11

Mr.Smithyyy


1 Answers

It all depends on what type of data you are sending into it. 8 bit is fine for ASCII characters and the like. 16 bit is for UCS-2. 32 bit is for UTF-32, which you wouldn't necessarily use because it's more for edge cases. Windows use of this is barely existant and Unix only uses it in internal applications sometimes.

like image 159
Scott Craig Avatar answered Nov 15 '22 07:11

Scott Craig