Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Have I reached the limits of the size of objects JavaScript in my browser can handle?

I'm embedding a large array in <script> tags in my HTML, like this (nothing surprising):

<script>     var largeArray = [/* lots of stuff in here */]; </script> 

In this particular example, the array has 210,000 elements. That's well below the theoretical maximum of 231 - by 4 orders of magnitude. Here's the fun part: if I save JS source for the array to a file, that file is >44 megabytes (46,573,399 bytes, to be exact).

If you want to see for yourself, you can download it from GitHub. (All the data in there is canned, so much of it is repeated. This will not be the case in production.)

Now, I'm really not concerned about serving that much data. My server gzips its responses, so it really doesn't take all that long to get the data over the wire. However, there is a really nasty tendency for the page, once loaded, to crash the browser. I'm not testing at all in IE (this is an internal tool). My primary targets are Chrome 8 and Firefox 3.6.

In Firefox, I can see a reasonably useful error in the console:

Error: script stack space quota is exhausted

In Chrome, I simply get the sad-tab page:

enter image description here

Cut to the chase, already

  • Is this really too much data for our modern, "high-performance" browsers to handle?
  • Is there anything I can do* to gracefully handle this much data?

Incidentally, I was able to get this to work (read: not crash the tab) on-and-off in Chrome. I really thought that Chrome, at least, was made of tougher stuff, but apparently I was wrong...


Edit 1

@Crayon: I wasn't looking to justify why I'd like to dump this much data into the browser at once. Short version: either I solve this one (admittedly not-that-easy) problem, or I have to solve a whole slew of other problems. I'm opting for the simpler approach for now.

@various: right now, I'm not especially looking for ways to actually reduce the number of elements in the array. I know I could implement Ajax paging or what-have-you, but that introduces its own set of problems for me in other regards.

@Phrogz: each element looks something like this:

{dateTime:new Date(1296176400000),  terminalId:'terminal999',  'General___BuildVersion':'10.05a_V110119_Beta',  'SSM___ExtId':26680,  'MD_CDMA_NETLOADER_NO_BCAST___Valid':'false',  'MD_CDMA_NETLOADER_NO_BCAST___PngAttempt':0} 

@Will: but I have a computer with a 4-core processor, 6 gigabytes of RAM, over half a terabyte of disk space ...and I'm not even asking for the browser to do this quickly - I'm just asking for it to work at all!


Edit 2

Mission accomplished!

With the spot-on suggestions from Juan as well as Guffa, I was able to get this to work! It would appear that the problem was just in parsing the source code, not actually working with it in memory.

To summarize the comment quagmire on Juan's answer: I had to split up my big array into a series of smaller ones, and then Array#concat() them, but that wasn't enough. I also had to put them into separate var statements. Like this:

var arr0 = [...]; var arr1 = [...]; var arr2 = [...]; /* ... */ var bigArray = arr0.concat(arr1, arr2, ...); 

To everyone who contributed to solving this: thank you. The first round is on me!


*other than the obvious: sending less data to the browser

like image 845
Matt Ball Avatar asked Jan 28 '11 22:01

Matt Ball


People also ask

How much memory does a JavaScript object use?

JavaScript uses double-precision (64-bit) floating point numbers. 64 bits are 8 bytes, but each number actually takes up an average of 9.7 bytes. Likewise, Chrome shows the size of each individual empty array as 32 bytes and the size of each empty object as 56 bytes.


2 Answers

Here's what I would try: you said it's a 44MB file. That surely takes more than 44MB of memory, I'm guessing this takes much over 44MB of RAM, maybe half a gig. Could you just cut down the data until the browser doesn't crash and see how much memory the browser uses?

Even apps that run only on the server would be well served to not read a 44MB file and keep it in memory. Having said all that, I believe the browser should be able to handle it, so let me run some tests.

(Using Windows 7, 4GB of memory)

First Test I cut the array in half and there were no problems, uses 80MB, no crash

Second Test I split the array into two separate arrays, but still contains all the data, uses 160Mb, no crash

Third Test Since Firefox said it ran out of stack, the problem is probably that it can't parse the array at once. I created two separate arrays, arr1, arr2 then did arr3 = arr1.concat(arr2); It ran fine and uses only slightly more memory, around 165MB.

Fourth Test I am creating 7 of those arrays (22MB each) and concatting them to test browser limits. It takes about 10 seconds for the page to finish loading. Memory goes up to 1.3GB, then it goes back down to 500MB. So yeah chrome can handle it. It just can't parse it all at once because it uses some kind of recursion as can be noticed by the console's error message.

Answer Create separate arrays (less than 20MB each) and then concat them. Each array should be on its own var statement, instead of doing multiple declarations with a single var.

I would still consider fetching only the necessary part, it may make the browser sluggish. however, if it's an internal task, this should be fine.

Last point: You're not at maximum memory levels, just max parsing levels.

like image 142
Juan Mendes Avatar answered Sep 18 '22 05:09

Juan Mendes


Yes, it's too much to ask of a browser.

That amount of data would be managable if it already was data, but it isn't data yet. Consider that the browser has to parse that huge block of source code while checking that the syntax adds up for it all. Once parsed into valid code, the code has to run to produce the actual array.

So, all of the data will exist in (at least) two or three versions at once, each with a certain amount of overhead. As the array literal is a single statement, each step will have to include all of the data.

Dividing the data into several smaller arrays would possibly make it easier on the browser.

like image 35
Guffa Avatar answered Sep 22 '22 05:09

Guffa