Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Processing a large (12K+ rows) array in JavaScript

The project requirements are odd for this one, but I'm looking to get some insight...

I have a CSV file with about 12,000 rows of data, approximately 12-15 columns. I'm converting that to a JSON array and loading it via JSONP (has to run client-side). It takes many seconds to do any kind of querying on the data set to returned a smaller, filtered data set. I'm currently using JLINQ to do the filtering, but I'm essentially just looping through the array and returning a smaller set based on conditions.

Would webdb or indexeddb allow me to do this filtering significantly faster? Any tutorials/articles out there that you know of that tackles this particular type of issue?

like image 837
S16 Avatar asked May 07 '12 16:05

S16


People also ask

How large can a JavaScript array be?

JavaScript arrays are zero-based and use 32-bit indexes: the index of the first element is 0, and the highest possible index is 4294967294 (232−2), for a maximum array size of 4,294,967,295 elements.

Are JavaScript arrays slow?

JavaScript Arrays are great when you only have a few items, but when you have a large amount of data or want to do complex transformations with lots of map , filter , and reduce method calls, you'll see a significant slowdown in performance using Array.


1 Answers

http://square.github.com/crossfilter/ (no longer maintained, see https://github.com/crossfilter/crossfilter for a newer fork.)

Crossfilter is a JavaScript library for exploring large multivariate datasets in the browser. Crossfilter supports extremely fast (<30ms) interaction with coordinated views, even with datasets containing a million or more records...

like image 67
Andrew Avatar answered Sep 21 '22 13:09

Andrew