Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the most efficient way to manage large datasets with Javascript/jQuery in IE?

I have a search that returns JSON, which I then transform into a HTML table in Javascript. It repeatedly calls the jQuery.append() method, once for each row. I have a modern machine, and the Firefox response time is acceptable. But in IE 8 it is unbearably slow.

I decided to move the transformation from data to HTML into the server-side PHP, changing the return type from JSON to HTML. Now, rather than calling the jQuery.append() time repeatedly, I call the jQuery.html() method once with the entire table. I noticed Firefox got faster, but IE got slower.

These results are anecdotal and I have not done any benchmarking, but the IE performance is very disappointing. Is there something I can do to speed up the manipulation of large amounts of data in IE or is it simply a bad idea to process very much data at once with AJAX/Javascript?

like image 657
aw crud Avatar asked Apr 22 '10 20:04

aw crud


People also ask

How does JavaScript handle large data?

You shouldn't be doing computations on huge data on the client side. Use a Database (SQL or No-SQL) or a caching solution (for eg. redis) and store the pre-processed data (don't know if it will suit your requirement). You can use sockets to process small-small chunk of data and send with socket events.


3 Answers

As others have mentioned, excessive DOM manipulation kills performance. Creating an HTML string using the aforementioned Array.join('') and setting the innerHTML of a container using the jQuery.html() method is orders of magnitude faster. Be wary of using jQuery.append(html) - this is equivalent to creating all the DOM nodes first and then inserting them!

Thing is, even if you optimize the creation of the page node tree, you're still going to hit a ceiling pretty fast with very large datasets. Browsers just can't handle such large and complex DOM trees. First thing you will see slowing down will be the interactions (animations, handlers, etc.) even if you use event delegation. If your dataset is truly large, you will need to do some sort of virtualization to only show what is visible in the viewport (this is what SlickGrid does - http://github.com/mleibman/slickgrid).

Alternatively, you can improve the responsiveness and "time to interactive" of your page by chunking your DOM additions and executing them on a timeout one after another with some pause in between to let the browser handle user events.

Other techniques include rendering the first page worth of data, allocating room for more, but only rendering it when the user starts scrolling towards it. This is what Facebook does.

like image 188
Tin Avatar answered Sep 30 '22 00:09

Tin


I've done this before. It's all the DOM manipulation that slows things down because of the repaint/reflow process firing after each addition.

Build it as a string on the client, insert the string into the DOM using .html().

This has worked quite successfully for me, even on IE6.

like image 36
Diodeus - James MacFarlane Avatar answered Sep 30 '22 00:09

Diodeus - James MacFarlane


Multiple DOM append operations will kill performance. And you may run into a problem with string immutability as well.

Keep the data as small as possible (JSON arrays are good), and build the html in script avoiding the javascript string concatenation problem. Append the html values to an array, and then join the array afterwards. Do one DOM append once the html has been created. eg

var builder = [];

//Inside a loop
builder.push('<tr><td>');
builder.push(json.value);
builder.push('</tr>');

//Outside the loop
$('div').append(builder.join(''));
like image 28
James Westgate Avatar answered Sep 30 '22 01:09

James Westgate