Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Internet Explorer slow to render tables generated via JavaScript

I'm working on a page on a web app with a large table. 12 columns and up to 300 rows in some cases. I'm having difficulty getting the table to render quickly in Internet Explorer. I've replicated my difficulties in this bit of test code:

http://jsfiddle.net/dSFz5/

Some benchmarks with IE9 on an Intel Quad Core Q8200 with 4GB RAM:
50 rows, 12 columns: 432ms
100 rows, 12 columns: 1023ms
200 rows, 12 columns: 2701ms
400 rows, 12 columns: 8107ms
800 rows, 12 columns: 24619ms

Exponentially bad.

I managed to dig up some code that renders the same test table MUCH faster on Internet Explorer, but because I'm using mustache.js templates to render my cells and rows (keeping all HTML markup out of my JavaScript), I'm not able to use these DOM methods:

http://jsfiddle.net/bgzLG/

Benchmark results:
50 rows, 12 columns: 37ms
100 rows, 12 columns: 72ms
200 rows, 12 columns: 146ms
400 rows, 12 columns: 324ms
800 rows, 12 columns: 566ms

I can't construct the table block by block like in the second example, because with client-side templates I need to inject strings of HTML returned by mustache. If you start sticking .innerHTML's in there, the performance tanks again.

Can anyone recommend a way to build a table in a more efficient manner compliant with the use of client-side templates?

Pagination is one way to manage this issue, but I'd like to resolve the problem itself.

Any suggestions much appreciated!

like image 439
Elliot B. Avatar asked Feb 21 '23 09:02

Elliot B.


1 Answers

First, I would suggest separating out the creation of the strings from the actual creation of the table because it creates overhead in regards to rendering time. Next, you should try to create the whole table before appending it to the body to minimize the amounts of repaints/reflows. Finally, I would suggest joining an array in IE because string concatenation in that browser repeatedly allocates larger and larger memory blocks for each copy. When you use an array join, the browser only allocates enough memory to hold the entire string.

var strings = [],
    table = ['<table>'],
    i, j;

for (i = 0; i < 1000; i += 1) {
    strings[i] = [];

    for (j = 0; j < 12; j += 1) {
        strings[i][j] = randomString();
    }        
}

var start = new Date().getTime();

for (i = 0; i < 1000; i += 1) {
    table.push('<tr>');

    for (j = 0; j < 12; j += 1) {
        table.push('<td>', strings[i][j], '</td>');
    }

    table.push('</tr>');
}

table.push('</table>');

$('body').append(table.join(''));

var end = new Date().getTime();
var time = end - start;
alert('Execution time: ' + time + 'ms');​

Using this method, I get the following results in IE9:

100 rows is ~9ms
200 rows is ~19ms
500 rows is ~51ms
1000 rows is ~119ms
5000 rows is ~526ms

I'm sure we could optimize it further, but this should be plenty enough for up to 300 rows (~30ms), which is what you said your goal was. It also keeps it under the holy grail benchmark of under ~50ms for any UI interaction.

like image 63
Steve Savoy Avatar answered Mar 08 '23 22:03

Steve Savoy