Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Knockout.js incredibly slow under semi-large datasets

I'm just getting started with Knockout.js (always wanted to try it out, but now I finally have an excuse!) - However, I'm running into some really bad performance problems when binding a table to a relatively small set of data (around 400 rows or so).

In my model, I have the following code:

this.projects = ko.observableArray( [] ); //Bind to empty array at startup

this.loadData = function (data) //Called when AJAX method returns
{
   for(var i = 0; i < data.length; i++)
   {
      this.projects.push(new ResultRow(data[i])); //<-- Bottleneck!
   }
};

The issue is the for loop above takes about 30 seconds or so with around 400 rows. However, if I change the code to:

this.loadData = function (data)
{
   var testArray = []; //<-- Plain ol' Javascript array
   for(var i = 0; i < data.length; i++)
   {
      testArray.push(new ResultRow(data[i]));
   }
};

Then the for loop completes in the blink of an eye. In other words, the push method of Knockout's observableArray object is incredibly slow.

Here is my template:

<tbody data-bind="foreach: projects">
    <tr>
       <td data-bind="text: code"></td>
       <td><a data-bind="projlink: key, text: projname"></td>
       <td data-bind="text: request"></td>
       <td data-bind="text: stage"></td>
       <td data-bind="text: type"></td>
       <td data-bind="text: launch"></td>
       <td><a data-bind="mailto: ownerEmail, text: owner"></a></td>
    </tr>
</tbody>

My Questions:

  1. Is this the right way to bind my data (which comes from an AJAX method) to an observable collection?
  2. I expect push is doing some heavy re-calc every time I call it, such as maybe rebuilding bound DOM objects. Is there a way to either delay this recalc, or perhaps push in all my items at once?

I can add more code if needed, but I'm pretty sure this is what's relevant. For the most part I was just following Knockout tutorials from the site.

UPDATE:

Per the advice below, I've updated my code:

this.loadData = function (data)
{
   var mappedData = $.map(data, function (item) { return new ResultRow(item) });
   this.projects(mappedData);
};

However, this.projects() still takes about 10 seconds for 400 rows. I do admit I'm not sure how fast this would be without Knockout (just adding rows through the DOM), but I have a feeling it would be much faster than 10 seconds.

UPDATE 2:

Per other advice below, I gave jQuery.tmpl a shot (which is natively supported by KnockOut), and this templating engine will draw around 400 rows in just over 3 seconds. This seems like the best approach, short of a solution that would dynamically load in more data as you scroll.

like image 739
Mike Christensen Avatar asked Mar 14 '12 20:03

Mike Christensen


5 Answers

Please see: Knockout.js Performance Gotcha #2 - Manipulating observableArrays

A better pattern is to get a reference to our underlying array, push to it, then call .valueHasMutated(). Now, our subscribers will only receive one notification indicating that the array has changed.

like image 162
Jim G. Avatar answered Nov 05 '22 12:11

Jim G.


As suggested in the comments.

Knockout has it's own native template engine associated with the (foreach, with) bindings. It also supports other template engines, namely jquery.tmpl. Read here for more details. I haven't done any benchmarking with different engines so don't know if it will help. Reading your previous comment, in IE7 you may struggle to get the performance that you are after.

As an aside, KO supports any js templating engine, if someone has written the adapter for it that is. You may want to try others out there as jquery tmpl is due to be replaced by JsRender.

like image 37
madcapnmckay Avatar answered Nov 05 '22 13:11

madcapnmckay


Use pagination with KO in addition to using $.map.

I had the same problem with a large datasets of 1400 records until I used paging with knockout. Using $.map to load the records did make a huge difference but the DOM render time was still hideous. Then I tried using pagination and that made my dataset lighting fast as-well-as more user friendly. A page size of 50 made the dataset much less overwhelming and reduced the number of DOM elements dramatically.

Its very easy to do with KO:

http://jsfiddle.net/rniemeyer/5Xr2X/

like image 13
Tim Santeford Avatar answered Nov 05 '22 13:11

Tim Santeford


KnockoutJS has some great tutorials, particularly the one about loading and saving data

In their case, they pull data using getJSON() which is extremely fast. From their example:

function TaskListViewModel() {
    // ... leave the existing code unchanged ...

    // Load initial state from server, convert it to Task instances, then populate self.tasks
    $.getJSON("/tasks", function(allData) {
        var mappedTasks = $.map(allData, function(item) { return new Task(item) });
        self.tasks(mappedTasks);
    });    
}
like image 11
deltree Avatar answered Nov 05 '22 14:11

deltree


Give KoGrid a look. It intelligently manages your row rendering so that it's more performant.

If you you're trying to bind 400 rows to a table using a foreach binding, you're going to have trouble pushing that much through KO into the DOM.

KO does some very interesting things using the foreach binding, most of which are very good operations, but they do start to break down on perf as the size of your array grows.

I've been down the long dark road of trying to bind large data-sets to tables/grids, and you end up needing to break apart/page the data locally.

KoGrid does this all. Its been built to only render the rows that the viewer can see on the page, and then virtualize the other rows until they are needed. I think you'll find its perf on 400 items to be much better than you're experiencing.

like image 9
ericb Avatar answered Nov 05 '22 14:11

ericb