Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Working with large Backbone collections

We're designing a backbone application, in which each server-side collection has the potential to contain tens of thousands of records. As an analogy - think of going into the 'Sent Items' view of an email application.

In the majority of Backbone examples I've seen, the collections involved are at most 100-200 records, and therefore fetching the whole collection and working with it in the client is relatively easy. I don't believe this would be the case with a much larger set.

Has anyone done any work with Backbone on large server-side collections?

  • Have you encountered performance issues (especially on mobile devices) at a particular collection size?
  • What decision(s) did you take around how much to fetch from the server?
  • Do you download everything or just a subset?
  • Where do you put the logic around any custom mechanism (Collection prototype for example?)
like image 924
isNaN1247 Avatar asked Sep 27 '11 09:09

isNaN1247


People also ask

What is a Backbone collection?

Advertisements. Collections are ordered sets of Models. We just need to extend the backbone's collection class to create our own collection. Any event that is triggered on a model in a collection will also be triggered on the collection directly.

Does anyone use Backbone JS?

Backbone. Backbone has been around for a long time, but it's still under steady and regular development. It's a good choice if you want a flexible JavaScript framework with a simple model for representing data and getting it into views.

Is Backbone a MVC?

Backbone is a JavaScript MVC library and that's a key difference. In the JavaScript MVC context, a framework usually means that you need to configure your code to get things done. Maybe you need to add some code to your HTML page, or give a page element a certain ID class name.


2 Answers

  1. Yes, at about 10,000 items, older browsers could not handle the display well. We thought it was a bandwidth issue, but even locally, with as much bandwidth as a high-performance machine could throw at it, Javascript just kinda passed out. This was true on Firefox 2 and IE7; I haven't tested it on larger systems since.

  2. We were trying to fetch everything. This didn't work for large datasets. It was especially pernicious with Android's browser.

  3. Our data was in a tree structure, with other data depending upon the presence of data in the tree structure. The data could change due to actions from other users, or other parts of the program. Eventually, we made the tree structure fetch only the currently visible nodes, and the other parts of the system verified the validity of the datasets on which they dependent independently. This is a race condition, but in actual deployment we never saw any problems. I would have liked to use socket.io here, but management didn't understand or trust it.

  4. Since I use Coffeescript, I just inherited from Backbone.Collection and created my own superclass, which also instantiated a custom sync() call. The syntax for invoking a superclass's method is really useful here:

    class Dataset extends BaseAccessClass
        initialize: (attributes, options) ->
            Dataset.__super__.initialize.apply(@, arguments)
            # Customizations go here.
    
like image 172
Elf Sternberg Avatar answered Oct 06 '22 08:10

Elf Sternberg


Like Elf said you should really paginate loading data from the server. You'd save a lot of load on the server from downloading items you may not need. Just creating a collection with 10k models locally in Chrome take half a second. It's a huge load.

You can put the work on another physical CPU thread by using a worker and then use transient objects to sent it to the main thread in order to render it on the DOM.

Once you have a collection that big rendering in the DOM lazy rendering will only get you so far. The memory will slowly increase until it crashes the browser (that will be quick on tablets). You should use object pooling on the elements. It will allow you to set a small max size for the memory and keep it there.

I'm building a PerfView for Backbone that can render 1,000,000 models and scroll at 120FPS on Chrome. The code is all up on Github https://github.com/puppybits/BackboneJS-PerfView. It;s commented so theres a lot of other optimizations you'd need to display large data sets.

like image 25
puppybits Avatar answered Oct 06 '22 08:10

puppybits