Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is a good strategy for handling large lists on websites without a pager?

Scenario

I have web interface (in a large web application) that allows a user to make a connection between two very large lists.

List A - 40,000+ items
List B - 1,000+ items
List C - Contains a list of items in b that are connected to the selected item in list A

The Code

Here is a rough jsfiddle of the current behavior minus the ajax update of the database.

Here is the primary functionality (only here because stack overflow requires a code snippet for jsfiddle links).

$('.name-listb input').add('.name-listc input').click(function (e) {
    var lista_id = $('.name-lista input:checked').val();
    var listb_id = $(this).val();
    var operation = $(this).prop('checked') ? 'create' : 'delete';
    var $listb = $('.name-listb .checkbox-list');
    var $listc = $('.name-listc .checkbox-list');

    if (operation == 'create') {
        $listb.find('input[value=' + listb_id + ']').prop('checked', true);
        // Ajax request to add checked item.
        $new_item = $listb.find('input[value=' + listb_id + ']').parents('.option-group').clone();
        $listc.append($new_item);
    } else if (operation == 'delete') {
            console.log('hello list delete');
        $listb.find('input[value=' + listb_id + ']').prop('checked', false);
        // Ajax request to remove checked item.
        $listc.find('input[value=' + listb_id + ']').parents('.option-group').remove();
    }
});

The Problem

The requirements do not allow for me to use an auto complete field or pager. But the current page takes way too long to load (between 1 and 5sec depending on caching). Also the JS behaviors are attached to all 40k+ items which will cause problems on lower performance computers (Tested on a newish $200 consumer special and the computer was crippled by the JS). There is also (not on JS fiddle but the final product) a filter that filters the list down based on text input.

The Question

What is a good strategy for handling this scenario?

My Idea

My first thought was to create a sort of document view architecture. A JavaScript list that adds items to the top and bottom as the user scrolls and dumps items off the other end when the list reaches a certain size. To filter I would dump the whole list and obtain a new list of filtered items like an auto-complete but it would be able to scroll and add items using ajax. But this is very complicated. I was hoping someone might have a better idea or a jquery plugin that already uses this approach.

Update

Lista is 70K Fixed Listb is User generated and will span between 1k-70k. That said just optimizing the JS with the excellent feedback of using delegates (which will make life 10x more awesome), won't be enough. Still need to limit the visible list.

Your Ideas?

like image 254
danielson317 Avatar asked Aug 26 '15 17:08

danielson317


1 Answers

I've encountered this issue on numerous projects before and one solution that's both easy to implement and well performing is using something like Infinity.js.
To summarize shortly, Infinity, like many other "infinite scroll" libraries, allows you to render only a small part of the actual list that should be visible (or should be visible soon), thus reducing the strain on the browser tremendously. You can see a simple live demo over here, check the first link for the API reference.

like image 99
Etheryte Avatar answered Oct 17 '22 00:10

Etheryte