Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to deal with large data sets with jquery isotope

I am planning on using the great isotope plugin for displaying a list of contacts and then allowing them to be filtered. The issue I have is that it works great for a small data set but i'm not sure the best way of scaling it up for 1000+ pieces of data.

So far the ideas I had were:

  • loading a random subset and then adding nodes to it as filters are clicked to fill in the gaps
  • loading more nodes as a user scrolls
  • paging the results
  • not displaying contacts until enough filters have been selected to bring the numbers below a predefined threshold.

I'm not sure if these will work well and I was hoping others had faced this situation and could give me some ideas.

like image 663
Josh Avatar asked Jan 28 '12 19:01

Josh


2 Answers

The situation you describe is pretty common: how to give your user access to more data than they can possibly see in detail at once.

There are several ways to answer the question and the correct answer is completely subjective: it depends on what your user is trying to see or do with the contacts. Before you can really get a satisfactory solution, you need to know what the users are going to use the contacts for.

Just guessing (but you would know better than me!), I'd expect there are two things they're doing:

  • Lookup: Looking for a specific contact and they already know their name/handle.
  • Explore: Looking for a specific contact but they can't quite remember their name/handle. Or they're just browsing.

If you do filtering for all the solutions, then the Lookup goal is pretty much in the bag. The Explore goal is the one you want to design for:

  • Random Subset: Its not a great way to browse since you're basically left with a subset to browse and then you must explicitly filter to see anything new. Hard to filter when you don't know exactly what you're looking for.
  • Infinite Scrolling: seems like a popular solution these days. I find it cumbersome, especially if you are 'infinitely' scrolling thru 1000+ contacts. Probably not great for the Explore goal.
  • Paging: Also cumbersome - but perhaps if the paging is tied to alphabetical sorting this could work well.
  • Threshold limiting: so...simply relying on the filtering? This may be bad in some corner cases in which the user applies one filter and they don't see anything b/c the threshold still isn't met (maybe there are a lot of people with the last name Johnson, which is what you searched for). Plus, I think the ability to browse is important when you don't know what you are looking for.

I think if I were in your shoes, I'd introduce some clustering of the contacts. I doubt that the 1000+ contacts is much of a performance problem (in less you're talking a million!), so the 10000+ is really a user constraint: they just can't view 1000 contacts at once.

I'd suggest introducing some clustering, probably by the last name or last name and first name. Then present the user with a way to drill into one cluster but fold up all the other contacts so they're aren't immediately visible. Something in the ream of the accordian/rollodex paradigm. This gives your user the illusion that they are working with 'all the contacts'. Probably introduce a minimal number for each cluster so that if the cluster is sufficiently small you don't bother showing it (ie, why show a cluster for 2 or 3 or 5 contacts - just show the contacts). As filters are applied then, the clusters melt away.

like image 95
dsummersl Avatar answered Sep 24 '22 14:09

dsummersl


Taking the idea of Read Through Cache, something like:

  • create a method that can load a batch of up to 100 (or any configurable number) elements. It would:
    • search in the cache (JS array with primary key the ID of the element) for the filtered items
    • request by AJAX the filtered items
    • items returned by AJAX would be added to the cache
    • items returned by AJAX would also be added in a "loading" area at the bottom of the DOM (see below), with the id of the created DIVs the primary key of the element
    • the server would send up to 100 elements. If there is no filter, it would send the next elements not yet been sent. You would need to keep track of the loaded elements. If the size of cached data on server side (i.e. session) is critical, you can keep track only of the highest continuous sent ID (i.e. if you send in 1st batch IDs 1,2,3,6,9,10, then the highest ID is 3, so next time you would send from 4, ..., so you keep in session only one value)
  • create a method that can move the cached DIVs to/from the isotope container
  • onDomReady load using the method above and display the first 20 elements by natural ordering (in your case it would be alphabetically by name). It can be 20 elements or 50 or any...
  • in the background, load in loop by AJAX and in batch of 100 all the elements.

Loading area could be simply:

<html>
  <body>
    <!-- the page stuff -->
    <div id="loader" style='display:none'>
      <!-- all elements are loaded here -->
      <div class="item">...</div>
    </div>
  </body>
</html>

This way you can load all elements step by step in the DOM, and you can display only what is needed.

like image 24
JScoobyCed Avatar answered Sep 22 '22 14:09

JScoobyCed