How can I manage large sets of filtered, sorted, paginated data in a single-page app which must also be functional on resource-limited devices (eg; mobile)?
More specifically; I have a dataset of 1000+ geocoded place information objects which is filtered to those contained within a map viewport. Due to memory constraints on mobile, I cannot cache the entire dataset. However, due to slow network connections on mobile, I want to minimise HTTP requests.
What is the best strategy for the combination of maximum caching and minimum network requests?
I am building a map split-view for a potentially large (1000+) dataset of information in Australia:
+----------------------------------------+-----------------------------+
| | +----------+ |
| | Sort By: |Suburb |\/| |
| | +----------+ |
| |+---------------------------+|
| _,__ .: || ||
| <* / | \ || Blah Blah ||
| .-./ |. : :, || ||
| / '-._/ \_ || Zoom Zip ||
| / ' \ || ||
| .' *: |+---------------------------+|
| .-' ; |+---------------------------+|
| | | || ||
| \ / || Bing Bong ||
| | / || ||
| \* __.--._ / || Banana for scale ||
| \ _.' \:. | || ||
| >__,-' \_/*_.-' |+---------------------------+|
| |+---------------------------+|
| :--, || ||
| '/ || Whoopty Doo ||
| || ||
+----------------------------------------+-----------------------------+
Driving requirements are:
I have gone through a couple of iterations of handling the data in the app*:
The problems arise with memory constraints on mobile due to each item in the list potentially containing a large set of data each (in JSON format).
There are also other considerations with overloading the DOM when loading the entire list in at once (render time in the browser makes the app unusable).
Doing this a "page" at a time (where a page is ~30 items), we solve the initial memory requirement, but after scrolling so far we hit the limits again on mobile.
There is also a network requirement on mobile now - GPRS/EDGE speeds are unfortunately still common in Australia, and really suck.
Old DOM elements can be replaced with empty placeholders (to keep the affordance of scrolling), and the associated items in the list can be removed from memory. This solves the DOM limitations issue.
However, we have now doubled our network load if the user scrolls back up the list. Ie; there is no cached data anymore.
So our cache can now stay warm, and scrolling back should be an instant load with no network requests.
But what if the user alters the filters / sorting criteria? A single cache is no longer effective as the results in the list may need to be re-filtered and re-sorted for display.
Since the dataset is lazy loaded, the app does not have a complete dataset to correctly perform the filter and sort, so a HTTP request is required.
The worst case is when a user pans the map slightly. 1-2 items may fall off one side, while 2-3 might be added on the opposite side. Now, instead of re-using any possibly cached data, we are unnecessarily reloading over a network request.
id
s with the request, and only receive the delta of the resultsThis saves data being downloaded over the wire. However, may result in large amounts of data being sent up the wire so the API can perform the delta.
Ranges could be a sub-set of the visible bounding box (think spatial partitioning), or a "page" of 30 results when scrolling, etc.
This solves the problem of too much network traffic (both up and down), but I am afraid that the meta data required for storing known ranges may also reach memory limits on a mobile device.
By removing this requirement, we can limit the results we show to a maximum of, say, 50-100, and stick with the Iteration 1 (Naive) approach when we are running on a device detected as limited in capabilities.
How can I manage large sets of filtered, sorted, paginated data in a single-page app which must also be functional on resource-limited devices (eg; mobile)?
* Note: I have already figured out the rendering of the data on the map by using a clustering technique based on the current zoom level, so this is not an issue
Your cache can grow to almost any size as long as you have the RAM to handle it so 30 MB won't be a problem unless you are on a very limited device. You can specify an optional "size" limit on your MemoryCache instance, but that is optional and defaults to there being no limit.
You can optimize your database caching by tweaking the schema. Indexing in a database is one way of optimizing database reads. The database also has a working set of data in-memory to handle frequent requests to the same data. Server caching is the custom caching of data in a server application.
Amazon ElastiCache is a web service that makes it easy to deploy, operate, and scale an in-memory data store or cache in the cloud.
Your best option is to implement it in a very different way. What I usually do is have a layered approach. In your case, perhaps you should have a first aggregation layer by state, then by city, etc. That way you never load hundreds of data points at once, which, by the way, is not what your user wants: They're usually looking for something specific or aggregated data - but never looking for thousands of results to browse through manually.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With