I've a asp.net mvc application which returns the JSON result containing upto n number of years worth of data which then gets rendered on Javascript chart.
In order to have a good user experience (in terms of performance) I'm looking for the best solution whether it's possible to cache the JSON data on client side so when user clicks on Chart with different parameters such as day, week view etc, the same JSON data is queried without hitting a server.
Could someone please help us to make a best decision on caching best practices on whether the data should be cached on client side or server side or should directly hit database for each graph toggle?
Thanks in advance.
Create a new object for storing the response data. Convert the new object to a string using your JSON parser. Send the JSON string back to the client as the response body (e.g, Response. Write(strJSON) , echo $strJSON , out.
How large can JSON Documents be? One of the more frequently asked questions about the native JSON data type, is what size can a JSON document be. The short answer is that the maximum size is 1GB.
Client-side caching duplicates the data of previously requested files directly within browser applications or other clients (such as intermediate network caches). Client cache is the most efficient type of caching, because it allows browsers to access files without communicating with the web server.
In order to add data in cache as JSON, the objects need to be created according to the JSON standards provided by NCache. JsonObject is added in the cache against a unique key. This key will be used to perform further operations on the cache.
First of all, where is the database? If you are on a local network with gigabit LAN, then hitting it won't be a problem. However, that is not true over the internet. People have limited bandwidth, especially on mobile, and thus you should limit your HTTP calls. Also, less HTTP calls means less strain on the server.
Here are some tips:
Consider pagination
When loading "2 years worth", I imagine a lot, like a 100+ page thesis. Consider paginating data instead of loading them all at once. This saves you bandwidth as well as cache space (If ever it's limited).
How to: Have the server script slice up the data according to what the client wants. It's pretty easy to create pagination in SQL using LIMIT
in the query. The logic is like starting_item = (page_needed - 1) * items_per_page
JSONify data
Use JSON for transporting data to and from the network. Aside from being lightweight, it's also structured. It will be easier to parse and store later on.
How to: PHP has a json_encode function to convert arrays into JSON strings. I assume your framework has a similar feature. Have the string echoed on a page then use JSON.parse
to convert from JSON string to a JS object. JSON methods come native in modern browsers but if you need to cater old browsers, Crockford has a library to parse it
Use a well known storage framework
If a persistent storage is needed for cache across page, I recently came across PersistJS which abstracts localStorage to ones available on the browser. Also, here's a JS implementation of LZW. Keep it handy since localstorage use strings to store data and it has a 5-10MB limit.
How to: convert the data into a string using JSON.stringify
and store it with PersistJS. Then for retrieval, get the string and parse it back using JSON.parse()
Call only when needed
Have the cache system only call the server if something is modified, added or if something isn't there. If the data is there, why should you call the server for it?
Sync the cache
If you fear of stale data, then have some AJAX sync your cache system by using some method of live data fetching as described in this wiki about Comet.
The last two points depend on your cache framework. But BackboneJS allows it's models and collections to sync to the server, which have the same functionality I mentioned.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With