What is a good design for caching the results of an expensive search in an ASP.NET system?
Any ideas would be welcomed ... particularly those that don't require inventing a complex infrastructure of our own.
Here are some general requirements related to the problem:
I see some possible options for where and how to implement caching:
1. Cache on the server (in session or App cache), use postbacks or Ajax panels to facilitate efficient pagination, sorting, filtering, and searching.
2. Cache at the server (as above) but using serializeable structures that are moved out of memory after some period of time to reduce memory pressure on the server
3. Cache on the client (using JSON or XML serialization), use client-side Javascript to paginate, sort, filter, and select results.
4. Cache on the client using a compressed/encoded representation of the data - call back into server to decode when switching pages, sorting, filtering, and searching.
5. Some alternative caching scheme I haven't considered...
For #1, have you considered using a state server (even SQL server) or a shared cache mechanism? There are plenty of good ones to choose from, and Velocity is getting very mature - will probably RTM soon. A cache invalidation scheme that is based on whether the user creates a new search, hits any other page besides search pagination, and finally a standard timeout (20 minutes) should be pretty successful at weeding your cache down to a minimal size.
References:
- SharedCache (FOSS)
- NCache ($995/CPU)
- StateServer (~$1200/server)
- StateMirror ("Enterprise pricing")
- Velocity (Free?)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With