Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Caching architecture for search results in an ASP.NET application

What is a good design for caching the results of an expensive search in an ASP.NET system?

Any ideas would be welcomed ... particularly those that don't require inventing a complex infrastructure of our own.

Here are some general requirements related to the problem:

  • Each search result can produce include from zero to several hundred result records
  • Each search is relatively expensive and timeconsuming to execute (5-15 seconds at the database)
  • Results must be paginated before being displayed at the client to avoid information overload for the user
  • Users expect to be able to sort, filter, and search within the results returned
  • Users expect to be able to quickly switch between pages in the search results
  • Users expect to be able to select multiple items (via checkbox) on any number of pages
  • Users expect relatively snappy performance once a search has finished

I see some possible options for where and how to implement caching:

1. Cache on the server (in session or App cache), use postbacks or Ajax panels to facilitate efficient pagination, sorting, filtering, and searching.

  • PROS: Easy to implement, decent support from ASP.NET infrastructure
  • CONS: Very chatty, memory intensive on server, data may be cached longer than necessary; prohibits load balancing practices

2. Cache at the server (as above) but using serializeable structures that are moved out of memory after some period of time to reduce memory pressure on the server

  • PROS: Efficient use of server memory; ability to scale out using load balancing;
  • CONS: Limited support from .NET infrastructure; potentially fragile when data structures change; places additional load on the database; significantly more complicated

3. Cache on the client (using JSON or XML serialization), use client-side Javascript to paginate, sort, filter, and select results.

  • PROS: User experience can approach "rich client" levels; most browsers can handle JSON/XML natively - decent libraries exist for manipulation (e.g. jQuery)
  • CONS: Initial request may take a long time to download; significant memory footprint on client machines; will require hand-crafted Javascript at some level to implement

4. Cache on the client using a compressed/encoded representation of the data - call back into server to decode when switching pages, sorting, filtering, and searching.

  • PROS: Minimized memory impact on server; allows state to live as long as client needs it; slightly improved memory usage on client over JSON/XML
  • CONS: Large data sets moving back and forth between client/server; slower performance (due to network I/O) as compared with pure client-side caching using JSON/XML; much more complicated to implement - limited support from .NET/browser

5. Some alternative caching scheme I haven't considered...

like image 553
LBushkin Avatar asked Oct 12 '09 18:10

LBushkin


1 Answers

For #1, have you considered using a state server (even SQL server) or a shared cache mechanism? There are plenty of good ones to choose from, and Velocity is getting very mature - will probably RTM soon. A cache invalidation scheme that is based on whether the user creates a new search, hits any other page besides search pagination, and finally a standard timeout (20 minutes) should be pretty successful at weeding your cache down to a minimal size.

References:

  • SharedCache (FOSS)
  • NCache ($995/CPU)
  • StateServer (~$1200/server)
  • StateMirror ("Enterprise pricing")
  • Velocity (Free?)
like image 165
Rex M Avatar answered Nov 03 '22 10:11

Rex M