To be succinct, what SIMPLE In-Memory Caches exist in the .Net ecosystem?
What I am looking for is:
var myCache = new SimpleCache(1024 * 1024 * 100); // 100 MB
I have already reasearched these options:
I am working with Google Protocol Buffers (protobuf-net), so I DO have a relatively accurate estimation of the memory footprint of each item. I am caching data returned from database access, but I have no desire to use a formal ORM (I am actually using PetaPoco, but that is beside the point).
At this stage I am planning on implementing my own cache, using a double linked list and hash (dictionary) to provide for dropping items that are least recently used from the cache once the cache limit is reached. However, I wanted to check to see if anyone knew of any suitable options before I rolled my own.
So what did you end up doing with the cache?
I have a prototype of Universal Cache that I was working on that does pretty much what you want except the cache limit is by capacity (number of items) instead of memory usage but it has an extra feature that you may appreciate: local persistence.
The cache is structured by layer where you can have a single layer: memory, a double layer: memory + local persistence, or a triple layer: memory + local persistence + network sharing. You can use any combination of layers as you need. Although the network sharing that is designed as a .net remoting service that can be installed on any computer on the network and is self-discoverable by client caches is not quite functional yet.
The caches are generic classes accessed via an interface ICache<TKey,TValue>. The local persistent cache use SQL Express Compact 4 to create on-the-fly database and tables that have all the fields of your TKey, a timestamp, and a TValue that is serialized as an Image column. In case of multiple layer configuration, when you use the method TryGetValue( TKey key, out TValue value ) on the top level cache, if the first layer (memory cache) doesn't have the item it check internally with the next layer (local persistent), then the next... until no more cache are available to provide the data. When the memory cache discards an item, it let the next cache the opprtunity to add the item to its layer so in case of persistent cache the data is stored for future requests.
The caches have a use-defined retention time parameter (i.e. few minutes for a memory cache, 30 days for local persistent cache) and capacity.
I may post it as public domain code if someone is interested.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With