This may be like asking for the moon on a stick; but is there a is there a "C# Production-quality Thread-Safe in-memory LRU Cache with Expiry? Or does anyone have a best-practices idea to achieve the same thing?
(LRU being "Least Recently Used" - http://en.wikipedia.org/wiki/Cache_algorithms#LRU)
To clarify: I want to support a memory cache in an ASP.Net MVC site with the following interface:
public interface ICache
{
T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class;
bool Remove(string key);
}
The best solution from Microsoft seems to be the "System.Runtime.Caching.MemoryCache", however it seems to come with a couple of caveats:
The code would look something like:
public sealed class Cache : ICache
{
private readonly MemoryCache _cache;
public Cache()
{
_cache = MemoryCache.Default;
}
public T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class
{
// This call kinda defeats the point of the cache ?!?
var newValue = create();
return _cache.AddOrGetExisting(key, newValue, DateTimeOffset.UtcNow + timeToLive) as T;
}
public bool Remove(string key)
{
_cache.Remove(key);
return true;
}
}
Or maybe something better around Lazy < T >, which does allow the result to be created once only, but feels like a hack (are there consequences to caching Func?):
class Program
{
static void Main(string[] args)
{
Func<Foo> creation = () =>
{
// Some expensive thing
return new Foo();
};
Cache cache = new Cache();
// Result 1 and 2 are correctly the same instance. Result 3 is correctly a new instance...
var result1 = cache.GetOrAdd("myKey", creation, TimeSpan.FromMinutes(30));
var result2 = cache.GetOrAdd("myKey", creation, TimeSpan.FromMinutes(30));
var result3 = cache.GetOrAdd("myKey3", creation, TimeSpan.FromMinutes(30));
return;
}
}
public sealed class Foo
{
private static int Counter = 0;
private int Index = 0;
public Foo()
{
Index = ++Counter;
}
}
public sealed class Cache
{
private readonly MemoryCache _cache;
public Cache()
{
_cache = MemoryCache.Default;
}
public T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class
{
var newValue = new Lazy<T>(create, LazyThreadSafetyMode.PublicationOnly);
var value = (Lazy<T>)_cache.AddOrGetExisting(key, newValue, DateTimeOffset.UtcNow + timeToLive);
return (value ?? newValue).Value;
}
public bool Remove(string key)
{
_cache.Remove(key);
return true;
}
}
Other thoughts:
I implemented a thread safe pseudo LRU designed for concurrent workloads - currently in use in a production system. Performance is very close to ConcurrentDictionary, ~10x faster than MemoryCache and hit rate is better than a conventional LRU. Full analysis provided in the github link below.
Usage looks like this:
int capacity = 666;
var lru = new ConcurrentTLru<int, SomeItem>(capacity, TimeSpan.FromMinutes(5));
var value = lru.GetOrAdd(1, (k) => new SomeItem(k));
bool removed = lru.TryRemove(1);
GitHub: https://github.com/bitfaster/BitFaster.Caching
Install-Package BitFaster.Caching
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With