Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C# Production-quality Thread-Safe in-memory LRU Cache with Expiry?

This may be like asking for the moon on a stick; but is there a is there a "C# Production-quality Thread-Safe in-memory LRU Cache with Expiry? Or does anyone have a best-practices idea to achieve the same thing?

(LRU being "Least Recently Used" - http://en.wikipedia.org/wiki/Cache_algorithms#LRU)

To clarify: I want to support a memory cache in an ASP.Net MVC site with the following interface:

public interface ICache
{
    T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class;
    bool Remove(string key);
}
  • I want "GetOrAdd" because I want these operations to be "atomic" - i.e. avoiding race conditions around two threads trying to query the cache at the same time
  • I want the Create function because the creation of these objects is expensive (requires complex database access)
  • I need expiry, as these objects do expire after a period of time

The best solution from Microsoft seems to be the "System.Runtime.Caching.MemoryCache", however it seems to come with a couple of caveats:

  • It needs to poll the cache periodically to obey the imposed memory limits. I can't have any possibility of running out of memory in my system. I have read this post, which worries me: MemoryCache does not obey memory limits in configuration
  • It only appears to have "AddOrGetExisting" to support my interface, which takes the constructed object as a second parameter - if this object is expensive to create, doesnt pre-constructing it kinda defeat the point of the cache?

The code would look something like:

public sealed class Cache : ICache
{
    private readonly MemoryCache _cache;

    public Cache()
    {
        _cache = MemoryCache.Default;
    }

    public T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class
    {
        // This call kinda defeats the point of the cache ?!?
        var newValue = create();

        return _cache.AddOrGetExisting(key, newValue, DateTimeOffset.UtcNow + timeToLive) as T;
    }

    public bool Remove(string key)
    {
        _cache.Remove(key);
        return true;
    }
}

Or maybe something better around Lazy < T >, which does allow the result to be created once only, but feels like a hack (are there consequences to caching Func?):

class Program
{
    static void Main(string[] args)
    {
        Func<Foo> creation = () =>
        {
            // Some expensive thing
            return new Foo();
        };

        Cache cache = new Cache();
        // Result 1 and 2 are correctly the same instance. Result 3 is correctly a new instance...
        var result1 = cache.GetOrAdd("myKey", creation, TimeSpan.FromMinutes(30));
        var result2 = cache.GetOrAdd("myKey", creation, TimeSpan.FromMinutes(30));
        var result3 = cache.GetOrAdd("myKey3", creation, TimeSpan.FromMinutes(30));

        return;
    }
}

public sealed class Foo
{
    private static int Counter = 0;
    private int Index = 0;

    public Foo()
    {
        Index = ++Counter;
    }
}

public sealed class Cache
{
    private readonly MemoryCache _cache;

    public Cache()
    {
        _cache = MemoryCache.Default;
    }

    public T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class
    {
        var newValue = new Lazy<T>(create, LazyThreadSafetyMode.PublicationOnly);
        var value = (Lazy<T>)_cache.AddOrGetExisting(key, newValue, DateTimeOffset.UtcNow + timeToLive);
        return (value ?? newValue).Value;
    }

    public bool Remove(string key)
    {
        _cache.Remove(key);
        return true;
    }
}

Other thoughts:

  • I've also found this implementation, but it doesnt allow you to specify a time-based expiry: Is it there any LRU implementation of IDictionary?
  • Maybe there is an implementation out there that uses ReaderWriterLock?
  • Some wrapper around ConcurrentDictionary?
like image 755
Jon Rea Avatar asked May 27 '15 17:05

Jon Rea


1 Answers

I implemented a thread safe pseudo LRU designed for concurrent workloads - currently in use in a production system. Performance is very close to ConcurrentDictionary, ~10x faster than MemoryCache and hit rate is better than a conventional LRU. Full analysis provided in the github link below.

Usage looks like this:

int capacity = 666;
var lru = new ConcurrentTLru<int, SomeItem>(capacity, TimeSpan.FromMinutes(5));

var value = lru.GetOrAdd(1, (k) => new SomeItem(k));
bool removed = lru.TryRemove(1);

GitHub: https://github.com/bitfaster/BitFaster.Caching

Install-Package BitFaster.Caching
like image 99
Alex Peck Avatar answered Sep 29 '22 06:09

Alex Peck