Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

IMemoryCache, refresh cache before eviction

I am trying to migrate my .Net framework application to .Net Core and in this process, I want to move my in-memory caching from System.Runtime.Caching/MemoryCache to Microsoft.Extensions.Caching.Memory/IMemoryCache. But I have one problem with IMemoryCache, I could not find a way to refresh the cache before it is removed/evicted.

In the case of System.Runtime.Caching/MemoryCache, there is UpdateCallback property in CacheItemPolicy to which I can set the delegate of callback function and this function will be called in a separate thread just before the eviction of the cached object. Even if callback function takes a long time to fetch fresh data, MemoryCache will continue to serve old data beyond its expiry deadline, this ensures my code need not wait for data during the process of cache refresh.

But I don't see such functionality in Microsoft.Extensions.Caching.Memory/IMemoryCache, there is RegisterPostEvictionCallback property and PostEvictionCallbacks extension method in MemoryCacheEntryOptions. But both of these will be fired after the cache entry is evicted from the cache. So if this callback takes a longer time, all the requests to get this data need to wait.

Is there any solution?

like image 954
Krishnaraj Barvathaya Avatar asked Oct 16 '19 05:10

Krishnaraj Barvathaya


People also ask

How do I refresh memory cache?

In most browsers, pressing Ctrl+F5 will force the browser to retrieve the webpage from the server instead of loading it from the cache. Firefox, Chrome, Opera, and Internet Explorer all send a “Cache-Control: no-cache” command to the server.

What is evicting cache?

Cache eviction is a feature where file data blocks in the cache are released when fileset usage exceeds the fileset soft quota, and space is created for new files. The process of releasing blocks is called eviction. However, file data is not evicted if the file data is dirty.

Is IMemoryCache a singleton?

The code is below. Note that the MemoryCache is a singleton, but within the process. It is not (yet) a DistributedCache. Also note that Caching is Complex(tm) and that thousands of pages have been written about caching by smart people.


2 Answers

I had this need and I write the class :

public abstract class AutoRefreshCache<TKey, TValue>
{
    private readonly ConcurrentDictionary<TKey, TValue> _entries = new ConcurrentDictionary<TKey, TValue>();

    protected AutoRefreshCache(TimeSpan interval)
    {
        var timer = new System.Timers.Timer();
        timer.Interval = interval.TotalMilliseconds;
        timer.AutoReset = true;
        timer.Elapsed += (o, e) =>
        {
            ((System.Timers.Timer)o).Stop();
            RefreshAll();
            ((System.Timers.Timer)o).Start();
        };
        timer.Start();
    }

    public TValue Get(TKey key)
    {
        return _entries.GetOrAdd(key, k => Load(k));
    }

    public void RefreshAll()
    {
        var keys = _entries.Keys;
        foreach(var key in keys)
        {
            _entries.AddOrUpdate(key, k => Load(key), (k, v) => Load(key));
        }
    }

    protected abstract TValue Load(TKey key);
}

Values aren't evicted, just refreshed. Only the first Get wait to load the value. During the refresh, Get return the precedent value (no wait).

Example of use :

class Program
{
    static void Main(string[] args)
    {
        var cache = new MyCache();
        while (true)
        {
            System.Threading.Thread.Sleep(TimeSpan.FromSeconds(1));
            Console.WriteLine(cache.Get("Key1") ?? "<null>");
        }
    }
}

public class MyCache : AutoRefreshCache<string, string>
{
    public MyCache() 
        : base(TimeSpan.FromSeconds(5))
    { }

    readonly Random random = new Random();
    protected override string Load(string key)
    {
        Console.WriteLine($"Load {key} begin");
        System.Threading.Thread.Sleep(TimeSpan.FromSeconds(3));
        Console.WriteLine($"Load {key} end");
        return "Value " + random.Next();
    }
}

Result :

Load Key1 begin
Load Key1 end
Value 1648258406
Load Key1 begin
Value 1648258406
Value 1648258406
Value 1648258406
Load Key1 end
Value 1970225921
Value 1970225921
Value 1970225921
Value 1970225921
Value 1970225921
Load Key1 begin
Value 1970225921
Value 1970225921
Value 1970225921
Load Key1 end
Value 363174357
Value 363174357
like image 73
vernou Avatar answered Sep 28 '22 22:09

vernou


You may try to take a look at FusionCache ⚡🦥, a library I recently released.

Features to use

The first interesting thing is that it provides an optimization for concurrent factory calls so that only one call per-key will be exeuted, relieving the load on your data source: basically all concurrent callers for the same cache key at the same time will be blocked and only one factory will be executed.

Then you can specify some timeouts for the factory, so that it will not take too much time: background factory completion isenabled by default so that, even if it will actually times out, it can keep running in the background and update the cache with the new value as soon as it will finish.

Then simply enable fail-safe to re-use the expired value in case of timeouts, or any problem really (the database is down, there are temporary network errors, etc).

A practical example

You can cache something for, let's say, 2 min after which a factory would be called to refresh the data but, in case of problems (exceptions, timeouts, etc), that expired value would be used again until the factory is able to complete in the background, after which it will update the cache right away.

One more thing

Another interesting feature is support for an optional, distributed 2nd level cache, automatically managed and kept in sync with the local one for you without doing anything.

If you will give it a chance please let me know what you think.

/shameless-plug

like image 26
Jody Donetti Avatar answered Sep 28 '22 23:09

Jody Donetti