We are about to use the built-in in-memory cache solution of ASP.NET Core to cache aside external system responses. (We may shift from in-memory to IDistributedCache
later.)
We want to use the Mircosoft.Extensions.Caching.Memory's IMemoryCache
as the MSDN suggests.
We need to limit the size of the cache because by default it is unbounded.
So, I have created the following POC application to play with it a bit before integrating it into our project.
public interface IThrottledCache
{
IMemoryCache Cache { get; }
}
public class ThrottledCache: IThrottledCache
{
private readonly MemoryCache cache;
public ThrottledCache()
{
cache = new MemoryCache(new MemoryCacheOptions
{
SizeLimit = 2
});
}
public IMemoryCache Cache => cache;
}
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers();
services.AddSingleton<IThrottledCache>(new ThrottledCache());
}
I've created a really simple controller to play with this cache.
[Route("api/[controller]")]
[ApiController]
public class MemoryController : ControllerBase
{
private readonly IMemoryCache cache;
public MemoryController(IThrottledCache cacheSource)
{
this.cache = cacheSource.Cache;
}
[HttpGet("{id}")]
public IActionResult Get(string id)
{
if (cache.TryGetValue(id, out var cachedEntry))
{
return Ok(cachedEntry);
}
else
{
var options = new MemoryCacheEntryOptions { Size = 1, SlidingExpiration = TimeSpan.FromMinutes(1) };
cache.Set(id, $"{id} - cached", options);
return Ok(id);
}
}
}
As you can see my /api/memory/{id}
endpoint can work in two modes:
I have observed the following strange behaviour:
/api/memory/first
first
first
/api/memory/first
first - cached
first
/api/memory/second
second
first
, second
/api/memory/second
second - cached
first
, second
/api/memory/third
third
first
, second
/api/memory/third
third
second
, third
/api/memory/third
third - cached
second
, third
As you can see at the 5th endpoint call is where I hit the limit. So my expectation would be the following:
first
oldest entrythird
as the newestBut this desired behaviour only happens at the 6th call.
So, my question is why do I have to call twice the Set
in order to put new data into the MemoryCache when the size limit has reached?
EDIT: Adding timing related information as well
During testing the whole request flow / chain took around 15 seconds or even less.
Even if I change the SlidingExpiration
to 1 hour the behaviour remains exactly the same.
The ASP.NET Session object is per user key/value storage, whereas MemoryCache is an application level key/value storage (values are shared among all users).
How Does Memory Caching Work? Memory caching works by first setting aside a portion of RAM to be used as the cache. As an application tries to read data, typically from a data storage system like a database, it checks to see if the desired record already exists in the cache.
It's for when we have used data in our application or some time after, you have to remove the cache data from our system, then we can use it. This is used for data outside of the cache, like if you saved the data in a file or database and then want to use it in our application.
MemoryCache implements a fast in-memory, e.g. in-process, caching for data that are expensive to create and are thread-safe. manner. All items are stored in a concurrent data structure ( System. Collections.
I downloaded, built and debugged the unit tests in Microsoft.Extensions.Caching.Memory; there seems to be no test that seems that truly covers this case.
The cause is: as soon as you try to add an item which would make the cache go over capacity, MemoryCache triggers a compaction in the background. This will evict the oldest (MRU) cache entries up until a certain difference. In this case, it tries to remove a total size of 1 of cache items, in your case "first", because that was accessed last.
However, since this compact cycle runs in the background, and the code in the SetEntry()
method is already on the code path for a full cache, it continues without adding the item to the cache.
The next time it tries to, it succeeds.
Repro:
class Program
{
private static MemoryCache _cache;
private static MemoryCacheEntryOptions _options;
static void Main(string[] args)
{
_cache = new MemoryCache(new MemoryCacheOptions
{
SizeLimit = 2
});
_options = new MemoryCacheEntryOptions
{
Size = 1
};
_options.PostEvictionCallbacks.Add(new PostEvictionCallbackRegistration
{
EvictionCallback = (key, value, reason, state) =>
{
if (reason == EvictionReason.Capacity)
{
Console.WriteLine($"Evicting '{key}' for capacity");
}
}
});
Console.WriteLine(TestCache("first"));
Console.WriteLine(TestCache("second"));
Console.WriteLine(TestCache("third")); // starts compaction
Thread.Sleep(1000);
Console.WriteLine(TestCache("third"));
Console.WriteLine(TestCache("third")); // now from cache
}
private static object TestCache(string id)
{
if (_cache.TryGetValue(id, out var cachedEntry))
{
return cachedEntry;
}
_cache.Set(id, $"{id} - cached", _options);
return id;
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With