I use IMemoryCache
in my project. I wonder what will happen if my app pushes many long-living objects in the cache. Can it occupy all the available memory? Could I globally define the maximum memory for the app?
The working set is constant at approximately 500 MB. CPU is 12%. The memory consumption and release (through GC) is stable.
ASP.NET Core apps should be designed to process many requests simultaneously. Asynchronous APIs allow a small pool of threads to handle thousands of concurrent requests by not waiting on blocking calls. Rather than waiting on a long-running synchronous task to complete, the thread can work on another request.
This was written a year ago so I'm going to assume you're using v1.x.x
of the Microsoft.Extensions.Caching.Memory
package.
Since there isn't a SizeLimit
property in the MemoryCacheOptions
like v2.x.x
, after digging around into the code for a while I found the following line of documentation.
https://github.com/aspnet/Caching/blob/rel/1.1.2/src/Microsoft.Extensions.Caching.Memory/MemoryCache.cs#L329
/// This is called after a Gen2 garbage collection. We assume this means there was memory pressure.
/// Remove at least 10% of the total entries (or estimated memory?).
Thus the package will eat up as much memory as the OS will allow your code to have. When it reaches that limit it will start compacting (evicting) cache entries.
With v2.x.x
you can set the limit manually using SizeLimit
property and you can even set the amount of compaction when the limit is hit CompactionPercentage
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With