I have a cache service like this:
public interface ICacheService {
T Get<T>(string cacheID, Func<T> getItemCallback, int cacheMinutes = 5) where T : class;
}
public class MemoryCacheService : ICacheService {
public T Get<T>(string cacheId, Func<T> getItemCallback, int cacheMinutes = 5) where T : class {
T item = MemoryCache.Default.Get(cacheId) as T;
if (item == null) {
item = getItemCallback();
MemoryCache.Default.Add(cacheId, item,
new CacheItemPolicy {AbsoluteExpiration = DateTime.Now.AddMinutes(cacheMinutes)});
}
return item;
}
}
And retrieved like this:
var result = _cache.Get("mylist", () => _database.Fetch<MyList>().AsQueryable(), 600);
The list is large and accessed frequently in a per keystroke type-ahead dropdown. And the query condition is also dynamic, like
if (this) result = result.Where(x=> this ...)
if (that) result = result.Where(x=> that ...)
finally result.ToList()
I wonder, every time I access the list from cache, does the system create a copy of the data before start building linq query? If so, it's like copy-per-keystroke, not very efficient. Or does the it deferred the query because I'm retrieving AsQueryable and build linq?
Any better alternatives? Thanks
MemoryCache does not allow you to share memory between processes as the memory used to cache objects is bound to the application pool. That's the nature of any in-memory cache implementation you'll find. The only way to actually use a shared cache is to use a distributed cache.
The Core 2.2 IMemoryCache is in theory thread safe. But if you call GetOrCreateAsync from multiple threads the factory Func will be called multiple times. Which could be a bad thing. A very simple fix to this is using a semaphore.
In-Memory Cache is used for when you want to implement cache in a single process. When the process dies, the cache dies with it. If you're running the same process on several servers, you will have a separate cache for each server. Persistent in-process Cache is when you back up your cache outside of process memory.
Memory caching (often simply referred to as caching) is a technique in which computer applications temporarily store data in a computer's main memory (i.e., random access memory, or RAM) to enable fast retrievals of that data. The RAM that is used for the temporary storage is known as the cache.
Without getting lost in the minutia of MemoryCache, you can reason this out with basic .NET design principles. Only value types are easy to copy. There is no general mechanism to copy reference types, beyond [Serializable] and the very broken ICloneable. Which are not requirements for an object to be put in the MemoryCache. So no.
Caching objects is very, very simple. A simple List<> gets that job done. The value-add you get from MemoryCache is the other essential feature of an effective cache. A retirement policy. A cache without a policy is a memory leak.
No, MemoryCache does not make a copy. You basically store a reference to some object instance in the cache, and that is what you get back when you access an item in the cache.
I don't have a formal documentation link, but found out the "hard way" in practice, where I accidentally modified the cached object by just using the reference I got back (without copying it).
Also, studying the reference sources (http://referencesource.microsoft.com) shows that there is no automatic copying happening.
Depending on your application and needs, you might to want sure that the types you cache are actually immutable by design.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With