Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MemoryCache Thread Safety, Is Locking Necessary?

For starters let me just throw it out there that I know the code below is not thread safe (correction: might be). What I am struggling with is finding an implementation that is and one that I can actually get to fail under test. I am refactoring a large WCF project right now that needs some (mostly) static data cached and its populated from a SQL database. It needs to expire and "refresh" at least once a day which is why I am using MemoryCache.

I know that the code below should not be thread safe but I cannot get it to fail under heavy load and to complicate matters a google search shows implementations both ways (with and without locks combined with debates whether or not they are necessary.

Could someone with knowledge of MemoryCache in a multi threaded environment let me definitively know whether or not I need to lock where appropriate so that a call to remove (which will seldom be called but its a requirement) will not throw during retrieval/repopulation.

public class MemoryCacheService : IMemoryCacheService {     private const string PunctuationMapCacheKey = "punctuationMaps";     private static readonly ObjectCache Cache;     private readonly IAdoNet _adoNet;      static MemoryCacheService()     {         Cache = MemoryCache.Default;     }      public MemoryCacheService(IAdoNet adoNet)     {         _adoNet = adoNet;     }      public void ClearPunctuationMaps()     {         Cache.Remove(PunctuationMapCacheKey);     }      public IEnumerable GetPunctuationMaps()     {         if (Cache.Contains(PunctuationMapCacheKey))         {             return (IEnumerable) Cache.Get(PunctuationMapCacheKey);         }          var punctuationMaps = GetPunctuationMappings();          if (punctuationMaps == null)         {             throw new ApplicationException("Unable to retrieve punctuation mappings from the database.");         }          if (punctuationMaps.Cast<IPunctuationMapDto>().Any(p => p.UntaggedValue == null || p.TaggedValue == null))         {             throw new ApplicationException("Null values detected in Untagged or Tagged punctuation mappings.");         }          // Store data in the cache         var cacheItemPolicy = new CacheItemPolicy         {             AbsoluteExpiration = DateTime.Now.AddDays(1.0)         };          Cache.AddOrGetExisting(PunctuationMapCacheKey, punctuationMaps, cacheItemPolicy);          return punctuationMaps;     }      //Go oldschool ADO.NET to break the dependency on the entity framework and need to inject the database handler to populate cache     private IEnumerable GetPunctuationMappings()     {         var table = _adoNet.ExecuteSelectCommand("SELECT [id], [TaggedValue],[UntaggedValue] FROM [dbo].[PunctuationMapper]", CommandType.Text);         if (table != null && table.Rows.Count != 0)         {             return AutoMapper.Mapper.DynamicMap<IDataReader, IEnumerable<PunctuationMapDto>>(table.CreateDataReader());         }          return null;     } } 
like image 614
James Legan Avatar asked Nov 22 '13 16:11

James Legan


People also ask

Is MemoryCache thread safe?

So it is "safe" for multi threaded apps like web sites, but that does not mean it guarantees to only execute the delegate to prime the cache only once.

Is Lazy cache thread safe?

Lazy Cache. Lazy cache is a simple in-memory caching service. It has a developer friendly generics based API, and provides a thread safe cache implementation that guarantees to only execute your cachable delegates once (it's lazy!). Under the hood it leverages Microsoft.

Is MemoryCache a singleton?

Note that the MemoryCache is a singleton, but within the process. It is not (yet) a DistributedCache. Also note that Caching is Complex(tm) and that thousands of pages have been written about caching by smart people. This is a blog post as part of a series, so use your head and do your research.


2 Answers

The default MS-provided MemoryCache is entirely thread safe. Any custom implementation that derives from MemoryCache may not be thread safe. If you're using plain MemoryCache out of the box, it is thread safe. Browse the source code of my open source distributed caching solution to see how I use it (MemCache.cs):

https://github.com/haneytron/dache/blob/master/Dache.CacheHost/Storage/MemCache.cs

like image 193
Haney Avatar answered Sep 19 '22 14:09

Haney


While MemoryCache is indeed thread safe as other answers have specified, it does have a common multi threading issue - if 2 threads try to Get from (or check Contains) the cache at the same time, then both will miss the cache and both will end up generating the result and both will then add the result to the cache.

Often this is undesirable - the second thread should wait for the first to complete and use its result rather than generating results twice.

This was one of the reasons I wrote LazyCache - a friendly wrapper on MemoryCache that solves these sorts of issues. It is also available on Nuget.

like image 45
alastairtree Avatar answered Sep 23 '22 14:09

alastairtree