Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using 'HttpContext.Current.Cache' safely

I am using Cache in a web service method like this:

var pblDataList = (List<blabla>)HttpContext.Current.Cache.Get("pblDataList");

if (pblDataList == null)
{
    var PBLData = dc.ExecuteQuery<blabla>(@"SELECT blabla");

    pblDataList = PBLData.ToList();

    HttpContext.Current.Cache.Add("pblDataList", pblDataList, null,
        DateTime.Now.Add(new TimeSpan(0, 0, 15)),
        Cache.NoSlidingExpiration, CacheItemPriority.Normal, null);
}

But I wonder, is this code thread-safe? The web service method is called by multiple requesters. And more then one requester may attempt to retrieve data and add to the Cache at the same time while the cache is empty.

The query takes 5 to 8 seconds. Would introducing a lock statement around this code prevent any possible conflicts? (I know that multiple queries can run simultaneously, but I want to be sure that only one query is running at a time.)

like image 252
Burak SARICA Avatar asked Apr 14 '10 14:04

Burak SARICA


2 Answers

The cache object is thread-safe but HttpContext.Current will not be available from background threads. This may or may not apply to you here, it's not obvious from your code snippet whether or not you are actually using background threads, but in case you are now or decide to at some point in the future, you should keep this in mind.

If there's any chance that you'll need to access the cache from a background thread, then use HttpRuntime.Cache instead.

In addition, although individual operations on the cache are thread-safe, sequential lookup/store operations are obviously not atomic. Whether or not you need them to be atomic depends on your particular application. If it could be a serious problem for the same query to run multiple times, i.e. if it would produce more load than your database is able to handle, or if it would be a problem for a request to return data that is immediately overwritten in the cache, then you would likely want to place a lock around the entire block of code.

However, in most cases you would really want to profile first and see whether or not this is actually a problem. Most web applications/services don't concern themselves with this aspect of caching because they are stateless and it doesn't matter if the cache gets overwritten.

like image 71
Aaronaught Avatar answered Oct 10 '22 21:10

Aaronaught


You are correct. The retrieving and adding operations are not being treated as an atomic transaction. If you need to prevent the query from running multiple times, you'll need to use a lock.

(Normally this wouldn't be much of a problem, but in the case of a long running query it can be useful to relieve strain on the database.)

like image 26
Greg Avatar answered Oct 10 '22 19:10

Greg