I have a website where I cache a lot of info. I am seeing conflicting information around storing stuff in the asp.net cache.
For example lets say I have this data structure:
Dictionary<string, List<Car>> mydictionary;
I could store the whole thing with a string key as "MyDictionary" and then drill down once i pull out the object.
HttpContext.Cache.Add("MyDictionary",
mydictionary,
null,
Cache.NoAbsoluteExpiration,
new TimeSpan(10, 0, 0),
CacheItemPriority.Normal,
null);
var myDict = HttpContext.Cache["MyDictionary"] as Dictionary<string, List<Car>>;
The other thing i could do is break it down and store each item in my dictionary separately in the cache (given the cache is a dictionary anyway).
Dictionary<string, List<Car>> mydictionary;
foreach (var item in mydictionary.Keys)
{
HttpContext.Cache.Add(item, mydictionary[item], null, Cache.NoAbsoluteExpiration, new TimeSpan(10, 0, 0), CacheItemPriority.Normal, null);
}
var myitem = "test";
var myDict = HttpContext.Cache[myItem] as List<Car>;
Would the performance implication be very different (given i am assuming that everything is in memory anyway ?)
Adding an additional answer here as I feel the existing one capture the 'what' but not enough of the 'why'.
The reason it's best to store individual entries separately in the cache have little to do with perf. Instead, it has to do with allowing the system to perform proper memory management.
There is a lot of logic in the ASP.NET cache to figure out what to do when it runs into memory pressure. In the end, it needs to kick out some items, and it needs to do this in the least disruptive way possible. Which items it chooses to kick out depends a lot of whether they were accessed recently. There are other factors, like what flags are passed at caching time. e.g. you can make an item non-removable, and it'll never be kicked out.
But going back to the question, if you store your entire dictionary as a single item, you are only leaving two options to the ASP.NET memory manager: keep the entire thing alive, or kill the whole thing. i.e. you completely lose the benefit of having your 'hot' items stay in the cache, while your rarely accessed memory-hogging items get kicked out. In other words, you lose any level of caching granularity.
Of course, you could choose to implement your own scheme in your dictionary to remove items that are rarely used. But at that point you're re-inventing the wheel, and your new wheel will not work as well since it won't coordinate with the rest of the system.
As you say, there are two opposing opinions on this.
What it really boils down to is that cache entries should probably be stored at the most granular level as they are needed. By that, i mean, if you use every entry in your Dictionary every time it's used, then store it at the dictionary level.
If you only use one item from the dictionary, and you don't walk the dictionary, then store them at the individual level.
So if you treat the entire collection as a single unit, then cache it as a single unit. If the collection is randomly accessed and only certain items are needed at a time, then store it at the level in which it is a single unit.
By far, the second way is better.
Think of having to consider many objects. Every time you have to get 1 value, you will only need to get it, instead of grabing the entyre dictionary and, from that, get what you want.
This, not saing about the algorithms used in the cache to prioritise the most used items (MRU if I still remember).
I just recomend you to measure the performance gain with this before adopting it as the final implementation.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With