Basically, if I want to do the following:
public class SomeClass { private static ConcurrentDictionary<..., ...> Cache { get; set; } }
Does this let me avoid using lock
s all over the place?
Concurrent. ConcurrentDictionary<TKey,TValue>. This collection class is a thread-safe implementation. We recommend that you use it whenever multiple threads might be attempting to access the elements concurrently.
The GetOrAdd functionThe vast majority of methods it exposes are thread safe, with the notable exception of one of the GetOrAdd overloads: TValue GetOrAdd(TKey key, Func<TKey, TValue> valueFactory); This overload takes a key value, and checks whether the key already exists in the database.
In . Net 4, ConcurrentDictionary utilized very poor locking management and contention resolution that made it extremely slow. Dictionary with custom locking and/or even TestAndSet usage to COW the whole dictionary was faster.
ConcurrentDictionary is thread-safe collection class to store key/value pairs. It internally uses locking to provide you a thread-safe class. It provides different methods as compared to Dictionary class. We can use TryAdd, TryUpdate, TryRemove, and TryGetValue to do CRUD operations on ConcurrentDictionary.
Yes, it is thread safe and yes it avoids you using locks all over the place (whatever that means). Of course that will only provide you a thread safe access to the data stored in this dictionary, but if the data itself is not thread safe then you need to synchronize access to it of course. Imagine for example that you have stored in this cache a List<T>
. Now thread1 fetches this list (in a thread safe manner as the concurrent dictionary guarantees you this) and then starts enumerating over this list. At exactly the same time thread2 fetches this very same list from the cache (in a thread safe manner as the concurrent dictionary guarantees you this) and writes to the list (for example it adds a value). Conclusion: if you haven't synchronized thread1 it will get into trouble.
As far as using it as a cache is concerned, well, that's probably not a good idea. For caching I would recommend you what is already built into the framework. Classes such as MemoryCache for example. The reason for this is that what is built into the System.Runtime.Caching
assembly is, well, explicitly built for caching => it handles things like automatic expiration of data if you start running low on memory, callbacks for cache expiration items, and you would even be able to distribute your cache over multiple servers using things like memcached, AppFabric, ..., all things that you would can't dream of with a concurrent dictionary.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With