I want a collection in Java which:
Object
s to Object
s (not String
or otherwise restricted keys only)It's OK if cache misses simultaneously in multiple threads cause redundant computations; the typical case is that the cache is mostly filled by one thread at first.
A synchronized
block around a thread-unsafe hashtable fails the efficient-to-read criterion. Thread-local caches would be straightforward but mean that new threads are expensive since they have full copies of the cache.
Java 1.5 built-ins, or one-or-few class files we can copy into our MIT-licensed project, are preferred, rather than large external libraries.
Use the java concurrent hashmap
ConcurrentHashMap<object, object> table;
public object getFromCache(object key)
{
value = table.get(key);
if (value == null)
{
//key isn't a key into this table, ie. it's not in the cache
value = calculateValueForKey(key)
object fromCache = table.putIfAbsent(key, value);
}
return value;
}
/**
* This calculates a new value to put into the cache
*/
public abstract object calculateValueForKey(object key);
N.b. This is not longer a general solution for multithreaded caching, since it relies on the stated fact that objects are immutable, and thus object equivalence is not important.
This is my own idea for a solution, but I'm not an expert on threaded programming, so please comment/vote/compare to other answers as you feel appropriate.
Use a thread-local variable (java.lang.ThreadLocal) which contains a per-thread hashtable used as a first-level cache. If the key is not found in this table, synchronized access is done to a second-level cache, which is a synchronized
-access hashtable shared by all threads. In this way, the calculation of the cache value is only ever done once, and it is shared among all threads, but each thread has a local copy of the mapping from keys to values, so there is some memory cost (but less than having independent per-thread caches), but reads are efficient.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With