I wrote a program for calculating fibonacci numbers recursively, with a ConcurrentHashMap
and computeIfAbsent()
method:
Program works absolutely fine when i used small values like 8,9,10
but stuck in endless loop when value increased from 10 to 20
program never halts
public class Test {
static Map<Integer, Integer> concurrentMap = new ConcurrentHashMap<>();
public static void main(String[] args) {
System.out.println("Fibonacci result for 20 is" + fibonacci(20));
}
static int fibonacci(int i) {
if (i == 0)
return i;
if (i == 1)
return 1;
return concurrentMap.computeIfAbsent(i, (key) -> {
System.out.println("Value is " + key);
return fibonacci(i - 2) + fibonacci(i - 1);
});
}
}
Can some one tell me why it is being stuck forever?
ConcurrentHashMap class achieves thread-safety by dividing the map into segments, the lock is required not for the entire object but for one segment, i.e one thread requires a lock of one segment. In ConcurrentHashap the read operation doesn't require any lock.
keys in ConcurrentHashMap are not in sorted order, so for cases when ordering is required, ConcurrentSkipListMap is a suitable choice.
In ConcurrentHashMap, at a time any number of threads can perform retrieval operation but for updated in the object, the thread must lock the particular segment in which the thread wants to operate. This type of locking mechanism is known as Segment locking or bucket locking.
ConcurrentHashMap is one of the most frequently used collection classes in our daily work, Its characteristic is high performance and thread-safety. However, there are two bugs that impact the performance of this familiar map. The problem is in computeIfAbsent .
You are hitting a deadlock.
computeIfAbsent
on a ConcurrentHashMap will lock the bucket in which the corresponding key will go to. If you are attempting to calculate a Fibonacci that is higher than the number of buckets in your map, then the recursive calls will attempt to lock a bucket that is already locked further up the call stack. But of course, that lock cannot be released until all of the recursive calls have completed. Thus, your deadlock.
I would reconsider your decision to use a ConcurrentHashMap
here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With