As per below code, hashmap initial dafault capacity is 16 and LF is 0.75, so when I enter 13th element then rehashing should occur and because I have provided a constant hashcode, it internally maintain a linked list to maintain an insertion order. So, till 10th elements it is working as expected but when I enter 11th element, it shuffle the order. Can anybody help me in understanding why it is happening at the time of 11th element insertion only.
class A{
int a;
A(int a){
this.a = a;
}
@Override
public int hashCode() {
return 7;
}
@Override
public String toString() {
return "" + a + "";
}
}
class Base {
public static void main(String[] args) {
Map<Object, Integer> m = new HashMap<Object, Integer>();
m.put(new A(1), 1);
m.put(new A(2), 1);
m.put(new A(3), 1);
m.put(new A(4), 1);
m.put(new A(5), 1);
m.put(new A(6), 1);
m.put(new A(7), 1);
m.put(new A(8), 1);
m.put(new A(9), 1);
m.put(new A(10), 1);
//m.put(new A(11), 1);
System.out.println(m);
}
}
Output which I am getting till 10th element:
{1=1, 2=1, 3=1, 4=1, 5=1, 6=1, 7=1, 8=1, 9=1, 10=1}
Output which I am getting after entering 11th element:
{4=1, 1=1, 2=1, 3=1, 5=1, 6=1, 7=1, 8=1, 9=1, 10=1, 11=1}
It should maintain insertion order or if it is using RB tree internally so which traversal it is using here in this case?
According to the theory i have read, hashcode returning a constant value will give us the same index of bucket in hashmap hence values are stored in a single bucket For equals method, it is returning false and hence logically same dept object are saved multiple times.
Returns the same hash code for the given object as would be returned by the default method hashCode(), whether or not the given object's class overrides hashCode(). The hash code for the null reference is zero. So even if the hashcode() is overridden, it should not effect it.
When two unequal objects have the same hash value, this causes a collision in the hash table, because both objects want to be in the same slot (sometimes called a bucket). The hash algorithm must resolve such collisions.
If multiple objects return the same value from hashCode(), it means that they would be stored in the same bucket. If many objects are stored in the same bucket it means that on average it requires more comparison operations to look up a given object.
It should maintain insertion order or if it is using RB tree internally so which traversal it is using here in this case?
There’s no “should”; the HashMap
does not guaranty any order. What actually happens in the current implementation, is determined by two constants, TREEIFY_THRESHOLD = 8
and MIN_TREEIFY_CAPACITY = 64
.
When the number of items in one bucket exceeds the former, the bucket will be converted into a tree of nodes, unless the map’s total capacity is smaller than the latter constant, in that case, the capacity will be doubled.
So when you insert the 9th object, the capacity will be raised from 16 to 32, inserting the 10th causes a raise from 32 to 64, then, inserting the 11th element will cause a conversion of the bucket to a tree.
This conversion will always happen, whether there’s an actual benefit or not. Since the objects have identical hash codes and do not implement Comparable
, it will end up using their identity hash codes for determining the order. This may result in a change of the order (in my environment, it does not).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With