I was trying to add Integers from 2 sets into single Set via for loop and also using addAll() method provided by Collections. For test purpose, I have populated 2 Sets with Integers and then tried adding them to third set
Set<Integer> undId = new HashSet<Integer>();
Set<Integer> proxies = new HashSet<Integer>();
//Create 2 sets with Integers
for(int i=0;i<100;i++){
undId.add(i);
proxies.add(i);
}
and method 1 : //Now add them to third set using for loop
for(Integer integer : undId)
underlyings.add(integer);
for(Integer integer :proxies)
underlyings.add(integer);
and method 2 ://Or add them to third set using addAll()
underlyings.addAll(undId);
underlyings.addAll(proxies);
Now when i was trying to time the operation using System.nanoTime(), add is twice faster (for 100,1000,10000 elements). When i increased size to 1000000 or 10000000. It was reversed. I was wondering why would it happen for larger set. I am not sure how addAll() internally handles however any help in understanding above will be appreciated. Thnx
Before you doing anything make sure you've read and understood the discussion here: Java benchmarking - why is the second loop faster?
I would expect addAll to be faster in some situations as it has more information to work with.
For example on an ArrayList addAll can make sure it allocates enough space to add every single element in one step, rather than having to reallocate multiple times if adding large numbers of elements.
It would certainly not be slower as even a naive implementation of it would just do what you do, loop through adding the items.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With