I implemented a search caching results that consist of keys of type State (a class with 7 short ints) and values of type Score
(a class of 3 doubles.) Using unordered_map was at least 20 times slower than map. Why?
Edit: Darn it! My hash function was
namespace std {
size_t hash<State>::operator()(State const& s) const {
size_t retval = hash<short>()(s.s[0]);
for (int i = 1; i < R; i += 2) { // 1 3 5
int x = (static_cast<int>(s.s[i + 1]) << 16)
+ (static_cast<int>(s.s[i]));
hash_combine(retval, x);
}
}
}
I forgot to return retval
, so it was all colliding! I wish unordered_map had a hash_function_quality() function that reports the average number of collisions.
The speed of unordered_map is directly proportional to the speed of your hashing function. It is never a straight forward relationship. Case in point, if you use the simplest hashing function:
std::size_t myHash(MyObjectType _object){ return 1; }
then what you'll end up with is a collection which behaves like a list rather than a hashed container. All the items will map to a single bucket and you'll have to traverse the entire bucket until you get to the item you desire (something that could take O(N) time.)
What you need to do is look at two things:
Either of those by themselves can and will kill the performance.
std::unordered_map
is commonly slow for a small number of elements because of the hash function. It takes a fixed (-ish) amount of time, but maybe a significant amount of time nonetheless.
std::map
on the other hand is simpler than std::unordered_map
. The time it takes accessing an element there depends on the count of elements, but less and less so as the number of elements grows. And the big-oh factor c
for a std::map is commonly very small too, compared to std::unordered_map
.
In general, prefer using std::map
over std::unordered_map
, unless you have a specific reason to use std::unordered_map
. This holds particularly if you don't have a large number of elements.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With