I have found various resources enlisting the time complexity of various C++ STL containers. Where can I find the space complexities which are involved with using C++ STL containers?
I do know that for most of the containers the relationship is linear with respect to the number of elements contained. But what about containers which use a hash function? Is it possible to make any guarantees in that case?
C can't have an "exact equivalent" of STL because C doesn't have templates or classes.
Unordered associative containers implement unsorted (hashed) data structures that can be quickly searched (O(1) amortized, O(n) worst-case complexity).
There's two sources of complexity bounds for every STL container. First one is what the standard mandates. A good (and almost always correct) source for that is cppreference.com, e.g. http://en.cppreference.com/w/cpp/container if you don't have the standard itself. Second, things not specified in the standard are implementation defined. These implementations are mostly very efficient given their multi-purpose nature.
To answer your question with a short answer: Yes, you can expect linear space. But the details are a bit more complicated.
A quick look into the Standard (23.2.1 General container requirements) says:
All of the complexity requirements in this Clause are stated solely in terms of the number of operations on the contained objects.
Section 23.2.5 (Unordered associative containers) states:
The worst- case complexity for most operations is linear, but the average case is much faster.
The standard goes on and defines certain aspects of unordered associative containers in more detail. When looking carefully at the operation complexities, we can infer something for the space. Digging a bit further (23.5.4.2 unordered_map constructors) reveals:
Constructs an empty unordered_map using the specified hash function, key equality func- tion, and allocator, and using at least n buckets. If n is not provided, the number of buckets is implementation-defined. Then inserts elements from the range [f, l). max_load_factor() returns 1.0. Complexity: Average case linear, worst case quadratic
Quadratic time happens for a pathologically bad hash function. The average case is what you should expect, namely linear time implies linear space. The worst case occurs when the hash map spills over and needs to be reconstructed (yes, handwaivingly speaking).
For element access, we get a similar thing:
Complexity: Average case O(1), worst case O(size()).
Also, the standard says that the implementation has to use a bucketed data structure. Elements are hashed into buckets. These buckets will also need space and depending on the way you initialize the unordered_map, the number of buckets is implementation defined. So, efficient implementations will use O(n+N) space, where n is the number of elements and N the number of buckets.
Hope that clarifies things a bit.
I was looking for the answer to the space complexity of associative container set
. Hope this helps.
The content below partly answered the question.
set
In this post space complexity of stl set?, due to the data structure of set
(a red-black tree), the space complexity is usually O(n)
map
The data structure of map
and set
are both red-black tree, so the space complexity of map
is usually O(n)
, but it depends on the real-world cases (see:
space complexity of a map).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With