Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the complexity of std::vector<T>::clear() when T is a primitive type?

I understand that the complexity of the clear() operation is linear in the size of the container, because the destructors must be called. But what about primitive types (and POD)? It seems the best thing to do would be to set the vector size to 0, so that the complexity is constant.

If this is possible, is it also possible for std::unordered_map?

like image 668
user1487088 Avatar asked Jun 27 '12 23:06

user1487088


People also ask

What is the time complexity of vector erase?

Time Complexity: O(N) in the worst case as an erase takes linear time.

What is time complexity of vector in C++?

The time complexity to find an element in std::vector by linear search is O(N). It is O(log N) for std::map and O(1) for std::unordered_map . However, the complexity notation ignores constant factors. Different containers have various traversal overheads to find an element.

What is the complexity of vector?

The complexity (efficiency) of common operations on vectors is as follows: Random access - constant 𝓞(1) Insertion or removal of elements at the end - amortized constant 𝓞(1) Insertion or removal of elements - linear in the distance to the end of the vector 𝓞(n)

What is the time complexity of removing the first element from a vector?

Erasing an element in a vector is O(n) as we have to remove the element and still need to shift all successive elements to fill the gap created. If a vector has n elements, then in the worst case we will need to shift n-1 elemets, hence the complexity is O(n).


1 Answers

It seems the best thing to do would be to set the vector size to 0, so that the complexity is constant.

In general, the complexity of resizing a vector to zero is linear in the number of elements currently stored in the vector. Therefore, setting vector's size to zero offers no advantage over calling clear() - the two are essentially the same.

However, at least one implementation (libstdc++, source in bits/stl_vector.h) gives you an O(1) complexity for primitive types by employing partial template specialization.

The implementation of clear() navigates its way to the std::_Destroy(from, to) function in bits/stl_construct.h, which performs a non-trivial compile-time optimization: it declares an auxiliary template class _Destroy_aux with the template parameter of type bool. The class has a partial specialization for true and an explicit specialization for false. Both specializations define a single static function called __destroy. In case the template parameter is true, the function body is empty; in case the parameter is false, the body contains a loop invoking T's destructor by calling std::_Destroy(ptr).

The trick comes on line 136:

std::_Destroy_aux<__has_trivial_destructor(_Value_type)>::
__destroy(__first, __last);

The auxiliary class is instantiated based on the result of the __has_trivial_destructor check. The checker returns true for built-in types, and false for types with non-trivial destructor. As the result, the call to __destroy becomes a no-op for int, double, and other POD types.

The std::unordered_map is different from the vector in that it may need to delete structures that represent "hash buckets" of POD objects, as opposed to deleting objects themselves*. The optimization of clear to O(1) is possible, but it is heavily dependent on the implementation, so I would not count on it.


* The exact answer depends on the implementation: hash tables implementing collision resolution based on open addressing (linear probing, quadratic probing, etc.) may be able to delete all buckets in O(1); implementations based on separate chaining would have to delete buckets one-by-one, though.

like image 174
Sergey Kalinichenko Avatar answered Sep 22 '22 06:09

Sergey Kalinichenko