Possible Duplicate:
Why would I prefer using vector to deque
I am curious why is it that std::vector
is so much more popular than std::deque
. Deque is almost as efficient in lookup, more efficient in insert (without vector::reserve)and allows for inserting/deleting in the front.
Herb Sutter once recommended that if you want to use vector, just prefer deque (I am paraphrasing). However in a recent talk on Writing Modern C++ he again strongly recommends thinking of std::vector
as a default container. According to the GOTW I linked earlier, even the standard has similar wording.
Is there a reason for this disparity? Is it just that vector
is simpler and more well known, or is there a technical reason? Or is it that vector
is just a cooler name .. ?
Deque stands for Double-ended queues that are a sequence of containers specifically used for feature expansion and contraction on both ends, almost similar to Vectors. There is a slight difference between Deque and Vector, which is like Deque is a little more efficient than Vector in terms of insertion and deletion.
The deque is a bit slower than the vector, that is logical because here there are more cache misses due to the segmented parts. The performance of the list are still poor but the gap is decreasing. The interesting point is that deque is now faster than vector.
Difference between std::vector and std::array in C++Vector is dynamic in nature so, size increases with insertion of elements. As array is fixed size, once initialized can't be resized. Vector occupies more memory. Array is memory efficient data structure.
A std::vector can never be faster than an array, as it has (a pointer to the first element of) an array as one of its data members. But the difference in run-time speed is slim and absent in any non-trivial program. One reason for this myth to persist, are examples that compare raw arrays with mis-used std::vectors.
I can't speak for anybody else, but I can for myself.
When I first read about std::deque
, I thought it was cool enough that for a while, I treated it not only as the default container, but as nearly the only container I used.
Then somebody asked about why, and I expounded at length on its virtues and why it was the best container to use for practically everything, and how it was much more versatile than std::vector
.
Fortunately, the guy questioning my decision on it was persuasive enough that I did some testing. Testing showed that in nearly every case, std::deque
was slower than std::vector
-- often by a substantial factor (e.g., around 2). In fact, of the code I'd written using std::deque
, just replacing std::deque
with std::vector
gave a speedup in all but a handful of cases.
I have used std::deque
in a few cases since then, but definitely don't treat it as the default any more. The simple fact is that in the usual implementation it's noticeably slower than std::vector
for most purposes.
I should add, however, that I'm reasonably certain that with the right implementation, it could be nearly equivalent to std::vector
in virtually all cases. Most use a representation that's undoubtedly great from an asymptotic viewpoint, but doesn't work out quite so wonderfully (for many purposes) in the real world.
std::vector
is very well understood, simple and is compatible with C (both in terms of the memory layout, and in terms of using pointers as iterators).
For some operations it is also more efficient than std::deque
. Accessing elements by index is one example.
For a given task, it makes sense to use the simplest container that does the job well. In many cases that simplest container is std::vector
.
People use std::vector over std::deque for a simple reason. They interface with a lot of C libraries and they have functions with parameters which require a pointer to contiguous storage, which std::deque doesn't (can't) guarantee.
Another reason is when you build code according to std::deque and your requirements change so that you have to support contiguous access, you will have a bit of refactoring to do.
I am compelled to mention that there are libraries out there whose authors claim to have created more efficient vector implementations in order to overcome inefficiencies when capacity is increased.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With