When writing C++ code for an embedded system with limited CPU and Memory resources, the common rule of thumb is to instantiate objects in the stack, and avoid using the heap unless it is really necessary. Doing this of course has many known benefits, but with the emergence of STL and folks recommending std::vectors as an efficient data structure, does it violate the rule of thumb that I mentioned, since the vector will be using the heap?
Example: In the old days, one would declare static arrays with known sizes that will satisfy the usage. Nowadays, one would just use vectors.
I'm not really comfortable with this transition, since there is always a possibility of the vector failing to allocate the required memory (reminder: this is for embedded systems with limited memory). Using arrays with known sizes in the stack guarantees that there will be space for allocation during compile-time.
Calling reserve() kind of helps, but this is done during run-time.
So, is this a cause for concern, or am I just being paranoid? It's definitely much more easier to use these vectors, but for an embedded environment, it might not be a good idea?
Note: This is not about dynamic vs fixed arrays, but more on how the data is allocated in memory, which is a big deal for my environment. As an example, some folks would do this: Say the array can grow or shrink between 1 to 10 elements. Some folks would create an array that covers this size in the stack, and NULL terminate depending on current size. This way, fragmentation is avoided, and we are guaranteed allocation during compile time. However, switching to vector made it much more cleaner, but at the expense of using the heap, and potentially having to deal with exceptions if allocation fails. This is what I am concerned about.
If you need a "dynamic" array, then std::vector is the natural solution. It should in general be the default container for everything. But if you want a statically sized array created at time of compilation (like a C-style array is) but wrapped in a nice C++ object then std::array might be a better choice.
Vectors can store any type of objects, whereas an array can store only homogeneous values.
Vector is better for frequent insertion and deletion, whereas Arrays are much better suited for frequent access of elements scenario. Vector occupies much more memory in exchange for managing storage and growing dynamically, whereas Arrays are a memory-efficient data structure.
A std::vector can never be faster than an array, as it has (a pointer to the first element of) an array as one of its data members. But the difference in run-time speed is slim and absent in any non-trivial program. One reason for this myth to persist, are examples that compare raw arrays with mis-used std::vectors.
I believe that you have forgotten one very important property of the STL containers: allocators.
A STL container (whether a vector
or otherwise) obtain all its memory from its allocator (apart from the very basic stack footprint that you can check with sizeof
). Therefore, it is perfectly suitable in embedded development to provide a dedicated allocator that:
With the advent of C++11, you may even use stateful allocators, so that a single allocator type can point to different pools of memory.
Therefore, the use of std::vector
, or even std::set
or std::map
, is not incompatible with a pre-allocation strategy; do remember though that apart from std::vector
other STL containers generally have some per-item overhead which must be taken into account when sizing the memory area they should draw upon.
"Depends" may be an understatement here. By using reserve with vectors, you can efficiently allocate space for the data and prevent unnecessary copies. The vector data structure itself will boil down to a heap allocated array with a size if the compiler is good enough.
Also, note I said heap. If you would rather allocate the data to the stack, your stuck with fixed sized arrays (including std::array).
And it also depends heavily on the compiler how vector is put together. Older compilers may (big emphasis on may) be less efficient. This is why for embedded systems you really need to know the architecture and compiler before making sweeping generalizations about what is "good" and what is "bad" to use.
std::vector
should be used for dynamically resized arrays. If you know the size at compile time, you can use std::array
instead.
If you know the only approximate size of array, since C++14 you may use std::dynarray
. This array has fixed size which cannot be changed during object lifetime. When the std::dynarray
without allocator is used additional optimisations are possible, the operator new
may not be called and stack-based allocation will be used.
Using arrays with known sizes in the stack guarantees that there will be space for allocation during compile-time.
Wrong assumption.
When your got a lot of call (not even recursive but a stack with a lot of level), you cannot be sure that there is enough space for your stack to hold the object that have to be created on.
There are basic checks (if the size of your object exceed size_t at least), but I do not even think they are made mandatory by the standard.
A -- somewhat excessive -- example is here:
#include <iostream>
template <unsigned long size>
struct BigObject
{
unsigned long array[size * size];
BigObject<size - 1> object;
};
template <>
struct BigObject<0>
{
unsigned long array[1];
};
BigObject<900>& recurse(BigObject<900>& object1, unsigned long n)
{
if (n == 0)
{
return object1;
}
BigObject<900> object;
return recurse(object, n);
}
int main(int argc, char const *argv[])
{
BigObject<900> object;
recurse(object, 20);
std::cout << sizeof(object) << std::endl;
return 0;
}
http://ideone.com/pkHo43
It does crash. And it is not such a special case. I had the problem in a 32 bit desktop oriented application (so none of your memory constraint) allocating a too big array on the stack.
A std::vector
is just a pointer and two size_t
when compiled with sufficient optimization flags. Now the memory for the size_t
is a waste when your know the size of the vector in advance and it will never change.
I would say:
std::array
std::vector
And as mentioned in the comments, you can use most of the functionality of e.g. <algorithms>
on fixed size arrays by using pointers for the begin
/end
iterators.
For example:
int array[8] = { 1,2,3,4,5,6,7,8 };
std::random_shuffle(array, array+8);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With