My C++ algorithm obtains data with unknown size (it detects particles on the image one by one, and I cannot know how many particles will be detected before this algorithm finishes its work). So, first I want to allocate, say, array with 10000 elements, and during the processing, if necessary, allocate another 10000 elements several times.
Here is what I tried, it doesn't work:
#include <iostream>
using namespace std;
int main(){
int n = 3;
int m = 3;
float *a = new float[3];
a[0] = 0;
a[1] = 1;
a[2] = 2;
float *b = a + 2;
b = new float[3];
b[0] = 4;
b[1] = 5;
cout << a[3] << endl;
}
As a result, I got minus infinity. Of course, I can handle this in different arrays, I can allocate a huge amount of memory once. I need to pass full array of detected data to the function after, so, as a result, I want to have one big array.
But still, is there a way to increase the size of your dynamically allocated way? What I want in a toy example is to increase number of elements in array a by 3, so it will have 6 elements.
In Matlab it is absolutely possible. What about C++?
Thanks
You should just use std::vector instead of raw arrays. It is implemented to grow efficiently. You can change its size with resize, append to it with push_back or insert a range (or various other things) with insert to grow it.
Changing the size of a manually allocated array is not possible in C++. Using std::vector over raw arrays is a good idea in general, even if the size does not change. Some arguments are the automated, leak-proof memory management, the additional exception safety as well as the vector knowing its own size.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With