I read a blog about vector memory leak. Given a vector, after inserting a large number of data, even if you delete a large amount of data (or delete all of it with clear()
) it does not change the capacity of the container, it will still take up the memory. To avoid this situation, we should think of ways to change the capacity of the container to make it consistent with the current data required.
So I wrote a test:
//code1
//why this order is best?
//memory 316
vector<string>vNum;
for (int i=0; i<100000; i++)
{
vNum.push_back("asfdsafadfsfsafsafsafdsfsdaf");
}
//memory 22732
vNum.clear();
//memory 4904
vector<string>(vNum).swap(vNum);
//memory 612
Then I switched the order:
//code2
//memory 308
vector<string>vNum;
for (int i=0; i<100000; i++)
{
vNum.push_back("asfdsafadfsfsafsafsafdsfsdaf");
}
//memory 23284
vector<string>(vNum).swap(vNum);
//memory 23324
vNum.clear();
//memory 4396
And I use one of them also seemed ineffective. Why does code1
seem ineffective?
I have to guess: Are the "memory" comments based on numbers that your operating system gives you when asked about you application's memory usage?
If that's the case, then this does not directly resemble the actual memory usage of your application. It is instead what the OS has reserved for your application. When there's no need to make space for other applications, there is no need to shrink the reserved memory for the OS.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With