Given the following code:
while(is_running)
{
std::vector<buffer> buffers;
// fill buffers
// use buffers
}
Does modern compilers perform the following transformation?
std::vector<bufer> buffers;
while(is_running)
{
// fill buffers
// use buffers
buffers.clear();
}
The only way to know with certainty would be to test, but I would be rather surprised to see an optimizer carry out this optimization.
To even begin to carry out this optimization, the compiler would have to either 1) know enough about the internals of the functions involved to "realize" (for example) that operator new
and operator delete
are basically mirror images of either other, or 2) it would have to generate all the code for all the functions inline (all the way down to the invocations of operator new
and operator delete
, and have enough intelligence to be able to deduce the same conclusion from the code.
I can barely imagine the first, but don't recall having ever seen it. Given the complexities of a typical heap manager, the second strikes me as truly unbelievable.
Bottom line: I've been surprised before, and I'm sure I will be again -- but but would be a bigger surprise than most.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With