Herb Sutter's C++ coding standards says to avoid Premature optimization
and Premature pessimization
. But I feel both is doing the same thing. So expecting some help to clarify these two concept with the difference between them. If you come-up with some examples, It will be more benefit for others.
Here is a good explanation about Premature optimization
. But I couldn't find any for Premature pessimization
What he means by premature pessimisation, I think, is just the opposite of premature optimisation: a fundamental disregard of which data structures and algorithms to use.
Premature optimisation is often concerned with minute details of algorithms that can well be tweaked later and don’t need to be paid attention to at the beginning.
Premature pessimisation, by contrast, concerns the high-level design of code architecture: a fundamentally inefficient interface for your library for instance cannot be fixed later by optimising, since the public interface is pretty much cast in stone.
What Herb means is that when you are faced with two equally readable options, always choose the most efficient one.
Using std::vector::reserve()
or the best standard container or algorithm is not premature optimization. However, not using them would be premature pessimisation.
Premature optimization is when you sacrifice readability for the sake of some "optimization" that might even not be worth it. Use a profiler for that.
There are both small and large scale choices to be made when programming.
Pessimisation is when write code in a way that "prevents the compiler from doing a good job". A typical example would be to not place functions in a place that allows them to be inlined, when the function is REALLY small and simple (a {s,g}etter for example). This can make the function take 10x the time it should take, and it's such a simple thing to "get right".
A pessimisation that I've found a few times on this site is to use "a /= 2;" when "a >>= 1" is equally suitable. If we know that a
is not negative, then shift left and divide have the same effect, but even when the compiler is optimising the divide, it nearly always produces more code to cope with "it may be negative" situation - and that extra code can be a real performance hit in some cases.
Premature optimisation is when you unroll loops or otherwise make the code more complicated simply because you don't trust the compiler to do a good job - typically with no evidence that it won't do a good job.
Another example would be "not using std::vector
", but your own expandable array
because "vector is too slow", without even having tested the code using std::vector
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With