In his insightful paper,
Error and Exception Handling,
@Dave Abrahams says:
Make your exception class immune to double-destruction if possible. Unfortunately, several popular compilers occasionally cause exception objects to be destroyed twice. If you can arrange for that to be harmless (e.g. by zeroing deleted pointers) your code will be more robust.
I am not able to understand this particular guideline, Can someone:
Like @Tony said, this guideline was meant as a protection against compiler bugs. This guideline dates back to 2001 or so, when exceptions support was probably still a bit unstable. Since then, I think/hope most compilers have fixed this bug, so the guideline might not be very relevant anymore.
FWIW, this guideline has been eliminated from the CERT coding practices. In the discussion on this page, an interesting point is raised: destructing an object twice is UB anyway, so whatever you do to handle that in your classes will never make your program fully predictible.
However, if you really want your code to be portable across compilers (including old versions), you should probably take all these little glitches into account. For instance, Boost goes through a lot of work to work around compiler bugs; they could simply write standard-compliant code and defer the responsability of failures to implementations, but that would hinder the adoption of their libraries.
Whether you need to put the same care when writing your code depends on your requirements, and basically boils down to this question: is supporting dozens of compilers really worth the amount of work that implies?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With