Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

"As a rule of thumb, make all your methods virtual" in C++ - sound advice?

Tags:

c++

virtual

I just happened upon the statement in the title. The full quote is:

As a rule of thumb, make all your methods virtual (including the destructor, but not constructors) to avoid problems associated with omission of the virtual keyword.

I found this in the Wrox book Professional C++. You can google it to check.

Is there anything to it? I would have thought that you'd only provide select extension points, not by-default extensibility. For instance, a 2001 article by Herb Sutter says so. Has anything changed dramatically since then to make the opposite the ruling norm? (Note that I'm a C++ noob so I haven't been following the discussion for the last decade.)

like image 743
Lumi Avatar asked Mar 12 '12 00:03

Lumi


3 Answers

Is there anything to it?

The advice is BAD, there is no question about it. Reading something like that would be enough to stay away from the book and its author.

You see, virtual keyword indicates "you can or should override this method - it was designed for this".

For any non-trivial task, I cannot imagine a reasonable system of classes that would allow user (i.e. other programmer) to override every single single method in every derived class. It is normal to have base abstract class with only virtual methods. However, once you start making derived classes, there's no reason for slapping "virtual" onto everything - some methods don't need to be extensible.

Making everything virtual means that at any point of code, no matter which method is called, you can never be sure that the class will do what you want, because somebody could have overriden your method, breaking it in the process (According to Murphy's Law it will happen). This will make your code unreliable, and hard to maintain. Another very interesting thing is the way virtual methods are called in constructors. Basically, by following this advice you sacrifice code readability/reliability in exchange for not doing a quite uncommon typo. In my opinion, it is not worth it.

In comparison, non-virtual method guarantees that no matter what happens, at this point of code, the code will always work as you expect (not counting the bugs you haven't discovered yet). I.e. somebody else won't replace your method with broken alternative.

The advice reminds me a common error some newbie programmers tend to do: instead of developing simple solution that will fix the problem, they get distracted and attempt to make code universal and extensible. As a result, project takes longer to finish or never becomes complete - because universal solution for every possible scenario takes more effort/development time than a localized solution limited only to current problem at hand.

Instead of following this "virtual" advice, I'd recommend to stick with Murphy's Law and KISS principle. They worked well for me. However, they are not guaranteed to work well for everybody else.

like image 138
SigTerm Avatar answered Oct 05 '22 13:10

SigTerm


I don't agree with the principle.

In the past, some were concerned about overuse of virtual due to performance concerns. This is still somewhat valid, but not overly problematic on today's hardware. (Keep in mind, most other languages incur similar penalties these days. For instance, the 400MHz iPhone 2G used Objective C which incurs a virtual method call on every function call.)

I think you should only use virtual on methods where it seems useful and reasonable to want to override it in a subclass. To me, it serves as a hint to other programmers (or your future self) as "this is a place where subclasses can sensibly customize behavior." If replacing the method in a subclass would be confusing or weird to implement, don't use virtual.

Also, for simple setters and getters, it's probably a bad idea as it will inhibit inlining.

like image 35
StilesCrisis Avatar answered Oct 05 '22 13:10

StilesCrisis


There will be a tiny loss in performance and a few bytes of memory wasted.

The real problem is that it makes the code less maintainable because you are saying something about the function that isn't true. It could cause a lot of confusion.

like image 20
Steve Wellens Avatar answered Oct 05 '22 14:10

Steve Wellens