I understand that programming to interfaces helps with loose coupling. However, is there a guideline that explains when its most effective?
For example, I have a simple web application that collects data on employees, their training plans, expenses and computes their expense for the year etc. This is a fairly simple application and I can of course use an interface but wonder if there would be any use. I will be using it for the sake of using it.
It can always be argued that as the application grows in complexity and I pass around objects it would make more sense to pass the type (interface) than an implementation. So, should I wait for the application to become complex or use it right away? I'm wondering the best practice might turn out to be "this guy is over doing things".
The reason to program to an interface is to insulate higher-level classes from changes in lower-level classes. If you don't anticipate a lower-level class to change at all, then it's reasonable not to program to an interface in that case. This article (PDF) elaborates on the idea, coining the term the Dependency-Inversion Principle.
If you ever get yourself interested in Test Driven Development, the need for "programming to an interface" really becomes apparent. If you want a class to be tested in isolation of its dependencies, you need to pass around interfaces instead of objects.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With