When working on developing new software i normally get stuck with the redundancy vs dependencies problem. That is, to either accept a 3rd party library that i have a huge dependencies to or code it myself duplicate all the effect but reduce the dependencies.
Though I've recently been trying to come up with a metric way of weighing up either redundancy in the code and dependencies in the code. For the most part, I've concluded reducing redundancy increases you dependencies in your code. Reducing the dependencies in your code increases redundancy. So its very much counter each other out.
So my question is: Whats a good metric you've used in the past and do use to weigh up dependencies or redundancy in your code?
One thing I think is soo important is if you choose the dependencies route is you need the tool sets in place so you can quickly examine all the routines and functions that use a specified function. Without those tools set, it seems like redundancy wins.
P.S Following on from an article
Article
I would definitely recommend reading Joels essay on this:
"In Defense of Not-Invented-Here Syndrome"
For a dependency, the best metric I can think of would be "would the world grind to a halt if this disappeared". For example if the STL of C++ magically went away, tons of programs would stop working. If .Net or Java disappeared, our economy would probably take a beating due to the number of products that stopped working...
I would think in those terms. Of course many things are a shade of gray between "end of world" and "meh" if they disappeared. The closer the dependency is to potentially causing the end-of-the-world if it dissapeared, the more likely it is to be stable, have an active user base, have its issues well-knows, etc. The more users the better.
Its analogous to being a small consumer of some hardware component. Sometimes hardware goes obsolete. Why? Because no one uses it. If you're a small company and need to get a component for your product, you will pick what is commonly available--what the "big players" order in large quantities, hoping this means that (a) the component won't disappear, (b) the problems with the component are well known, (c) there's a large, informed user base and (d) it might cost less and be more readily available.
Sometimes though, you need that special flux-capacitor part and you take the risk that the company selling it to you might not care to keep producing flux-capacitors if you are only ordering 20 a year, and no one seems to care :). In this case, it might be worth developing your own flux capacitor instead of relying on that unreliable Doc Brown Inc. Just don't buy Plutonium from the Libyans.
If you've dealt in manufacturing something (especially when you're making far fewer than millions of them per year), you've had to deal with with this problem. Software dependencies, I believe, need to be understood in very similar terms.
How to quantify this into a real metric? Roughly count how many people depend on something. If its high, the risk of the dependency hurting you is much lower, and you can decide at what point the risk is too much.
I hadn't really considered the criteria I use to decide whether to employ a third party library or not before, but having thought about it, probably the following, in no particular order:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With