Do you think it's worth trading off some performance for code-quality and maintainability? I remember a post by Jeff Atwood that stated that hardware is cheap, developers are not. I think I'd like to change that to "Hardware is cheap, time is not."
I've noticed with an MVC project I've been working on lately that sometimes I lose DAYS just trying to squeeze out a little extra performance out of my app and I'm starting to think it's just not worth it. I've just found myself having trouble with designing an ASP.NET MVC application. I love IQueryable to death in the fact that it allows me to append to the query so I can get some fluent code for using it. But being able to do something like that seems to add more responsibility on the controller/BLL.
So what do you think? In the case of web applications would you be ok with trading off some performance for maintainable/cleaner code? Do you think it's with to prematurely try to optimize everything you can? Because as we've seen you cannot predict all requirements.
It relates to the size, consistency, structure, and complexity of the codebase. And ensuring maintainable source code relies on a number of factors, such as testability and understandability. You can't use a single metric to ensure maintainability.
Correctness is the most important If your code is not correct at all, then it does not matter how readable or performant it is. There are some products where the whole system or part of it revolves around performance. In these cases, your product cannot be correct if it does not match the expected performance.
Sir Tony Hoare famously said, "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil."
The first part of the quote has been all but forgotten (it doesn't roll off the tongue as easily), and thus many inexperienced engineers don't take performance into consideration during the design phase of a software project. This is almost always a fatal mistake, as later on a badly designed application is very difficult to optimise due to fundamental design flaws. At the same time, there is no point trying to save CPU cycles by using clever tricks when the performance bottlenecks aren't known yet.
As to your question, I think a properly designed application that is designed to cope with its particular performance requirements won't need to be coded in an unmaintainable or "unclean" way. It's only when those performance bottlenecks are discovered (e.g. you discover your application spends 90% of its time in 10% of the code) that you might want to consider sparingly using optimisation tricks in small amounts of your code, so that it remains maintainable and easy to understand.
The great thing about many Web applications is that performance can be drastically improved using various caching techniques. As you control the server environment (and, like you say, hardware is cheap) you can make sure you cache the hell out of those commonly-used parts of your Web app. This doesn't really make for unmaintainable code if you use an abstraction layer. Facebook is a good example of a Web application that famously exploits caching (memcached) to its advantage.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With