Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Bleeding edge vs field tested technology. How will you strike a balance [closed]

I have been pondering about this for some time. How do you pick a technology ( am not talking about Java vs .Net vs PHP) when you are planning for a new project /maintaining an existing project in an organization.

Arguments for picking the latest technology

  1. It might overcome some of the limitations of the existing technology ( Think No SQL vs RDBMS when it comes to scalability). Sometimes latest technology is backward compatible and only get to gain the new features without breaking the old functionality
  2. It will give better user experience (May be HTML 5 for videos, just a thought)
  3. Will cut down development time/cost and make maintenance of the code base relatively easy

Arguments for picking field tested technology/against picking a bleeding edge technology

  1. It has not stood the test of time. There can be unforeseen problems. convoluted solutions might lead to more problems during maintenance phase and the application might become a white elephant
  2. Standards might not yet be in place. Standards might change and significant rework might be needed to make the project adhere to standards.Choosing the field tested technology will save these efforts
  3. The new technology might not be supported by the organization. Supporting a new (or for that matter a different technology) would require additional resources
  4. It might be hard to get qualified resources with bleeding edge technology

From a developer perspective, I do not see a reason not to get hands dirty with some new technology (in your spare time) but he/she might be limited to open source/free ware/developer editions

From am organization perspective, it looks like its a double edged sword. Sit too long in a "field tested" technology and good people might move away (not to mention that there will always be people who prefer familiar technology who refuse to update their knowledge). Try an unconventional approach and you risk overrunning the budged/time not to mention the unforeseen risks

TL;DR

Bottom line. When do you consider a technology mature enough so that it can be adopted by an organization ?

like image 614
ram Avatar asked Jan 21 '23 20:01

ram


1 Answers

Most likely you work with a team of people and this should be taken into consideration as well. Some ways to test/evaluate technology maturity:

  1. Is your team and management receptive to using new technology at all? This might be your biggest barrier. If you get the sense that they aren't receptive, you can try big formal presentations to convince them... or just go try it out (see below).

  2. Do some Googling around for problems people have with it. If you don't find much, then this is what you'll run into when you have problems

  3. Find some new small project with very low risk (e.g. something just you or a couple people would use), apply new technology in skunkworks fashion to see how it pans out.

  4. Try to find the most mature of the immature. For instance if you're thinking about a NoSQL type of data store. All of the NoSQL stuff is immature when you compare against RDBMS like Oracle that has been around for decades, so look at the most mature solution for these that has support organizations, either professionally or via support groups.

  5. Easiest project to start is to re-write an existing piece of software. You already have your requirements: make it just like that. Just pick a small piece of the software to re-write in the new technology, preferably something you can hammer at with unit/load testing to see how it stands up. I'm not advocating to re-write an entire application to prove it out, but a small measurable chunk.

like image 123
MattS Avatar answered May 16 '23 06:05

MattS