I was curious how often other software developers reevaluated their development environments and tools. I used to work at a large corporation with rigid toolsets that everyone hated, but could do nothing about. So nobody ever really updated their development environments because we couldn't in that environment.
Now that I'm in my own start-up I find I can spend endless time evaluating new tools and development environments, but that I really shouldn't and can't afford to. I've committed to spending 1 day a month looking at new development tools and trying them out to see if it is worth switching.
How often do you try out new IDE's, editors, bug tacking tools, debuggers? Or update to newer versions of your own?
It's an ongoing process, but I don't make major changes more often than every two years or so. A major change involves too much time, and the tradeoff isn't generally worth it. Major changes might be defined as changing the whole target or compiler architecture and toolchain for an existing project.
Note that major changes can occur between projects - a new project can settle on a completely different architecture and toolchain with no significant cost. But care should be taken not to go too bleeding edge here. An evaluation process is needed to prevent selection of a setup that will not support the project later as the project grows in complexity.
But for minor changes I simply upgrade my tools and environment as I find opportunity and reason to do so.
-Adam
For me, upgrades are event-driven, not timer-driven. I keep my ear to the ground for new tools (libraries, IDEs, CASE tools, etc) and evaluate them as they show up on my radar.
Working with Microsoft technologies, I move to the newest version if there's no compelling reason holding me back. With OSS, I use what I know unless there's something compelling pushing me forward.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With