Given a short sprint, is it ever acceptable to forgo TDD to "get things done" within the sprint.
For example a given piece of work might need say 1/3 of the sprint to design the object model around an existing implementation. Under this scenario you might well end up with implemented code, say half way through the sprint, without any tests (implementing unit tests during this "design" stage would add significant effort and the tests would likely be thrown away a few times until the final "design" is settled upon).
You might then spend a day or two in the second week adding in unit / integration tests after the fact.
Is this acceptable?
I would say that it's almost always acceptable to bypass any process if it means that you complete a project that you wouldn't otherwise be able to complete. The processes should be there to help you, but if your process is not helping, then don't use it (after discussing it with the rest of the team first, of course).
However bypassing TDD can easily result in the exact opposite effect - you could write buggy code that just before you need to ship requires a rewrite as final testing shows up critical problems that you should have spotted sooner, so think carefully before doing so.
If you do skip unit testing to get something out the door and are lucky enough that it works, you should see it as a technical debt that should be paid back as soon as possible.
If you can accurately code something without tests, why use unit tests at all? Unit tests should either help you write code faster or help write better code. If it doesn't do either, don't use unit tests.
A 2 week iteration isn't short for a lot of people. Many of us are doing one week iterations. Kent Beck is even trying to encourage daily deployments - and there are advantages in cleaning the dev process up so it can be that responsive.
NEVER reduce TDD quality to get stuff out - It's so much harder to clean up later and you just end up teaching the customer that they can pressure you into quick, dirty, hacked releases. They don't see the crap code that gets produced as a result - and they don't get to maintain it. If somebody tried to get me to do that I'd quit... and I have refused to work in places that "don't have time to test properly". That's not an excuse that works.
NOTE: When I write about TDD, I'm including functional tests. These are important because they should exercise scenarios that make sense to the customer in terms of recognizable user-stories. I normally start work on a story with the functional test because it's the most important test - "test customer gets what they described..." All the other tests might be up for negotiation, but when I'm team leading I expect at least one functional test per story or it's (as scrum people say) "not done!" ;-)
Don't think you can go in and add tests later - it's so much more difficult to do that. (I have tried it both ways - believe me.) It really is cheaper to put tests in as you go - even if you have to refactor and rewrite, or throw some away as the system evolves.
You can't get quality code without having decent code coverage ALL the time.
Code test coverage is the important word here. Covering stuff that could break, not just zillions of meaningless tests - critical tests that cover things you need to worry about.
If you can't get it out in time and it's a problem you need to think why?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With