Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How much additional time does unit testing take? [closed]

I am looking for possibly a study between the time difference of regular coding vs coding + unit tests (not strict TDD just yet). I know the whole "Saves you time in the long run" angle, but from a project planning perspective for the team that has never done it before, I need to be able to roughly estimate how much additional time to allocate.

Does such a study exist? Can anyone comment from experience?

like image 952
Slappy Avatar asked Sep 21 '10 01:09

Slappy


People also ask

How much time does unit testing take?

Typical time budgeted on writing unit tests is about 1 day for every feature that takes 3-4 days of heads down coding. But that can vary with a lot of factors. 99% code coverage is great. Unit tests are great.

What does the timely rule of unit testing mean?

Timely: Unit tests should be written just before the production code that makes the test pass. This is something that you would follow if you were doing TDD (Test Driven Development), but otherwise it might not apply.

Does TDD take more time?

When you're doing TDD while learning it, it takes longer than if you weren't doing TDD. But this is just temporary; once you know TDD, you won't be slowed down by the learning process. While writing tests takes time, it can take the place of time-consuming manual testing.

How long should unit test suite take to run?

Still, it seems as though a 10 second short-term attention span is more or less hard-wired into the human brain. Thus, a unit test suite used for TDD should run in less than 10 seconds. If it's slower, you'll be less productive because you'll constantly lose focus.

How long does it take to write a unit test?

Typical time budgeted on writing unit tests is about 1 day for every feature that takes 3-4 days of heads down coding. But that can vary with a lot of factors. 99% code coverage is great.

How much unit testing should you do?

Generally, this means a percentage of the team's work week or something. From now on, spend 90% of your time writing code and 10% working on unit tests. One or more of these things, they reason, will ensure that the team does "enough" unit testing. Let's now get down to brass tacks.

When do unit tests pay off?

Unit tests pay off at maintenance time. If you plan to have a long living application you will spend more time maintaining than you think you will now (if you have not yet tried this, you will be surprised how long a successful project may live)

Should all teams be performing unit testing?

That doesn’t mean all teams are, can, or should be performing unit testing because of time and time again one drawback of a unit test is clear: time. They take time, particularly in a codebase that’s not set up to be unit tested.


5 Answers

I know that you're not interested in going for full TDD just yet, but I think the best performance and planning markers are going to be found in Test Driven Development case studies, such as those conducted by both Microsoft and IBM.

Case studies were conducted with three development teams at Microsoft and one at IBM that have adopted TDD. The results of the case studies indicate that the pre-release defect density of the four products decreased between 40% and 90% relative to similar projects that did not use the TDD practice. Subjectively, the teams experienced a 15–35% increase in initial development time after adopting TDD. Source.

That is part of a preface to a full comparison of normal development vs development using unit testing as a core principle. The upfront development planning addition that we use is 20%, which includes both unit and integration tests. In all honesty testing is vital to any successful piece of software, unit testing just removes some of the hard yards and manual effort. The ability to run a few hundred unit and integration tests upon finishing a bit of functionality and being able to test the system within a few seconds is invaluable.

There are a number of different 'magic' numbers that people add to their estimates to incorporate the additional time to write unit tests, but it really just comes down to a simple trade-off. Essentially your balancing the increase in development time vs the increase in bug fixing/error checking time, also taking into account the likeyhood of downtime and the system criticality (if it is a primary revenue or a non-essential system).

If you’re interested in doing a little more reading here is the complete Microsoft study. Its pretty short but gives some interesting findings. And if you’re really keen here is a unit testing slideshow that outlines the concepts and benefits in reasonable detail (previously this was a link to the source content of this presentation, but sadly that content is now gone).

like image 72
JonVD Avatar answered Nov 18 '22 00:11

JonVD


I can't comment on studies for this topic.

From experience I'd say your magic number range is 30-40% since your team is new to this. Your team will need to learn how to create mocks and fakes, and get used to writing tests, in addition to setting up infrastructure, lots of up-front costs until your team gets up to speed. If your main language is C++, then it takes more effort to write mocks and fakes, than with C# (from my experience). If your project is all brand new code, then it will take less effort than working with existing code. If you can get your team quickly up to speed on testing, than TDD will prove less effort than writing tests after the fact. With enough experience, the times for tests are probably around 20%, yet another magic number. Pardon my lack of exact numbers, I don't have the precise metrics here from my experience.

like image 23
Chris O Avatar answered Nov 17 '22 23:11

Chris O


In all state of affairs, in all teams this should be right:

TimeOf(Coding+Testing) < TimeOf(CodingWithoutTesting)

It should't take additional time at all. Or it'll become useless.

like image 37
Dan Ganiev Avatar answered Nov 18 '22 00:11

Dan Ganiev


I need to be able to roughly estimate how much additional time to allocate.

That's silly. There's no additional time.

Here's what you do. Take your existing testing budget. The 40% of development effort at the end of the project.

Spread most of this testing effort through the life of the project as unit testing. Call it 30% of the total effort, allocated everywhere.

Leave some of the testing effort at the end for "integration" and "performance" testing. Call it 10% of the total effort, allocated just at the end and just for integration and performance testing. Or User Acceptance Testing or whatever is left over that you didn't unit test.

There's no "additional". You have to do the testing anyway. You can either do it as first-class software, during the project, or you can scramble around at the end of the project doing a bad job.


The "cost" of TDD is -- at best -- a subjective impression. Read the excellent study summary carefully. "Subjectively, the teams experienced a 15–35% increase in initial development time after adopting TDD". [Emphasis added.]

Actually there is zero cost to TDD. TDD simply saves time.

It's much less expensive to do TDD than it is to create software other ways.

Why?

  1. You have to test anyway. Since you have to test, you may as well plan for testing by driving all of your development around the testing.

  2. When you have a broken test case, you have a focused set of tasks. When you are interrupted by phone calls, meetings, product demos, operational calls to support previous releases, it's easy to get distracted. When you have tests that fail, you get focus back immediately.

  3. You have to test anyway. It's either Code then Test or it's Test then Code. You can't escape the cost of testing. Since you can't escape, do it first.

  4. The idea that TDD has "additional" or "incremental" cost is crazy. Even a well-documented study like Link can't -- actually -- compare the same project done two ways. The idea of "additional development time" cannot actually be measured. That's why it's a "subjective impression". And they're simply wrong about it.

When you ask programmers -- who are new to TDD -- if TDD slowed them down, they lie about about it.

Yes. They Lie.

One. You made them change. Change is bad. Everyone knows that. They can't say it was easier, because it was "different".

Two. It calls management into question. We can't say bad things about our managers, that's bad. Everyone knows that. They can't say it was easier because previous things our managers demanded are now obviously wrong.

Three. Most programmers (and managers) think that there's a distinction between "real" code and "test" code. TDD takes longer to get to the "real" code because you spend your up-front time doing "test" code.

This "real" code vs. "test" code is a false distinction. Anyone who says this doesn't get how important testing is. Since testing is central to demonstrating that an application works, test code should be first-class. Making this distinction is wrong. Test code is real code.

The time spent writing test code is -- effectively -- time taken away from design of the real code. Rather than create "paper" designs, you are creating a working, living design in the form of test cases.

TDD saves time.

Folks who says otherwise are resisting change and/or trying to protect management from appearing to be wrong and/or making a false distinction between real code and test code. All things that are simply wrong.

like image 42
S.Lott Avatar answered Nov 17 '22 23:11

S.Lott


As someone who is currently working on his first project using unit tests (not full-blown TDD, but close), I would say that you should double the time it would usually take for your team to do their initial implementation.

Roy Osherove's book, The Art of Unit Testing, has an informal study that shows this, but also shows that when QA cycles are included, the overall release time was slightly lower when using unit tests, and there were far fewer defects in the code developed with unit tests.

I would make these suggestions:

  • Have your programmers read everything they can get on unit tests and TDD, especially anything they can find on how to design code that is test friendly (use of dependency injection, interfaces, etc.). Osherove's book would be a great start.

  • Start evaluating unit testing frameworks (NUnit, xUnit, etc.), and Mocking frameworks (Rhino Mocks, Moq, etc.), and have them choose ones to standardize on. The most popular are probably NUnit and Rhino Mocks, but I chose xUnit and Moq.

  • Don't let your programmers give up. It can be tough to change your mindset and overcome the natural resistance to change, but they need to work through that. Initially it may feel like unit tests just get in the way, but the first time they refactor sections of code on the fly and use unit tests to know they didn't break anything as opposed to hoping they didn't will be a revelation.

  • Finally, if possible, don't start unit tests on a large, high-pressure project with a tight deadline; this would likely lead to them not overcoming their initial difficulties and ditching unit tests in favor of just getting the code done.

Hope this helps!

like image 27
Jeff Ogata Avatar answered Nov 17 '22 23:11

Jeff Ogata