Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I guarantee all unit tests pass before committing?

We've had problems recently where developers commit code to SVN that doesn't pass unit tests, fails to compile on all platforms, or even fails to compile on their own platform. While this is all picked up by our CI server (Cruise Control), and we've instituted processes to try to stop it from happening, we'd really like to be able to stop the rogue commits from happening in the first place.

Based on a few other questions around here, it seems to be a Bad Idea™ to force this as a pre-commit hook on the server side mostly due to the length of time required to build + run the tests. I did some Googling and found this (all devs use TortoiseSVN):

http://cf-bill.blogspot.com/2010/03/pre-commit-force-unit-tests-without.html

Which would solve at least two of the problems (it wouldn't build on Unix), but it doesn't reject the commit if it fails. So my questions:

  • Is there a way to make a pre-commit hook in TortoiseSVN cause the commit to fail?
  • Is there a better way to do what I'm trying to do in general?
like image 230
Morinar Avatar asked Aug 18 '11 14:08

Morinar


2 Answers

There is absolutely no reason why your pre-commit hook can't run the Unit tests! All your pre-commit hook has to do is:

  • Checkout the code to a working directory
  • Compile everything
  • Run all the unit tests
  • Then fail the hook if the unit tests fail.

It's completely possible to do. And, afterwords, everyone in your development shop will hate your guts.

Remember that in a pre-commit hook, the entire hook has to complete before it can allow the commit to take place and control can be returned to the user.

How long does it take to do a build and run through the unit tests? 10 minutes? Imagine doing a commit and sitting there for 10 minutes waiting for your commit to take place. That's the reason why you're told not to do it.

Your continuous integration server is a great place to do your unit testing. I prefer Hudson or Jenkins over CruiseControl. They're easier to setup, and their webpage are more user friendly. Even better they have a variety of plugins that can help.

Developers don't like it to be known that they broke the build. Imagine if everyone in your group got an email stating you committed bad code. Wouldn't you make sure your code was good before you committed it?

Hudson/Jenkins have some nice graphs that show you the results of the unit testing, so you can see from the webpage what tests passed and failed, so it's very clear exactly what happened. (CruiseControl's webpage is harder for the average eye to parse, so these things aren't as obvious).

One of my favorite Hudson/Jenkins plugin is the Continuous Integration Game. In this plugin, users are given points for good builds, fixing unit tests, and creating more passed unit tests. They lose points for bad builds and breaking unit tests. There's a scoreboard that shows all the developer's points.

I was surprised how seriously developers took to it. Once they realized that their CI game scores were public, they became very competitive. They would complain when the build server itself failed for some odd reason, and they lost 10 points for a bad build. However, the number of failed unit tests dropped way, way down, and the number of unit tests that were written soared.

like image 86
David W. Avatar answered Sep 17 '22 12:09

David W.


There are two approaches:

  1. Discipline
  2. Tools

In my experience, #1 can only get you so far.

So the solution is probably tools. In your case, the obstacle is Subversion. Replace it with a DVCS like Mercurial or Git. That will allow every developer to work on their own branch without the merge nightmares of Subversion.

Every once in a while, a developer will mark a feature or branch as "complete". That is the time to merge the feature branch into the main branch. Push that into a "staging" repository which your CI server watches. The CI server can then pull the last commit(s), compile and test them and only if this passes, push them to the main repository.

So the loop is: main repo -> developer -> staging -> main.

There are many answers here which give you the details. Start here: Mercurial workflow for ~15 developers - Should we use named branches?

[EDIT] So you say you don't have the time to solve the major problems in your development process ... I'll let you guess how that sounds to anyone... ;-)

Anyway ... Use hg convert to get a Mercurial repo out of your Subversion tree. If you have a standard setup, that shouldn't take much of your time (it will just need a lot of time on your computer but it's automatic).

Clone that repo to get a work repo. The process works like this:

  • Develop in your second clone. Create feature branches for that.
  • If you need changes from someone, convert into the first clone. Pull from that into your second clone (that way, you always have a "clean" copy from subversion just in case you mess up).
  • Now merge the Subversion branch (default) and your feature branch. That should work much better than with Subversion.
  • When the merge is OK (all the tests run for you), create a patch from a diff between the two branches.
  • Apply the patch to a local checkout from Subversion. It should apply without problems. If it doesn't, you can clean your local checkout and repeat. No chance to lose work here.
  • Commit the changes in subversion, convert them back into repo #1 and pull into repo #2.

This sounds like a lot of work but within a week, you'll come up with a script or two to do most of the work.

When you notice someone broke the build (tests aren't running for you anymore), undo the merge (hg clean -C) and continue to work on your working feature branch.

When your colleagues complain that someone broke the build, tell them that you don't have a problem. When people start to notice that your productivity is much better despite all the hoops that you've got to jump, mention "it would be much more simple if we would scratch SVN".

like image 28
Aaron Digulla Avatar answered Sep 18 '22 12:09

Aaron Digulla