A lot have already been said about Test Driven Development (TDD) by a lot of people, but I'd still like to add my 0.02paisa.

We have an internal requirement of checking in UnitTests along with the product code and the code coverage for the unit tests needs to be high. Most of our developers have over 85% code coverage.

In my sources I decided to try out TDD. I used the following steps

  1. Write the method's prototype so that it matches the design doc and throw a NotImplementedException in it.
  2. Write unit-tests in Visual Studio Team System unit-test framework. I try to cover all the requirements, even the ones required for negative testing like pass an invalid handle and catch an ArgumentNullException and verify that the correct argument is mentioned in ArgumentNullException.Param.
  3. After that  I run the tests. All the tests obviously fail with lots of X marks.
  4. Then I go about fixing each of the test failures by adding the functionality in the code.
  5. After each fix (or couple of them) I run the tests and the red marks keep changing into test passed green ticks.
  6. Once I'm done, I run with code coverage and then add more tests if required to cover the blocks which were not touched by the tests.

Even though the system looks simple it has helped me enormously by catching multiple bugs at the beginning. Even trivial tests like tests for GetHashCode and operator overloads found issues :) The fun in seeing all those X marks disappear one after the other brings in a childish zeal to get them done even faster.

The other goodness is that after every code change later I can easily fire these tests for sanity check.