Roy Osherove has put together a Framework for testing FxCop rules. It's an interesting approach, firing up the FxCop engine for each rule that he wants to test.

There are few things that I would like to see it support:

  1. The ability to specify positive tests (code that rules should not fire when run over) on top of the negative (code that rules do fire when run over) tests that he supports
  2. The ability to separate the actual test from the methods you want to run your rules over. Unfortunately, this approach will not scale when you are testing 3 difference languages (C#, Visual Basic and C++) for 200+ rules that we ship.

Unfortunately, his approach is not going to work in the next version of FxCop/Visual Studio for a couple of reasons. 1) We did some work in Orcas to reduce the visibility of API's that did not need to be public for actually writing custom rules (for example, everything in FxCopCommon.dll is now internal) and 2) We also removed the Reflection 'bridge', that is, the ability to create CCI objects from their Reflection counterparts and vice versa.

Apart from the things mentioned above, it's a great start and its got us thinking, how many users actually automate the testing of custom rules? Is this something you see as important? Would you like to see how we do this internally?