I apologize for not getting this post out sooner. I was not feeling well most of last week.

In a previous post, I talked about how we used TFS to implement the practice of Quality Gates. In this post, I'll talk some of the reports that were used to e track things from the top, CxO level, down to the individual features.

Here is our divisional dashboard:

image

You may remember in a previous post about implementing process,  that we had value props, which traced to experiences, which traced to features. The above is a snapshot for the entire developer division on progress across all of them.

This top level report shows progress against all the Value Props. Value Propositions, as you may remember are divisional or departmental objectives, or pillars of the release. We had around 10 of these for the Orcas (VS 2008) release.

Clicking on a value prop (see the gray box above), will show you progress on experiences tied to that value prop:

image

Now you can see the progress of all the experiences and features associated with this value proposition. Drilling down to an Experience:

image

shows the progress on the features for that experience. Again. we can drill down on a feature:

image 

and we see the Web View of the Feature record that I referred to in previous posts. With the Team System Web Access client, I don't see why this couldn't link to the actual Feature work item record itself.

These set of reports allows people at any level to view the status of the entire release. It also provides transparency, which is culture changing at all levels, but is for the best.

That's it for this post. I have two more posts in this series. One is to talk about the issues we ran into and another to answer the question: "Well, did it work?"

Thanks for listening!