Overall I know it’s a great thing and there are some issues that need to be improved over time for it to be truly successful.  I may be a little close to the project since my co-workers are responsible for the Feedback Center, but I think I’m still allowed to have opinions and air them in this space. :-) Recently I was asked some specific questions that do a good job framing my opinions on this project.

Do you like this kind of transparency?
I love this transparency and think it’s only the first step towards a more transparent future of developer products at Microsoft.  Why do I like it so much? 

In case it’s not clear… when you enter a bug through the feedback center it is also opened directly in our internal bug tracking solution alongside of bugs being found by our own testers.  Internally we can’t escape these bugs since they are in the database we scrutinize every day on our way to shipping the product.  You get to see when, how, and who fixes your reported issues and you get notified of its progress along the way.  We also require that team members provide quality responses to every reported issue to further encourage accountability and close the feedback loop.

Historically we’ve had crash reports and statistics that helped us know which crashes we should focus on based on the number of reports.  Triaging non-crashing issues has always led to debates over how we think customers might feel about an issue versus what rating and how many customers might actually be affected by an issue.   This will eventually help us triage bugs much more efficiently and gain an even better customer understanding of the impact of bugs we find internally. 

We will only get more transparent.  Soon you may be able to see our actual internal bug counts and even see internal bugs alongside customer reported issues.  When we are in a world where you might have access to any build of visual studio this will be even more important.  

How does this compare with the closed, semi-private, bug reporting on beta.microsoft.com ?
I’ve used both and the MSDN feedback center is much better simply because you can search, see, and vote on bugs reported by other users.  You also get to add your comments and workarounds to issues as well.  With betaplace (beta.microsoft.com) you only get to see issues that you report.

Microsoft products will still have private beta’s for a long time so there will always be a need for a non-public site, but there are teams talking about creating closed group instances of the Ladybug application moving forward because it is seen as a step up from existing solutions.   So you can imagine that beta.microsoft.com could eventually be leveraging the Ladybug solution. 

What do you think about the quality of submitted bugs?
It could be improved.  I’ve spent a lot of time in my career here triaging bugs as a Test Lead and I’ve also been involved in triaging the bugs from the Feedback Center.  In general feedback center bugs require more “translation” time to understand what customers are trying to report and what they may have been trying to do when they encountered the bug.  Some teams have suggested that the triage time per bug is up to 10 times as much when compared to bugs reported internally.  I’m not sure I buy that number (given the statistics you’ll see below), but it does take more time.  The extra time required is probably because of the following reasons:

  • Lack of Additional Background Info: There is a set of information that each team feels should be collected when a bug is logged on their feature. When this information is missing the issue requires an extra round trip with the customer to attain it.  We would rather not burden every customer with these specific requests, since we want the bug reporting to remain easy, but we may eventually run an active X control that automatically fills out portions of a longer form for users when they attempt to file a specific bug. This information would include what profile settings you chose at first launch, what other versions of VS you might have installed, what type of project you were working on, etc.
  • Lack of Clear Repro Steps: We should provide you guys with better bug report examples.  I’ve seen a bugs that look like “Sometimes I seem to get an error message that says Foo” without explaining what they were doing that led to the error message.  The feedback center is a community in its infancy and the unspoken rules/expectations of a community have not been fully explored yet.  Over time, with better examples and additional practice submitting bugs it will improve.  The quality here is not unlike the quality of bugs you might get from a new-hire testing the product and, like the new-hire, I’m sure it will improve over time with a little guidance. 
  • Lack of Expected Behavior Reports: Good bugs should not only contain the description of the behavior experienced, but also the expected behavior from your perspective.  Instead of just saying “When I click X, FOO occurs” it can be helpful to say “When I click X, FOO occurs, but I was expecting BAR because of Y”.   I also believe this will get better over time. 
  • Lack of Good Microsoft Responses: Yes, this something we can improve on that will help quality.  A running joke on our test team was that we should have all been filing our bugs through beta-place or Ladybug as customers because we’d get better responses about why issues are “Won’t Fix” or “By Design”.  A tester internally will often just see these two word responses and it would be up to the tester to push for a better explanation if they felt the bug was important.  Strangely enough we felt we should encourage people to treat the community members with a little more dignity than our own employees. 
    • Providing clear explanations about bug resolutions is something that is new for a lot of people.  Speaking as a former tester I’ll say that it is about time.  Currently we’ve had some rough spots and bits of information that are Lost in Translation.  Eventually we will adapt by improving our accountability, learning better “customer-speak”, and finding more common ground within this community.  This is new for us too.  We are investigating providing a direct “Contact the Developer” form for MSDN Feedback center bugs so you don’t get the rather blind “Resolved by Microsoft” text.  You would then see who resolved it any be able to contact them directly.

What do the results look like so far?
So we’re listening and guaranteeing a response.  IMO this is only part of the solution.  In the end we’ll also be judged by what we do with what we hear.  It’s unfortunate that this feedback mechanism was opened so late in the cycle of Visual Studio 2005 because a lot of this great feedback will only start to get leveraged for the next version.  This is especially true for suggestions that would require more than a trivial amount of new development work.  In the mean time I’ll share some raw statistics since late June when the feedback center became public that I took a snapshot of today. 

For comparison I thought it might be interesting to look how these percentages compare to internal bugs and suggestions opened during the same time period.  For these, unfortunately, I don’t have a good measure of time to response so the raw % will have to do.  Also, just assume the total numbers for internally reported issues are larger. :-)

MSDN Feedback Center Bug Stats

  • Total # Opened So Far                1679
  • Responded to                             75%
  • Avg # of Days to First Response  6 Days
  • Still Active                                   50%
  • Resolved Fixed                           24%
  • Resolved No Repro                     9%
  • Resolved By Design                    7%
  • Resolved Postponed                   6%
  • Resolved Won’t Fix                    4%

Internal Bug %’s for the same time period

  • Still Active                                   55%
  • Resolved Fixed                           20%
  • Resolved By Design                    8%
  • Resolved Won’t Fix                    5%
  • Resolved No Repro                     5%
  • Resolved Duplicate                      6%*
  • Resolved Postponed                   1%

MSDN Feedback Center Suggestion Stats

  • Total # Opened So Far                1174
  • Responded to                               79%
  • Avg # of Days to First Response  6 Days
  • Still Active                                   34%
  • Resolved Postponed                    27%
  • Resolved Fixed                           16%
  • Resolved Won’t Fix                    12%
  • Resolved By Design                    9%
  • Resolved No Repro                    2%

Internal Suggestion %’s for the same time period

  • Still Active                               52%
  • Resolved Won’t Fix                 17%
  • Resolved Fixed                        11%
  • Resolved Postponed                10%
  • Resolved By Design                 5%
  • Resolved Duplicate                   5%*
  • Resolved No Repro                  <1%

*If an MSDN Feedback bug is considered a duplicate it is associated with the primary internal bug so customer bugs can only be duplicates of other customer bugs.  That number has been very small since customers can “+1” an existing bug rather than opening new entries.  So the % duplicate of MSDN Feedback bugs was not worth reporting. 

What’s important to note is that the jury is still out on around half of these issues that are still marked as active.  Because of this the resolution percentages are very likely to change over time.

What I learned through this is that, so far, we’ve actually fixed a higher percentage of customer reported issues than internally reported ones!  It will be interesting to see if this keeps up over time.

Now that you’ve read this far I’d like to know some things:

  1. Are these numbers interesting to you guys?
  2. Would you like to see more?
  3. If so, what other information would you like to see?
  4. How would you like to see top contributors to the Feedback Center rewarded by Microsoft?