For over year, we've been measuring a few key metrics in the MSDN Forums to monitor overall forum health.  We track them aggressively, send out biweekly mails about them, and use them as guideposts to make decisions about what we should or should not do with regard to the forums.  For example, our basic "reputation system" of having the top answerer lists in the forums is directly based on our desire to raise the overall "answer rate" of the forums.

Here are the key metrics we are currently watching:

  • Monthly Question Volume - How many questions are being asked in the forums?  Is it raising or falling?  How quickly is this number changing?
  • 2 Day Answer Rate  (Goal:  60%) - What percentage of questions are answered within 48 hours of first being asked?  This is our primary metric.  Think about it...if you were asking a question on the site and it wasn't answered within 48 hours, would the answer be of great use to you?  (DevDiv is currently at
  • 7 Day Answer Rate (Goal:  80%) - If the question isn't get answered right away, are we at least being helpful in the long run to most of the people who are answering questions on the forums?

That's pretty much it.  Our real goals on the forums (so far) have been primarily based on the number of questions that get marked as an answer.  There have been problems with this approach.

  • No Quality Measures - Ugh...the old quantity vs. quality debate.  We're measuring the amounts of questions that get the little "answer" flag...but how many of those questions are answered with a good answer?  How many are just off-topic questions with the answer being "ask the question somewhere else"?  There's no true quality measures built into the system.
  • Over-incenting crazy-fast answering - Along the same vein, this drive towards answer rates (and the Top Answerer lists on the live site) have drove some overly fast answers and answerers.  By not measuring quality, why not just go as fast as you can through the questions?
  • Comparing Percentages is Comparing Apples to Oranges - Is it fair that I send out a report that compares the health of the VB forums to the health of the forums for a much smaller team?  Probably not.  There are order-of-magnitude differences from forum to forum.  If a given forum gets 3 questions a week, 2 of which get answered, is it healthier than a forum that gets 90 questions a week but only 53 got answered?
  • Living and Dying By the "Mark as Answer" Button - We really depend on moderators and product teams to do all of the answer marking.  I love the "mark as answer" feature in the forums, but the implementation forces us to depend on people actually marking things.  Not everything gets marked, and not everything gets unmarked when it should.  It's an explicit action, and let's face it--people are lazy.  Why should we believe that anybody is going to go out of their way to just mark some as answered?  Aren't their other metrics we could be tracking?

I'd like to follow-up this post with some new proposed metrics I've been thinking of, but I'd like to kick off the discussion without tainting it (or making this post any longer...)

If you had to track just two or three numbers to monitor the health of an online community, what would you track?