At Microsoft, there is an obsession with measurement. If you can't measure it, then it doesn't exist. As a result, we set up data collection mechanisms, and try to interpret that data, even if the data isn't what we're really interested in, but we act as if it is. Because it's what we know how to do. (If all you have is a hammer...)

A classic example of this is trying to gauge the impact of blogging. Microsoft employees who are considering taking up the practice ask questions about measurement.

I want to measure the impact of my blog. I'd like to put a survey at the bottom of my blog that asks people "Did this blog posting prevent a call to Microsoft product support?" or "Was this blog posting helpful?" or "Rate this blog posting on a scale of 1 to 10." Then I can generate reports based on what people think so I can see how effective I am. Somebody in sales might ask "Did this blog posting convince you to buy a Microsoft product?" A developer might ask "Did this blog posting help you integrate your third-party product with Microsoft Windows?"

This smells like "I must make this quantitative and measurable so I can make it a review goal to increase my blog's 'impact' by 25%." In my opinion, blogging isn't like that. Blogging is more about creating an atmosphere. Sure, individual entries may solve specific problems, but the cumulative effect is the goal. Using a survey to measure the impact of a blog entry is like having somebody fill out a survey after you give them a ride home because you want to determine the impact that one action had on how nice a person they think you are.

Questions about measuring the impact of blogs will never go away because Microsoft is all about measurement. Many people believe that if you can't measure it, then you can't claim on your annual performance review.