A former colleague of mine sent some email last week thanking me for a book suggestion I had given him. He followed to ask me if I could give him some pointers on measuring quality.

 

I think a lot of testers attempt to measure quality – or think they are measuring quality, but in every case I can think of, they’re really measuring attributes (functionality, performance, compliance, etc.). My stance is that you can (and should) measure a number of criteria, but this criteria is all that you are measuring – you’re not measuring quality.

 

Imagine if you could measure quality. Someone could ask “how is the quality of the current release”, to which you could answer “we currently have 85% quality.”  “Whew – only 15% more quality to go!”

 

Perhaps I get quality and value confused. Customers say they want quality, but I think what they really want is value. Customers don’t care if the pass rate was 99.98% and the product survived 100 hours of stress. They want something that works for them whenever they need it to. Maybe you can measure quality, and that it's value that you can't measure.  Since I don't think I have any readers in "Value Assurance" roles, let's assume that we're trying to measure quality. How would you answer these questions?

 

  • Does a 100% test pass rate mean you have a quality product? 
  • Does 100% code coverage mean you have a quality product?
  • Does flawless performance after 1000 hours of stress testing mean you have a quality product?

 

Most seasoned engineers would say something like “not necessarily, but it is a positive metric.” (disagreements welcome). If that’s the case, how do you measure quality?

 

Can you measure quality?

 

 

(note the single spaces after periods in this post)