Last week I asked you to see how many different interpretations of this sign you could devise. After an hour or so I had the following:
A number of questions occurred to me as I worked this exercise, which I believe are further evidence of the sign's ambiguity:
I started this process with an unstructured brainstorming session. When that ran out of steam I called on Jerry Weinberg's Mary Had A Little Lamb heuristic: I read the sign multiple times, emphasizing a different word or set of words each time. When *that* ran out of steam I called on Jerry's Mary Conned The Trader heuristic: I used synonyms and alternate definitions of each word in the sign as a jumping off point for additional brainstorming. Voila: multitudes of interpretations of the sign!
"And how does this apply to testing?" I took thirteen words and found forty different interpretations. I wager that the specification for your feature contains rather more than thirteen words and so has rather more than forty possible interpretations. Searching out and highlighting these ambiguities - testing the specification, in other words - is one way testers can add value at those points in the project when there isn't any code to test. If these differences in understanding are not surfaced, each person is likely to work from a different one and difficulties will likely ensue. Bringing these assumptions to light allows the team to discuss them and agree to work from a single one.
If you are interested in learning more about the grief ambiguity can cause as well as strategies for reducing ambiguity, pick up the excellent Exploring Requirements, which Jerry cowrote with Don Gause. It will be worth your while!
*** Want a fun job on a great team? I need a tester! Interested? Let's talk: Michael dot J dot Hunter at microsoft dot com. Great testing and coding skills required