Once upon a time, I thought testing was about finding bugs.
Once upon a time, I thought I should be able to find every bug in my product.
Once upon a time, I thought testing was about ensuring no bugs escaped to customers.
Once upon a time, I thought every single test should be automated.
One day I stopped testing for awhile and thought about why I thought these things. I started by making a list of every reason I could think of to automate a test. It was a long list. Reviewing it, however, I realized every reason boiled down to one of two basic reasons:
This got me thinking about why I was testing in the first place. Soon I realized that I wasn't testing to find bugs - I was testing *because* defects had been found and the team wanted to know how many other defects were present.
Upon further consideration I realized that was not exactly correct. I had learned through experience that I would never find every defect. I had also learned through experience that my management did not expect me to find every defect.
So why was I testing?
Aha! I was testing so that I could provide my opinion as to whether the product was ready to ship or not!
Upon further consideration I realized that was not exactly correct. I had learned through experience that my opinion as to whether the product was ready to ship might be overruled by people up my management chain.
So why was I testing?
Several similar cycles later, I came to a conclusion:
My team is building a product. The team is composed of human beings. Human beings are fallible and make mistakes, thus the team is fallible and will make mistakes. Some of these mistakes will take the form of defects. Some of these defects will prevent our product from serving its intended purpose well enough to meet the business goals my team has for our product. I am testing in order to provide information regarding how well our product serves its intended purpose. This information is used by people up my management chain to decide whether shipping our product or taking additional time to refine our product will provide the most business value.
Once I spelled this out all sorts of things suddenly made sense. For example, "refining" might mean fixing defects. It might also mean adding additional features, or expanding existing features, or cutting completed features. Now I understood why each of these might occur a week before our scheduled ship date. Now I also understood why we might ship with what I considered heinous problems.
With this realization I started re-evaluating everything I did in terms of business value. My quest to reduce the cost of UI automation stemmed in part from this, because lowering that cost meant my team and I could complete more testing in a shorter amount of time and thus provide deeper information to the people up our management chain more quickly. And in fact that has turned out to be true.
Of late, however, I find myself thinking that continuing this quest may not be worth the investment. The changes we have wrought seem to me small, especially in the face of the exponentially exploding complexity of software today. I find myself questioning the business value of all the time I spend automating tests, and updating them to keep up with the product they are testing, and fixing defects in them and the infrastructure they use. This time seems to me better spent using my brain to identify the biggest risks to the business value my product is meant to create, working to prevent these risks from reifying, exploring my product in search of triggers for those risks, and - yes - crafting automated tests as seems appropriate.
Of late, however, I find myself questioning the business value of even this approach. I do not see how it can keep up with the exponentially exploding complexity of the software which will be here tomorrow. I feel as though there is a quantum leap I can make which will put myself ahead of this curve. I have not found it yet. I continue to search.
If you have any ideas how to find it, please let me know!
*** Want a fun job on a great team? I need a tester! Interested? Let's talk: Michael dot J dot Hunter at microsoft dot com. Great testing and coding skills required.