A while ago I said that a professional tester should split his or her time to testing the product and thinking about how to improve upon the current approaches for testing. Now that Orcas Beta1 is out the emphasis for many testers is on running and re-running all of there automated tests to ensure that the product maintains a high-bar of quality. In addition to that however, it is time to begin to think about how we can improve upon our current approaches to testing.
Specifically, it is time to think about how we can achieve four important goals:
These four goals, of course, don't represent the only dimensions on which you can optimize your testing efforts, but they do represent the key areas. In the coming weeks I'll be sharing with you my plans on we will achieve these four goals for the Visual Basic test team here at Microsoft. My suspicion is that while the challenges in VBQA are unique to Microsoft, the solutions would be applicable to all testers.
For now, let me just summarize what these goals entail.
Improving the quality of the product
While testing the product can find you plenty of software defects, typically knowledge of those defects doesn't translate into a better product. (That is, you have to fix the bugs too.) But there are some types of testing you can put into place which can result in a better product at the end. Ensuring that all code is FxCop clean is a great example of this, but testers can do more than just force developers to write Unit Tests...
Improving the quality of your testing
The tendency a lot of testers have (myself included) is to simply keep on adding tests as a product grows without really thinking about what you are getting in return. By thinking about what your automation is actually buying you in terms of verification and validation you can think about ways for getting more from your testing investment. The post on this topic will be centered around understanding your testplan and how to best organize it to have the biggest impact.
Reducing the time it takes to analyze automation
If you have ever had to deal with a lot of legacy tests, you know that the signal-to-noise ratio can be pretty low for automation. This is even for new tests if the product UI or core behavior is changing often. Fortunately there are a lot of things you can do to make your automation more robust and reliable. Even if your tests do fail, there are some great techniques you can employ to automate the task of analyzing testcase failures.
Reduce the time it takes to create/write automation
The title pretty much sums it all. Even if you write stellar automation, if you can't outpace new feature development you're sunk. This post will go over some design patterns / philosophy on how to structure and design testplans to make automating those tests a breeze.
So there you have it. I look forward to making targeted improvements in VBQA to address all four of these issues, and hopefully by discussing the improvements for those things over here you can apply them to your own work.