Microsoft | patterns & practices | Developer Network | Enterprise Library | Acceptance Testing Guide | Personal Site
Traditionally, p&p primary audience included developers and architects. Even though we have some coverage of the testing discpline (in particularly Testing .NET Application Blocks - Version 1.0 guide and Performance Testing Guidance for Web Applications), it is a small portion of the p&p assets [link to catalog]. In our interactions with customers, we hear requests for good guidance on testing – all kinds of testing.
So, back in the fall 2007, I’ve put a number of projects related to test engineering and test automation on the patterns & practices backlog. Several other important projects took precedence (GAT/GAX, Unity, Enterprise Library). And now, after we have shipped GAT/GAX 1.4 and Unity 1.0, I am happy to be able to kick off this project with the focus on acceptance test engineering.
The core team consists of Michael Puleio, Jon Bach and I, Grigori Melnik. Michael is not a tester but a great developer with passion for testing, test automation and test tools. Jon is a professional tester, he is the manager of corporate intellect at Quardev Labs and a co-inventor of session-based test management for managing and measuring exploratory testing. I have devoted a number of years researching executable acceptance test-driven development (with FIT) and the relationships between software requirements and acceptance tests (see this article with my stance on this).
The topics we plan to focus on in this project include:
- test objectives and strategy,
- product readiness/acceptance,
- defining and reconciling good-enough criteria in various industrial contexts,
- working with customers and customer-proxies,
- supporting stories/requirements with acceptance tests.
We intend to support our guidance with case studies and exercises from the real world.
We are running this project as an agile project, with weekly iterations, standups, collocated team, etc. This deserves a separate blog post, which I’ll probably do later this month.
In the meantime, feel free to post your comments and thoughts on any specific (painful) aspects of acceptance testing that you would like help with. Also, if you have an interesting experience with acceptance testing that you'd like to share and perhaps be profiled in our guide as a case study, we'd like to hear about it.
The other day, I mentioned a new project here at p&p, Acceptance Testing Guidance , Grigori has his
FIT is a great tool, I wish MS P&P could extend it (or create a similar tool). I would suggest incorporating it into MOSS or the Reporting Services. My challenge today is how would I do a test in my database for data quality?
I read the article regarding FitNesse and I am curious if anyone has experience using FitNesse in an environment where user interaction is not completely text based? We have found these tests to be incredibly high maintenance with a rapidly changing program.
If anyone has suggestions, I would love to hear them!
its great to see that my proposed performance improvements have made it into the final Entlib release.
Did you profile the enhancements or do you have any numbers at hand?
Tech-Ed 2008 Developers , June 2-6, Orlando, FL. I’ll be giving a talk on the Enterprise Library (DVP02-TLC).
In our quest to produce actionable acceptance testing guidance , we are looking for hard-to-test scenarios