Today I present guest writer Rich Grutzmacher, Program Manager on the Office User Experience team. He helped coordinate one of the long-term, real-world studies conducted on early Office 2007 builds.
For eight months, we followed the progress of a group of twelve people in a Fortune 500 company here in the Pacific Northwest who used Office 2007 to perform their daily work. These people graciously agreed to install Office 2007 Beta 1 on their main work computers and to let us track their feedback and thoughts over a long period of time using the software.
The participants in the study were not software engineers or in any way associated with high-tech; they were just normal people who use Office to help get their jobs done.
During this extended usage study, we personally interviewed and videotaped each participant every 2 to 3 weeks to understand how they were using Office 2007 to perform their everyday work tasks. The participants in the study also sent us feedback using Send a Smile whenever they encountered a problem or discovered something about the product that they liked.
Given the issues one encounters while using early beta software, we considered nominating these participants for Sainthood. Their willingness to endure the pain of using a beta product for an extended period of time to perform meaningful work provided us with valuable insights in how typical Office customers learn and use the Office 2007 user interface.
In this series of articles, I will share with you some of the lessons we learned from this study and how the information was used in combination with many other sources of customer research to improve Office 2007 before Beta 2 got out the door.
Now before you jump out of your chair and scream, "ONLY TWELVE USERS," let me explain a bit more about the purpose of this study.
Yes, the sample size for this specific study was small. And yes, the study only included people in one company in the Pacific Northwest. It is true that for these reasons the results of this study aren't statistically significant.
But the purpose of this study wasn't to provide quantitative results. Rather, the goal was to employ ethnographic research strategies to allow us to understand qualitatively how the new Office 2007 user interface fit into daily work lives of typical Office customers.
There are several factors which contribute to the overall Office user experience which cannot be accurately measured through satisfaction surveys or through two-hour structured studies in our usability labs.
A big part of our research strategy for the new user interface is the concept of triangulation: collecting data and feedback from many sources, with many different kinds of studies, and with widely varying sample sizes and time periods. This particular study was designed to watch how Beta 1 fared not just in the first hour or the first week, but over months and months of use.
We went to great efforts in selecting participants for this study who encompass a range of characteristics found in Office customers and who use Office applications to perform a broad range of work-related tasks. For instance, we wanted some people who mostly used Word, and others who lived in Excel or PowerPoint. We looked for people with varying proficiency with Office and people in different kinds of jobs.
This selection process, combined with the high-touch methodologies we used, gave us an incredibly valuable insight into how the user interface changes would fare in the real world--eight months before we would have usually even started to gather this kind of feedback. And because we got the feedback early, we were able to address deficiencies in the product and then measure the impact of the improvements in future betas.
Next time: Thoughts about training and migration tools from the study.