One of the main role of the researcher, is to provide quality, reliable
data to the product team. However, it is not quite as easy as it may
seem. Researchers, for lack of appropriate information or skill, may be
at risk of generating data lacking in quality. There are multiple
strategies that can be taken to ensure the highest quality of the data
being collected, including careful review of the research protocol and a
general skeptical approach to one’s research methodology. In this post,
I’ll introduce the underlying reason why a good researcher should
always initially cast a critical eye on the data he is being fed, and
apply this to his own data. I will then discuss some common threats to
good data collection I have encountered in recent weeks, and in
particular the threats of bad recruitment.
Product Units are so often caught up in results – details that add up to results, that sometimes we forget the value proposition of the bigger picture: the overall User eXperience (UX). At Microsoft, we are increasingly working with UX professionals (designers, researchers and product managers) to improve value and satisfaction for our customers. In the case of our team, Library eXperience (LEX), engaging with the UX team brings value on three levels:
Additionally, UX provides resources about usage data to validate designs or better understand users’ behaviors and attitudes to inspire and ground decisions (More on the three values of UX Research in an earlier post by Yann Riche).
The Visual Studio 2010 Help Viewer is a great example. As a result of group consolidation part of the way through a development cycle, the LEX team became the owners of a component in a shipping product, with the Beta release months away. The Help component of VS 2010 had Executive sponsorship, meaning the teams charter was very clear – execute on expectations (see Jeff Braaten’s “The Story of Help in Visual Studio 2010” for more details). After the Visual Studio 2010 release, LEX heard from the community the unexpected impact the loss of Document Explore had on developer productivity. The promise was made that we (LEX) would address this in the Service Pack (Dev 10 SP1). The race was on. We analyzed Document Explorer with the goal of distilling out only the key functions that were critical to support (timeline and resources were relatively fixed). Were the results the right features? What did Developers really need to support their local Help productivity? We decided to address these questions with three parallel and mutually supporting efforts:
With UX guidance, engaged our MVPs customers in 3 Live Meetings to get their feedback through the design and development process
With UX feedback & wireframes, build a model, a prototype illustrating the features (resulting from the Document Explorer analysis), to show and use for feedback
Engaged UX to provide user studies and analysis
The first task for UX was to help us evaluate the value proposition. We asked the question, “does this effort map to usage patterns – will the envisioned application actually add value to the end user experience?” UX evaluated the Visual Studio 2010 release feedback – specific to the Help viewer, and decided that building a model and engaging the MVPs was a good start (“start” is key here). UX provided guidance with planning and running our MVP meetings.
In parallel, we started working on the prototype. During this process, UX provided guidance from day 1. We sent the list of what we thought were key Document Explorer features, and UX responded with analysis and wireframe proposals allowing the developers to start building the prototype. During the development process, we held regular review meetings with UX for feedback (concerning both usability and general UI) and for continuous feature re-prioritization. Across this entire process, we also gathered feedback from Executive reviews, dogfooding, and community forums and blog comments. Each actionable input was considered in the feature work prioritization stack. With the progressive addition of bugs and potential features, we developed a cadence of periodically re-evaluating our feature priority stack driven by user needs and desires. In our first two meetings with MVPs, we presented the problem and our proposed solution set of features for feedback. In our final Live Meeting with MVPs, we presented them with the opportunity to use a functional prototype of the proposed Help Viewer application for evaluation and feedback. MVPs showed interest in learning our plans, and provided some useful information to help us prioritize features. These meetings also allowed the MVPs to confirm that all important features had been considered and established a communication forum through which we could gather feedback on future efforts.
During the same period, UX conducted a series of usability studies for the new local Help viewer, which led to refinements in the current visual and interaction design of the application, and allowed the team to gain a much better understanding of how different types of customers rely on different features. For example, a keyword index feature is key to the productivity of expert professional developers who rely on their large experience to retrieve information instead of re-finding it. Search is also key to opportunistic developers who would rather re-find information than store it if there is no clear indication that it will be useful later. We also found out that the lack of auto-completion and in-editor suggestion (IntelliSense) in C++ led them to rely particularly on the index, and to have very specific behaviors around the access to reference material. The interesting point being there was no one set of features for “developers”, rather each developer work style (developers vs. architects vs. opportunistic vs. systematic programmers etc.) has their own set of needs. Based on usage pattern data, we needed to develop a prioritized set of features for our varied market segment needs.
Overall, UX helped us design our customer engagement strategy, worked with us throughout the planning, development and release process. UX provided usage data from usability studies that allowed better evaluation and refinement of the local help viewer value proposition, the initial result of which was the new local Help viewer in SP1. In the LEX team, we are continuing to engage with UX for future releases of our help assets, adopting an “early and often” approach to gathering user feedback for iterating on the proposed designs.
How does “Program Management” define the value UX brings to the application development process? Tough question, one that I will not attempt to answer on behalf of the “Program Management” world. I will say that within the LEX team, we see UX value across our customer-facing work. This includes helping us to prioritize the features based on user need, and that prioritization is premised on user studies and other customer engagement (MVPs in this case). It also includes UI guidance, from meeting internal graphic standards, to usability standards. In summary, we see UX as an integral partner in the solution development process.
Nice overview of the UX involvement in the development process.