How We Evaluate the Experiences We Engineer

IEBlog

Internet Explorer Team Blog

How We Evaluate the Experiences We Engineer

  • Comments 41

How do you know when an experience is ready for consumers? This is something we ask ourselves all the time. In this post, we’ll cover how we set our experience goals for IE9 and how we measured (and continue to measure) our progress toward these goals throughout the development cycle.

We set experience goals for all of the products we ship. These goals are at the product and “experience” (i.e., a meaningful unit of experience for people, not at the feature level) levels of analysis. We think about these goals in what we call a “Confidence Model.” This is a model that evaluates the product experiences across four dimensions – Useful, Usable, Desirable, and Principled. For each experience across each dimension, we have a “confidence rating” of how close we are to meeting (or exceeding) the experience goals. The dimensions of useful, usable, and desirable are common measures of experience across industries. In addition to those standards, we also evaluate our experiences in light of the Windows Experience Principles (more on these later). This is all an evolution of our process that we described in the Windows 7 Engineering Blog.

What we mean by “Useful, Usable, Desirable, and Principled”

When we think of experiences on these four dimensions, we have specific definitions in mind:

Useful” is the notion of value to people. We want the experiences in IE9 to help people do something they couldn’t do before, save them time, energy, and/or effort, and have things get noticeably better in a way that matters to people.

Usable” is what people typically think about when considering the “usability” of a product, site, device, etc. We measure how Usable an experience is on the standard factors – task success, time to completion, ease of use/setup, comprehension, confidence, security, etc.

Desirable” is how an experience evokes the intended emotional response and perception. We have general aspirations for any experience (e.g., love) but then choose specific emotions each experience should elicit (e.g., feelings of control, efficiency, connectedness, clean).

Principled” is how an experience expresses the Windows Experience Principles. We review each experience at regular points in the development cycle to make sure it is expressing our Principles. These principles are inspired by data, informed by our values, and tweaked by experience. They are not strict rules, but rather values we aspire to and shape all the experiences we design in our products. We first talked about our “design principles” for Windows 7 back at PDC 2008. Since then, we have evolved these into our current Experience Principles. Later in this post, we’ll cover how we used one of the principles to inform a design decision in IE9.

How we set and track goals

Across those four dimensions, we set goals for each of a set of experiences. These goals have multiple inputs including previous experience research, design explorations, and technical investigations. The goals are generated, iterated on, and agreed upon by the entire engineering team. They become the team-wide agreement of “what success looks like” for an experience and the bar we all hold ourselves to. We then establish specific metrics for tracking progress toward each goal.

For example, we had specific Useful goals for some of the general browsing experiences: People articulate the value of having a fast and fluid browsing experience as:

  • They can get to their sites in fewer steps than in previous versions of IE
  • They see that pages load faster than previous versions of IE
  • They save time on their common and frequent tasks compared to with previous versions of IE
  • They are aware of what is slowing down their experience

Progress toward these goals is measured primarily through qualitative feedback from participants in our research labs and also through community feedback.

After goals for each experience are set, we track them throughout the development cycle. For IE9, as with all our products, we used a large set of methods in our research, but there are three we rely on the most – lab studies, field research (e.g., visits to people’s homes) and usage instrumentation. Each of these methods has their own strengths and they give us different lenses to look at and understand the experiences we are building. During IE9 development, we conducted six different lab studies with about 60 different participants. These lab studies are great for understanding experiences as we moved from prototypes to working code and allowed us to control for whatever influences may bias our results. In addition to the lab studies, we also went on many site visits to people’s homes to give us a look at their experiences over time, in everyday settings (e.g., their living rooms), and produce specific examples of how people actually use our products in their lives. The insights we gained on our initial site visits in the summer of 2009 around how people were using web sites like their other applications was highly influential on our plans to build what eventually became Pinned Sites. Lastly, usage instrumentation of how people are currently using IE8 and IE7 gave us the huge datasets (tens of millions of people, hundreds of millions of sessions as Dean mentioned in his post) required to know what the common and frequent behaviors and patterns of usage are with our products. This data informed our decisions about which behaviors we optimized for in IE9.

We are constantly evaluating our experiences with these methods and the team always has a pulse on if we are on the right trajectory to meet our goals. We make adjustments to our products and experiences based on our research, something we call “data-informed decision making.”

Goals for Internet Explorer 9

As Jane described in her post, we started with three overarching goals for the experiences in Internet Explorer 9 – Sites Shine, A Natural Extension of Windows 7, and Fast, Safe, and Reliable. Throughout development, we evaluated our progress against these goals in general and specific goals for different experiences.

For example, here are some of the specific Usable goals for our navigation experiences:

People can get to where the way to go fast. This means people:

  • Successfully get to sites they want to visit
  • Successfully get to sites they have previously visited within a timeframe that meets or exceeds expectations
  • Accurately articulate when a page is finished loading
  • Successfully pin sites

It’s important to note that these goals are not for a specific feature. They span many features including Address bar, New Tab Page, Pinned Sites, and progress indication, because all of these contribute to the experience of navigation to sites with the browser.

For the last goal of “Successfully pin sites,” we originally had problems with the design. You could pin by dragging the icon from the Address bar to the Taskbar and dragging the site from the New Tab Page, but not by dragging the tab the site was in to the Taskbar. Through our research, it became clear that this was one of the top ways people attempted to pin sites when trying the feature. This video clip shows one representative participant whose first instinct was to drag tabs to the Taskbar to pin them (you’ll also see the red dot of some eye-tracking research we were doing during that study as well):

Based on this research, we knew that enabling pinning sites by dragging the tab to the Taskbar would meet people’s expectations and remove one more hurdle to people having the sites they love, need, and want at their fingertips. We then verified this decision with further research. Here is a participant pinning a site by dragging a tab to the Taskbar in a later build:

Over the course of these studies, our confidence that we were on track increased as we made changes to the design and verified we were improving the experience. This occurred across most of the features of the product. This is just one example.

For another example, here are some of the specific Usable goals for our fluid browsing experiences (different from the ones for the navigation experiences above):

People can fluidly move among sites . This means people:

  • Successfully see sites side-by-side (with or without tabs)
  • Successfully find the functionality they use most often
  • Successfully get to their homepage after navigating to different sites
  • Successfully recognize whether a webpage is secure or not without prompting
  • Successfully queue multiple tabs

For the first goal of “Successfully see sites side-by-side (with or without tabs),” we evaluated many different designs to accomplish this goal. One of the tools we used to choose the design we built is the Experience Principle of “Reduce concepts to increase confidence.” This Principle is about taking advantage of what people already know and introducing new concepts only when necessary. We try to make only meaningful distinctions among concepts that people will understand and get value from.

As described in Jane’s earlier post, you can drag tabs out of Internet Explorer windows and directly Snap them to one side of the screen or the other on Windows 7. We deliberately built on what people were already using in Windows 7 and extend that experience to Internet Explorer 9. No relearning, no new concepts. People who use Windows 7 Snap already know how to use Snap with Internet Explorer 9.

These are just two examples of how we set goals for experiences and then evaluate their progress on the dimensions of Useful, Usable, Desirable, and Principled. We do this for all of the experiences and products across Internet Explorer, Windows, and Windows Live.

Setting goals for experiences across the four dimensions – Useful, Usable, Desirable, and Principled – and tracking our progress toward these goals across the development cycle is an important part of shipping the experiences we believe in. We hope you enjoy using IE9 beta. Please send your feedback in the comments and on Connect.

Jess Holbrook
User Experience Research Lead
Internet Explorer

  • Loading...