Notes on comments.
Welcome to our blog dedicated to the engineering of Microsoft Windows 7
We’ve come a long way in engineering Windows 7 since we first provided an engineering preview of Windows 7 and the work we are doing to support the touch interface paradigm back at the D: All Things Digital conference. We chose to kick-off the discussion about engineering Windows 7 with touch scenarios because we know this is a long-lead effort that requires work across the full ecosystem to fully realize the benefit. For Windows 7, touch support is engineered by building on our advances in input technology we began with the TabletPC work on Windows XP. Touch in Windows 7 requires improvements in hardware, driver software, core Windows user experience, and of course application support. By having this support in an open platform, consumers and developers will benefit from a wide variety of choices in hardware, software, and different PC form factors. Quite a few folks have been a little skeptical of touch, often commenting about having fingerprints on their monitor or something along those lines. We think touch will become broadly available as the hardware evolves and while it might be the primary input for some form factors (such as a wall mounted display in a hospital, kiosk, or point of sale) it will also prove to richly augment many scenarios such as reading on a convertible laptop or a “kitchen PC”. One of my favorite experiences recently was watching folks at a computer retailer experience one of the currently available all-in-one touch desktops and then moving to another all-in-one and continuing to interact with the screen—except the PC was not interacting back. The notion that you can touch a screen seems to be becoming second nature rather quickly. This post is our first dedicated blog on the subject. This is a joint effort by several people from the touch team, mostly Reed Townsend, Dave Matthews, and Ian LeGrow. -Steven
Windows Touch is designed to enhance how you interact with a PC. For those of us that have been living and breathing touch for the last two years we’re excited to be able to deliver the capability to people using Windows 7. In this blog we’re going to talk about what we’ve done to make Windows touchable. We approached this from a number of different directions: key improvements to the core Windows UI, optimizing for touch in key experiences, working with hardware partners to provide robust and reliable touch PCs, and providing a multitouch platform for applications.
With Windows 7 we have enriched the Windows experience with touch, making touch a first-class way to interact with your PC alongside the mouse and keyboard. We focused on common activities and refined them thoughtfully with touch in mind. You will have the freedom of direct interaction, like being able to reach out and slowly scroll a web page then flick quickly to move through it. With new touch optimized applications from creative software developers you will be able to immerse yourself as you explore you photos, browse the globe, or go after bad guys in your favorite games.
While providing this touchable experience, we made sure you are getting the full Windows 7 experience and not a sub-set just for touch. We’ve been asked if we are creating a new Touch UI, or “Touch Shell” for Windows – something like Media Center that completely replaces the UI of Windows with a version that is optimized for touch. As you can see from the beta, we are focused on bringing touch through the Windows experience and delivering optimized touch interface where appropriate. A touch shell for launching only touch-specific applications would not meet customers’ needs – there would be too much switching between “touch” mode and Windows applications. Instead, we focused our efforts on augmenting the overall experience so that Windows works great with touch.
We took a variety of approaches – some broad, and some very targeted to support this goal:
Overall, the Windows Touch features are designed to work together to deliver a great end-to-end touch experience. For example, the goal with IE8 was to deliver a seamless touch browsing experience, this includes the panning, zooming, URL entry, and several interface enhancements. For this reason, all the new touch features require the presence of a multi-touch digitizer – more on that further down.
The Windows Touch gestures are the basic actions you use to interact with Windows or an application using touch. As we noted above, because the gestures are built into the core of Windows, they are designed to work with all applications, even ones that were never designed with touch in mind.
Our mantra with gestures has been “Predictable + Reliable = Habits”. To be predictable the action should relate to the result – if you drag content down, the content should move down. To be reliable, the gesture should do roughly the same action everywhere, and the gesture needs to be responsive and robust to reasonable variations. If these conditions are met then people are far more likely to develop habits and use gestures without consciously thinking about it.
We’ve intentionally focused on this small set of system-wide gestures in Win7. By keeping the set small we reduce misrecognition errors – making them more reliable. We reduce latencies since we need less data to indentify gestures. It’s also easier for all of us to remember a small set! The core gestures are:
For touch gestures, seeing them in action is important so here is a brief video showing the gestures in action:
In order to make the gestures reliable, we tuned the gesture detection engine with sample gesture input provided by real people using touch in pre-release builds; these tuned gestures are what you will see in the RC build. We have a rigorous process for tuning. Similar to our handwriting recognition data collection, we have tools to record the raw touch data from volunteers while they perform a set of scripted tasks. We collected thousands of samples from hundreds of people. These data were then mined looking for problems and optimization opportunities. The beauty of the system is that we can replay the test data after making any changes to the gesture engine, verifying improvements and guarding against regression in other areas.
This has led to several important optimizations. For example, we found that zooms and rotates were sometimes confused. Detecting zoom gestures only in applications that don’t use rotation has resulted in a 15% improvement in zoom detection.
Further analysis showed that many short gestures were going unrecognized. The gesture recognition heuristics needed to see 100ms or 5mm worth of data before making a decision about what gesture the user was performing. The concern that originally led to these limits was that making a decision about which gesture was being performed too early would lead to misrecognition. In fact, when we looked at the collected user data, we found we could remove those limits entirely – the gesture recognition heuristics performed very well in ambiguous situations. After applying the change and replaying the collected gesture sample data, we found zoom and rotate detection improved by about 6% each, and short scrolling improved by almost 20%!
Gestures are built into the system in such a way that many applications that have no awareness of touch respond appropriately, we have done this by creating default handlers that simulate the mouse or mouse wheel. Generally this gives a very good experience, but there are applications where some gestures don’t work smoothly or at all. In these cases the application needs to respond to the gesture message directly.
In Windows, several experiences have been gesture enabled. We’ve spent a considerable amount of effort on IE8 – ensuring scrolling and zooming are smooth and that back and forward are at your fingertips. Media Center, which is a completely custom interface ideally suited to touch, added smooth touch scrolling in galleries and the home screen. The XPS Viewer has gesture support that will could become a model for many document viewing apps. Scrolling and zoom work as you would expect. When zooming out beyond a single page, pages start to tile so you can view many at a time. When zoomed out in that fashion, double tapping on any page jumps back to the default view of that page. A two-finger tap restores the view to 100% magnification. These predictable behaviors become habit forming quickly.
A major benefit of the Windows ecosystem is diversity – PCs come in all shapes and sizes. To help ensure that there is a great Windows Touch experience across the many different types of PCs we have defined a set of measurements and tests for Windows Touch that are part of the Windows Logo. We’ve been working with touch hardware partners since the beginning of Windows 7 to define the requirements and ensure they are ready for launch.
Our approach has been to provide an abstraction of the underlying hardware technology. We’ve specified a requirements for the quantitative aspects of the device, such as accuracy, sample rate, and resolution, based on the requirements to successfully enable touch features. For example, we have determined the necessary accuracy values for a device so people can successfully target common UI elements like close boxes, or what sample rate and resolution are required to ensure quality gesture recognition.
The requirements form the basis for the Windows Touch logo program. For consumers, the logo tells you that the PC and all of its components are optimized for Windows. Component level logo, which is what we grant to Touch digitizers helps the OEMs choose a device that will deliver a great touch experience.
Based on the quantitative requirements, we built an interactive test suite that includes 43 separate tests, all validating the core requirements under different conditions. There are single point accuracy tests at various locations on the screen, including the corners which are often harder for accuracy but critical to Windows. There are also several dynamic tests where accuracy is measured while drawing lines on the screen – see the screenshot below of Test 7. In this test, two lines are simultaneously drawn using touch along the black line from the start to the end. The touch tracings must remain within 2.5 mm of the black line between the start and end points. The first image below shows a passing test where the entire tracing is green (apologies for the fuzziness – these are foot long tracings from a large screen that have been scaled down).
Figure 1: A passing line accuracy test from the Windows 7 Touch logo test tool
Not all devices pass the tests. Below is a screenshot of a device that is failing. This one has some noise – notice the deviation from the line in red. These errors need to be resolved before it would receive the logo. Errors like this can result in misrecognized gestures.
Figure 2: A failing line accuracy test from the Windows 7 Touch logo test tool
To ensure repeatability of the tests, we’ve built a set of plastic jigs with tracing cut-outs, see photo below. This particular jig is used for 5 of the tests and measures accuracy while tracing an arc.
Figure 3. Plastic jigs with tracing cut-outs for testing.
The testing tool is available to our partners now, we’re working closely with several of them to help tune the performance of their devices to meet the requirements and deliver a great touch experience. We have set-up an in-house testing facility that will be testing every device submitted for Logo.
With the Release Candidate, OEMs and IHVs will be able to finalize the logo process for systems designed for Windows 7. Today we already have several hardware partners that have provided us with devices and drivers for testing.
We also want to talk a little about the touch platform for software developers. Windows 7 provides a rich touch platform for applications. We’ve already mentioned gestures, there’s also a lower level platform that gives developers complete control over the touch experience. We think about it in a Good-Better-Best software stack.
The “good” bucket is what touch-unaware applications get for free from Windows 7. Windows provides default behaviors for many gestures, and will trigger those behaviors in your application in response to user input. For example, if someone tries touch scrolling over a window that is touch-unaware, we can detect the presence of various types of scrollbars and will scroll them. Similarly, when the user zooms, we inject messages that provide an approximation of the zoom gesture in many apps. As a developer you can ensure that the default gestures work just by using standard scrollbars and responding to ctrl-mouse wheel messages.
The “better” bucket is focused on adding direct gesture support and other small behavior and UI changes to make apps more touch-friendly. For instance, there is a new Win32 window message, WM_GESTURE (preliminary MSDN docs), that informs the application a gesture was performed over its window. Each message contains information about the gesture, such as how far the user is scrolling or zooming and where the center of the gesture is.
Applications that respond to gestures directly have full control over how they behave. For example, the default touch scrolling is designed to work in text centric windows that scroll primarily vertically (like web pages or documents), dragging horizontally does selection rather than scrolling. In most applications this works well, but if an app has primarily horizontal scrolling then the defaults would have to be overridden. Also, for some applications the default scroll can appear chunky. This is fine with a mouse wheel, but it feels unnatural with touch. Apps may also want to tune scrolling to end on boundaries, such as cells in a spreadsheet, or photos in a list. IE8 has a custom behavior where it opens a link in a new tab if you drag over it rather than click it.
In addition to gestures, there are subtle optimizations applications can make for touch if they check to see if touch is in use. Many of the subtle touch behavior optimizations in Windows were enabled in this manner. Larger Jump List item spacing for touch, larger hot spots for triggering window arranging, and the press and hold behavior on the desktop Aero Peek button with touch are all features written with the mouse in mind, but when activated via touch use slightly different parameters.
Applications or features that fall into the “best” bucket are designed from the ground up to be great touch experiences. Apps in this bucket would build on top of WM_TOUCH – the window message that provides raw touch data to the application. Developers can use this to go beyond the core system gestures and build custom gesture support for their applications. They can also provide visualizations of the touch input (e.g. a raster editing application), build custom controls, and other things we haven’t thought of yet!
We also provide a COM version of the Manipulations and Inertia APIs from Surface. The Manipulations API simplifies interactions where an arbitrary number of fingers are on an object into simple 2D affine transforms and also allows for multiple interactions to be occurring simultaneously. For instance, if you were writing a photo editing application, you could grab two photos at the same time using however many fingers you wanted and rotate, resize, and translate the photos within the app. Inertia provides a very basic physics model for applications and, in the example above, would allow you to “toss” the photos and have them decelerate and come to a stop naturally.
We’ve previously demonstrated, Microsoft Surface Globe, an interactive globe done in partnership with the Surface effort. Spinning the globe works as you would expect from a real-world globe, but with a touchable globe you can grab and stretch the view to zoom in, rotate, and move the view around. Interacting with the globe and exploring the world is the majority of the UI, and it is exceedingly easy to use with touch. Other features like search and adding markers to the map have also been designed with touch in mind.
Here’s another video to get an idea of what we’re talking about:
We’re eagerly looking forward to seeing new touch-optimized user interfaces and interactions. If you’re thinking about writing touch applications or adding touch support to your existing app, you should start with the MSDN documentation and samples.
We’ve noted several touch updates in the RC. If you have the Windows 7 Beta you can experiment with touch using a PC that supports multiple touch points. Please note that the multitouch PCs available today were developed while the Windows 7 requirements were also defined, so while we believe they can support Windows 7’s requirements, only the maker of the PC can provide the logoed drivers for Windows 7 and support the PC on Windows 7. Keeping that caveat in mind, today there are a few multitouch PCs on the market:
To enable multitouch capabilities on these PCs running the Windows 7 Beta you will need to make sure you have the latest multitouch beta drivers. Remember these are pre-release drivers and are not supported by Microsoft, Dell or HP. And again, they still need to pass through the Windows Logo process we described above before they are final.
We often get asked about single-touch PCs. Will they work with Windows 7? There are many types of hardware available for touch and many screens and PCs can provide single touch (usually based on resistive touch technology). A single-touch PC will have the same functionality on Windows 7 as it does on Vista, but this functionality will not be extended to the Windows 7 capabilities. As we noted earlier, Windows Touch in Windows 7 is comprised of a collection of touch enhancements, several of which require multitouch, that work together to deliver a great end-to-end touch experience.
As form factors change and the demands of our user interfaces change, input methods change and grow as well. We’re excited about the unique benefits touch offers the user, and the new places and new ways it enables PCs to be used. We expect PCs of all form factors and price points to provide touch support and so it makes sense that these PCs will be able to take advantage of the full range of Windows 7 capabilities.
Windows 7 is designed to provide efficient ways to use multitouch for the most common and important scenarios, while being a natural and intuitive complement to the mouse and keyboard people use today.
Keep in Touch!
- Windows Touch Team
This is judged only by the video provided, but shouldn't the image viewer give rotate feedback in realtime? That is, follow the input and snap to the nearest position in the end? The way it is done in the video seems to be a bit rough and more like "traditional user experience".
If the application does not process the message, it must call DefWindowProc. Not doing so will cause the application to leak memory because the touch input handle will not be closed and associated process memory will not be freed.
As of today, the touch functionality under Windows 7 will cause many legacy business apps to crash.
The reason: Many existing business apps have been written in VB6 and use the common controls OCX of VB5/6 (comctl32.ocx and mscomctl.ocx)to create list views and treeviews.
These OCXs crash if any program runs that implements WinEvent hooks. The touch functionality, the TabletPC tools, narrator, and some more use WinEvent hooks, causing all VB6 apps to crash if they use the common control OCXs.
This problem is described in the KB article at http://support.microsoft.com/kb/896559.
In 2005, MS provided a "fix" for the components that should resolve the issues. However, it doesn't fix all issues, and as a result, the fix is useless.
Thus, MS must fix the comctl32.ocx and mscomctl.ocx components, or touch will not succeed in the business computing world.
I'm very excited for touch -- I've been using a Samsung Q1 as one of my main Win7 test machines. The experience is very good, even though it is only single-input touch. While I would really like inertial scrolling throughout Explorer on the device, I understand the technical limitations from talking to the Touch team in the newsgroups.
Touch is the perfect accent for a Win7 machine. While usable without it, adding it in makes Windows that much more... concrete, I guess.
Very good work, team.
Now, its done with touching? Then, maybe you all can wake up and start adding the functionality that lots of users of NON touch devices (like, I don´t know, maybe just 90% of windows users?) are demanding?
Where is a customizable Windows explorer toolbar and statusbar?
Where is the possibility to choose the pictures to be imported from a digital camera, and not having to import everything or nothing?
Where is the possibility to "lock" a folder, and STOPS the really annoying windows explorer habit of changing the view styles?
Can we expand the All Programs in start menu? Or, the classical menu? Is it back?
Will Windows Live Messenger quietly in the tray, where it belongs, and not taking space in my taskbar?
Windows Media Player 12 already have the same "search for radio" functionalty as Windows Media Player TEN had, and then, I don´t know why, was lost?
Can we change the "hardcoded" windows hotkeys? What If I want the winkey + E to open my libraries? what if I liked that behaviour? can I revert back, now it opens again in "My computer"?
Can I create a shortcut to "New Folder" in explorer?
and, so on... but, sorry, I see you all was soooooo busy working with touch, to please that 1% of users...
on the touch input - great job in putting in Win7. I've been wanting a touch-capable OS for a while now.
Since Win7 allows desktop slideshows, if I slide/flick left or right on the desktop, will that make it advance to the next/previous picture? That would be a great little feature.
How much time and money is MS going to waste on this before it is clear that this is doomed for failure?
Ok, first, the jigs are nice and repeatable but not for a human. Sure, you might be able to read a perfect curve, but many of us can't draw them - a few deliberately sloppy jigs would definitely be an asset to testing.
Except that no one wants a screen full of greasy figureprint trails. I have seen people go psychotic with the windex over just a SINGLE fingerprint.
All that aside, I have still yet to find a single touch sensitive display that isn't handheld.
Interesting idea for an iPhone or Palm, ok for a display you see from 20 feet away over the tv - SUCKY AS ALL GET OUT if you are in arms reach.
I'm not sure what will ruin the experience for me more. Dragging my fingers through other people's peanutbutter smears or knowing the screen I have to touch has probably been sneezed on by every sick person who has looked at that day.
And ya, I feel the same way when I have to use someone else's keyboard. I have seen some MIGHTY disgusting keyboards. Turning the screen into high touch zone.
Ya, ask the parent of any 2 year old how well that works out.
I bought a HP TX2 laptop last month and put the beta on it with the special driver for multi touch. Touch team you have done a splendid job, I have an iPhone and never thought that you would be able to get close to that on a computer.
What I like:
• The finger scrolling is great, I use that everywhere I can now
• I never noticed the extra spacing on the start bar lists, but now that I do I think that was brilliant
• The window placement stuff is great where you can drag windows to the top or edge of the screen,
What you should fix
• Scrolling does not work in Firefox (which is my favorite program)
• Some web pages don’t work very well. I use google maps a lot and it doesn’t work.
• Window resizing is hard, I need to use my mouse for that but actually I have most windows full screen so I just drag them to the top which is fine mostly
• Powerpoint doesn’t work very well with touch
• Picture rotation should show a picture rotating it is confusing now
Where can I get the globe program? Is that available somewhere for download?
Sabu23 i believe they are using Live maps
heres the link, click "3d" to download the browser app.
Hrm, I have not personally used a Single or Multi-touch PC is YEARS (yes, I did use a good ol' fashioned single touch back in the 90's for work for a while...) but I have to say, this looks pretty cool.
While I have to agree that touch should hardly be a pressing concern for you guys (what with over 98% of users using a mouse), I don't see what the problem with offering this functionality is.
With all that said, it would be very nice to see MacBook like multi-touch functionality on a trackpad for us current generation notebook users! The technology is there (I am 100% certain my synaptics pad supports multi-touch hardware functionality - I have seen a colleague do it on his notebook within Linux, and it says so on their website. Link: http://www.synaptics.com/solutions/technology/gestures), so it seems clear that this should be something that Windows7 users should be able to utilize in its entirety. Enhanced Gesture Recognition (EGR), Flick, Two-finger Flick, ChiralMotion™, and ChiralRotate™; they should all be compatible with Windows7.
A response from the team on this point would be excellent, as I have to say, this is one of the make or break aspects of Windows7 for me.
- AeonSlayer / Simon
I wont buy Multitouch monitor :(
Tihiy relax, that's standard behavior of a default window procedure. (I think ;) )
Is this touch tech pure about touching your screen??
Because i am not really interested in that and i think 90%-99% of all windows users will also not use this touch tech if you ask me. I dont believe in touching your screen, on a desktop this will really be a no go (because you are further away from your screen) and i personally also dont want my laptop to support that.
No what i want is full multi touch support on my laptops touchpad (yes just as the competitor) that is in my eyes where windows "touch" is really lacking and that would be used by many many if not all laptop users..
Or is this touchpad multi touch purely a hardware/driver issue? Why is it then in the windows pc world not really happening?
"Ok, first, the jigs are nice and repeatable but not for a human."
These jigs are used to test the ability of a digitizer to translate pen position to screen position. Since the pen detection process is stochastic, multiple trials need to be taken to ensure the digitizer works properly, and thus the tests need to be the same. This has nothing to do with drawing perfect curves free hand. If the digitizer passes this test, it should be able to translate anything accurately onto the screen. If you'd actually ever used a tablet, you'd know that drawing on a good screen is like drawing on paper.
"no one wants a screen full of greasy figureprint trails"
The way I deal with fingerprints is a quick swipe of a micro fibre cloth. Cleans them up quick enough. My screen is cleaner than most people's who don't have touch screens.
"Dragging my fingers through other people's peanutbutter smears or knowing the screen I have to touch has probably been sneezed on by every sick person who has looked at that day."
How is this a problem of touch screens only? Any input surface of the computer will be germy.
@AeonSlayer: I have "Two-finger Flick" on my Eee trackpad working on XP. They have drivers for Vista too. What is it that MS has to support, other than perhaps making a baseline driver so the OEM doesn't have to?
@Xepol: "Except that no one wants a screen full of greasy figureprint trails" -- actually get one before you over worry about such things. ATM machines have them. My iPhone was one. I thought it would be awful to not have a physical keyboard on it, but its fine. Same with my TouchSmart (meaning using the screen, not implying that I don't use the keyboard on the PC).