Last Friday at SIGCSE Craig Mundie gave his keynote titled “Rethinking Computing.” As I see it the key point the made was the idea of a natural user interface replacing the graphical user interface we commonly use today. As part of his talk he showed two demos. The first was a computerized receptionist that Microsoft Research is working on. This receptionist uses cameras and microphones to gather information about the people who approach it. It tries to determine if people are together or coming separately. It tries to determine who is saying what an based on that information it books rides on the company shuttles. It’s a pretty fair deal of complex computation. Multiple processors, I believe he said 8, are working at 40% of capacity just scanning the hallway for people to determine if it has to go to work. But it results in a robot of sorts that can interact with people. Frankly the demo feels like someone out of a science fiction movie to me. But if this sort of thing become more practical maybe the future is not so far away. The second demo was of the Microsoft Surface device. This device also uses cameras to sense what is happing on the surface of the screen and to react accordingly. Multi-touch - touching multiple points on the screen at a time - is supported so that not only can one person use both hands or all their fingers but several people can all do things at the same time. You’ve seen stuff like this in the movies but this is real. There was a Surface device in the Microsoft booth and it was a star attraction. The sample programs got quite a work out and included software to handle online portfolios in schools, multi-player games and puzzles, and simple demos that showed the device reacting to specific things places down on it. The people trying it out seemed to be thinking of all sorts of additional possibilities as well. I had been starting to think that I/O devices like Xbox 360 controllers, easily programmed against using XNA, were a new item but clearly they are only a fractional step into the future. It’s enough to start people thinking though. Where are we going with interfaces between people and computers? CHI or HCI depending on your thinking (Computer Human Interface or Human Computer Interface) is going to change. We’re going to go beyond keyboards and mice just as we went beyond punch cards and line printers. I think we’re just starting.
I remember reading about a Smartboard like setup that Microsoft was working on. Small cameras attached at the bottom of any surface to do basically the same thing as the Surface. Any sight of such?
There are a number of related research products going on. I don't have dates or specific information on any of them at this time. I will say that Windows 7 now in beta supports multi-touch using other technologies now. I'm not sure how much like Surface it really is though.