Samsung SUR40 with Microsoft PixelSensesamsunglfd.com/solution/sur40.do
More than one attendee of the Multitouch and Surface Computing Workshop at CHI 2009 made sure that I was aware of a touch-based informational kiosk at the conventional center… Note that there are three cues to the user to hint that it is touch-enabled. For applications that run on Microsoft Surface, I find that users tend to know that they should use touch to interact with the unit when there is some “organic” or random movements of some UI objects. Our Water (Attract) is a great example. Additionally, the use of a “severed” hand (as seen in the photo) ought to be considered carefully for deployment in certain culture.
Over 70 HCI students, researchers and practitioners—representing the best brains on multitouch research—gathered in a room at CHI 2009 and talked about how we—collectively as the multitouch research community—can advance the field. This Special Interest Group session was organized by our very own Daniel Wigdor and Joe Fletcher, along with Gerald Morrison from SMART Technologies. Edward Tze was also there to present a case study.
Jeff Han (Perceptive Pixel) and I joked about the very funny SNL skit with Fred Armisen. After the session, Jeff dropped by the Microsoft booth to give the Newsreader application a spin. (Photo credit: Leslie Scott, UX Researcher of MS Office!)
I was preparing our booth at the annual CHI conference when the staff of the Massachusetts Convention Center Authority (MCCA) showed up. These are the gentlemen and ladies who prepare and maintain the center for the rest of us. Without them, there will be a few thousand very confused and caffeine-deprived HCI scientists and students… so these guys spotted the Microsoft Surface and were absolutely intrigued. I proceed to give them a demo and had them interact with the unit. One guy said, “Gentlemen, this is the future!” They stuck around and played with the unit for quite a while until their supervisor came and had them go back to work.