The Power of PixelSense™ - Microsoft PixelSense Blog - Site Home - MSDN Blogs

The Power of PixelSense™

Rate This
  • Comments 5

Today marks the release of a video the Surface team put together highlighting the power of PixelSense™. Microsoft’s PixelSense, in the new Samsung SUR40 for Microsoft Surface, allows a display to recognize fingers, hands, and objects placed on the screen, enabling vision-based interaction without the use of cameras.  The individual pixels in the display see what’s touching the screen and that information is immediately processed and interpreted.

Think of it like the connection between the eye and the brain. You need both, working together, to see. In this case, the eye is the sensor in the panel, it picks up the image and it feeds that to the brain which is our vision input processor that recognizes the image and does something with it. Taken in whole…this is PixelSense technology. 
We’ve gone behind-the-scenes to show you the creation of the technology and some of the people involved. It’s a little longer than most web videos but we wanted to go deeper than normal and really explain what’s going on.

In conjunction with the video, let’s walk through the high-level steps of how PixelSense actually works:

  1. A contact (finger/blob/tag/object) is placed on the display
  2. IR back light unit provides light (though the optical sheets, LCD and protection glass) that hits the contact.
  3. Light reflected back from the contact is seen by the integrated sensors.
  4. Sensors convert the light signal into an electrical signal/value.
  5. Values reported from all of the sensors are used to create a picture of what is on the display.
  6. The picture is analyzed using image processing techniques.
  7. The output is sent to the PC. It includes the corrected sensor image and various contact types (fingers/blobs/tags).

Right now PixelSense is only available in the Samsung SUR40 for Microsoft Surface and we believe it’s going to change the way you interact with touch-enabled content.

  • So the question to ask next.... when are we getting the 2.0 SDK?  It's now Summer 11, so hopefully soon.  Keep up the good work.

  • I love this technologie :)

  • @shaggygu, technicaly, summer extent up to september 21. I won't be surprise at all that the sdk 2.0 come only after de Build conference september 13, where the faith of WPF will be exposed.

  • @fred.  good point.  i hope the sdk along with wpf goodness will be announced.  cross fingers!

  • hi, how does pixelsense handle intense light (ie. reflectors above it) or changing light conditions? IR cameras often get... confused by intense light - thinking that a reflector is a IR blob. how does pixelsense handle that? any experience in using SUR40 in daylight?

Page 1 of 1 (5 items)
Leave a Comment
  • Please add 8 and 4 and type the answer here:
  • Post

The Power of PixelSense™