by Asavin Wattanajantra

When the Microsoft Kinect came out it extended the Xbox 360 audience, making it attractive to people who weren’t necessarily traditional gamers, providing a physical, fun and interactive experience. But it’s also become extremely interesting to researchers, looking to develop new technology we’ve only seen in the movies.

One of the most interesting science fiction films in the last few years was Minority Report. Set in 2054, John Anderton, the character played by Tom Cruise, is using a multi-touch display in the air making gestures we know so well for our smartphones.

More recently the Iron Man films see Robert Downey Jr playing billionaire playboy philanthropist Tony Stark, using a similar gesture based multi-touch computer system. With this he can interact with and manipulate 3D holographic models as he plays around with cool features for his Iron Man suit.

Perceptive Pixel shows off this type of technology on a large screen, but what about the hand waving excellence we see in these films? It's a more difficult challenge because in the air you're not actually touching anything.

But this year, a Microsoft Research machine-learning project showed how hand gestures could become a reality for controlling the Kinect you use with your Xbox at home. For the research, images of people's hands were used to train Kinect in determining whether a hand in front of the sensor was open or closed.

Once the Kinect was able to detect a hand grip, it allowed a user to make hand gestures in the air to drive applications on Kinect for Windows.  You could do things like open and close applications, draw and paint, or use mapping technology (like Cruise does in the movie).

Virtual reality on Kinect

Read enough science-fiction literature or watch enough movies and you'll encounter a reference to virtual reality. In Star Trek, characters get to play in a holodeck which allows them to play out their fantasies. In the cyberpunk book, Snow Crash, hackers plug into a collective shared virtual reality-based internet, populated by other user-controlled avatars.

We're still quite a long way from this advanced virtual reality – a 3D Second Life-style online world we can interact with physically, rather than just on a screen. But holograms such as the ones in Star Wars are already here, and augmented reality is something we're already seeing with our smartphones. Research recently done by Microsoft is looking to push this even further.

The Kinect has attracted a lot of research interest due to its sensors, which function as depth cameras. These emit light to the environment and capture a scene's depth information using a structured-light technique. In practice, this could see applications such as 3D shape scanning and model generation.

Recently, Microsoft researchers took this a step further, exploring the possibilities of using the Kinect to capture a 3D scene in real-time and reconstructing it. In the past this has been done with laser scanners which are expensive and slow, or multiple colour cameras which are inaccurate, particularly on surfaces with no texture.

One way would be to move a standard Kinect camera to create 3D reconstructions of an indoor scene. The depth data tracks the 3D pose of the sensor and reconstructs a geometrically precise real-time 3D model of the physical scene. A second way would be to use multiple depth cameras and a 3D depth reconstruction algorithm.

There are a number of practical applications for this technology, such as more realistic forms of augmented reality. The 3D virtual world could be directly overlaid onto and interact with its real-world representation. And because the Kinect also detects your movements, it's possible to interact with the virtually created scene via imagine interactive 3D video.

There's potential for Free-Viewpoint TV, where you can interactively control the viewpoint, generating views of a scene from any position. Imagine watching football and moving the camera where you wanted, as the match was playing?

Technology moves quickly and sometimes unpredictably. Who could have guessed the path of mobile phone technology development, or that the internet would redefine our lives? And what will 2054 usher in, when Minority Report is set?  With researchers daily pushing back the boundaries of our reality, the future may come sooner than you think – and very likely on the “normal” devices you use at home. Is the toaster watching you yet?