Getting technology to do what you want can be challenging. Imagine building a remote-controlled robot in 6 weeks, from pre-defined parts, which can perform various tasks in a competitive environment. That’s the challenge presented to 2,500 teams of students who will be participating in the FIRST (For Inspiration and Recognition of Science and Technology) Robotics Competition.
The worldwide competition, open to students in grades 9-12, kicks off this morning with a NASA-televised event, including pre-recorded opening remarks from Presidents Clinton and G.W. Bush, Dean Kamen, founder of FIRST and inventor of the Segway, and Alex Kipman, General Manager, Hardware Incubation, Xbox.
Last year, several FIRST teams experimented with the Kinect natural user interface capabilities to control their robots. The difference this year is the Kinect hardware and software will be included in the parts kits teams receive to build their robots. Teams will be able to control their robots not only with joy sticks, but gestures and possibly spoken commands.
The first part of the competition is the autonomous period, in which robot can only be controlled by sensor input and commands. This is when the depth and speech capabilities of Kinect will prove extremely useful.
To help teams understand how to incorporate Kinect technologies into the design of their robot controls for the 2012 competition, workshops are being held around the country. Students will be using C# or C++ to program the drive stations of their robots to recognize and respond to gestures and poses.
In addition, Microsoft Stores across the country are teaming up with FIRST robotics teams to provide Kinect tools, technical support, and assistance.
While winning teams get bragging rights, all participants gain real-world experience by working with professional engineers to build their team’s robot, using sophisticated hardware and software, such as the Kinect for Windows SDK. Team members also gain design, programming, project management, and strategic thinking experience. Last but not least, over $15 million of college scholarships will be awarded throughout the competition.
“The ability to utilize Kinect technologies and capabilities to transform the way people interact with computers already has sparked the imagination of thousands of developers, students, and researchers from around the world,” notes FIRST founder Dean Kamen. “We look forward to seeing how FIRST students utilize Kinect in the design and manipulation of their robots, and are grateful to Microsoft for participating in the competition as well as offering their support and donating thousands of Kinect sensors.”
This morning’s kick-off of the 2012 FIRST Robotics Competition was a highly anticipated day. Approximately 2,500 teams worldwide were given a kit of 600-700 discrete parts including a Kinect sensor and the Kinect for Windows software development kit (SDK), along with the details and rules for this year’s game, Rebound Rumble. Learn how Kinect for Windows will play a role in this year’s game by watching the game animation.
Kinect for Windows team
Traditional digital animation techniques can be costly and time-consuming. But KinÊtre—a new Kinect for Windows project developed by a team at Microsoft Research Cambridge—makes the process quick and simple enough that anyone can be an animator who brings inanimate objects to life.
KinÊtre uses the skeletal tracking technology in the Kinect for Windows software development kit (SDK) for input, scanning an object as the Kinect sensor is slowly panned around it. The KinÊtre team then applied their expertise in cutting-edge 3-D image processing algorithms to turn the object into a flexible mesh that is manipulated to match user movements tracked by the Kinect sensor.
Microsoft has made deep investments in Kinect hardware and software. This enables innovative projects like KinÊtre, which is being presented this week at SIGGRAPH 2012, the International Conference and Exhibition on Computer Graphics and Interactive Techniques. Rather than targeting professional computer graphics (CG) animators, KinÊtre is intended to bring mesh animation to a new audience of novice users.
Shahram Izadi, one of the tool's creators at Microsoft Research Cambridge, told me that the goal of this research project is to make this type of animation much more accessible than it's been—historically requiring a studio full of trained CG animators to build these types of effects. "KinÊtre makes creating animations a more playful activity," he said. "With it, we demonstrate potential uses of our system for interactive storytelling and new forms of physical gaming."
This incredibly cool prototype reinforces the world of possibilities that Kinect for Windows can bring to life and even, perhaps, do a little dance.
Peter Zatloukal, Kinect for Windows Engineering Manager
This year’s Microsoft TechForum provided an opportunity for Craig Mundie, Microsoft Chief Research and Strategy Officer, to discuss the company’s vision for the future of technology as well as showcase two early examples of third-party Kinect for Windows applications in action.
Mundie was joined by Don Mattrick, President of the Microsoft Interactive Entertainment Business, and his Chief of Staff, Aaron Greenberg, who demonstrated both of the third-party Kinect for Windows applications, including the Pathfinder Kinect Experience. This application enables users to stand in front of a large monitor, and use movement, voice, and gestures to walk around the 2013 Nissan Pathfinder Concept, examining the exterior, bending down and inspecting the wheels, viewing the front and back, and then stepping inside to experience the upholstery, legroom, dashboard, and other details.
Nissan worked with IdentityMine and Critical Mass to create the Kinect-enabled virtual experience, which was initially shown at the Chicago Auto Show in early February. The application is continuing to be refined, taking advantage of the Kinect natural user interface to enable manufacturers to showcase their vehicles in virtual showrooms.
“Using motion, speech, and gestures, people will be able to get computers to do more for them,” explain Greenberg. “You can imagine this Pathfinder solution being applied in different ways in the future - at trade shows, online, or even at dealerships - where someone might be able to test drive a physical car, while also being able to visualize and experience different configurations of the car through its virtual twin, accessorizing it, changing the upholstery, et cetera.”
Also demonstrated at TechForum was a new kind of shopping cart experience, which was developed by mobile application studio Chaotic Moon. This application mounts a Kinect for Windows sensor on a shopping cart, enabling the cart to follow a shopper - stopping, turning, and moving where and when the shopper does.
Chaotic Moon has tested their solution at Whole Foods in Austin, Texas, but the application is an early experiment and no plans are in place for this application to be introduced in stores anytime soon. Conceivably, Kinect-enabled carts at grocery stores, shopping malls, or airports could make it easier for people to navigate and perform tasks hands free. “Imagine how an elderly shopper or a parent with a stroller might be assisted by something like this,” notes Greenberg.
“The Kinect natural user interface has the potential to revolutionize products and processes in the home, at work, and in public places, like retail stores,” continues Greenberg. “It’s exciting to see what is starting to emerge.”
The following blog post was guest authored by Celeste Humphrey, business development consultant at nsquared, and Dr. Neil Roodyn, director of nsquared.
A company that is passionate about learning, technology, and creating awesome user experiences, nsquared has developed three new applications that take advantage of Kinect for Windows to provide users with interactive, natural user interface experiences. nsquared is located in Sydney, Australia.
At nsquared, we believe that vision-based interaction is the future of computing. The excitement we see in the technology industry regarding touch and tablet computing is a harbinger of the changes that are coming as smarter computer vision systems evolve.
Kinect for Windows has provided us with the tools to create some truly amazing products for education, hospitality, and events.
Education: nsquared sky spelling
We are excited to announce nsquared sky spelling, our first Kinect for Windows-based educational game. This new application, aimed at children aged 4 to 12, makes it fun for children to learn to spell in an interactive and collaborative environment. Each child selects a character or vehicle, such as a dragon, a biplane, or a butterfly, and then flies as that character through the sky to capture letters that complete the spelling of various words. The skeleton recognition capabilities of the Kinect for Windows sensor and software development kit (SDK) track the movement of the children as they stretch out their arms as wings to navigate their character through hoops alongside their wingman (another player). The color camera in the Kinect for Windows sensor allows each child to add their photo, thereby personalizing their experience.
nsquared sky spelling
Hospitality: nsquared hotel kiosk
The nsquared hotel kiosk augments the concierge function in a hotel by providing guidance to hotel guests through an intuitive, interactive experience. Guests can browse through images and videos of activities, explore locations on a map, and find out what's happening with a live event calendar. It also provides live weather updates and has customizable themes. The nsquared hotel kiosk uses the new gestures supported in the Kinect for Windows SDK 1.7, enabling users to use a “grip” gesture to drag content across the screen and a “push” gesture to select content. With its fun user interface, this informative kiosk provides guests an interactive alternative to the old brochure rack.
Kinect for Windows technology enables nsquared to provide an interactive kiosk experience for less than half the price of a similar sized touchscreen (see note).
nsquared hotel kiosk
Events: nsquared media viewer
The new nsquared media viewer application is a great way to explore interactive content in almost any environment. Designed for building lobbies, experience centers, events, and corporate locations, the nsquared media viewer enables you to display images and video by category in a stylish, customizable carousel. Easy to use, anyone can walk up and start browsing in seconds.
In addition to taking advantage of key features of the Kinect for Windows sensor and SDK, nsquared media viewer utilizes Windows Azure, allowing clients to view reports about the usage of the screen and the content displayed.
nsquared media viewer
Kinect for Windows technology has made it possible for nsquared to create applications that allow people to interact with content in amazing new ways, helping us take a step towards our collective future of richer vision-based computing systems.
Celeste Humphrey, business development consultant, andDr. Neil Roodyn, director, nsquared
____________Note: Based on the price of 65-inch touch overlay at approximately US$900 compared to the cost of a Kinect for Windows sensor at approximately US$250. For integrated touch solutions, the price can be far higher. Back to blog...
Since our announcement of Kinect for Windows version 1.5 in “What’s Ahead: A Sneak Peek” there have been a few questions that have come up that I wanted to answer.
There have been some folks who have thought that 1.5 included new hardware. Version 1.5 is our new software release that is coming out in the same timeframe that we launch the current Kinect for Windows hardware in 19 additional countries. We will upgrade our software at a faster rate than we refresh our hardware.
We have built version 1.5 of our software with 1.0 compatibility at top of mind. Applications built using 1.0 will work on the same machine with an application built using 1.5 – this is something that we plan to do always, insuring that solutions built using older runtimes can always run side by side with solutions using new runtimes. Furthermore, we have maintained API compatibility for developers – applications that are currently being built using the 1.0 SDK can be recompiled using the 1.5 SDK without any changes required. No one has to wait for 1.5 to get a Kinect for Windows sensor or to start coding using the current SDK!
I love the enthusiasm for the 1.5 SDK and runtime, the new speech languages, and for the new countries we’re launching in – we can’t wait to deliver it to you.
Craig EislerGeneral Manager, Kinect for Windows
The Imagine Cup competition—which recently concluded its tenth year—throws the spotlight on cutting-edge innovations. Two-thirds of the education-focused projects utilized Microsoft Kinect in a variety of different ways, including interactive therapy for stroke victims, an automated cart to help make solo trips to crowded public places manageable for the disabled, and an application to help dyslexic children learn the alphabet.
Team Wi-GO of Portugal invented a Kinect-enabled cart to aid the disabled.
Students from 75 countries participated in the Imagine Cup Finals, held July 6 to 11 in Sydney, Australia, which featured more than 100 projects. Kinect for Windows played a significant role in this year's competition, with 28 Kinect-enabled projects across multiple categories—including Software Design, Game Design, Windows Azure, and a Fun Labs Challenge that was focused entirely on Kinect.
With the goal of using technology to help solve the world's toughest problems, students put Kinect to work providing the digital eyes, ears, and tracking capabilities needed for a range of potential new products and applications. We applaud all of the teams who incorporated Kinect for Windows into their projects this year! Here are highlights from a few of them:
"Imagine Cup is about giving students the resources and tools they need to succeed and then getting out of their way and letting them create," said Walid Abu-Hadba, corporate vice president of Microsoft's Developer and Platform Evangelism group. "Kinect in particular is unlocking a new class of interactive solutions. It's inspiring to watch the way students from a multitude of backgrounds find common ground as they combine their love of technology with their determination to make a difference. It's amazing."
We look forward to next year’s Imagine Cup. In the meantime, keep up the great work.
Kinect for Windows Team
• Kinect for Windows Gallery• Imagine Cup website• Imagine Cup winners and finalists• Team wi-GO • Team Whiteboard Pirates • Team Flexify• Italian Ingenium Team• The D Labs• Make a Sign
Students, teachers, researchers, and other educators have been quick to embrace Kinect’s natural user interface (NUI), which makes it possible to interact with computers using movement, speech, and gestures. In fact, some of the earliest Kinect for Windows applications to emerge were projects done by students, including several at last year’s Imagine Cup.
One project, from an Imagine Cup team in Italy, created an application for people with severe disabilities that enables them to communicate, learn, and play games on computers using a Kinect sensor instead of a traditional mouse or keyboard. Another innovative Imagine Cup project, done by university students in Russia, used the Kinect natural user interface to fold, rotate, and examine online origami models.
To encourage students, educators, and academic researchers to continue innovating with Kinect for Windows, special academic pricing on Kinect for Windows sensors is now available in the United States. The academic price is $149.99 through Microsoft Stores.
If you are an educator or faculty with an accredited school, such as a university, community college, vocational school, or K-12, you can purchase a Kinect for Windows sensor at this price.
Find out if you qualify, and then purchase online or visit a Microsoft store in your area.
When we launched Kinect for Xbox 360 on November 4th, 2010, something amazing happened: talented Open Source hackers and enthusiasts around the world took the Kinect and let their imaginations run wild. We didn’t know what we didn’t know about Kinect on Windows when we shipped Kinect for Xbox 360, and these early visionaries showed the world what was possible. What we saw was so compelling that we created the Kinect for Windows commercial program.
Our commercial program is designed to allow our partners— companies like Toyota, Mattel, American Express, Telefonica, and United Health Group—to deploy solutions to their customers and employees. It is also designed to allow early adopters and newcomers alike to take their ideas and release them to the world on Windows, with hardware that’s supported by Microsoft. At the same time, we wanted to let our early adopters keep working on the hardware they’d previously purchased. That is why our SDK continues to support the Kinect for Xbox 360 as a development device.
As I reflect back on the past eleven months since Microsoft announced we were bringing Kinect to Windows, one thing is clear: The efforts of these talented Open Source hackers and enthusiasts helped inspire us to develop Kinect for Windows faster. And their continued ambition and drive will help the world realize the benefits of Kinect for Windows even faster still. From all of us on the Kinect for Windows team: thank you.
In March, ten startups will converge on Seattle to start developing commercial and gaming applications that utilize Kinect's innovative natural user interface (NUI). As part of the Microsoft Kinect Accelerator program, they will have three months and a wealth of resources—including access to Microsoft and industry mentors—to develop, and then present their applications to angel investors, venture capitalists, Microsoft executives, media, and influential industry leaders.
Since launching in late November, the Kinect Accelerator has received hundreds of applications from over forty countries, proposing transformative, creative innovations for healthcare, fitness, retail, training/simulation, automotive, scientific research, manufacturing, and much more.
Applications are still being accepted, and the Kinect Accelerator team encourages you to apply. Learn more about the application process.
The Kinect Accelerator program is powered by TechStars, one of the most respected technology accelerator programs in the world. Microsoft is working with TechStars to leverage the absolute best startup accelerator methodologies, mentors, and visibility. If you are considering building a business based on the capabilities of Kinect, this is a great opportunity for you.
Dave Drach, Managing Director, Microsoft Emerging Business Team, explains that the Kinect Accelerator program is looking for creative startups that have a passion for driving the next generation of computing. “Starting in the spring of 2012, they will have three months to bring their ideas to life. What will emerge will be applications and business scenarios that we’ve not seen before,” comments Drach.
Read more about the Kinect Accelerator program.
A unique clinic for treating children with cancer and blood disorders, alex’s place is designed to be a warm, open, communal space. The center—which is located in Miami, Florida—helps put its patients at ease by engaging them with interactive screens that allow them to be transported into different environments—where they become a friendly teddy bear, frog, or robot and control their character’s movements in real time.
"As soon as they walk in, technology is embracing them," said Dr. Julio Barredo, chief of pediatric services at alex's place in The Sylvester Comprehensive Cancer Center, University of Miami Health Systems.
The clinic—which opened its doors in May 2012—was conceived of and designed with this in mind, and the Kinect for Windows digital experience was part of the vision from day one. Created by Snibbe Interactive, Character Mirror was designed to fit naturally within this innovative, unconventional treatment environment. The goal is to help reinforce patients' mind-body connection with engaging play and entertainment, as well as to potentially reduce their fear of technology and the treatments they face. As an added benefit, nurses can observe a child's natural range of movement during play and more easily draw out answers to key diagnostic questions.
"I find the gestural interactive experiences we created for alex's place in Miami among the most worthwhile and satisfying in our history," said Scott Snibbe, founder and CEO of Snibbe Interactive. "Kids in hospitals are feeling lonely, scared, and bored, not to mention sick. Partnering with Alex Daly and Dr. Barredo, we created a set of magical experiences that encourage healthy, social, and physical activity among the kids.
"Kids found these experiences so pleasing that they actually didn't want to leave after their treatments were complete," Snibbe added. "We are very excited to roll out these solutions to more hospitals, and transform healthcare through natural user interfaces that promote social play and spontaneous physical therapy."