The following blog post was guest authored by Celeste Humphrey, business development consultant at nsquared, and Dr. Neil Roodyn, director of nsquared.
A company that is passionate about learning, technology, and creating awesome user experiences, nsquared has developed three new applications that take advantage of Kinect for Windows to provide users with interactive, natural user interface experiences. nsquared is located in Sydney, Australia.
At nsquared, we believe that vision-based interaction is the future of computing. The excitement we see in the technology industry regarding touch and tablet computing is a harbinger of the changes that are coming as smarter computer vision systems evolve.
Kinect for Windows has provided us with the tools to create some truly amazing products for education, hospitality, and events.
Education: nsquared sky spelling
We are excited to announce nsquared sky spelling, our first Kinect for Windows-based educational game. This new application, aimed at children aged 4 to 12, makes it fun for children to learn to spell in an interactive and collaborative environment. Each child selects a character or vehicle, such as a dragon, a biplane, or a butterfly, and then flies as that character through the sky to capture letters that complete the spelling of various words. The skeleton recognition capabilities of the Kinect for Windows sensor and software development kit (SDK) track the movement of the children as they stretch out their arms as wings to navigate their character through hoops alongside their wingman (another player). The color camera in the Kinect for Windows sensor allows each child to add their photo, thereby personalizing their experience.
nsquared sky spelling
Hospitality: nsquared hotel kiosk
The nsquared hotel kiosk augments the concierge function in a hotel by providing guidance to hotel guests through an intuitive, interactive experience. Guests can browse through images and videos of activities, explore locations on a map, and find out what's happening with a live event calendar. It also provides live weather updates and has customizable themes. The nsquared hotel kiosk uses the new gestures supported in the Kinect for Windows SDK 1.7, enabling users to use a “grip” gesture to drag content across the screen and a “push” gesture to select content. With its fun user interface, this informative kiosk provides guests an interactive alternative to the old brochure rack.
Kinect for Windows technology enables nsquared to provide an interactive kiosk experience for less than half the price of a similar sized touchscreen (see note).
nsquared hotel kiosk
Events: nsquared media viewer
The new nsquared media viewer application is a great way to explore interactive content in almost any environment. Designed for building lobbies, experience centers, events, and corporate locations, the nsquared media viewer enables you to display images and video by category in a stylish, customizable carousel. Easy to use, anyone can walk up and start browsing in seconds.
In addition to taking advantage of key features of the Kinect for Windows sensor and SDK, nsquared media viewer utilizes Windows Azure, allowing clients to view reports about the usage of the screen and the content displayed.
nsquared media viewer
Kinect for Windows technology has made it possible for nsquared to create applications that allow people to interact with content in amazing new ways, helping us take a step towards our collective future of richer vision-based computing systems.
Celeste Humphrey, business development consultant, andDr. Neil Roodyn, director, nsquared
____________Note: Based on the price of 65-inch touch overlay at approximately US$900 compared to the cost of a Kinect for Windows sensor at approximately US$250. For integrated touch solutions, the price can be far higher. Back to blog...
By now, most of you likely have heard about the new Kinect sensor that Microsoft will deliver as part of Xbox One later this year.
Today, I am pleased to announce that Microsoft will also deliver a new generation Kinect for Windows sensor next year. We’re continuing our commitment to equipping businesses and organizations with the latest natural technology from Microsoft so that they, in turn, can develop and deploy innovative touch-free applications for their businesses and customers. A new Kinect for Windows sensor and software development kit (SDK) are core to that commitment.
Both the new Kinect sensor and the new Kinect for Windows sensor are being built on a shared set of technologies. Just as the new Kinect sensor will bring opportunities for revolutionizing gaming and entertainment, the new Kinect for Windows sensor will revolutionize computing experiences. The precision and intuitive responsiveness that the new platform provides will accelerate the development of voice and gesture experiences on computers.
Some of the key capabilities of the new Kinect sensor include:
The enhanced fidelity and depth perception of the new Kinect sensor will allow developers to create apps that see a person's form better, track objects with greater detail, and understand voice commands in noisier settings.
The new sensor tracks more points on the human body than previously, including the tip of the hand and thumb, and tracks six skeletons at once. This opens up a range of new scenarios, from improved "avateering" to experiences in which multiple users can participate simultaneously.
I’m sure many of you want to know more. Stay tuned; at BUILD 2013 in June, we’ll share details about how developers and designers can begin to prepare to adopt these new technologies so that their apps and experiences are ready for general availability next year.
A new Kinect for Windows era is coming: an era of unprecedented responsiveness and precision.
Bob HeddleDirector, Kinect for Windows
Photos in this blog by STEPHEN BRASHEAR/Invision for Microsoft/AP Images
Reflexion Health, founded with technology developed at the West Health Institute, realized years ago that assessing physical therapy outcomes is difficult for a variety of reasons, and took on the challenge of designing a solution to help increase the success rates of rehabilitation from physical injury.
In 2011, the Reflexion team approached the Orthopedic Surgery Department of the Naval Medical Center San Diego to help test their new Rehabilitation Measurement Tool (RMT). This software solution was developed to make physical therapy more engaging, efficient, and successful. By using the Kinect for Windows sensor and software development kit (SDK), the RMT allows clinicians to measure patient progress. Patients often do much of their therapy alone and because they can lack immediate feedback from therapists, it can be difficult for them to be certain that they are performing the exercises in a manner that will provide them with optimal benefits. The RMT can indicate if exercises were performed properly, how frequently they were performed, and give patients real-time feedback.
Reflexion Health's Kinect for Windows-based tool helps measure how patients respond to physical therapy.
“Kinect for Windows helps motivate patients to do physical therapy—and the data set we gather when they use the RMT is becoming valuable to demonstrate what form of therapy is most effective, what types of patients react better to what type of therapy, and how to best deliver that therapy. Those questions have vexed people for a long time,” says Dr. Ravi Komatireddy, co-founder at Reflexion Health.
The proprietary RMT software engages patients with avatars and educational information, and a Kinect for Windows sensor tracks a patient’s range of motion and other clinical data. This valuable information helps therapists customize and deliver therapy plans to patients.
“RMT is a breakthrough that can change how physical therapy is delivered,” Spencer Hutchins, co-founder and CEO of Reflexion Health says. “Kinect for Windows helps us build a repository of information so we can answer rigorous questions about patient care in a quantitative way.” Ultimately, Reflexion Health has demonstrated how software could be prescribed—similarly to pharmaceuticals and medical devices—and how it could possibly lower the cost of healthcare.
More information about RMT and the clinical trials conducted by the Naval Medical Center can be found in the newly released case study.
Kinect for Windows team
As you might imagine, working in a nuclear power plant provides special challenges. One crucial aspect for any project is the need to minimize employee exposure to radiation by applying a standard known as As Low As Reasonably Achievable—ALARA for short.
How this works: Plant ALARA managers work with the maintenance groups to estimate how much time is required to perform a task and, allowing for exposure limits, they determine how many employees may be needed to safely complete it. Today, that work is typically done with pen and paper. But new tools from Siemens PLM Software that incorporate the Kinect for Windows sensor could change this by providing a 3-D virtual interactive modeling environment.
Kinect for Windows is used to capture realistic movement for use in the Siemens Teamcenter solution for ALARA radiation planning.
The solution, piloted at a US nuclear power plant last year, is built on Siemens’ Teamcenter software, using its Tecnomatix process simulate productivity product. Siemens PLM Software Tecnomatix provides virtual 3-D human avatars—“Jack” and “Jill”—that are integrated to model motion-controlled actions input with a Kinect for Windows sensor. This solution is helping to usher in a new era of industrial planning applications for employee health and safety in the nuclear industry.
"We're really revolutionizing the industry," said Erica Simmons, global marketing manager for Energy, Oil, and Gas Industries at Siemens PLM Software. "For us, this was a new way to develop a product in tandem with the industry associations. We created a specific use case with off-the-shelf technology and tested and validated it with industry. What we have now is a new visual and interactive way of simulating potential radiation exposure which can lead to better health and safety strategies for the plant."
Traditional pencil-and-paper planning (left) compared to the Siemens PLM Software Process Simulate on Teamcenter solution (right) with “Jack” avatar and Kinect for Windows movement input.
The Siemens Tecnomatix process planning application, integrated with the Kinect for Windows system, will give nuclear plant management the ability to better manage individual worker radiation exposure and optimize steps to reduce overall team exposure. As a bonus, once tasks have been recorded by using “Jack,” the software can be used for training. Employees can learn and practice an optimized task by using Kinect for Windows and Siemens “Jack” and “Jill”—safely outside of the radiation zone—until they have mastered it and are ready to perform the actual work.
"We wanted to add something more for the user of this solution in addition to our 3-D human avatars and the hazard zones created by our visual cartography; this led us to exploring what we could do with the Kinect for Windows SDK for this use case," said Dr. Ulrich Raschke, director of Human Simulation Technologies at Siemens PLM Software. “User feedback has been good so far; the addition of the Kinect for Windows system adds another level of interactivity to our application."
This Siemens solution grew out of a collaborative effort with Electric Power Research Institute (EPRI) and Fiatech industry association, which identified the need for more technologically advanced estimation tools for worker radiation dosage. Kinect for Windows was incorporated when the developers were tailoring the avatar system to the solution and exploring ways to make the user experience much more interactive.
"Collaboration with several key stakeholders and industry experts led to this innovative solution," said Phung Tran, senior project manager at EPRI. "We're pleased the industry software providers are using it, and look forward to seeing the industry utilize these new tools."
“In fact,” Tran added, “the tool is not necessarily limited to radiation work planning. It could help improve the management and execution of many operation, maintenance, and project-based tasks.”
Yes, it’s the moment many of you have been waiting for: Kinect for Windows SDK 1.7 is available for download! We’ve included a few photos of the key features: Kinect Interactions and Kinect Fusion. Or if you’re a developer, you can download the SDK and get started immediately.
A woman demonstrates the new Kinect Interactions, which are included in the Kinect for Windows SDK 1.7: counter-clockwise from top left: “push” to select, “grab” to scroll and pan, and wave to identify primary user. Two-handed zoom (top right) is not included but can be built with this new SDK.
Kinect Interactions are designed to let users intuitively do things like press their hand forward a few inches to push a button, or close their hands to “grip and pan” as seen here. Now you can untether yourself and move around a conference room naturally.
In this physical therapy scenario, Kinect for Windows enables a therapist to interact with the computer without leaving her patient’s side.
Customers can virtually try on merchandise, such as sunglasses, by using business solutions created with the new Kinect for Windows SDK 1.7. If colors, models, or sizes are not in stock, you can still see what they look like on you.
Kinect Fusion, a tool also included in Kinect for Windows SDK 1.7, can create highly accurate 3-D renderings of people and objects in real time.
Kinect Fusion makes it possible to create highly accurate 3-D renderings at a fraction of the price it would cost with traditional high-end 3-D scanners.
Kinect Fusion opens up a variety of new scenarios for businesses and developers, including augmented reality, 3-D printing, interior and industrial design, and body scanning for things like custom fitting and improved clothes shopping.
Kinect Fusion opens up a variety of new scenarios for businesses and developers, including augmented reality, 3-D printing, interior and industrial design, and body scanning for things like custom fitting and improved clothes shopping .
The Kinect for Windows Team
Today at Engadget Expand, I announced that Kinect for Windows SDK 1.7 will be made available this coming Monday, March 18. This is our most significant update to the SDK since we released the first version a little over a year ago, and I can’t wait to see what businesses and developers do with the new features and enhancements.
On Monday, developers will be able to download the SDK, developer toolkit, and the new and improved Human Interface Guidelines (HIG) from our website. In the meantime, here’s a sneak peek:
Kinect Interactions give businesses and developers the tools to create intuitive, smooth, and polished applications that are ergonomic and intelligently based on the way people naturally move and gesture. The interactions include push-to-press buttons, grip-to-pan capabilities, and support for smart ways to accommodate multiple users and two-person interactions. These new tools are based on thousands of hours of research, development, and testing with a broad and diverse group of people. We wanted to save businesses and developers hours of development time while making it easier for them to create gesture-based experiences that are highly consistent from application to application and utterly simple for end users. With Kinect Interactions, businesses can more quickly develop customized, differentiated solutions that address important business needs and attract, engage, and delight their customers.
Kinect for Windows Interactions transform how people interact with computers insettings ranging from retail to education, training, and physical therapy.
Kinect Fusion is one of the most affordable tools available today for creating accurate 3-D renderings of people and objects. Kinect Fusion fuses together multiple snapshots from the Kinect for Windows sensor to create accurate, full, 3-D models. Developers can move a Kinect for Windows sensor around a person, object, or environment and “paint” a 3-D image of the person or thing in real time. These 3-D images can then be used to enhance countless real-world scenarios, including augmented reality, 3-D printing, interior and industrial design, and body scanning for things such as improved clothes shopping experiences and better-fitting orthotics. Kinect Fusion is something many of our partners have been asking for and we’re thrilled to be delivering it now.
Kinect Fusion enables developers to create accurate 3-D renderings in real time.
The updated SDK also includes an enhanced developer toolkit and additional developer resources, including:
Seeing is believingWe demonstrated Kinect Interactions and Kinect Fusion live, onstage at Engadget Expand. You can watch the webcast of those demos now—and then come back to download the latest SDK on March 18. It’s fully compatible with all previous commercial releases, so we encourage everyone to upgrade to the new version. There’s no reason not to!
As always, we are constantly evolving the technology and want to know what you think. And we love hearing about the solutions you’re developing with Kinect for Windows, so please join us at Facebook and Twitter.
The Kinect for Windows sensor, together with the SDK, can help you create engaging applications that take natural voice and gesture computing to the next level.
Bob Heddle, DirectorKinect for Windows
Shortly after the commercial release of Kinect for Windows in early 2012, Microsoft announced the availability of academic pricing for the Kinect for Windows sensor to higher education faculty and students for $149.99 at the Microsoft Store in the United States. We are now pleased to announce that we have broadened the availability of academic pricing through Microsoft Authorized Educational Resellers (AERs).
Most of these resellers have the capability to offer academic pricing directly to educational institutions; academic researchers; and students, faculty, and staff of public or private K-12 schools, vocational schools, junior colleges, colleges, universities, and scientific or technical institutions. In the United States, eligible institutions are accredited by associations that are recognized by the US Department of Education and/or the State Board of Education. Academic pricing on the Kinect for Windows sensor is currently available through AERs in the United States, Taiwan, and Hong Kong SAR.
Within the academic community, the potential of Kinect for Windows in the classroom is generating a lot of excitement. Researchers and academia in higher education collaborate with Microsoft Research on a variety of projects that involve educational uses of Kinect for Windows. The educator driven community resource, KinectEDucation, encourages developers, teachers, students, enthusiasts and any other education stakeholders to help transform classrooms with accessible technology. One such development is a new product from Kaplan Early Learning Company, the Inspire-NG Move, bundled with the Kinect for Windows sensor. This bundle includes four educational programs for children age three years and older. The programs make it possible for children to experience that hands-on, kinesthetic play with a purpose makes learning fun. The bundle currently sells for US$499.
“We’re excited about the new learning models that are enabled by Kinect for Windows,” stated Chris Gerblick, vice president of IT and Professional Services at Kaplan Early Learning Company. “We see the Inspire NG-Move family of products as excellent learning tools for both the classroom and the home.”
With the availability of academic pricing, we look forward to many developments from the academic community that integrate Kinect for Windows into interactive educational experiences.
Michael FryBusiness Development, Strategic AlliancesKinect for Windows
Revealed in November as a future addition to the Kinect for Windows SDK, Kinect Fusion made a big impression at the annual TechFest event hosted by Microsoft Research this week in Redmond, Washington.
Kinect Fusion pulls depth data that is generated by the Kinect for Windows sensor and, from the sequence of frames, constructs a highly detailed 3-D map of objects or environments. The tool averages readings over hundreds or even thousands of frames to create a rich level of detail.
Kinect Fusion, shown during TechFest 2013, enables high-quality scanning and reconstruction of 3-D models using just a handheld Kinect for Windows sensor and the Kinect for Windows SDK.
"The amazing thing about this solution is how you can take an off-the-shelf Kinect for Windows sensor and create 3-D models rapidly," said Shahram Izadi, senior researcher at Microsoft Research Cambridge. "Normally when you think of Kinect, you think of a static sensor in a living room. But with Kinect Fusion, we allow the user to hold the camera, explore their space, and rapidly scan the world around them."
When scanning smaller objects, you also have the option to simply move the object instead of the sensor.
The Cambridge researchers and Kinect for Windows team collaborated closely on Kinect Fusion to construct a tool that can enable businesses and developers to devise new types of applications.
"This has been a wonderful example of collaboration between Microsoft Research and our product group," said Kinect for Windows Senior Program Manager Chris White. "We have worked shoulder-to-shoulder over the last year to bring this technology to our customers. The deep engagement that we have maintained with the original research team has allowed us to incorporate cutting edge research, even beyond what was shown in the original Kinect Fusion paper."
"This kind of collaboration is one of the unique strengths of Microsoft, where we can bring together world-class researchers and world-class engineers to deliver real innovation," White added. "Kinect Fusion opens up a wide range of development possibilities—everything from gaming and augmented reality to industrial design. We're really excited to be able to include it in a future release of the Kinect for Windows SDK."
Much like Build-A-Bear Workshop, Mattel has been watching the trends and finding that children are embracing digital media. How can the company keep a toy like the Barbie doll, launched in 1959, relevant in a world where tablet computers and smartphones dominate kids' wishlists?
Once again, Kinect for Windows has proved a formidable ally in bridging the gap between digital entertainment and traditional toys. A six-month project for Mattel, Gun Communications and creative applications developer Adapptor built Barbie: the Dream Closet, which lets enthusiasts of all ages across Australia virtually try on a variety Barbie outfits from different decades by using a Kinect for Windows-enabled "magic mirror." Have you ever wondered what you’d look like in one of Barbie's ball gowns, or as an astronaut, or a race car driver? With the Dream Closet, it's possible. Additionally, you can save and share photos over social media, or even take a photo home.
To build the application, each outfit was photographed on a Barbie doll, trimmed into its component parts, and then reconstructred dynamically on Barbie fans by the custom Dream Closet application, built in Microsoft XNA. The Kinect for Windows sensor and software development kit (SDK) make it easy to accurately determine the size of the user so the virtual clothes and selection menus can be fitted to match.
"If we would have had to write code from the ground up [versus using code provided in the SDK], it would have taken much longer, and the end result wouldn’t have been nearly as impactful," said Adapptor Managing Director Mark Loveridge. "The Kinect for Windows SDK doubled our development speed."
The result of Barbie: the Dream Closet? Increased customer brand loyalty and media coverage yielding more than 25 million impressions, a new case study reports.
"The impact of Kinect for Windows on the public and the Barbie brand is incredible," notes Mattel Marketing Director Amanda Allegos. "Kinect for Windows has given us a new way to reach existing Barbie fans and attract new ones in a way that's contemporary, interactive, and bridges both the digital and physical worlds."
Almost two years ago, Microsoft announced its intent to take Kinect beyond gaming and make it possible for developers and businesses to innovate with Kinect on computers. The Kinect for Windows team was born.
Shortly after that, I joined the team to oversee Program Management, and over the past year, we’ve shipped the Kinect for Windows sensor as well as multiple updates to the Kinect for Windows software development kit (SDK). Throughout it all, Craig Eisler has been leading our business.
This month, Craig is moving on to do other important work at Microsoft, and I am stepping in to lead the Kinect for Windows team. I am excited to maintain the amazing momentum we’ve seen in industries like healthcare, retail, education, and automotive. There have been more than 500,000 downloads of our free SDK, and the Kinect for Windows sensor can be purchased in 39 regions today.
Such rapid growth would not have been possible without the community embracing the technology. Thanks to all of you—business leaders, technical leaders, creative visionaries, and developers—Kinect for Windows has been deployed across the globe. The community is developing new ways for consumers to shop for clothing and accessories, interesting digital signage that delights and inspires customers, remote monitoring tools that make physical therapy easier, more immersive training and simulation applications across multiple industries, and touch-free computing tools that enable surgeons to view patient information without having to leave the operating room. The list goes on and on…and the list is growing every day.
We launched Kinect for Windows nearly one year ago—pioneering a commercial technology category that didn’t previously exist. I look forward to continuing to be at the forefront of touch-free computing and helping our partners develop innovative solutions that take the natural user interface vision even further. We’ve said it before and I’ll say it again: this is just the beginning. I’m thrilled to continue the great foundational work we did in 2012 and look forward to a very productive 2013.