• Kinect for Windows Product Blog

    BUILDing business with Kinect for Windows v2


    BUILD—Microsoft’s annual developer conference—is the perfect showcase for inventive, innovative solutions created with the latest Microsoft technologies. As we mentioned in our previous blog, some of the technologists who have been part of the Kinect for Windows v2 developer preview program are here at BUILD, demonstrating their amazing apps. In this blog, we’ll take a closer look at how Kinect for Windows v2 has spawned creative leaps forward at two innovative companies: Freak’n Genius and Reflexion Health.

    Making schoolwork fun with Freak’n Genius, which lets anyone become an animator using Kinect for Windows v2. Here a student is choosing a character to animate in real time, for a video presentation on nutrition.
    Left: A student is choosing a Freak’n Genius character to animate in real time for a video presentation on nutrition. Right: Vera, by Reflexion Health can track a patient performing physical therapy exercises at home and give her immediate feedback on her execution while also transmitting the results to her therapist.

    Freak’n Genius is a Seattle-based company whose current YAKiT and YAKiT Kids applications, which let users create talking photos on a smartphone, have been used to generate well over a million videos.

    But with Kinect for Windows 2, Freak’n Genius is poised to flip animation on its head, by taking what has been highly technical, time consuming, and expensive and making it instant, free, and fun. It’s performance-based animation without the suits, tracking balls, and room-size setups. Freak’n Genius has developed software that will enable just about anyone to create cartoons with fully animated characters by using a Kinect for Windows v2 sensor. The user simply chooses an on-screen character—the beta features 20 characters, with dozens more in the works—and animates it by standing in front of the Kinect for Windows sensor and moving. With its precise skeletal tracking capabilities, the v2 sensor captures the “animator’s” every twitch, jump, and gesture, translating them into movements of the on-screen character.

    What’s more, with the ability to create Windows Store apps, Kinect for Windows v2 stands to bring Freak’n Genius’s improved animation applications to countless new customers. Dwayne Mercredi, the chief technology officer at Freakn’ Genius, says that “Kinect for Windows v2 is awesome. From a technology perspective, it gives us everything we need so that an everyday person can create amazing animations immediately.” He praises how the v2 sensor reacts perfectly to the user’s every movement, making it seem “as if they were in the screen themselves.”  He also applauds the v2 sensor’s color camera, which provides full HD at 1080p. “There’s no reason why this shouldn’t fully replace the web cam,” notes Mercredi.

    Mercredi notes that YAKiT is already being used for storytelling, marketing, education reports, enhanced communication, or just having fun. With Kinect for Windows v2, Freak’n Genius envisions that kids of all ages will have an incredibly simple and entertaining way to express their creativity and humor while professional content creators—such as advertising, design, and marketing studios—will be able to bring their content to life either in large productions or on social media channels. There is also a white-label offering, giving media companies the opportunity to use their content in a new way via YAKiT’s powerful animation engine.

    While Freak’n Genius captures the fun and commercial potential of Kinect for Windows v2, Reflexion Health shows just how powerful the new sensor can be to the healthcare field. As anyone who’s ever had a sports injury or accident knows, physical therapy (PT) can be a crucial part of their recovery. Physical therapists are rigorously trained and dedicated to devising a tailored regimen of manual treatment and therapeutic exercises that will help their patients mend. But increasingly, patients’ in-person treatment time has shrunk to mere minutes, and, as any physical therapist knows, once patients leave the clinic, many of them lose momentum, often struggling  to perform the exercises correctly at home—or simply skipping them altogether.

    Reflexion Health, based in San Diego, uses Kinect for Windows to augment their physical therapy program and give the therapists a powerful, data-driven new tool to help ensure that patients get the maximum benefit from their PT. Their application, named Vera, uses Kinect for Windows to track patients’ exercise sessions. The initial version of this app was built on the original Kinect for Windows, but the team eagerly—and easily—adapted the software to the v2 sensor and SDK. The new sensor’s improved depth sensing and enhanced skeletal tracking, which delivers information on more joints, allows the software to capture the patient’s exercise moves in far more precise detail.  It provides patients with a model for how to do the exercise correctly, and simultaneously compares the patient’s movements to the prescribed exercise. The Vera system thus offers immediate, real-time feedback—no more wondering if you’re lifting or twisting in the right way.  The data on the patient’s movements are also shared with the therapist, so that he or she can track the patient’s progress and adjust the exercise regimen remotely for maximum therapeutic benefit.

    Not only does the Kinect for Windows application provide better results for patients and therapists, it also fills a need in an enormous market. PT is a $30 billion business in the United States alone—and a critical tool in helping to manage the $127 billion burden of musculoskeletal disorders. By extending the expertise and oversight of the best therapists, Reflexion Health hopes to empower and engage patients, helping to improve the speed and quality of recovery while also helping to control the enormous costs that come from extra procedures and re-injury. Moreover, having the Kinect for Windows v2 supported in the Windows Store stands to open up home distribution for Reflexion Health. 

    Mark Barrett, a lead software engineer at Reflexion Health, is struck by the rewards of working on the app. Coming from a background in the games industry, he now enjoys using Kinect technology to “try and tackle such a large and meaningful problem. That’s just a fantastic feeling.”  As a developer, he finds the improved skeletal tracking the v2 sensor’s most significant change, calling it a real step forward from the original Kinect for Windows. “It’s so much more precise,” he says. “There are more joints, and they’re in more accurate positions.”  And while the skeletal tracking has made the greatest improvement in Reflexion Health’s app—giving both patients and clinicians more accurate and actionable data on precise body movements—Barrett is also excited for the new color camera and depth sensor, which together provide a much better image for the physical therapist to review.  “You see such a better representation of the patient…It was jaw-dropping the first time I saw it,” he says.

    But like any cautious dev, Barrett acknowledges being apprehensive about porting the application to the Kinect for Windows v2 sensor.  Happily, he discovered that the switch was painless, commenting that “I’ve never had a hardware conversion from one version to the next be so effortless and so easy.” He’s also been pleased to see how easy the application is for patients to use. “It’s so exciting to be working on a solution that has the potential to help so many people and make people’s lives better. To know that my skills as a developer can help make this possible is a great feeling.”

    From creating your own animations to building a better path for physical rehabilitation, the Kinect for Windows v2 sensor is already in the hands of thousands of developers. We can’t wait to make it publicly available this summer and see what the rest of you do with the technology.

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Making Learning More Interactive and Fun for Young Children


    Although no two people learn in exactly the same way, the process of learning typically involves seeing, listening/speaking, and touching. For most young children, all three senses are engaged in the process of grasping a new concept.

    For example, when a red wooden block is given to a toddler, they hear the words “red” and “block,” see the color red, and also use their hands to touch and feel the shape of the wooden block.

    Uzma Khan, a graduate student in the Department of Computer Science at the University of Toronto, realized the Kinect natural user interface (NUI) could provide similar experiences. She used the Kinect for Windows SDK to create a prototype of an application that utilizes speech and gestures to simplify complex learning, and make early childhood education more fun and interactive.

    The application asks young children to perform an activity, such as identify the animals that live on a farm.  Using their hands to point to the animals on a computer screen, along with voice commands, the children complete the activities. To reinforce their choices, the application praises them when they make a correct selection.

    Using the speech and gesture recognition capabilities of Kinect enables children to not only learn by seeing, listening, and speaking; it lets them actively participate by selecting, copying, moving, and manipulating colors, shapes, objects, patterns, letters, numbers, and much more.

    The creation of applications to aid learning for people of all ages is one of the many ways we anticipate Kinect for Windows will be used to enable a future in which computers work more naturally and intelligently to improve our lives.

    Sheridan Jones
    Business and Strategy Director, Kinect for Windows

  • Kinect for Windows Product Blog

    Styku Smart Fitting Room Expected to Improve Online Shopping Experience, Reduce Returns


    Styku, a Kinect Accelerator startup, set out to alter clothes shopping for retailers by using the Kinect for Windows sensor and software development kit to develop its Smart Fitting Room quickly, a new case study reports.

    The technology will soon be used by Brooks Brothers, IM-Label, and other fashion retailers. Styku hopes to improve the shopping experience—reducing the problem of shoppers returning up to 40 percent of their online purchases and offering a faster, less expensive body scanning solution. Additionally, military apparel contractors appreciate the improved measurement capability of Kinect for Windows with the Styku software—estimated to be up to 400 percent more accurate—which could save soldiers' lives, thanks to better fitting body armor.

    Consumers scan their bodies with the Kinect for Windows sensor in a dressing room.

    Customers can quickly visualize the fit and fabric characteristics of garments over digital renderings of their bodies that are created by scanning their body with the Kinect for Windows sensor. The scan lasts only one second—reducing the risk that a fidgety customer will compromise the scan’s accuracy. Clothing is rendered in
    3-D, and customers can use gesture to rotate, view a custom-fit color map, and compare multiple sizes.

    "Kinect for Windows had exactly the sensors that we needed, in a small package," said Pierre Du Charme, vice president of Software Engineering for Styku. "The SDK was easy to learn and gave us the tools to quickly implement a full-featured application."

    Kinect for Windows team

    Key Links

  • Kinect for Windows Product Blog

    Kinect for Windows: Developer Toolkit Update (v1.5.2)


    We’re pleased to announce the release of Developer Toolkit update v1.5.2, which includes:

    • WPFD3Dinterop. This new sample demonstrates DirectX 11 interoperability with Windows Presentation Foundation (WPF), which enables powerful DirectX rendering composed with quicker-to-develop WPF user interfaces.
    • Improved Kinect Studio playback. We’ve fixed a known issue from version 1.5.1 of Kinect Studio to ensure that depth-to-color mapping works properly, even if you play an XED clip on a machine without the identical Kinect sensor that recorded it.

    If you have already installed the Kinect for Windows SDK, simply download the new v1.5.2 Developer Toolkit Update. If you are new to Kinect for Windows, download both the Kinect for Windows SDK v1.5 and the Developer Toolkit v1.5.2.

    Rob Relyea
    Program Manager, Kinect for Windows

    Key Links

  • Kinect for Windows Product Blog

    Reflexion Health advancing physical therapy with Kinect for Windows


    Reflexion Health, founded with technology developed at the West Health Institute, realized years ago that assessing physical therapy outcomes is difficult for a variety of reasons, and took on the challenge of designing a solution to help increase the success rates of rehabilitation from physical injury.

    In 2011, the Reflexion team approached the Orthopedic Surgery Department of the Naval Medical Center San Diego to help test their new Rehabilitation Measurement Tool (RMT). This software solution was developed to make physical therapy more engaging, efficient, and successful. By using the Kinect for Windows sensor and software development kit (SDK), the RMT allows clinicians to measure patient progress. Patients often do much of their therapy alone and because they can lack immediate feedback from therapists, it can be difficult for them to be certain that they are performing the exercises in a manner that will provide them with optimal benefits. The RMT can indicate if exercises were performed properly, how frequently they were performed, and give patients real-time feedback.

    Reflexion Health's Kinect for Windows-based tool helps measure how patients respond to physical therapy.
    Reflexion Health's Kinect for Windows-based tool helps measure how patients respond to physical therapy.

    “Kinect for Windows helps motivate patients to do physical therapy—and the data set we gather when they use the RMT is becoming valuable to demonstrate what form of therapy is most effective, what types of patients react better to what type of therapy, and how to best deliver that therapy. Those questions have vexed people for a long time,” says Dr. Ravi Komatireddy, co-founder at Reflexion Health.

    The proprietary RMT software engages patients with avatars and educational information, and a Kinect for Windows sensor tracks a patient’s range of motion and other clinical data. This valuable information helps therapists customize and deliver therapy plans to patients.

    “RMT is a breakthrough that can change how physical therapy is delivered,” Spencer Hutchins, co-founder and CEO of Reflexion Health says. “Kinect for Windows helps us build a repository of information so we can answer rigorous questions about patient care in a quantitative way.” Ultimately, Reflexion Health has demonstrated how software could be prescribed—similarly to pharmaceuticals and medical devices—and how it could possibly lower the cost of healthcare.

    More information about RMT and the clinical trials conducted by the Naval Medical Center can be found in the newly released case study.

    Kinect for Windows team

    Key links


  • Kinect for Windows Product Blog

    Partners Deliver Custom Solutions that Use Kinect for Windows


    Kinect for Windows demos at Microsoft Worldwide Partner Conference

    Kinect for Windows partners are finding new business opportunities by helping to develop new custom applications and ready-made solutions for various commercial customers, such as the Coca-Cola Company, and vertical markets, including the health care industry.

    Several of these solutions were on display at the Microsoft Worldwide Partner Conference (WPC) in Toronto, Canada, where Kinect for Windows took the stage with two amazing demos as well as strong booth showings at the Solutions Innovation Center.

    "Being part of the WPC 2012 event was a great opportunity to showcase our Kinect-based 3-D scanner, and the response was incredibly awesome, both on stage when the audience would spontaneously clap and cheer in the middle of the scan, and in the Kinect for Windows trade show area where people would stand in line to get scanned," said Nicolas Tisserand, co-founder of the France-based Manctl, one of the 11 companies in the Microsoft Accelerator for Kinect program.

    Manctl's Skanect scanner software uses the Kinect sensor to build high quality 3-D digital models of people and objects, which can be sent to a 3-D printer to create detailed plastic extruded sculptures. "Kinect for Windows is a fantastic device, capable of so much more than just game control. It's making depth sensing a commodity," Tisserand added.

    A demo from übi interactive in Germany uses the Kinect sensor to turn virtually any surface into a 3-D touchscreen that can control interfaces, apps, and games. "Kinect for Windows is a great piece of hardware and it works perfect[ly] with our software stack," reported übi co-founder David Hajizadeh. "As off-the-shelf hardware, it massively reduced our costs and we see lots of opportunities for business applications that offer huge value for our customers."

    Snibbe Interactive created its SocialMirror Coke Kiosk to deliver a Kinect-based game in which players aim a stream of soda into a glass and then share videos of the experience with their social networks. "We were extremely excited to show off our unique Coca-Cola branded interactive experience and its unique ability to create instant ROI [return on investment] through our viral marketing component," reported Alan Shimoide, director of engineering at Snibbe.

    InterKnowlogy developed KinectHealth to assist doctors with motion-controlled access to patient records and surgery planning tools. "A true game changer, Kinect for Windows allows our designers and developers to think differently about business cases across many verticals," noted Kevin Custer, the director of strategic marketing and partnerships at InterKnowlogy. "Kinect for Windows is not just how we interact with computers, but it offers unique ways to add gesture and voice to our natural user-interface designed software—the combination of which is changing lives of customers and users alike."
    "Avanade has already delivered several innovative solutions using Kinect, and we expect that demand to keep growing," said Ben Reierson, innovation manager at Avanade, whose Kinect for Virtual Healthcare includes video chat for connecting clinics to remote doctors for online appointments. "Customers and partners are clearly getting more serious about the possibilities of Kinect and natural user interfaces."

    Kinect for Windows Team

    Key Links

  • Kinect for Windows Product Blog

    Kinect for Windows Shopping Solutions Showcased at National Retail Federation Expo


    Swivel Close-Up, a Kinect for Windows-based kiosk from FaceCake, lets customers visualize themselves
    Swivel Close-Up, a Kinect for Windows-based kiosk from FaceCake, lets customers visualize themselves
    in small accessories such as makeup, sunglasses, and jewelry.

    Microsoft Kinect for Windows has been playing an increasingly important role in retail, from interactive kiosks at stores such as Build-A-Bear Workshop, to virtual dressing rooms at fashion leaders like Bloomingdale's, to virtual showrooms at Nissan dealerships. This year's National Retail Federation (NRF) Convention and Expo, which took place earlier this week, showcased several solutions that provide retailers with new ways to drive customer engagement, sales, and loyalty.

    Trend watchers have noted significant shifts in how consumers shop—often blending online and in-store investigation by using phones, tablets, kiosks, and computers in addition to good old-fashioned salesperson interaction. Brick-and-mortar stores, which are facing vigorous competition from online resellers, are embracing new technologies like Kinect for Windows to help drive sales and retention—and to delight and surprise customers with fun, interactive shopping experiences. Even better, customers can get more accurate and personalized information about whether a specific product is right for them—whether it's an article of clothing or a piece of furniture—reducing dissatisfaction and inconvenient returns.

    Several Kinect for Windows-based retail solutions were showcased at this year’s National Retail Federation Convention and Expo."This past holiday season, we’ve seen retailers get much more tech savvy in how they engage customers and offer more flexibility in how they shop," said Kinect for Windows Senior Channel Development Manager Michael Fry. "As the lines between traditional and digital shopping channels continue to blur, retailers must seek new ways to deliver the most value and earn loyalty through compelling, seamless experiences across all touch points with their customers. Technologies like Kinect for Windows help retailers engage customers with interactive shopping experiences that are not only fun, but also increase important bottom-line business results—increasing engagement, awareness, and brand value while making it easier to select the best products for them."

    At a hospitality event during NRF, Kinect for Windows partner Avanade showed one such innovation: their "shoppable storefront," created for my-wardrobe.com in Norway. Customers can walk up to the showroom window and—even after business hours—interact with the Kinect for Windows sensor to browse the store catalog, view pricing, and scan a Quick Response (QR) code to quickly purchase the product online via mobile phone. See a video of how it works.

    "Consider the possibilities within the store, they're almost endless with a technology like Kinect for Windows," said John Konczal, director of service line marketing at Avanade. "You could build a guide for customers to find more information about products and quickly locate them in the store. If an item is not available, order it for shipment and pick-up at the nearest store. The interactivity, simplicity, and responsiveness of this technology can really help retailers differentiate their stores from the competition."

    Avanade also demonstrated Natural User Observation of Retail Displays (NUO), which provides a cost-effective solution for retailers by gathering real-time customer response and behavior. This allows retail managers to do things like determine where customers are spending their time in the store, identify trends, and gather demographic and customer behavior as they interact with store displays. Avanade reports that the solution integrates into existing store and back-office IT systems and provides dashboards and data-rich reporting for improving product placement, marketing effectiveness, and overall display performance.

    Another of our partners, FaceCake Marketing Technologies, Inc., which developed Swivel, the 3-D virtual dressing room that's been featured at Bloomingdale's, showed NRF attendees the newest enhancements to their Swivel software. The enhancements, which work in conjunction with the latest Kinect for Windows SDK, include face-tracking and a feature called real-time Compare, which allows you to contrast two looks in a full-motion visualization of yourself in two dresses (or any type of clothing) side-by-side. Sizing is now even more accurate, and FaceCake also added multi-user functionality that allows, for example, a bride to see herself, virtually, in various wedding dresses at the same time as her bridesmaids see themselves in their bridesmaid dresses.

    We also featured another exciting new product from FaceCake in our booth: Swivel Close-Up. This Kinect for Windows-based kiosk, which operates within a two-foot environment, lets customers try on much smaller accessories than clothing including makeup, sunglasses, and jewelry. Earrings dangle and twist beautifully as a shopper tries them on virtually, and consumers now have the opportunity to try on a limitless number of lip colors without lipstick ever touching their lips.

    "We can now provide an extended Try-On solution that is real-time, 3-D, and full motion as opposed to just uploading a static image and then modifying it," said FaceCake CEO Linda Smith. "The result is a lifelike representation that's just like looking in a mirror—your dream dressing room mirror powered by Swivel and Kinect for Windows! It's both efficient and fun for the customer."

    One of the key themes of this year's NRF event was putting customers at the center of retail marketing, something that Kinect for Windows accomplishes readily, thanks to its ability to quickly entice customers into virtual shopping spaces within actual storefronts, making it easier than ever for them to find, experience, and purchase products that are right for them.

    "Staying competitive in retail today means putting customers at the heart of the business and seeking new ways to deliver value in the store," Fry said. "A Kinect for Windows retail display immediately puts the focus on the shopper, delivering uniquely personalized results that drive both sales and customer satisfaction."

    Kinect for Windows team

    Key Links

  • Kinect for Windows Product Blog

    Help Kinect for Windows Become Innovation of the Year


    Earlier this month, Geekwire announced the nominees for the 2012 Seattle 2.0 Startup Awards. We're honored Kinect for Windows was selected in the “Innovation of the Year” category, which looks at technologies, which are setting a course for ”where the world is going and the way of the future.”

    Other nominees in this category are Symform, ExtraHop, LaserMotive, and Vioguard.

    If you’re developing with the Kinect for Windows SDK and sensor, or simply a fan of the technology, cast your vote and help us become Seattle’s startup innovation of the year.

    Voting ends Monday, April 23rd. Winners will be announced at the Seattle 2.0 Startup Awards bash on May 3 at the Experience Music Project (EMP) in Seattle.

    Vote today!

    Kinect for Windows Team

  • Kinect for Windows Product Blog

    Microsoft Accelerator for Kinect: The Road to Demo Day


    Twelve weeks ago, I announced that the Microsoft Accelerator for Kinect had opened its doors and the 11 participating teams had arrived in Seattle. Yesterday, the program concluded with Demo Day—an all-day event attended by more than 150 investors and journalists—where each of the startups presented their business plans and applications.

    The übi team, all the way from Munich, Germany, demonstrate their Kinect-enabled technology that allows users to turn almost any screen or surface into a touch-enabled display. From the beginning, we believed this program was going to be amazing: we had hoped to receive 100 to 150 applications, but ended up with nearly 500 from more than 60 countries. There were so many amazing, creative ideas from a whole range of talented, successful people. As I said in a previous post, getting to the finalists was super challenging.

     The teams who came here to Seattle—leaving jobs, families, university, and the comforts of their daily lives—did not disappoint. Their energy, drive, and innovative thinking were a constant source of inspiration to me and the folks across Microsoft that worked with them.

     There were a lot great moments at Demo Day; here are just a few of many:

    • The pitches. To see all of the teams come so far in such a short period of time was phenomenal—from early, nascent ideas about their businesses to fully formed plans today. And it was especially gratifying to see so many of them already partially funded.
    • The demo booths. These photos speak for themselves. Check out more pictures from the event as well as videos of the whole three months.
    • The cheer. Last but not least, when the day came to close and it was time for everyone to box up their demo booths, all of the teams gave each other a spontaneous cheer that reverberated through the halls of Microsoft. The camaraderie and respect that had developed between the teams was palpable. That cheer was the sound of success, collaboration, and teamwork.

    GestSure Technologies prepares for event attendees. The GestSure team has collaborated to develop a Kinect-enabled solution for the operating room. I think all the Kinect Accelerator companies have done an outstanding job the past 12 weeks and have bright futures ahead.  These 11 teams are helping accelerate and push the boundary of what’s possible with Kinect for Windows, and inspiring others to think creatively about what the future looks like when Kinect-enabled, touch-free NUI experiences are commonplace.

    Thanks to all of the teams that participated in the Accelerator and to the many others who applied. Keep up the great work!

    Key links

    Craig Eisler
    General Manager, Kinect for Windows

  • Kinect for Windows Product Blog

    The Kinect for Windows v2 sensor and free SDK 2.0 public preview are here


    Today, we began shipping thousands of Kinect for Windows v2 sensors to developers worldwide. And more sensors will leave the warehouse in coming weeks, as we work to fill orders as quickly as possible.

    Additionally, Microsoft publicly released a preview version of the Kinect for Windows SDK 2.0 this morning—meaning that developers everywhere can now take advantage of Kinect’s latest enhancements and improved capabilities. The SDK is free of cost and there are no fees for runtime licenses of commercial applications developed with the SDK.

    The new sensor can track as many as six complete skeletons and 25 joints per person.
    The new sensor can track as many as six complete skeletons and 25 joints per person.

    We will be releasing a final version of the SDK 2.0 in a few months, but with so many of you eagerly awaiting access, we wanted to make the SDK available as early as possible. For those of you who were unable to take part in our developer preview program, now you can roll up your sleeves and start developing. And for anyone else out there who has been waiting—well, the wait is over!

    The new sensor’s key features include:

    • Improved skeletal tracking: The enhanced fidelity of the depth camera, combined with improvements in the software, have led to a number skeletal tracking developments. In addition to now tracking as many as six complete skeletons (compared to two with the original sensor), and tracking 25 joints per person (as compared to 20 with the original sensor), the tracked positions are more anatomically correct and stable—and the range of tracking is broader. This enables and simplifies many scenarios, including more stable avateering, more accurate body position evaluation, crisper interactions, and more bystander involvement in interactive scenarios.
    • Higher depth fidelity: With higher depth fidelity and a significantly improved noise floor, the v2 sensor gives you better 3D visualization, increased ability to see smaller objects and all objects more clearly, and more stable skeletal tracking.
    • 1080p HD video: The color camera captures full, beautiful 1080p video that can be displayed in the same resolution as the viewing screen, allowing for a broad range of powerful scenarios. In addition to improving video communications and video analytics applications, this provides a great input on which to build high-quality, augmented reality scenarios, digital signage, and more.
    • New active infrared capabilities: In addition to allowing the Kinect for Windows v2 sensor to see in the dark, the new infrared (IR) capabilities produce a lighting-independent view, which makes machine learning or computer-vision–based tasks much easier—because you don’t have to account for or model lighting-based variation. And, you can now use IR and color at the same time. We look forward to the many new and innovative uses that the community will develop to use this fundamentally new capability.
    • Wider/expanded field of view: The expanded field of view enables a larger area of a scene to be captured by the camera. As a result, users can be closer to the camera and still in view, and the camera is effective over a larger total area.

    With the ability to track new joints for hand tips and thumbs—as well as improved understanding of the soft connective tissue and body positioning—you get more anatomically correct positions for crisp interactions.
    With the ability to track new joints for hand tips and thumbs—as well as improved understanding of the soft connective tissue and body positioning—you get more anatomically correct positions for crisp interactions.

    In addition to the new sensor’s key features, the Kinect for Windows SDK 2.0 includes:

    • Improved skeletal, hand, and joint orientation: With the ability to track as many as six people and 25 skeletal joints per person—including new joints for hand tips, thumbs, and shoulder center—as well as improved understanding of the soft connective tissue and body positioning—you get more anatomically correct positions for crisp interactions and more accurate avateering. These improved capabilities result in more lifelike avatars and open up new and better scenarios in fitness, wellness, education and training, entertainment, gaming, movies, communications, and more.
    • Support for new development environments: New Unity support provides faster, cost-efficient, and high quality support for cross-platform development, enabling developers to build their apps for the Windows Store using tools they already know.
    • Powerful tooling: Thanks to Kinect Studio’s enhanced recording and playback features, developers can develop on the go, without the need to have a Kinect sensor with them at all times. And Gesture Builder lets developers build their own custom gestures that the system recognizes and uses to write code by using machine learning. These features increase productivity and keep costs down.
    • Advanced face tracking: With significantly increased resolution, applications can capture a face with a 2,000-point mesh that looks more true to life. This means that avatars will look more lifelike.
    • Simultaneous multi-app support: New multi-app support enables more than one application to access a single sensor simultaneously. This means you could have a business intelligence app running at the same time that a training or retail or education experience were running, allowing you to get analytics in real time.

    When the final version of the SDK is available, people will be able to start submitting their apps to the Windows Store, and companies will be able to make their v2 solutions available commercially. We look forward to seeing what everyone does with the new NUI.

    The new SDK 2.0 public preview includes Unity support for faster, cost-efficient, and high quality support for cross-platform development, enabling developers to build their apps for the Windows Store using tools they already know.
    The new SDK 2.0 public preview includes Unity support for faster, cost-efficient, and high quality support for cross-platform development, enabling developers to build their apps for the Windows Store using tools they already know.

    We’ve already shown you what several partners are working on, including Reflexion Health and Freak n’ Genius. Most recently, Walt Disney Studios Motion Pictures have developed an interactive experience to help promote their upcoming movie, Planes 2: Fire & Rescue. One of seven experience kiosks will debut in London at the end of the week in time for school holidays. Disney is confident it will receive an enthusiastic reception from users of all ages, creating an engaging experience associated with the Disney brand and, of course, sparking interest in the movie which releases nationwide from August 8. Read more.

    We will showcase more partner solutions here in coming months, so stay tuned. In the meantime, order your new sensor, download the SDK 2.0 public preview, and start developing your NUI apps. And please join our Microsoft Virtual Academy to learn from our experts and jump start your development.

    The Kinect for Windows Team

    Key links

Page 6 of 10 (94 items) «45678»