• Kinect for Windows Product Blog

    Dev Kit Program Update

    • 3 Comments

    Back in June at the Build Conference we announced and started taking applications for our upcoming developer kit program.

    The response & interest we’ve seen has been tremendous: thousands of developers from 74 different countries applied.

    Mea Culpa

    When we announced the program we said we’d start notifying successful applicants in August. Many people interpreted that to mean that we’d be done with notifications in August. I apologize for not being clearer on this. We never intended to have all the notifications done in August. While we did start in August and have notified many developers of their acceptance to the program, there are still many more applicants to be notified.

    Over the coming weeks we will continue to let applicants know if they are admitted to the program, denied admission, or waitlisted (just like college :-)). Every applicant will hear something from us by the end of September. Anyone who is waitlisted will have a final decision by the end of the year.

    We are still planning to start sending out the developer kits in late November to all program participants.

    Again, apologies for any confusion. Please stay tuned…you will hear something from us soon!

    Ben

    @benlower | kinectninja@microsoft.com | mobile: +1 (206) 659-NINJA (6465)

  • Kinect for Windows Product Blog

    New Director, Same Direction: the Momentum Continues with Kinect for Windows

    • 3 Comments

    Almost two years ago, Microsoft announced its intent to take Kinect beyond gaming and make it possible for developers and businesses to innovate with Kinect on computers. The Kinect for Windows team was born.

    Shortly after that, I joined the team to oversee Program Management, and over the past year, we’ve shipped the Kinect for Windows sensor as well as multiple updates to the Kinect for Windows software development kit (SDK). Throughout it all, Craig Eisler has been leading our business.

    This month, Craig is moving on to do other important work at Microsoft, and I am stepping in to lead the Kinect for Windows team. I am excited to maintain the amazing momentum we’ve seen in industries like healthcare, retail, education, and automotive. There have been more than 500,000 downloads of our free SDK, and the Kinect for Windows sensor can be purchased in 39 regions today.

    Such rapid growth would not have been possible without the community embracing the technology. Thanks to all of you—business leaders, technical leaders, creative visionaries, and developers—Kinect for Windows has been deployed across the globe. The community is developing new ways for consumers to shop for clothing and accessories, interesting digital signage that delights and inspires customers, remote monitoring tools that make physical therapy easier, more immersive training and simulation applications across multiple industries, and touch-free computing tools that enable surgeons to view patient information without having to leave the operating room. The list goes on and on…and the list is growing every day.

    We launched Kinect for Windows nearly one year ago—pioneering a commercial technology category that didn’t previously exist. I look forward to continuing to be at the forefront of touch-free computing and helping our partners develop innovative solutions that take the natural user interface vision even further. We’ve said it before and I’ll say it again: this is just the beginning. I’m thrilled to continue the great foundational work we did in 2012 and look forward to a very productive 2013.

    Bob Heddle
    Director, Kinect for Windows

  • Kinect for Windows Product Blog

    Using Kinect InteractionStream Outside of WPF

    • 3 Comments

    Last month with the release of version 1.7 of our SDK and toolkit we introduced something called the InteractionStream.  Included in this release were two new samples called Controls Basics and Interaction Gallery which, among other things, show how to use the new InteractionStream along with new interactions like Press and Grip.  Both of these new samples are written using managed code (C#) and WPF.

    One question I’ve been hearing from developers is, “I don’t want to use WPF but I still want to use InteractionStream with managed code.  How do I do this?”  In this post I’m going to show how to do exactly that.  I’m going to take it to the extreme by removing the UI layer completely:  we’ll use a console app using C#.

    The way our application will work is summarized in the diagram below:

    image

     

    There are a few things to note here:

    1. Upon starting the program, we initialize our sensor, interactions, and create FrameReady event handlers.
    2. Our sensor is generating data for every frame.  We use our FrameReady event handlers to respond and handle depth, skeleton, and interaction frames.
    3. The program implements the IInteractionClient interface which requires us to implement a method called GetInteractionInfoAtLocationwhich gives us back information about interactions happening with a particular user at a specified location:
      public InteractionInfo GetInteractionInfoAtLocation(int skeletonTrackingId, InteractionHandType handType, double x, double y)
      {
      var interactionInfo = new InteractionInfo
      {
      IsPressTarget = false,
      IsGripTarget = false
      };

      // Map coordinates from [0.0,1.0] coordinates to UI-relative coordinates
      double xUI = x * InteractionRegionWidth;
      double yUI = y * InteractionRegionHeight;

      var uiElement = this.PerformHitTest(xUI, yUI);

      if (uiElement != null)
      {
      interactionInfo.IsPressTarget = true;

      // If UI framework uses strings as button IDs, use string hash code as ID
      interactionInfo.PressTargetControlId = uiElement.Id.GetHashCode();

      // Designate center of button to be the press attraction point
      //// TODO: Create your own logic to assign press attraction points if center
      //// TODO: is not always the desired attraction point.
      interactionInfo.PressAttractionPointX = ((uiElement.Left + uiElement.Right) / 2.0) / InteractionRegionWidth;
      interactionInfo.PressAttractionPointY = ((uiElement.Top + uiElement.Bottom) / 2.0) / InteractionRegionHeight;
      }

      return interactionInfo;
      }
    4. The other noteworthy part of our program is in the InteractionFrameReady method.  This is where we process information about our users, route our UI events, handle things like Grip and GripRelease, etc.

     

    I’ve posted some sample code that you may download and use to get started using InteractStream in your own managed apps.  The code is loaded with tips in the comments that should get you started down the path of using our interactions in your own apps.  Thanks to Eddy Escardo Raffo on my team for writing the sample console app.

    Ben

    @benlower | kinectninja@microsoft.com | mobile: +1 (206) 659-NINJA (6465)

  • Kinect for Windows Product Blog

    Join Now, BUILD for Tomorrow

    • 3 Comments

    Today at Microsoft BUILD 2013, we made two important announcements for our Kinect for Windows developer community.

    First, starting today, developers can apply for a place in our upcoming developer kit program. This program will give participants exclusive early access to everything they need to start building applications for the recently-announced new generation Kinect for Windows sensor, including a pre-release version of the new sensor hardware and software development kit (SDK) in November, and a replacement unit of the final sensor hardware and firmware when it is publicly available next year. The cost for the program will be US$399 (or local equivalent). Applications must be received by July 31 and successful applicants will be notified and charged in August. Interested developers are strongly encouraged to apply early, as spots are very limited and demand is already great for the new sensor. Review complete program details and apply for the program.


    The upcoming Kinect for Windows SDK 1.8 will include more realistic color capture with Kinect Fusion.
    The upcoming Kinect for Windows SDK 1.8 will include more realistic color capture with Kinect Fusion.

    Additionally, in September we will again refresh the Kinect for Windows SDK with several exciting updates including:

    • The ability to extract the user from the background in real time
    • The ability to develop Kinect for Windows desktop applications by using HTML5/JavaScript
    • Enhancements to Kinect Fusion, including capture of color data and improvements to tracking robustness and accuracy

    The feature enhancements will enable even better Kinect for Windows-based applications for businesses and end users, and the convenience of HTML5 will make it easier for developers to build leading-edge touch-free experiences.

    This will be the fourth significant update to the Kinect for Windows SDK since we launched 17 months ago. We are committed to continuing to improve the existing Kinect for Windows platform as we prepare to release the new generation Kinect for Windows sensor and SDK.  If you aren’t already using Kinect for Windows to develop touch-free solutions, now is a great time to start. Join us as we continue to make technology easier to use and more intuitive for everyone.

    Bob Heddle
    Director, Kinect for Windows

    Key links 

  • Kinect for Windows Product Blog

    Jintronix makes rehabilitation more convenient, fun, and affordable with Kinect for Windows

    • 2 Comments

    A stroke can be a devastating experience, leaving the patient with serious physical impairments and beset by concerns for the future. Today, that future is much brighter, as stroke rehabilitation has made enormous strides. Now, Jintronix offers a significant advance to help stroke patients restore their physical functions: an affordable motion-capture system for physical rehabilitation that uses Microsoft Kinect for Windows.

    Jintronix offers a significant advance to help stroke patients restore their physical functions
    The folks at Montreal- and Seattle-based Jintronix are tackling three major issues related to rehabilitation. First, and most importantly, they are working to improve patients’ compliance with their rehabilitation regimen, since up to 65% of patients fail to adhere fully—or at all—with their programs.[1] In addition, they are addressing the lack of accessibility and the high cost associated with rehabilitation. If you have just had a stroke, even getting to the clinic is a challenge, and the cost of hiring a private physical therapist to come to your home is too high for most people.

    Consider Jane, a 57-year-old patient. After experiencing a stroke eight months ago, she now has difficulty moving the entire right side of her body. Like most stroke victims, Jane faces one to three weekly therapy sessions for up to two years. Unable to drive, she depends on her daughter to get her to these sessions; unable to work, she worries about the $100 fee per visit, as she has exhausted her insurance coverage. If that weren’t enough, Jane also must exercise for hours daily just to maintain her mobility. Unfortunately, these exercises are very repetitive, and Jane finds it difficult to motivate herself to do them. 

    Jintronix tackles all of these issues by providing patients with fun, “gamified” exercises that accelerate recovery and increase adherence. In addition, Jintronix gives patients immediate feedback, which ensures that they perform their movements correctly. This is critical when the patient is exercising at home.

    For clinicians and insurers, Jintronix monitors and collects data remotely to measure compliance and provides critical information on how to customize the patient’s regimen. Thus patients can conveniently and consistently get treatment between clinic visits, from the comfort of their own homes, with results transmitted directly to their therapist. This has been shown to be an effective method for delivering care, and for people living in remote areas, this type of tele-rehabilitation has the potential to be a real game changer.[2] Moreover, a growing shortage of trained clinicians—the shortfall in the United States was estimated to be 13,500 in 2013 and is expected to grow to 31,000 by 2016—means that more and more patients will be reliant on home rehab[3].

    Motion capture lies at the heart of Jintronix. The first-generation Kinect for Windows camera can track 20 points on the body with no need for the patient to wear physical sensors, enabling Jintronix to track the patient’s position in three-dimensional space at 30 frames per second. Behind the scenes, Jintronix uses the data captured by the sensor to track such metrics as the speed and fluidity of patients’ movement. It also records patients’ compensation patterns, such as leaning the trunk forward to reach an object instead of extending the arm normally.

    Jintronix then uses this data to place patients in an interactive game environment that’s built around rehabilitation exercises. For example, in the game Fish Frenzy, the patient's hand controls the movement of an on-screen fish, moving it to capture food objects that are placed around the screen in a specific therapeutic pattern, like a rectangle or a figure eight.

    There are other rehab systems out there that use motion capture, but they often require sensor gloves or other proprietary hardware that take a lot of training and supervision to use, or they depend on rigging an entire room with expensive cameras or placing lots of sensors on the body. “Thanks to Kinect for Windows, Jintronix doesn’t require any extra hardware, cameras, or body sensors, which keeps the price affordable,” says Shawn Errunza, CEO of Jintronix. “That low price point is extremely important,” notes Errunza, “as we want to see our system in the home of every patient who needs neurological and orthopedic rehab.”

    Jintronix developed the system by working closely with leaders in the field of physical rehabilitation, such as Dr. Mindy Levin, professor of physical and occupational therapy at McGill University. With strong support both on the research and clinical sides, the company designed a system that can serve a variety of patients in addition to post-stroke victims—good news for the nearly 36 million individuals suffering from physical disabilities in the United States[4].

    What’s more, Jintronix is a potential boon to the elderly, as it has been shown that seniors can reduce the risk of injury due to falls by 35% by following specific exercise programs.  Unfortunately, most home rehab regimens fail to engage such patients. A recent study of elderly patients found that less than 10 percent reported doing their prescribed physical therapy exercises five days a week (which is considered full adherence), and more than a third reported zero days of compliance.

    Jintronix is currently in closed beta testing in five countries, involving more than 150 patients at 60 clinics and hospitals, including DaVinci Physical Therapy in the Seattle area and the Gingras Lindsay Rehabilitation Hospital in Montreal. According to Errunza, “preliminary results show that the fun factor of our activities has a tangible effect on patients’ motivation to stay engaged in their therapy.”

    Jintronix is working to remove all the major barriers to physical rehabilitation by making a system that is fun, simple to use, and affordable. Jintronix demonstrates the potential of natural user interfaces (NUI) to make technology simpler and more effective—and the ability of Kinect for Windows to help high tech meet essential human needs.

    The Kinect for Windows Team

    Key links

     


    1 http://physiotherapy.org.nz/assets/Professional-dev/Journal/2003-July/July03commentary.pdf

    2 http://www.ncbi.nlm.nih.gov/pubmed/23319181

    3 http://www.apta.org/PTinMotion/NewsNow/2013/10/21/PTDemand/

    4 http://ptjournal.apta.org/content/86/3/401.full

  • Kinect for Windows Product Blog

    What's Up with CodePlex?

    • 2 Comments

    Last week we announced the release of the source code of 22 Kinect for Windows sample applications.  The developer response has been terrific and much larger than we expected.

    Some publications claimed we had open sourced “all of the code for Kinect” or the “core code of Kinect”.  Neither of these is true.  We released source code for most of our sample applications.  It’s important to understand that sample code is not the same as core code.  The purpose of the sample applications is to give developers examples of how to use particular APIs and/or to give a good starting point for a new application.  Samples do use the core APIs and Kinect for Windows platform but we have not changed anything about the licensing of those underlying components.



    The samples we’ve released show how to do things like get raw infrared data from the sensor, build an interactive kiosk that changes content when a person is detected, and track a person’s facial movements.  The samples are one of the many areas in which we are investing to make it easy for new and seasoned developers alike to build applications using Kinect for Windows.

    It was not our intention for our announcement to be misinterpreted.  It’s evident in the comments of many posts that readers understood the distinction.  It’s also been great to see the debate & discussions (I’m looking at you, Reddit :-).

    We are following up with some publications to clarify our announcement and to request they update their posts.

     

    Ben

    @benlower | kinectninja@microsoft.com | mobile:  +1 (206) 659-NINJA (6465)

    Kinect for Windows:  @KinectWindows

  • Kinect for Windows Product Blog

    Kinect for Windows Academic Pricing Update

    • 2 Comments

    Shortly after the commercial release of Kinect for Windows in early 2012, Microsoft announced the availability of academic pricing for the Kinect for Windows sensor to higher education faculty and students for $149.99 at the Microsoft Store in the United States. We are now pleased to announce that we have broadened the availability of academic pricing through Microsoft Authorized Educational Resellers (AERs).

    Most of these resellers have the capability to offer academic pricing directly to educational institutions; academic researchers; and students, faculty, and staff of public or private K-12 schools, vocational schools, junior colleges, colleges, universities, and scientific or technical institutions. In the United States, eligible institutions are accredited by associations that are recognized by the US Department of Education and/or the State Board of Education. Academic pricing on the Kinect for Windows sensor is currently available through AERs in the United States, Taiwan, and Hong Kong SAR.

    Within the academic community, the potential of Kinect for Windows in the classroom is generating a lot of excitement. Researchers and academia in higher education collaborate with Microsoft Research on a variety of projects that involve educational uses of Kinect for Windows. The educator driven community resource, KinectEDucation, encourages developers, teachers, students, enthusiasts and any other education stakeholders to help transform classrooms with accessible technology.
     
    One such development is a new product from Kaplan Early Learning Company, the Inspire-NG Move, bundled with the Kinect for Windows sensor. This bundle includes four educational programs for children age three years and older. The programs make it possible for children to experience that hands-on, kinesthetic play with a purpose makes learning fun. The bundle currently sells for US$499.

    “We’re excited about the new learning models that are enabled by Kinect for Windows,” stated Chris Gerblick, vice president of IT and Professional Services at Kaplan Early Learning Company. “We see the Inspire NG-Move family of products as excellent learning tools for both the classroom and the home.”

    With the availability of academic pricing, we look forward to many developments from the academic community that integrate Kinect for Windows into interactive educational experiences.

    Michael Fry
    Business Development, Strategic Alliances
    Kinect for Windows

    Key Links

     

  • Kinect for Windows Product Blog

    Kinect Accelerator Program Seeking Innovators

    • 2 Comments

    In March, ten startups will converge on Seattle to start developing commercial and gaming applications that utilize Kinect's innovative natural user interface (NUI). As part of the Microsoft Kinect Accelerator program, they will have three months and a wealth of resources—including access to Microsoft and industry mentors—to develop, and then present their applications to angel investors, venture capitalists, Microsoft executives, media, and influential industry leaders.

    Since launching in late November, the Kinect Accelerator has received hundreds of applications from over forty countries, proposing transformative, creative innovations for healthcare, fitness, retail, training/simulation, automotive, scientific research, manufacturing, and much more.

    Applications are still being accepted, and the Kinect Accelerator team encourages you to apply. Learn more about the application process.

    The Kinect Accelerator program is powered by TechStars, one of the most respected technology accelerator programs in the world.  Microsoft is working with TechStars to leverage the absolute best startup accelerator methodologies, mentors, and visibility.  If you are considering building a business based on the capabilities of Kinect, this is a great opportunity for you.

    Dave Drach, Managing Director, Microsoft Emerging Business Team, explains that the Kinect Accelerator program is looking for creative startups that have a passion for driving the next generation of computing. “Starting in the spring of 2012, they will have three months to bring their ideas to life. What will emerge will be applications and business scenarios that we’ve not seen before,” comments Drach.

    Read more about the Kinect Accelerator program.

    Kinect for Windows team

  • Kinect for Windows Product Blog

    Microsoft’s Kinect Accelerator Begins Today

    • 2 Comments

    I am pleased to announce that the finalists for our Kinect Accelerator have arrived in ever-sunny Seattle and today are launching into a three-month program to build new products and business using Kinect. I can’t wait to see what they come up with – using Kinect, these teams have the ability to reimagine the way products are used, and perhaps even revolutionize entire industries along the way.

    Kinect Accelerator is powered by TechStars, in close collaboration with the Microsoft BizSpark program; my team and I have been working closely with the BizSpark team and others in the Interactive Entertainment Business to help develop and bring this program to life. The response to the Kinect Accelerator has been phenomenal and we expect to see remarkable innovation coming out of the program.

    Craig Eisler and other executives from Microsoft and TechStars met in February to review program applications.We were hoping to receive 100 to 150 applications, with a goal of selecting the best ten. But the worldwide entrepreneurial community completely surprised us by submitting almost five hundred applications with concepts spanning nearly 20 different industries, including healthcare, education, retail, entertainment, and more. 

    There were so many clever and innovative ideas and so many great teams it was super challenging to narrow things down – we spent many, many hours in a rigorous and highly energetic review process. We finally landed on 11 finalists from five countries, chosen based on their experience, qualifications, and the potential benefit that could result from their Kinect Accelerator.  The finalists are: 

    • Freak'n Genius – Seattle, WA
    • GestSure Technologies – Toronto, Canada
    • IKKOS – Seattle, WA
    • Kimetric – Buenos Aires, Argentina
    • Jintronix Inc.  – Montreal, Canada
    • Manctl – Lyon, France
    • NConnex – Hadley, MA
    • Styku - Los Angeles, CA
    • übi interactive – Munich, Germany
    • VOXON – New York, NY
    • Zebcare – Boston, MA

    The Kinect Accelerator will be held in Microsoft’s state of the art facility in Seattle’s vibrant South Lake Union neighborhood.Each team will be mentored by entrepreneurs and venture capitalists as well as leaders from Kinect for Windows, Xbox, Microsoft Studios, Microsoft Research and other Microsoft organizations. The teams will spend the first several weeks ideating and refining their business concepts with input and advice from their mentors, followed by several weeks of design and development.  They will present their results at an event at the end of June.

    We were so amazed by the quality, caliber, and uniqueness of the applications and teams that we decided to reward the top 100 applicants that didn’t make it into the program with a complimentary Kinect for Windows sensor. I believe we are going to see great things from many of the folks that applied to the program and we wish them all the best.

    We will share more information about the Kinect Accelerator teams and their applications on this blog in coming months. And for more information on the Kinect Accelerator program in general, go to KinectAccelerator.com.

    Craig Eisler
    General Manager, Kinect for Windows

  • Kinect for Windows Product Blog

    Kinect for Windows Helps Girls Everywhere Dress Like Barbie

    • 2 Comments

    I grew up in the UK and my female cousins all had Barbie. In fact Barbies – they had lots of Barbie dolls and ton of accessories that they were obsessed with. I was more of a BMX kind of kid and thought my days of Barbie education were long behind me, but with a young daughter I’m beginning to realize that I have plenty more Barbie ahead of me, littered around the house like landmines. This time around though, I’m genuinely interested thanks to a Kinect-enabled application. The outfits from Barbie the Dream Closet not only scale to fit users, but enable them to turn sideways to see how they look from various angles.

    This week, Barbie lovers in Sydney, Australia, are being given the chance to do more than fanaticize how they’d look in their favorite Barbie outfit. Thanks to Mattel, Gun Communications, Adapptor, and Kinect for Windows, Barbie The Dream Closet is here.

    The application invites users to take a walk down memory lane and select from 50 years of Barbie fashions. Standing in front of Barbie’s life-sized augmented reality “mirror,” fans can choose from several outfits in her digital wardrobe—virtually trying them on for size.

    The solution, built with the Kinect for Windows SDK and using the Kinect for Windows sensor, tracks users’ movements and gestures enabling them to easily browse through the closet and select outfits that strike their fancy. Once an outfit is selected, the Kinect for Windows skeletal tracking determines the position and orientation of the user. The application then rescales Barbie’s clothes, rendering them over the user in real time for a custom fit.

    One of the most interesting aspects of this solution is the technology’s ability to scale - with menus, navigation controls and clothing all dynamically adapting so that everyone from a little girl to a grown woman (and cough, yes, even a committed father) can enjoy the experience. To facilitate these advancements, each outfit was photographed on a Barbie doll, cut into multiple parts, and then built individually via the application. 

    Of course, the experience wouldn’t be complete without the ability to memorialize it. A photo is taken and, with approval/consent from those photographed, is uploaded and displayed in a gallery on the Barbie Australian Facebook page. (Grandparents can join in the fun from afar!)

    I spoke with Sarah  Sproule, Director, Gun Communications about the genesis of the idea who told me, We started working on Barbie The Dream Closet six months ago, working with our development partner Adapptor. Everyone has been impressed by the flexibility, and innovation Microsoft has poured into Kinect for Windows. Kinect technology has provided Barbie with a rich and exciting initiativBarbie enthusiasts of all ages can enjoy trying on and posing in outfits.e that's proving to delight fans of all ages. We're thrilled with the result, as is Mattel - our client."

    Barbie’s Dream Closet, was opened to the public at the Westfield Parramatta in Sydney  today and will be there through April 15. Its first day, it drew enthusiastic crowds, with around 100 people experiencing Barbie The Dream Closet. It's expected to draw even larger crowds over the holidays. It’s set to be in Melbourne and Brisbane later this year.

     Meantime, the Kinect for Windows team is just as excited about it as my daughter:

    “The first time I saw Barbie’s Dream Closet in action, I knew it would strike a chord,” notes Kinect for Windows Communications Manager, Heather Mitchell. “It’s such a playful, creative use of the technology. I remember fanaticizing about wearing Barbie’s clothes when I was a little girl. Disco Ken was a huge hit in my household back then…Who didn’t want to match his dance moves with their own life-sized Barbie disco dress? I think tens of thousands of grown girls have been waiting for this experience for years…Feels like a natural.”

    That’s the beauty of Kinect – it enables amazingly natural interactions with technology and hundreds of companies are out there building amazing things; we can’t wait to see what they continue to invent.

    Steve Clayton
    Editor, Next at Microsoft

Page 4 of 10 (96 items) «23456»