• Kinect for Windows Product Blog

    The magic of Christmas, Kinect style

    • 0 Comments

    Every December, British shoppers look forward to the creative holiday ad campaign from John Lewis, a major UK department store chain. It’s been a tradition for a number of years, is seen by millions of viewers in the UK annually, and won a coveted IPA Effectiveness Award in 2012. The retailer’s seasonal campaign traditionally emphasizes the joy of giving and the magic of Christmas, and this year’s ads continue that tradition, with a television commercial that depicts the loving relationship between a young boy and his pet penguin, Monty.


    The 2014 Christmas advertisement from John Lewis tells the story of a boy and his penguin—and the
    magic of giving just the right gift.

    But the iconic British retailer has added a unique, high-tech twist to the 2014 campaign: Monty’s Magical Toy Machine, an in-store experience that uses the Kinect for Windows v2 sensor to let kids turn their favorite stuffed toy into an interactive 3D model. The experience deftly plays off the TV ad, whose narrative reveals that Monty is a stuffed toy that comes alive in the boy’s imagination.

    Monty’s Magical Toy Machine experience, which is available at the John Lewis flagship store on London’s Oxford Street, plays to every child’s fantasy of seeing a cherished teddy bear or rag doll come to life—a theme that runs through children’s classics from Pinocchio to the many Toy Story movies. The experience has been up and running since November 6, with thousands of customers interacting with it to date. Customers have until December 23 to enjoy the experience before it closes.

    The toy machine experience was the brainchild of Microsoft Advertising, which had been approached by John Lewis to come up with an innovative, technology-based experience based on the store’s holiday ad. “We actually submitted several ideas,” explains creative solutions specialist Art Tindsley, “and Monty’s Magical Toy Machine was the one that really excited people. We were especially pleased, because we were eager to use the new capabilities of the Kinect v2 sensor to create something truly unique.”

    John Lewis executives loved the idea and gave Microsoft the green light to proceed. "We were genuinely excited when Microsoft presented this idea to us,” says Rachel Swift, head of marketing for the John Lewis brand. “Not only did it exemplify the idea perfectly, it did so in a way that was both truly innovative and charming.”  

    Working with the John Lewis team and creative agency adam&eveDDB, the Microsoft team came up with the design of the Magical Toy Machine: a large cylinder, surrounded by three 75-inch display screens, one of which is topped by a Kinect for Windows v2 sensor. It is on this screen that the animation takes place.

    The enchantment happens here, at Monty's Magical Toy Machine. Two of the enormous display screens can be seen in this photo; the screen on the left has a Kinect for Windows v2 sensor mounted above and speakers positioned below.

    The enchantment happens here, at Monty's Magical Toy Machine. Two of the enormous display
    screens can be seen in this photo; the screen on the left has a Kinect for Windows v2 sensor
    mounted above and speakers positioned below.

    The magic begins when the child’s treasured toy is handed over to one of Monty’s helpers. The helper then takes the toy into the cylinder, where, unseen by the kids, it is suspended by wires and photographed by three digital SLR cameras. The cameras rotate around the toy, capturing it from every angle. The resulting photos are then fed into a customized computer running Windows 8.1, which compiles them into a 3D image that is projected onto the huge screen, much to the delight of the toy’s young owner, who is standing in front of the display. This all takes fewer than two minutes.

    Suspended by wires, the toy is photographed by three digital SLR cameras (two of which are visible here) that rotate around the toy and capture its image from every angle.

    Suspended by wires, the toy is photographed by three digital SLR cameras (two of which are
    visible here) that rotate around the toy and capture its image from every angle.

    The Kinect for Windows v2 sensor then takes over, bringing the toy’s image to life by capturing and responding to the youngster’s gestures. When a child waves at the screen, their stuffed friend wakens from its inanimate slumber—magically brought to life and waving back to its wide-eyed owner. Then, when the child waves again, their toy dances in response, amazing and enchanting both kids and parents, many of whom cannot resist dancing too.

    The Kinect for Windows SDK 2.0 plays an essential role in animating the toy. Having added a skeletal image to the toy, the developers used the Kinect for Windows software development kit (SDK) 2.0 to identify key sequences of movements, thus enabling the toy to mimic the lifelike poses and dances of a human being. Because the actions map to those of a human figure, Monty’s Magical Toy Machine works best on toys like teddy bears and dolls, which have a bipedal form like that of a person. It also functions best with soft-textured toys, whose surface features are more accurately captured in the photos.

    The entire project took two months to build, reports Tindsley. “We began with scanning a real toy with Kinect technology, mapping it to create a surface representation (a mesh), then adding in texture and color. We then brought in a photogrammetry expert who created perfect 3D images for us to work with,” Tindsley recalls.

    Then came the moment of truth: bringing the image to life. “In the first trials, it took 12 minutes from taking the 3D scans of the toy to it ‘waking up’ on the screen—too long for any eager child or parent to wait,” said Tindsley. “Ten days later, we had it down to around 100 seconds. We then compiled—read choreographed and performed—a series of dance routines for the toy, using a combination of Kinect technology and motion capture libraries,” he recounts.


    A teddy bear named Rambo Ginge comes to life through Monty's Magical Toy Machine, and, as this video shows, even adults are enraptured to see their priceless toys come alive.

    None of this behind-the-scenes, high tech matters to the children, who joyfully accept that somehow their favorite stuffed toy has miraculously come to life. Their looks of surprise and wonder are priceless.

    And the payoff for John Lewis? Brand loyalty and increased traffic during a critical sales period. As Rachel Swift notes, “The partnership with Microsoft allowed us to deliver a unique and memorable experience at a key time of year. But above all,” she adds, “the reward lies in surprising and delighting our customers, young and old.” Just as Monty receives the perfect Christmas gift in the TV ad, so, too, do the kids whose best friends come to life before their wondering eyes.

    The Kinect for Windows Team

    Key links



  • Kinect for Windows Product Blog

    Intel-GE Care Innovations uses Kinect solution to help elderly patients

    • 0 Comments

    “I’ve fallen … and I can’t get up.” That line, from a low-budget 1980s TV commercial hawking a personal medical emergency call button, has been fodder for countless comedians over the years. But falls among the elderly are anything but a laughing matter, especially to Maureen Glynn, the director of behavioral innovation programming at Intel-GE Care Innovations.

    “Falls are a major health concern among the elderly,” she says, and the statistics certainly back her up. In fact, the U.S. Centers for Disease Control and Prevention reports that each year one of three Americans over the age of 65 takes a spill, and the results can be devastating: broken bones, permanent disabilities, and complications that can lead to death. In fact, falls are the leading cause of fatal and nonfatal injuries among older adults, with studies documenting that 20 to 30 percent of the elderly who fall suffer moderate to severe injuries. In 2003, for example, about 13,700 Americans 65 years or older died from falls, and another 1.8 million were treated in emergency departments for nonfatal fall injuries. Treating elderly patients who have fallen costs about $30 billion annually in the United States today, and experts estimate that that amount could more than double by 2020, given the aging population of Baby Boomers.

    Under the watchful eye of the Kinect sensor, a patient performs her physical therapy regimen from the comfort and convenience of her own home.
    Under the watchful eye of the Kinect sensor, a patient performs her physical therapy regimen from
    the comfort and convenience of her own home.

    What’s more, once an elderly individual has suffered a fall, he or she is much more likely to fall again without some sort of intervention. “I had a 76-year-old family member who fell five times, enduring repeated broken bones,” Glynn recounts. And while broken bones are no fun for anyone, they pose special problems in the elderly, whose ability to heal is often diminished. Seniors who break a hip—a common injury in falls among the elderly—may end up spending considerable time in the hospital and rehab, and may never attain full functionality again. Such sufferers become physically inactive, which, notes Glynn “can lead to chronic mental and physical disease.”

    Glynn’s employer is determined to change this dismal picture. As its name clearly indicates, Intel-GE Care Innovations is a joint venture of two industry titans. Founded in 2011, the company seeks to transform the way care is delivered by connecting patients in their homes with care teams—thus enabling patients to live independently whenever possible. Augmenting the technological strengths of its parent companies with deep knowledge of the healthcare system, Intel-GE Care Innovations collects, aggregates, and analyzes data to provide insights that connect providers, payers, caregivers, and consumers—and brings the care continuum into the patient’s home. For example, the company has established the Care Innovations Validation Institute to improve standards for measuring and promoting remote care management solutions and services.

    One of the company’s latest products, RespondWell from Care Innovations, takes direct aim at the problem of falls among the elderly. As Glynn observes, “As a company dedicated to helping patients receive the healthcare they need while maintaining as much independence as possible, we saw the need for a home-based solution that helps older people recover from and avoid falls.”

    The Kinect sensor monitors the patient’s performance, correcting improperly executed movements and awarding points for those done appropriately.
    The Kinect sensor monitors the patient’s performance, correcting improperly executed movements
    and awarding points for those done appropriately.

    Responding to this need led them to partner with RespondWell, a healthcare IT software company that, as CEO John Grispon explains, “specializes in activating patients and driving efficiencies. We motivate patients to follow through with their physical therapy, by making the activities interactive and engaging.” The company’s antecedents were in the gaming world, having created one of the first fitness games for the original Xbox, back in 2005. From there, RespondWell moved into the physical therapy industry, determined to do for rehab what they had done for fitness: getting people up and moving by making the often onerous rehab exercises interactive and entertaining.

    Both Intel-GE Care Innovations and RespondWell saw Kinect as the logical platform for addressing fall prevention and rehabilitation among seniors. Recognizing how difficult it can be for older people to make daily visits to their therapist’s office, the teams at Intel-GE Care Innovations and RespondWell have created an interactive program that lets patients exercise in the comfort of their own home while providing Kinect-based gesture monitoring to ensure that they are performing their exercises correctly. The solution is sold to therapists and other healthcare providers.

    It works like this: the therapist evaluates the patient and then designs a program of exercises that are intended to restore functions and, equally important, to prevent future falls. The patient learns the movements under the watchful eye of the therapist—and the unblinking lens of the Kinect sensor, which faithfully tracks the patient’s skeletal positions throughout the exercises.

    Using data captured by the Kinect sensor, the physical therapist can track a patient’s progress and adjust the exercise regimen as necessary.
    Using data captured by the Kinect sensor, the physical therapist can track a patient’s progress and
    adjust the exercise regimen as necessary.

    At home, patients call up their personalized exercise program on a Windows tablet or PC, which is connected to a Kinect sensor. They then perform the exercises, again under the view of the Kinect sensor, and the system analyzes their movements and provides instructions to correct any mistakes. The system not only corrects errors, but it rewards good performance with points, adding a competitive element that many patients find highly motivating. Glynn praises this positive reinforcement element of the solution, pointing out that it motivates patients without being overly gamified. She also points out that the solution not only monitors and coaches patients in the comfort of their own home, but that it also sends data about the performance to their therapist, who can adjust the exercises as needed.

    RespondWell from Care Innovations was developed on the Kinect v2 sensor, which Grispon enthusiastically endorses, “especially the enhanced field of view, which lets us get a good look at the patient even when he’s very close to the sensor.” He also praises the v2 sensor’s improved picture resolution and enhanced skeletal tracking, both of which boost the solution’s ability to precisely record patient movements. When asked how easy it was to port the code from the original Kinect sensor to the Kinect v2 sensor, Grispon quotes his lead developer, who says that the process was easy and offers this advice to other devs, “Smoothing is your friend—use it.” (Smoothing is a feature in SDK 2.0 that makes it easier for developers to recognize joints in the Kinect image.)

    Currently in pilot testing, RespondWell from Care Innovations is expected to be generally available by January 2015. Patients in the pilot program report that the solution makes physical therapy more enjoyable, while therapists are delighted that patients are more motivated to do their home exercises and that they are performing them more accurately. Both Glynn and Grispon stress that RespondWell from Care Innovations fills a vital need in our healthcare system. A system that not only helps seniors recover from falls but also helps prevent future tumbles could curb medical costs and offer the elderly a much improved quality of life. We’re pleased that Kinect can play a role in this effort.

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Telekinesis, Kinect for Windows style

    • 0 Comments

    What science fiction fan doesn’t love the idea of telekinesis? We never tire of the illusion, but there’s nothing illusory about Microsoft tech evangelist Mike Taulty’s ability to move a ball without touching it, using only simple hand gestures.


    Mike Tautly controls a Sphero with simple gestures—and a Kinect for Windows v2 sensor.

    The ball in question is the Sphero, a clever little robot that is normally controlled via a smartphone or tablet. The “magic” behind Taulty’s handiwork comes courtesy of Kinect for Windows v2, as he explains in the video clip above. By adding Kinect to the equation, Taulty made it possible to control Sphero without the need to use a tablet or smartphone. He created his ball-rolling app to demonstrate the potential of Windows 8.1 apps at the Native Summit conference in London this past September. He tied together the Kinect for Windows v2 sensor and the Sphero device with some JavaScript code, and voilà—he could control the rotation of the Sphero with his left hand and its direction of movement with his right.

    While Taulty calls his app “hacking for fun,” we see its potential in the real world. Imagine, for instance, how much more engaged a person could be with a digital display in a shopping center or public space if they could manipulate products or objects themselves. Imagine this simple application applied to a museum installation, an advertising display in a retail store, or a gaming arcade. We may not be able to control objects with our minds, but Kinect for Windows gives us the next best thing.

    The Kinect for Windows Team

    Key links

     

  • Kinect for Windows Product Blog

    Hackers shine in the Great White North

    • 2 Comments

    Earlier this month, we traveled north for a developer hackathon at the Burnaby campus of the British Columbia Institute of Technology (BCIT), located in the heart of the Vancouver metropolitan area. The event, which was hosted by BCIT and co-sponsored by our team along with Occipital (makers of the Structure Sensor), drew nearly 100 developers, all eager to spend their weekend hacking with us. We were astonished by their creativity and energy—and their ability to cram so much hardware on each table!

    Attendees hunched over keyboards and displays, hard at work on their projects.Attendees hunched over keyboards and displays, hard at work on their projects.

    First place

    Team Bubble Fighter won the top prize ($250, five Kinect for Windows v2 sensors and carrying bags, and a Structure Sensor) for their game Bubble Fighter. The game, reminiscent of Street Fighter, allows two people to play against one another over a network connection. Game play includes special moves triggered by gestures, including jumping to make your avatar leap over projectiles.

    Players really had to jump to avoid projectiles in Bubble Fighter.Players really had to jump to avoid projectiles in Bubble Fighter.

    Second place

    Team NAE took second place ($100, five Kinect for Windows v2 sensors and carrying bags, and a Structure Sensor) for their Public Speech Trainer, which helps users improve their public speaking, including training to avoid bad posture (think crossed arms) and distracting gestures. The app also provides built-in video recording, enabling users to review of all their prior training sessions.

    Third place

    The Eh-Team grabbed third place (five Kinect for Windows v2 sensors and carrying bags) for The Fitness Clinic, an app that provides real-time feedback on a user’s form during popular gym workouts.

    We loved the great turnout, even if it meant that the hackers had just enough room on the table for all their gear.We loved the great turnout, even if it meant that the hackers had just enough room on the table for all their gear.

    Other projects presented

    • 3D Object Recognition (team Rebel Without Clause), which uses Structure Sensor to scan objects and then provides object recognition
    • Enhanced Gaze-Tracking (team Gazepoint), which paired an external eye tracker with a Kinect v2 sensor to achieve enhanced gaze tracking
    • Fall Detection System* (team WelCam), an app that detects when an elderly person has fallen and sends email alerts to loved ones; the app also recognizes voice commands, which allows the fallen person to abort the email response by saying “ignore” or to emphatically solicit aid by saying “help”
    • Focal Length Finder* (team Sharp Corners), which determines the focal length of a lens, based on a calibration object
    • FusedFusion*, which marries the Kinect original and the v2 sensors into a single application, fusing their data into a shared Kinect Fusion volume
    • Joker (team Joker), which uses voice commands to open an Internet browser and other applications, and provides a Pong-like game powered by audio and body sensing
    • Reverse Dance Game (team The Peeps), a motion tracking game that adjusts sound based on players’ motions and gestures
    • The Red Ball Project (team NetKitties), which lets users toss a virtual ball using voice and gesture commands
    • Trenton (team New Team), which gives users hands-free control of applications by letting them utilize gestures to scroll windows and complete other tasks

    Team WelCam demonstrated their fall detection app.Team WelCam demonstrated their fall detection app.

    Thanks our gracious hosts at BCIT, to all the attendees who came to hack and share ideas, and to our co-sponsor Occipital. I hope to see everyone again at a future event.

    Ben Lower, Developer Community Manager, Kinect for Windows

    Key links

    ____________________
    *Denotes projects awarded an Honorable Mention by the judges

     

  • Kinect for Windows Product Blog

    Re-imagining banking with Kinect for Windows

    • 0 Comments

    Someday, people might reminisce about the days when ATMs were the state of the art in offsite banking. That day might come sooner than expected, thanks in part to Kinect for Windows technology. Diebold, Incorporated, a worldwide leader in integrated service solutions, has unveiled a prototype of a standalone banking platform that promises to make self-service banking more convenient, intuitive, and secure, for both the customer and the financial institution. Called the Responsive Banking Concept, the prototype is equipped with touch screens and sensing devices that simplify and protect offsite banking transactions. Kinect for Windows is one of the key underlying technologies, providing motion and voice sensing to help recognize customers and provide personalized service for even complex transactions. The Responsive Banking Concept prototype made its debut on November 2 at the 2014 Money 20/20 conference, a global event for financial service industries. Designed to bring secure, personalized financial services to places where customers work and play, the full-scale Responsive Banking Concept could be implemented in airports, shopping malls, and other high-traffic areas, while smaller modular versions could be placed in retail shops. So someday soon, a Kinect-enabled installation might be your new personal banker.

                    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Big day for Kinect developers

    • 13 Comments

    The Kinect Adapter for Windows enables you to connect a Kinect for Xox One sensor to Windows 8.0 and 8.1 PCs and tablets

    Today, we're extremely excited to announce some major news about Kinect:

    • The full release of the Kinect for Windows software development kit 2.0, featuring over 200 improvements and updates to the preview version of the SDK. The SDK is a free download, and there are no fees for runtime licenses of commercial applications developed with the SDK.

    • Ability to develop Kinect apps for the Windows Store. With  commercial availability of SDK 2.0, you can develop and deploy Kinect v2 apps in the Windows Store for the first time. Access to the Windows Store enables you to reach millions of potential customers for your business and consumer solutions.

    • Availability of the US$49.99 Kinect Adapter for Windows that enables you to connect a Kinect for Xbox One sensor to Windows 8.0 and 8.1 PCs and tablets. This means that developers can use their existing Kinect for Xbox One sensor to create Kinect v2 solutions, and consumers can experience Kinect v2 apps on their computer by using the Kinect for Xbox One sensor they already own. The adapter is available in over two dozen markets—rolling out later today—and will be available in a total of 41 markets in coming weeks.

    You can find more details about these developments in Microsoft Technical Fellow Alex Kipman's post on the Official Microsoft Blog. As Alex says, "these updates are all part of our desire to make Kinect accessible and easy to use for every developer."  

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Hackers display their creativity at Amsterdam hackathon

    • 2 Comments

    We recently traveled to the Netherlands’ capital for our latest developer hackathon. The venue, Pakhuis de Zwijger, a former refrigerated warehouse located on one of Amsterdam’s many canals, made for a very unique setting. Developers from all over Europe came for the 28-hour event, which was hosted by Dare to Difr. There were some very innovative projects, and we couldn’t have been happier with the energy of the attendees and the quality of their work.

    The participants’ energy and creativity resulted in innovative projects during the Kinect for Windows hackathon in Amsterdam (September 5–6, 2014).
    The participants’ energy and creativity resulted in innovative projects during the Kinect for Windows hackathon in Amsterdam (September 5–6, 2014).


    First place

    Team Hoog+Diep took the top prize (€1000 and one Windows v2 sensor and carrying bag per team member) for their app My First Little Toy Story 3D, which allows users to capture playful adventures with favorite toys and share them as videos with friends. The app tracks the movement of dinosaurs and helicopters while the user plays with them, then it “magically” makes the user disappear from the video before sharing it.

    Team Hoog+Diep took first place for their augmented play app, My First Little Toy Story 3D.
    Team Hoog+Diep took first place for their augmented play app, My First Little Toy Story 3D.


    Team AK took second prize with their retail analytics solution, Clara.Second place

    Team AK earned second place (€500 and one Windows v2 sensor and carrying bag per team member) for Clara, an app that provides real-time analytics for a retail store, showing how many shoppers came through and providing insights on customer behavior and product popularity.


    Team motognosis took third place for their medical rehab and analysis solution, In exTremory.Third place

    Team motognosis won third place (one Windows v2 sensor and carrying bag per team member) for their work on In exTremory, a “catch-the-shape” game for tremor analysis in clinical, rehabilitation, and home scenarios.


     Other projects presented

    Developers from all over Europe came for the 28-hour event
    Developers from all over Europe came for the 28-hour event.

    • Hero (team Hero), a tool for emergency responders that enables distant risk assessment
    • AdoptAGeek (team KiMeet), which uses natural interactions to describe a user’s soul mate and provide a picture of the perfect match
    • Connect Your Home (team Connect Your Home), a building management system with tactile feedback
    • Midi Connector* (team Connector), a virtual band experience for two people: one rocking a mean air guitar and the other on drums
    • 1999: A Space Oddysey (sic; team LUSTlab), a personal virtual reality experience that involves the physical movement of the user
    • MeTricorder (team Metrilus), an app that makes the veins in the user’s arm clearly visible via projection mapping, thereby preventing multiple needle sticks during blood draws
    • Hackathon participant ponders the codeDocinector (team Metrilus), which automatically scans documents by using Kinect
    • Cool Guys Don't (team Michael Bay Fan Club), which demonstrates that cool guys don't look at explosions—really
    • Gesture Enabled Netsupport Webservice (team Netsupport), which lets users control Bing and Google maps in the browser by using gestures
    • BodyType (team Semper Five), a game in which players form letters by using the shape of their body
    • Sign Language Interpreter (team Sign Language Interpreter), which recognizes, transcribes, and teaches the gestures of sign language
    • Defy Graphics* (team Defy Graphics), an dance game in which the environment in a club responds to what’s happening on the dance floor
    • Desert race (team Fireball), an immersive full-body experience for up to six players, who each drive their horse to the fullest by vigorously exercising and making key gestures
    • Dorky Date (team ThreeManCamel), a first-person, physics-based date similar that plays up how awkward dating can be
    • Blijft dat zo (team Vastgoed), a physics collider and forces experience that uses joint position and particles


    Upcoming events

    Developers took advantage of the enhanced skeletal tracking capabilities of Kinect for Windows v2.

    Our next hackathon will take place in Vancouver, British Columbia, November 8–9; registration opens in October, so keep an eye on our blog.

    Thanks to all the attendees of the Amsterdam event and to our wonderful hosts at Dare to Difr. I look forward to watching the projects progress and to seeing you all again at a future event!

    Ben Lower, Developer Community Manager, Kinect for Windows

    Key links

    _________________
    *Denotes projects awarded an Honorable Mention by the judges

  • Kinect for Windows Product Blog

    Newly released: update to SDK 2.0 public preview

    • 0 Comments

    Today we released another update to the Kinect for Windows SDK 2.0 public preview. This release contains important product improvements that add up to a more stable and feature-rich product. This updated SDK lets you get serious about finalizing your applications for commercial deployment and, later this year, for availability in the Windows Store. Please install, enjoy, and let us know what you think.

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Hacking away in Canada

    • 0 Comments

    A member of team Kwartzlab++ demonstrates his team's project VR Builder at the Kinect Hackathon in Kitchener, Ontario.
    A member of team Kwartzlab++ demonstrates his team's project VR Builder at the
    Kinect Hackathon in Kitchener, Ontario.

    Last week, we headed north to Canada for the latest stop on our Kinect Hackathon world tour: a three-day event (August 8–10) in Kitchener, Ontario, where developers gathered to develop applications* using Kinect for Windows v2. One of the three cities that make up the Regional Municipality of Waterloo, Kitchener has a booming tech community, fueled in part by the renowned computer science program at the University of Waterloo. So it was no surprise that the Kitchener attendees exhibited boundless energy and enormous creativity. Equally impressive was the hospitality of the people in Kitchener, especially Jennifer Janik and Rob Soosaar of Deep Realities, who were awesome hosts.

    And the winners* are…

    Team CleanSweep took first place.Team CleanSweep took first place.

    • Team CleanSweep took first place (US$500 and three Kinect for Windows v2 sensors) for their app Turtle Curling, an augmented reality version of one of Canada’s favorite games: curling. And no, it doesn’t send real turtles sliding down the ice. It uses two Kinect v2 sensors and a TurtleBot to create an incredibly fun version of this unique Olympic sport.
    • Team Christie Digitalia took second place (US$250 and a Kinect for Windows v2 sensor) for their app Projection Cosplay, which turns anyone into a virtual superhero by using projection mapping. Imagine yourself as a supernaturally endowed crime fighter, the nemesis of virtual bad guys everywhere. 
    • Team Command Your Space took third place (US$100 and a Kinect for Windows v2 sensor) with Command Your Space, an app that enables online shoppers to see how furniture and accessories will fit into real-world environments, as seen by the Kinect sensor. It can also be used with 3D scans of your own furniture, allowing you to do a little virtual rearranging.

    Hard at work: members of team BearHunterNinja (left) and team Titan (right)
    Hard at work: members of team BearHunterNinja (left) and team Titan (right)

    Other projects* presented

    • Angry Asteroid (team Pass/Fail), a game in the style of Angry Birds that uses Kinect motion controls
    • Art Jam (team REAPsters), a kinetic, interactive, multimedia experience in which users simultaneously interact with visual art and music using the Kinect sensor’s ability to detect motion
    • BearHunterNinja (team BearHunterNinja), an app that uses Kinect’s hand-state detection to enable the classic game of “rock, paper, scissors”; also implemented a variation of the game using custom, machine-learning gestures
    • BOHAH (team BOHAH), a therapeutic video game for children with disabilities
    • Bricktastic (team Bricktastic), who adapted their 3D brick-breaker mobile game to work with Kinect and Oculus Rift
    • ConnectConnect (team ConnectConnect), which networks together multiple Kinect sensors to allow sharing and combining of all the data in the same application, enabling more than six users and remote connections
    • Florb (team Titan), an app that lets you virtually fly, using your arms as thrusters
    • GIORP 5000 (team GIORPers), a proof of concept for an interactive retail clothing shopping experience
    • Half-Life 2 Mod for Kinect (team Barney’s Crabs), a Half-Life 2 mod with Kinect for Windows that enables movement and perspective changes
    • InteractionDemo (team Connecteraction), an app that powers experiments with Kinect data from the body, gestures, depth, and color
    • Speechy (team Speechy), a public speaking “training” program that uses Kinect to give you feedback on your posture, voice projection, and use of repeated words during presentations
    • Swish (team Focus on Fun), a marketing app that virtually dresses passersby in a store’s best outfit
    • Voice in Motion? (team Ace of Base?), an app that uses Kinect for Windows to interactively teach people American Sign Language (ASL)
    • VR Builder (team Kwartzlab++), an app that lets users build accurate 3D shapes that can then be placed in the user’s immediate area

    Upcoming events

    • Amsterdam, Netherlands (September 5–6): register at http://aka.ms/k4whackams
    • Vancouver, British Columbia (November 8): registration will open soon (keep an eye on our blog)

    Thanks to everyone who came to the event in Kitchener. I hope to see you at another event in the future!

    Ben Lower, Developer Community Manager, Kinect for Windows

    Key links

    _____________________
    *The names of the hackathon projects and teams are determined solely by the participants and are not intended to be used commercially.

  • Kinect for Windows Product Blog

    V2 meets 3D

    • 3 Comments

    As Microsoft Most Valuable Professional (MVP) James Ashley points out in a recent blog, it’s a whole lot easier to create 3D movies with the Kinect for Windows v2 sensor and its preview software development kit (SDK 2.0 public preview). For starters, the v2 sensor captures up to three times more depth information than the original sensor did. That means you have far more depth data from which to construct your 3D images.

    The next big improvement is in the ability to map color to the 3D image. The original Kinect sensor used an SD camera for color capture, and the resulting low-resolution images made it difficult to match the color data to the depth data. (RGB+D, a tool created by James George, Jonathan Porter, and Jonathan Minard, overcame this problem.) Knowing that the v2 sensor has a high-definition (1080p) video camera, Ashley reasoned that he could use the camera's color images directly, without a workaround tool. He also planned to map the color data to depth positions in real-time, a new capability built into the preview SDK.

    Ashley shot this 3D video of his daughter Sophia by using Kinect for Windows v2 and a standard laptop.

    Putting these features together, Ashley wrote an app that enabled him to create 3D videos on a standard laptop (dual core Intel i5, with 4 GB RAM and an integrated Intel HD Graphics 4400). While he has no plans at present to commercialize the application, he opines that it could be a great way to bring real-time 3D to video chats.

    Ashley also speculates that since the underlying principle is a point cloud, stills of the volumetric recording could be converted into surface meshes that can be read by CAD software or even turned into models that could be printed on a 3D printer. He also thinks it could be useful for recording biometric information in a physician’s office, or for recording precise 3D information at a crime scene, for later review.

    Those who want to learn more from Ashley about developing cool stuff with the v2 sensor should note that his book, Beginning Kinect Programming with Kinect for Windows v2, is due to be published in October.

    The Kinect for Windows Team

    Key links

Page 1 of 11 (109 items) 12345»