• Kinect for Windows Product Blog

    Booting up customer relations with Kinect

    • 0 Comments

    Tension mounts as the place kicker trots onto the field. With only seconds left on the clock and a score of 12 to 14, this three-point field goal attempt will decide the game’s outcome. And who is the intrepid kicker, on whose kicking prowess so much rests? It’s any of a host of fans, each eager to boot the game-winning goal, thanks to some Kinect for Windows magic. The virtual field goal experience was a huge hit with football fans—watch the video below and see it in action.

    Yes, thousands of fans lined up for a chance to kick virtual field goals this past season at FedEx Field during home games of Washington D.C.’s National Football League (NFL) franchise and at the annual Army-Navy grudge match at Baltimore’s M&T Bank Stadium. Part of an interactive Social Media Lounge, the virtual field goal setup consisted of a gaming wall, an enormous, high-definition screen composed of nine monitors and a Kinect for Windows v2 sensor. The Kinect sensor tracked the kicker’s motion as he or she approached the virtual ball, gave it a boot, and then followed through with, one hopes, the classic high leg swing. Using this data, the Kinect-powered app computed the velocity, distance, and accuracy of the attempt and determined whether the kicker was the hero of the play. Unlike the NFL players, the fans got a little help: each kicker was allowed five attempts to send the ball through the uprights.

    The Social Media Lounge was the brainchild of high-tech marketing firm MVP Interactive. In addition to the gaming wall, the Lounge enabled fans to create virtual bobble head figurines in their own likeness and to send email images of their kick attempts and bobble heads or post the images to Facebook or Twitter. The Army-Navy game event alone generated more than 67,000 social impressions (emails, Facebook posts and clicks, and Twitter clicks).

    It’s “thumbs up” from this fan as he prepares to attempt a virtual field goal for the Washington NFL team.
    It’s “thumbs up” from this fan as he prepares to attempt a virtual field goal for the Washington NFL team.

    The events were sponsored by multinational brewing company Anheuser-Busch at the Washington games and insurer USAA at the Army-Navy game. Anheuser-Busch used the experience to promote their Bud Light brand among digital-savvy millennials, a much-sought-after demographic for the brewer, while USAA, which offers financial products to military families, took advantage of the event to make a positive connection with potential customers.

    Anthony DiPrizio, chief technology officer at MVP Interactive, praises the capabilities of the Kinect for Windows v2 sensor. “At MVP Interactive, we work with and develop many cutting-edge technologies. We've found that the Kinect v2 sensor rivals and outperforms most commercial-grade sensors, which are 10 times the cost of the Kinect v2. We're very pleased with what Microsoft and the Kinect community have done to allow us to create interactions like the virtual field goal kick. The relentless effort of their development team to push out new SDK versions on a regular basis is truly game changing. We look forward to continuing to work with the device and push the boundaries of what's possible.”

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Original Kinect for Windows sensor sales to end in 2015

    • 3 Comments

    In October, we shipped the public release of the Kinect for Windows v2 sensor and its software development kit (SDK 2.0). The availability of the v2 sensor and SDK 2.0 means that we will be phasing out the sale of the original Kinect for Windows sensor in 2015.

    The move to v2 marks the next stage in our journey toward more natural human computing. The new sensor provides a host of new and improved features, including enhanced body tracking, greater depth fidelity, full 1080p high-definition video, new active infrared capabilities, and an expanded field of view. Likewise, SDK 2.0 offers scores of updates and enhancements, not the least of which is the ability to create and publish Kinect-enabled apps in the Windows Store. At the same time that we publicly released the v2 sensor and its SDK, we also announced the availability of the Kinect Adapter for Windows, which lets developers create Kinect for Windows applications by using a Kinect for Xbox One sensor. The response of the developer community to Kinect v2 has been tremendous: every day, we see amazing apps built on the capabilities of the new sensor and SDK, and since we released the public beta of SDK 2.0 in July, the community has been telling us that porting their original solutions over to v2 is smoother and faster than expected.

    The original Kinect for Windows sensor was a milestone achievement in the world of natural human computing. It allowed developers to create solutions that broke through the old barriers of mouse and keyboard interactions, opening up entirely new commercial experiences in multiple industries, including retail, education, healthcare, education, and manufacturing. The original Kinect let preschoolers play educational games by simply moving their arms; it coached patients through physical rehabilitation; it gave shoppers new ways to engage with merchandise and even try on clothes. The list of innovative solutions powered by the original Kinect for Windows goes on and on.

    We hope everyone will embrace the latest Kinect technology as soon as possible, but we understand that some business customers have commitments to the original sensor and SDK. If you’re one of them and need a significant number of original Kinect for Windows sensors, please contact us as soon as possible. We will do our best to fill your orders, but no more original sensors will be manufactured after the current stock sells out.

    All of us on the Kinect for Windows team are grateful to all of you in the community who jumped on this technology and showed us what it could do. We know that your proven track record doing great things with the original technology will only get better with v2—the improvements in quality from the original Kinect for Windows sensor to the v2 device are truly immense. And so, we’re cheered by the prospect of seeing all the amazing solutions you’ll create with the new and improved Kinect for Windows.

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    The magic of Christmas, Kinect style

    • 0 Comments

    Every December, British shoppers look forward to the creative holiday ad campaign from John Lewis, a major UK department store chain. It’s been a tradition for a number of years, is seen by millions of viewers in the UK annually, and won a coveted IPA Effectiveness Award in 2012. The retailer’s seasonal campaign traditionally emphasizes the joy of giving and the magic of Christmas, and this year’s ads continue that tradition, with a television commercial that depicts the loving relationship between a young boy and his pet penguin, Monty.


    The 2014 Christmas advertisement from John Lewis tells the story of a boy and his penguin—and the
    magic of giving just the right gift.

    But the iconic British retailer has added a unique, high-tech twist to the 2014 campaign: Monty’s Magical Toy Machine, an in-store experience that uses the Kinect for Windows v2 sensor to let kids turn their favorite stuffed toy into an interactive 3D model. The experience deftly plays off the TV ad, whose narrative reveals that Monty is a stuffed toy that comes alive in the boy’s imagination.

    Monty’s Magical Toy Machine experience, which is available at the John Lewis flagship store on London’s Oxford Street, plays to every child’s fantasy of seeing a cherished teddy bear or rag doll come to life—a theme that runs through children’s classics from Pinocchio to the many Toy Story movies. The experience has been up and running since November 6, with thousands of customers interacting with it to date. Customers have until December 23 to enjoy the experience before it closes.

    The toy machine experience was the brainchild of Microsoft Advertising, which had been approached by John Lewis to come up with an innovative, technology-based experience based on the store’s holiday ad. “We actually submitted several ideas,” explains creative solutions specialist Art Tindsley, “and Monty’s Magical Toy Machine was the one that really excited people. We were especially pleased, because we were eager to use the new capabilities of the Kinect v2 sensor to create something truly unique.”

    John Lewis executives loved the idea and gave Microsoft the green light to proceed. "We were genuinely excited when Microsoft presented this idea to us,” says Rachel Swift, head of marketing for the John Lewis brand. “Not only did it exemplify the idea perfectly, it did so in a way that was both truly innovative and charming.”  

    Working with the John Lewis team and creative agency adam&eveDDB, the Microsoft team came up with the design of the Magical Toy Machine: a large cylinder, surrounded by three 75-inch display screens, one of which is topped by a Kinect for Windows v2 sensor. It is on this screen that the animation takes place.

    The enchantment happens here, at Monty's Magical Toy Machine. Two of the enormous display screens can be seen in this photo; the screen on the left has a Kinect for Windows v2 sensor mounted above and speakers positioned below.

    The enchantment happens here, at Monty's Magical Toy Machine. Two of the enormous display
    screens can be seen in this photo; the screen on the left has a Kinect for Windows v2 sensor
    mounted above and speakers positioned below.

    The magic begins when the child’s treasured toy is handed over to one of Monty’s helpers. The helper then takes the toy into the cylinder, where, unseen by the kids, it is suspended by wires and photographed by three digital SLR cameras. The cameras rotate around the toy, capturing it from every angle. The resulting photos are then fed into a customized computer running Windows 8.1, which compiles them into a 3D image that is projected onto the huge screen, much to the delight of the toy’s young owner, who is standing in front of the display. This all takes fewer than two minutes.

    Suspended by wires, the toy is photographed by three digital SLR cameras (two of which are visible here) that rotate around the toy and capture its image from every angle.

    Suspended by wires, the toy is photographed by three digital SLR cameras (two of which are
    visible here) that rotate around the toy and capture its image from every angle.

    The Kinect for Windows v2 sensor then takes over, bringing the toy’s image to life by capturing and responding to the youngster’s gestures. When a child waves at the screen, their stuffed friend wakens from its inanimate slumber—magically brought to life and waving back to its wide-eyed owner. Then, when the child waves again, their toy dances in response, amazing and enchanting both kids and parents, many of whom cannot resist dancing too.

    The Kinect for Windows SDK 2.0 plays an essential role in animating the toy. Having added a skeletal image to the toy, the developers used the Kinect for Windows software development kit (SDK) 2.0 to identify key sequences of movements, thus enabling the toy to mimic the lifelike poses and dances of a human being. Because the actions map to those of a human figure, Monty’s Magical Toy Machine works best on toys like teddy bears and dolls, which have a bipedal form like that of a person. It also functions best with soft-textured toys, whose surface features are more accurately captured in the photos.

    The entire project took two months to build, reports Tindsley. “We began with scanning a real toy with Kinect technology, mapping it to create a surface representation (a mesh), then adding in texture and color. We then brought in a photogrammetry expert who created perfect 3D images for us to work with,” Tindsley recalls.

    Then came the moment of truth: bringing the image to life. “In the first trials, it took 12 minutes from taking the 3D scans of the toy to it ‘waking up’ on the screen—too long for any eager child or parent to wait,” said Tindsley. “Ten days later, we had it down to around 100 seconds. We then compiled—read choreographed and performed—a series of dance routines for the toy, using a combination of Kinect technology and motion capture libraries,” he recounts.


    A teddy bear named Rambo Ginge comes to life through Monty's Magical Toy Machine, and, as this video shows, even adults are enraptured to see their priceless toys come alive.

    None of this behind-the-scenes, high tech matters to the children, who joyfully accept that somehow their favorite stuffed toy has miraculously come to life. Their looks of surprise and wonder are priceless.

    And the payoff for John Lewis? Brand loyalty and increased traffic during a critical sales period. As Rachel Swift notes, “The partnership with Microsoft allowed us to deliver a unique and memorable experience at a key time of year. But above all,” she adds, “the reward lies in surprising and delighting our customers, young and old.” Just as Monty receives the perfect Christmas gift in the TV ad, so, too, do the kids whose best friends come to life before their wondering eyes.

    The Kinect for Windows Team

    Key links



  • Kinect for Windows Product Blog

    Intel-GE Care Innovations uses Kinect solution to help elderly patients

    • 0 Comments

    “I’ve fallen … and I can’t get up.” That line, from a low-budget 1980s TV commercial hawking a personal medical emergency call button, has been fodder for countless comedians over the years. But falls among the elderly are anything but a laughing matter, especially to Maureen Glynn, the director of behavioral innovation programming at Intel-GE Care Innovations.

    “Falls are a major health concern among the elderly,” she says, and the statistics certainly back her up. In fact, the U.S. Centers for Disease Control and Prevention reports that each year one of three Americans over the age of 65 takes a spill, and the results can be devastating: broken bones, permanent disabilities, and complications that can lead to death. In fact, falls are the leading cause of fatal and nonfatal injuries among older adults, with studies documenting that 20 to 30 percent of the elderly who fall suffer moderate to severe injuries. In 2003, for example, about 13,700 Americans 65 years or older died from falls, and another 1.8 million were treated in emergency departments for nonfatal fall injuries. Treating elderly patients who have fallen costs about $30 billion annually in the United States today, and experts estimate that that amount could more than double by 2020, given the aging population of Baby Boomers.

    Under the watchful eye of the Kinect sensor, a patient performs her physical therapy regimen from the comfort and convenience of her own home.
    Under the watchful eye of the Kinect sensor, a patient performs her physical therapy regimen from
    the comfort and convenience of her own home.

    What’s more, once an elderly individual has suffered a fall, he or she is much more likely to fall again without some sort of intervention. “I had a 76-year-old family member who fell five times, enduring repeated broken bones,” Glynn recounts. And while broken bones are no fun for anyone, they pose special problems in the elderly, whose ability to heal is often diminished. Seniors who break a hip—a common injury in falls among the elderly—may end up spending considerable time in the hospital and rehab, and may never attain full functionality again. Such sufferers become physically inactive, which, notes Glynn “can lead to chronic mental and physical disease.”

    Glynn’s employer is determined to change this dismal picture. As its name clearly indicates, Intel-GE Care Innovations is a joint venture of two industry titans. Founded in 2011, the company seeks to transform the way care is delivered by connecting patients in their homes with care teams—thus enabling patients to live independently whenever possible. Augmenting the technological strengths of its parent companies with deep knowledge of the healthcare system, Intel-GE Care Innovations collects, aggregates, and analyzes data to provide insights that connect providers, payers, caregivers, and consumers—and brings the care continuum into the patient’s home. For example, the company has established the Care Innovations Validation Institute to improve standards for measuring and promoting remote care management solutions and services.

    One of the company’s latest products, RespondWell from Care Innovations, takes direct aim at the problem of falls among the elderly. As Glynn observes, “As a company dedicated to helping patients receive the healthcare they need while maintaining as much independence as possible, we saw the need for a home-based solution that helps older people recover from and avoid falls.”

    The Kinect sensor monitors the patient’s performance, correcting improperly executed movements and awarding points for those done appropriately.
    The Kinect sensor monitors the patient’s performance, correcting improperly executed movements
    and awarding points for those done appropriately.

    Responding to this need led them to partner with RespondWell, a healthcare IT software company that, as CEO John Grispon explains, “specializes in activating patients and driving efficiencies. We motivate patients to follow through with their physical therapy, by making the activities interactive and engaging.” The company’s antecedents were in the gaming world, having created one of the first fitness games for the original Xbox, back in 2005. From there, RespondWell moved into the physical therapy industry, determined to do for rehab what they had done for fitness: getting people up and moving by making the often onerous rehab exercises interactive and entertaining.

    Both Intel-GE Care Innovations and RespondWell saw Kinect as the logical platform for addressing fall prevention and rehabilitation among seniors. Recognizing how difficult it can be for older people to make daily visits to their therapist’s office, the teams at Intel-GE Care Innovations and RespondWell have created an interactive program that lets patients exercise in the comfort of their own home while providing Kinect-based gesture monitoring to ensure that they are performing their exercises correctly. The solution is sold to therapists and other healthcare providers.

    It works like this: the therapist evaluates the patient and then designs a program of exercises that are intended to restore functions and, equally important, to prevent future falls. The patient learns the movements under the watchful eye of the therapist—and the unblinking lens of the Kinect sensor, which faithfully tracks the patient’s skeletal positions throughout the exercises.

    Using data captured by the Kinect sensor, the physical therapist can track a patient’s progress and adjust the exercise regimen as necessary.
    Using data captured by the Kinect sensor, the physical therapist can track a patient’s progress and
    adjust the exercise regimen as necessary.

    At home, patients call up their personalized exercise program on a Windows tablet or PC, which is connected to a Kinect sensor. They then perform the exercises, again under the view of the Kinect sensor, and the system analyzes their movements and provides instructions to correct any mistakes. The system not only corrects errors, but it rewards good performance with points, adding a competitive element that many patients find highly motivating. Glynn praises this positive reinforcement element of the solution, pointing out that it motivates patients without being overly gamified. She also points out that the solution not only monitors and coaches patients in the comfort of their own home, but that it also sends data about the performance to their therapist, who can adjust the exercises as needed.

    RespondWell from Care Innovations was developed on the Kinect v2 sensor, which Grispon enthusiastically endorses, “especially the enhanced field of view, which lets us get a good look at the patient even when he’s very close to the sensor.” He also praises the v2 sensor’s improved picture resolution and enhanced skeletal tracking, both of which boost the solution’s ability to precisely record patient movements. When asked how easy it was to port the code from the original Kinect sensor to the Kinect v2 sensor, Grispon quotes his lead developer, who says that the process was easy and offers this advice to other devs, “Smoothing is your friend—use it.” (Smoothing is a feature in SDK 2.0 that makes it easier for developers to recognize joints in the Kinect image.)

    Currently in pilot testing, RespondWell from Care Innovations is expected to be generally available by January 2015. Patients in the pilot program report that the solution makes physical therapy more enjoyable, while therapists are delighted that patients are more motivated to do their home exercises and that they are performing them more accurately. Both Glynn and Grispon stress that RespondWell from Care Innovations fills a vital need in our healthcare system. A system that not only helps seniors recover from falls but also helps prevent future tumbles could curb medical costs and offer the elderly a much improved quality of life. We’re pleased that Kinect can play a role in this effort.

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Telekinesis, Kinect for Windows style

    • 0 Comments

    What science fiction fan doesn’t love the idea of telekinesis? We never tire of the illusion, but there’s nothing illusory about Microsoft tech evangelist Mike Taulty’s ability to move a ball without touching it, using only simple hand gestures.


    Mike Tautly controls a Sphero with simple gestures—and a Kinect for Windows v2 sensor.

    The ball in question is the Sphero, a clever little robot that is normally controlled via a smartphone or tablet. The “magic” behind Taulty’s handiwork comes courtesy of Kinect for Windows v2, as he explains in the video clip above. By adding Kinect to the equation, Taulty made it possible to control Sphero without the need to use a tablet or smartphone. He created his ball-rolling app to demonstrate the potential of Windows 8.1 apps at the Native Summit conference in London this past September. He tied together the Kinect for Windows v2 sensor and the Sphero device with some JavaScript code, and voilà—he could control the rotation of the Sphero with his left hand and its direction of movement with his right.

    While Taulty calls his app “hacking for fun,” we see its potential in the real world. Imagine, for instance, how much more engaged a person could be with a digital display in a shopping center or public space if they could manipulate products or objects themselves. Imagine this simple application applied to a museum installation, an advertising display in a retail store, or a gaming arcade. We may not be able to control objects with our minds, but Kinect for Windows gives us the next best thing.

    The Kinect for Windows Team

    Key links

     

  • Kinect for Windows Product Blog

    Hackers shine in the Great White North

    • 2 Comments

    Earlier this month, we traveled north for a developer hackathon at the Burnaby campus of the British Columbia Institute of Technology (BCIT), located in the heart of the Vancouver metropolitan area. The event, which was hosted by BCIT and co-sponsored by our team along with Occipital (makers of the Structure Sensor), drew nearly 100 developers, all eager to spend their weekend hacking with us. We were astonished by their creativity and energy—and their ability to cram so much hardware on each table!

    Attendees hunched over keyboards and displays, hard at work on their projects.Attendees hunched over keyboards and displays, hard at work on their projects.

    First place

    Team Bubble Fighter won the top prize ($250, five Kinect for Windows v2 sensors and carrying bags, and a Structure Sensor) for their game Bubble Fighter. The game, reminiscent of Street Fighter, allows two people to play against one another over a network connection. Game play includes special moves triggered by gestures, including jumping to make your avatar leap over projectiles.

    Players really had to jump to avoid projectiles in Bubble Fighter.Players really had to jump to avoid projectiles in Bubble Fighter.

    Second place

    Team NAE took second place ($100, five Kinect for Windows v2 sensors and carrying bags, and a Structure Sensor) for their Public Speech Trainer, which helps users improve their public speaking, including training to avoid bad posture (think crossed arms) and distracting gestures. The app also provides built-in video recording, enabling users to review of all their prior training sessions.

    Third place

    The Eh-Team grabbed third place (five Kinect for Windows v2 sensors and carrying bags) for The Fitness Clinic, an app that provides real-time feedback on a user’s form during popular gym workouts.

    We loved the great turnout, even if it meant that the hackers had just enough room on the table for all their gear.We loved the great turnout, even if it meant that the hackers had just enough room on the table for all their gear.

    Other projects presented

    • 3D Object Recognition (team Rebel Without Clause), which uses Structure Sensor to scan objects and then provides object recognition
    • Enhanced Gaze-Tracking (team Gazepoint), which paired an external eye tracker with a Kinect v2 sensor to achieve enhanced gaze tracking
    • Fall Detection System* (team WelCam), an app that detects when an elderly person has fallen and sends email alerts to loved ones; the app also recognizes voice commands, which allows the fallen person to abort the email response by saying “ignore” or to emphatically solicit aid by saying “help”
    • Focal Length Finder* (team Sharp Corners), which determines the focal length of a lens, based on a calibration object
    • FusedFusion*, which marries the Kinect original and the v2 sensors into a single application, fusing their data into a shared Kinect Fusion volume
    • Joker (team Joker), which uses voice commands to open an Internet browser and other applications, and provides a Pong-like game powered by audio and body sensing
    • Reverse Dance Game (team The Peeps), a motion tracking game that adjusts sound based on players’ motions and gestures
    • The Red Ball Project (team NetKitties), which lets users toss a virtual ball using voice and gesture commands
    • Trenton (team New Team), which gives users hands-free control of applications by letting them utilize gestures to scroll windows and complete other tasks

    Team WelCam demonstrated their fall detection app.Team WelCam demonstrated their fall detection app.

    Thanks our gracious hosts at BCIT, to all the attendees who came to hack and share ideas, and to our co-sponsor Occipital. I hope to see everyone again at a future event.

    Ben Lower, Developer Community Manager, Kinect for Windows

    Key links

    ____________________
    *Denotes projects awarded an Honorable Mention by the judges

     

  • Kinect for Windows Product Blog

    Re-imagining banking with Kinect for Windows

    • 0 Comments

    Someday, people might reminisce about the days when ATMs were the state of the art in offsite banking. That day might come sooner than expected, thanks in part to Kinect for Windows technology. Diebold, Incorporated, a worldwide leader in integrated service solutions, has unveiled a prototype of a standalone banking platform that promises to make self-service banking more convenient, intuitive, and secure, for both the customer and the financial institution. Called the Responsive Banking Concept, the prototype is equipped with touch screens and sensing devices that simplify and protect offsite banking transactions. Kinect for Windows is one of the key underlying technologies, providing motion and voice sensing to help recognize customers and provide personalized service for even complex transactions. The Responsive Banking Concept prototype made its debut on November 2 at the 2014 Money 20/20 conference, a global event for financial service industries. Designed to bring secure, personalized financial services to places where customers work and play, the full-scale Responsive Banking Concept could be implemented in airports, shopping malls, and other high-traffic areas, while smaller modular versions could be placed in retail shops. So someday soon, a Kinect-enabled installation might be your new personal banker.

                    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Big day for Kinect developers

    • 13 Comments

    The Kinect Adapter for Windows enables you to connect a Kinect for Xox One sensor to Windows 8.0 and 8.1 PCs and tablets

    Today, we're extremely excited to announce some major news about Kinect:

    • The full release of the Kinect for Windows software development kit 2.0, featuring over 200 improvements and updates to the preview version of the SDK. The SDK is a free download, and there are no fees for runtime licenses of commercial applications developed with the SDK.

    • Ability to develop Kinect apps for the Windows Store. With  commercial availability of SDK 2.0, you can develop and deploy Kinect v2 apps in the Windows Store for the first time. Access to the Windows Store enables you to reach millions of potential customers for your business and consumer solutions.

    • Availability of the US$49.99 Kinect Adapter for Windows that enables you to connect a Kinect for Xbox One sensor to Windows 8.0 and 8.1 PCs and tablets. This means that developers can use their existing Kinect for Xbox One sensor to create Kinect v2 solutions, and consumers can experience Kinect v2 apps on their computer by using the Kinect for Xbox One sensor they already own. The adapter is available in over two dozen markets—rolling out later today—and will be available in a total of 41 markets in coming weeks.

    You can find more details about these developments in Microsoft Technical Fellow Alex Kipman's post on the Official Microsoft Blog. As Alex says, "these updates are all part of our desire to make Kinect accessible and easy to use for every developer."  

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Hackers display their creativity at Amsterdam hackathon

    • 2 Comments

    We recently traveled to the Netherlands’ capital for our latest developer hackathon. The venue, Pakhuis de Zwijger, a former refrigerated warehouse located on one of Amsterdam’s many canals, made for a very unique setting. Developers from all over Europe came for the 28-hour event, which was hosted by Dare to Difr. There were some very innovative projects, and we couldn’t have been happier with the energy of the attendees and the quality of their work.

    The participants’ energy and creativity resulted in innovative projects during the Kinect for Windows hackathon in Amsterdam (September 5–6, 2014).
    The participants’ energy and creativity resulted in innovative projects during the Kinect for Windows hackathon in Amsterdam (September 5–6, 2014).


    First place

    Team Hoog+Diep took the top prize (€1000 and one Windows v2 sensor and carrying bag per team member) for their app My First Little Toy Story 3D, which allows users to capture playful adventures with favorite toys and share them as videos with friends. The app tracks the movement of dinosaurs and helicopters while the user plays with them, then it “magically” makes the user disappear from the video before sharing it.

    Team Hoog+Diep took first place for their augmented play app, My First Little Toy Story 3D.
    Team Hoog+Diep took first place for their augmented play app, My First Little Toy Story 3D.


    Team AK took second prize with their retail analytics solution, Clara.Second place

    Team AK earned second place (€500 and one Windows v2 sensor and carrying bag per team member) for Clara, an app that provides real-time analytics for a retail store, showing how many shoppers came through and providing insights on customer behavior and product popularity.


    Team motognosis took third place for their medical rehab and analysis solution, In exTremory.Third place

    Team motognosis won third place (one Windows v2 sensor and carrying bag per team member) for their work on In exTremory, a “catch-the-shape” game for tremor analysis in clinical, rehabilitation, and home scenarios.


     Other projects presented

    Developers from all over Europe came for the 28-hour event
    Developers from all over Europe came for the 28-hour event.

    • Hero (team Hero), a tool for emergency responders that enables distant risk assessment
    • AdoptAGeek (team KiMeet), which uses natural interactions to describe a user’s soul mate and provide a picture of the perfect match
    • Connect Your Home (team Connect Your Home), a building management system with tactile feedback
    • Midi Connector* (team Connector), a virtual band experience for two people: one rocking a mean air guitar and the other on drums
    • 1999: A Space Oddysey (sic; team LUSTlab), a personal virtual reality experience that involves the physical movement of the user
    • MeTricorder (team Metrilus), an app that makes the veins in the user’s arm clearly visible via projection mapping, thereby preventing multiple needle sticks during blood draws
    • Hackathon participant ponders the codeDocinector (team Metrilus), which automatically scans documents by using Kinect
    • Cool Guys Don't (team Michael Bay Fan Club), which demonstrates that cool guys don't look at explosions—really
    • Gesture Enabled Netsupport Webservice (team Netsupport), which lets users control Bing and Google maps in the browser by using gestures
    • BodyType (team Semper Five), a game in which players form letters by using the shape of their body
    • Sign Language Interpreter (team Sign Language Interpreter), which recognizes, transcribes, and teaches the gestures of sign language
    • Defy Graphics* (team Defy Graphics), an dance game in which the environment in a club responds to what’s happening on the dance floor
    • Desert race (team Fireball), an immersive full-body experience for up to six players, who each drive their horse to the fullest by vigorously exercising and making key gestures
    • Dorky Date (team ThreeManCamel), a first-person, physics-based date similar that plays up how awkward dating can be
    • Blijft dat zo (team Vastgoed), a physics collider and forces experience that uses joint position and particles


    Upcoming events

    Developers took advantage of the enhanced skeletal tracking capabilities of Kinect for Windows v2.

    Our next hackathon will take place in Vancouver, British Columbia, November 8–9; registration opens in October, so keep an eye on our blog.

    Thanks to all the attendees of the Amsterdam event and to our wonderful hosts at Dare to Difr. I look forward to watching the projects progress and to seeing you all again at a future event!

    Ben Lower, Developer Community Manager, Kinect for Windows

    Key links

    _________________
    *Denotes projects awarded an Honorable Mention by the judges

  • Kinect for Windows Product Blog

    Newly released: update to SDK 2.0 public preview

    • 0 Comments

    Today we released another update to the Kinect for Windows SDK 2.0 public preview. This release contains important product improvements that add up to a more stable and feature-rich product. This updated SDK lets you get serious about finalizing your applications for commercial deployment and, later this year, for availability in the Windows Store. Please install, enjoy, and let us know what you think.

    The Kinect for Windows Team

    Key links

Page 6 of 17 (161 items) «45678»