• Kinect for Windows Product Blog

    Kinect for Windows Solution Leads to the Perfect Fitting Jeans

    • 7 Comments

    The Bodymetric Pod is small enough to be used in retail locations for capturing customers’ unique body shapes to virtually to select and purchase garmentsThe writer Mark Twain once said “We are alike, on the inside.” On the outside, however, few people are the same. While two people might be the same height and wear the same size, the way their clothing fits their bodies can vary dramatically. As a result, up to 40% of clothing purchased both online and in person is returned because of poor fit.

    Finding the perfect fit so clothing conforms to a person’s unique body shape is at the heart of the Bodymetrics Pod. Developed by Bodymetrics, a London-based pioneer in 3D body-mapping, the Bodymetrics Pod was introduced to American shoppers for the first time today during Women’s Denim Days at Bloomingdale’s in Century City, Los Angeles. This is the first time Kinect for Windows has been used commercially in the United States for body mapping in a retail clothing environment.

    Bloomingdale’s, a leader in retail innovation, has one of the largest offerings in premium denim from fashion-forward brands like J Brand, 7 for all mankind, Citizens and Humanity, AG, and Paige. The Bodymetrics services allows customers to get their body mapped and find jeans that fit and flatter their unique shape from the hundreds of different jeans styles that Bloomingdale’s stocks.

    During Bloomingdale’s Denim Days, March 15 – 18, customers will be able to get their body mapped, and also become a Bodymetrics member. This free service enables customers to access an online account and order jeans based on their body shape.

    The Bodymetric scanning technology maps hundreds of measurements and contours, which can be used for choosing clothing that perfectly fits a person’s body“We’re very excited about bringing Bodymetrics to US shoppers,” explains Suran Goonatilake, CEO of Bodymetrics. “Once we 3D map a customer’s body, we classify their shape into three categories - emerald, sapphire and ruby. A Bodymetrics Stylist will then find jeans that exactly match the body shape of the customer from jean styles that Bloomingdale’s stocks.”

    The process starts with a customer creating a Bodymetrics account. They are then directed to the Bodymetrics Pod, a secure, private space, where their body is scanned by 8 Kinect for Windows sensors arranged in a circle. Bodymetrics’ proprietary software produces a 3D map of the customer’s body, and then calculates the shape of the person, taking hundreds of measurements and contours into account. The body-mapping process takes less than 5 seconds.

    Helping women shop for best-fitting jeans in department stores is just the start of what Bodymetrics envisions for their body-mapping technologies. The company is working on a solution that can be used at home. Individuals will be able to scan their body, and then go online to select, virtually try on, and purchase clothing that match their body shape.

    Goonatilake explains, “Body-mapping is in its infancy. We’re just starting to explore what’s possible in retail stores and at home. Stores are increasingly looking to provide experiences that entice shoppers into their stores, and then allow a seamless journey from stores to online. And we all want shopping experiences that are personalized to us – our size, shape and style.”

    Even though people may not be identical on the outside, we desire clothing that fits well and complements our body shapes. The Kinect for Windows-enabled Bodymetrics Pod offers a retail-ready solution that makes the perfect fit beautifully simple.

    Kinect for Windows Team

  • Kinect for Windows Product Blog

    Microsoft’s Kinect Accelerator Begins Today

    • 2 Comments

    I am pleased to announce that the finalists for our Kinect Accelerator have arrived in ever-sunny Seattle and today are launching into a three-month program to build new products and business using Kinect. I can’t wait to see what they come up with – using Kinect, these teams have the ability to reimagine the way products are used, and perhaps even revolutionize entire industries along the way.

    Kinect Accelerator is powered by TechStars, in close collaboration with the Microsoft BizSpark program; my team and I have been working closely with the BizSpark team and others in the Interactive Entertainment Business to help develop and bring this program to life. The response to the Kinect Accelerator has been phenomenal and we expect to see remarkable innovation coming out of the program.

    Craig Eisler and other executives from Microsoft and TechStars met in February to review program applications.We were hoping to receive 100 to 150 applications, with a goal of selecting the best ten. But the worldwide entrepreneurial community completely surprised us by submitting almost five hundred applications with concepts spanning nearly 20 different industries, including healthcare, education, retail, entertainment, and more. 

    There were so many clever and innovative ideas and so many great teams it was super challenging to narrow things down – we spent many, many hours in a rigorous and highly energetic review process. We finally landed on 11 finalists from five countries, chosen based on their experience, qualifications, and the potential benefit that could result from their Kinect Accelerator.  The finalists are: 

    • Freak'n Genius – Seattle, WA
    • GestSure Technologies – Toronto, Canada
    • IKKOS – Seattle, WA
    • Kimetric – Buenos Aires, Argentina
    • Jintronix Inc.  – Montreal, Canada
    • Manctl – Lyon, France
    • NConnex – Hadley, MA
    • Styku - Los Angeles, CA
    • übi interactive – Munich, Germany
    • VOXON – New York, NY
    • Zebcare – Boston, MA

    The Kinect Accelerator will be held in Microsoft’s state of the art facility in Seattle’s vibrant South Lake Union neighborhood.Each team will be mentored by entrepreneurs and venture capitalists as well as leaders from Kinect for Windows, Xbox, Microsoft Studios, Microsoft Research and other Microsoft organizations. The teams will spend the first several weeks ideating and refining their business concepts with input and advice from their mentors, followed by several weeks of design and development.  They will present their results at an event at the end of June.

    We were so amazed by the quality, caliber, and uniqueness of the applications and teams that we decided to reward the top 100 applicants that didn’t make it into the program with a complimentary Kinect for Windows sensor. I believe we are going to see great things from many of the folks that applied to the program and we wish them all the best.

    We will share more information about the Kinect Accelerator teams and their applications on this blog in coming months. And for more information on the Kinect Accelerator program in general, go to KinectAccelerator.com.

    Craig Eisler
    General Manager, Kinect for Windows

  • Kinect for Windows Product Blog

    Nissan Pathfinder Virtual Showroom is Latest Auto Industry Tool Powered by Kinect for Windows

    • 0 Comments

    Automotive companies Audi, Ford, and Nissan are adopting Kinect for Windows as a the newest way to put a potential driver into a vehicle. Most car buyers want to get "hands on" with a car before they are ready to buy, so automobile manufacturers have invested in tools such as online car configurators and 360-degree image viewers that make it easier for customers to visualize the vehicle they want.

    Now, Kinect's unique combination of camera, body tracking capability, and audio input can put the car buyer into the driver's seat in more immersive ways than have been previously possible—even before the vehicle is available on the retail lot!

    The most recent example of this automotive trend is the 2013 Nissan Pathfinder application powered by Kinect for Windows, which was originally developed to demonstrate the new Pathfinder at auto shows before there was a physical car available.

    Nissan quickly recognized the value of this application for building buzz at local dealerships, piloting it in 16 dealerships in 13 states nationwide.

    "The Pathfinder application using Kinect for Windows is a game changer in terms of the way we can engage with consumers," said John Brancheau, vice president of marketing at Nissan North America. "We're taking our marketing to the next level, creating experiences that enhance the act of discovery and generate excitement about new models before they're even available. It's a powerful pre-sales tool that has the potential to revolutionize the dealer experience."

    Digital marketing agency Critical Mass teamed with interactive experience developer IdentityMine to design and build the Kinect-enabled Pathfinder application for Nissan. "We're pioneering experiences like this one for two reasons: the ability to respond to natural human gestures and voice input creates a rich experience that has broad consumer appeal," notes Critical Mass President Chris Gokiert. "Additionally, the commercial relevance of an application like this can fulfill a critical role in fueling leads and actually helping to drive sales on site."

    Each dealer has a kiosk that includes a Kinect for Windows sensor, a monitor, and a computer that’s running the Pathfinder application built with the Kinect for Windows SDK. Since the Nissan Pathfinder application first debuted at the Chicago Auto Show in February 2012, developers made several enhancements, including a new pop-up tutorial, and interface improvements, such as larger interaction icons and instructional text along the bottom of the screen so a customer with no Kinect experience could jump right in. "In the original design for the auto show, the application was controlled by a trained spokesperson. That meant aspects like discoverability and ease-of-use for first-time users were things we didn’t need to design for," noted IdentityMine Research Director Evan Lang.

    Now, shoppers who approach the Kinect-based showroom are guided through an array of natural movements—such as extending their hands, stepping forward and back, and leaning from side to side—to activate hotspots on the Pathfinder model, allowing them to inspect the car inside and out.

    Shoppers who approach the Kinect-based showroom are guided through an array of natural movements that allow them to inspect the car inside and out.The project was not, however, without a few challenges. The detailed Computer-Aided Design (CAD) model data provided by Nissan, while ideal for commercials and other post-rendered uses, did not lend itself easily to a real-time engine. "A lot of rework was necessary that involved 'retopolgizing' the mesh," reported IdentityMine’s 3D Design Lead Howard Schargel. "We used the original as a template and traced over to get a cleaner, more manageable polygon count. We were able to remove much more than half of the original polygons, allowing for more fluid interactions and animations while still retaining the fidelity of the client's original model."

    And then, the development team pushed further. "The application uses a dedicated texture to provide a dynamic, scalable level of detail to the mesh by adding or removing polygons, depending on how close it is to the camera,” explained Schargel. “It may sound like mumbo jumbo—but when you see it, you won't believe it."

    You can see the Nissan Pathfinder app in action at one of the 16 participating dealerships or by watching our video case study.

    Kinect for Windows Team

    Key Links

  • Kinect for Windows Product Blog

    Kinect for Windows: Developer Toolkit Update (v1.5.1)

    • 0 Comments

    Back in May, we released the Kinect for Windows SDK/Runtime v1.5 in a modular manner, to make it easier to refresh parts of the Developer Toolkit (tools, components, and samples) without the need to update the SDK (driver, runtime, and basic compilation support).

    Today, we have realized that vision with the Developer Toolkit update v1.5.1. This update boosts Kinect Studio performance and stability, improves face tracking, and introduces offline documentation support. If you have already installed the SDK, simply download the new v1.5.1 Developer Toolkit Update. If you are new to Kinect for Windows, you will want to download both Kinect for Windows SDK v1.5 and Developer Toolkit v1.5.1.

    Key Links:

    Rob Relyea
    Program Manager, Kinect for Windows

  • Kinect for Windows Product Blog

    Kinect for Windows expands its developer preview program

    • 0 Comments

    Last June, we announced that we would be hosting a limited, exclusive developer preview program for Kinect for Windows v2 prior to its general availability in the summer (northern hemisphere) of 2014. And a few weeks ago, we began shipping Kinect for Windows v2 Developer Preview kits to thousands of participants all over the world.

    It’s been exciting to hear from so many developers as they take their maiden voyage with Microsoft’s new generation NUI technology. We’ve seen early unboxing videos that were recorded all over the world, from London to Tokyo. We’ve heard about some promising early experiments that are taking advantage of the higher resolution data and the ability to see six people.  People have told us about early success with the new sensor’s ability to track the tips of hands and thumbs.  And some developers have even described how easy it’s been to port their v1 apps to the new APIs.

    Kinect for Windows v2 Developer Preview kit (Photo courtesy of Vladimir Kolesnikov [@vladkol], a developer preview program participant)
    Kinect for Windows v2 Developer Preview kit
    (Photo courtesy of Vladimir Kolesnikov [@vladkol], a developer preview program participant)

    But we’ve also heard from many people who were not able to secure a place in the program and are eager to get their hands on the Kinect for Windows v2 sensor and SDK as soon as possible. For everyone who has been hoping and waiting, we’re pleased to announce that we are expanding the program so that more of you can participate!

    We are creating 500 additional developer preview kits for people who have great ideas they want to bring to life with the Kinect for Windows sensor and SDK. Like before, the program is open to professional developers, students, researchers, artists, and other creative individuals.

    The program fee is US$399 (or local equivalent) and offers the following benefits:

    • Direct access to the Kinect for Windows engineering team via a private forum and exclusive webcasts
    • Early SDK access (alpha, beta, and any updates along the way to release)
    • Private access to all API and sample documentation
    • A pre-release version of the new generation sensor
    • A final, released sensor at launch next summer (northern hemisphere)

    Applications must be completed and submitted by January 31, 2014, at 9:00 A.M. (Pacific Time), but don’t wait until then to apply! We will award positions in the program on a rolling basis to qualified applicants. Once all 500 kits have been awarded, the application process will be closed.

    Learn more and apply now

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Kinect for Windows Helps Girls Everywhere Dress Like Barbie

    • 2 Comments

    I grew up in the UK and my female cousins all had Barbie. In fact Barbies – they had lots of Barbie dolls and ton of accessories that they were obsessed with. I was more of a BMX kind of kid and thought my days of Barbie education were long behind me, but with a young daughter I’m beginning to realize that I have plenty more Barbie ahead of me, littered around the house like landmines. This time around though, I’m genuinely interested thanks to a Kinect-enabled application. The outfits from Barbie the Dream Closet not only scale to fit users, but enable them to turn sideways to see how they look from various angles.

    This week, Barbie lovers in Sydney, Australia, are being given the chance to do more than fanaticize how they’d look in their favorite Barbie outfit. Thanks to Mattel, Gun Communications, Adapptor, and Kinect for Windows, Barbie The Dream Closet is here.

    The application invites users to take a walk down memory lane and select from 50 years of Barbie fashions. Standing in front of Barbie’s life-sized augmented reality “mirror,” fans can choose from several outfits in her digital wardrobe—virtually trying them on for size.

    The solution, built with the Kinect for Windows SDK and using the Kinect for Windows sensor, tracks users’ movements and gestures enabling them to easily browse through the closet and select outfits that strike their fancy. Once an outfit is selected, the Kinect for Windows skeletal tracking determines the position and orientation of the user. The application then rescales Barbie’s clothes, rendering them over the user in real time for a custom fit.

    One of the most interesting aspects of this solution is the technology’s ability to scale - with menus, navigation controls and clothing all dynamically adapting so that everyone from a little girl to a grown woman (and cough, yes, even a committed father) can enjoy the experience. To facilitate these advancements, each outfit was photographed on a Barbie doll, cut into multiple parts, and then built individually via the application. 

    Of course, the experience wouldn’t be complete without the ability to memorialize it. A photo is taken and, with approval/consent from those photographed, is uploaded and displayed in a gallery on the Barbie Australian Facebook page. (Grandparents can join in the fun from afar!)

    I spoke with Sarah  Sproule, Director, Gun Communications about the genesis of the idea who told me, We started working on Barbie The Dream Closet six months ago, working with our development partner Adapptor. Everyone has been impressed by the flexibility, and innovation Microsoft has poured into Kinect for Windows. Kinect technology has provided Barbie with a rich and exciting initiativBarbie enthusiasts of all ages can enjoy trying on and posing in outfits.e that's proving to delight fans of all ages. We're thrilled with the result, as is Mattel - our client."

    Barbie’s Dream Closet, was opened to the public at the Westfield Parramatta in Sydney  today and will be there through April 15. Its first day, it drew enthusiastic crowds, with around 100 people experiencing Barbie The Dream Closet. It's expected to draw even larger crowds over the holidays. It’s set to be in Melbourne and Brisbane later this year.

     Meantime, the Kinect for Windows team is just as excited about it as my daughter:

    “The first time I saw Barbie’s Dream Closet in action, I knew it would strike a chord,” notes Kinect for Windows Communications Manager, Heather Mitchell. “It’s such a playful, creative use of the technology. I remember fanaticizing about wearing Barbie’s clothes when I was a little girl. Disco Ken was a huge hit in my household back then…Who didn’t want to match his dance moves with their own life-sized Barbie disco dress? I think tens of thousands of grown girls have been waiting for this experience for years…Feels like a natural.”

    That’s the beauty of Kinect – it enables amazingly natural interactions with technology and hundreds of companies are out there building amazing things; we can’t wait to see what they continue to invent.

    Steve Clayton
    Editor, Next at Microsoft

  • Kinect for Windows Product Blog

    Inside the Newest Kinect for Windows SDK—Infrared Control

    • 0 Comments

    Inside the Newest Kinect for Windows SDK—Infrared ControlThe Kinect for Windows software development kit (SDK) October release was a pivotal update with a number of key improvements. One important update in this release is how control of infrared (IR) sensing capabilities has been enhanced to create a world of new possibilities for developers.

    IR sensing is a core feature of the Kinect sensor, but until this newest release, developers were somewhat restrained in how they could use it. The front of the Kinect for Windows sensor has three openings, each housing a core piece of technology. On the left, there is an IR emitter, which transmits a factory calibrated pattern of dots across the room in which the sensor resides. The middle opening is a color camera. The third is the IR camera, which reads the dot pattern and can help the Kinect for Windows system software sense objects and people along with their skeletal tracking data.

    One key improvement in the SDK is the ability to control the IR emitter with a new API, KinectSensor.ForceInfraredEmitterOff. How is this useful? Previously, the sensor's IR emitter was always active when the sensor was active, which can cause depth detection degradation if multiple sensors are observing the same space. The original focus of the SDK had been on single sensor use, but as soon as innovative multi-sensor solutions began emerging, it became a high priority to enable developers to control the IR emitter. “We have been listening closely to the developer community, and expanded IR functionality has been an important request,” notes Adam Smith, Kinect for Windows principal engineering lead. “This opens up a lot of possibilities for Kinect for Windows solutions, and we plan to continue to build on this for future releases.”

    Another useful application is expanded night vision with an external IR lamp (wavelength: 827 nanometers). “You can turn off the IR emitter for pure night vision ("clean IR"),” explains Smith, “or you can leave the emitter on as an illumination source and continue to deliver full skeletal tracking. You could even combine these modes into a dual-mode application, toggling between clean IR and skeletal tracking on demand, depending on the situation. This unlocks a wide range of possibilities—from security and monitoring applications to motion detection, including full gesture control in a dark environment.”

    Finally, developers can use the latest version of the SDK to pair the IR capabilities of the Kinect for Windows sensor with a higher definition color camera for enhanced green screen capabilities. This will enable them to go beyond the default 640x480 color camera resolution without sacrificing frame rate. “To do this, you calibrate your own color camera with the depth sensor by using a tool like OpenCV, and then use the Kinect sensor in concert with additional external cameras or, indeed, additional Kinect sensors,” notes Smith. “The possibilities here are pretty remarkable: you could build a green screen movie studio with full motion tracking and create software that transforms professional actors—or even, say, visitors to a theme park—into nearly anything that you could imagine."

    Kinect for Windows team

    Key Links

  • Kinect for Windows Product Blog

    Windows Store app development is coming to Kinect for Windows

    • 9 Comments

    Today at Microsoft BUILD 2014, Microsoft made it official: the Kinect for Windows v2 sensor and SDK are coming this summer (northern hemisphere). With it, developers will be able to start creating Windows Store apps with Kinect for the first time. The ability to build such apps has been a frequent request from the developer community. We are delighted that it’s now on the immediate horizon—with the ability for developers to start developing this summer and to commercially deploy their solutions and make their apps available to Windows Store customers later this summer.

    The ability to create Windows Store apps with Kinect for Windows not only fulfills a dream of our developer community, it also marks an important step forward in Microsoft’s vision of providing a unified development platform across Windows devices, from phones to tablets to laptops and beyond. Moreover, access to the Windows Store opens a whole new marketplace for business and consumer experiences created with Kinect for Windows.

    The Kinect for Windows v2 has been re-engineered with major enhancements in color fidelity, video definition, field of view, depth perception, and skeletal tracking. In other words, the v2 sensor offers greater overall precision, improved responsiveness, and intuitive capabilities that will accelerate your development of voice and gesture experiences.

    Specifically, the Kinect for Windows v2 includes 1080p HD video, which allows for crisp, high-quality augmented scenarios; a wider field of view, which means that users can stand closer to the sensor—making it possible to use the sensor in smaller rooms; improved skeletal tracking, which opens up even better scenarios for health and fitness apps and educational solutions; and new active infrared detection, which provides better facial tracking and gesture detection, even in low-light situations.

    The Kinect for Windows v2 SDK brings the sensor’s new capabilities to life:

    • Window Store app development: Being able to integrate the latest human computing technology into Windows apps and publish those to the Windows Store will give our developers the ability to reach more customers and open up access to natural user experiences in the home.
    • Unity Support: We are committed to supporting the broader developer community with a mix of languages, frameworks, and protocols. With support for Unity this summer, more developers will be able to build and publish their apps to the Windows Store by using tools they already know.
    • Improved anatomical accuracy: With the first-generation SDK, developers were able to track up to two people simultaneously; now, their apps can track up to six. And the number of joints that can be tracked has increased from 20 to 25 joints per person. Lastly, joint orientation is better. The result is skeletal tracking that’s greatly enhanced overall, making it possible for developers to deliver new and improved applications with skeletal tracking, which our preview participants are calling “seamless.”
    • Simultaneous, multi-app support: Multiple Kinect-enabled applications can run simultaneously. Our community has frequently requested this feature and we’re excited to be able to give it to them with the upcoming release.

    Developers who have been part of the Kinect for Windows v2 Developer Preview program praise the new sensor’s capabilities, which take natural, human computing to the next level. We are in awe and humbled by what they’ve already been able to create.

    Technologists from a few participating companies are on hand at BUILD, showing off the apps they have created by using the Kinect for Windows v2. See what two of them, Freak’n Genius and Reflexion Health, have already been able to achieve, and learn more about these companies.

    The v2 sensor and SDK dramatically enhance the world of gesture and voice control that were pioneered in the original Kinect for Windows, opening up new ways for developers to create applications that transform how businesses and consumers interact with computers. If you’re using the original Kinect for Windows to develop natural voice- and gesture-based solutions, you know how intuitive and powerful this interaction paradigm can be. And if you haven’t yet explored the possibilities of building natural applications, what are you waiting for? Join us as we continue to make technology easier to use and more intuitive for everyone.

    The Kinect for Windows Team

    Key links

  • Kinect for Windows Product Blog

    Kinect for Windows at Convergence of Style and Technology for New York Fashion Week

    • 5 Comments

    Kinect for Windows powers a new technology that virtually models the hottest styles in Bloomingdale’s during Fashion Week.This year, Kinect for Windows gives Fashion Week in New York a high-tech boost by offering a new way to model the latest styles at retail. Swivel, a virtual dressing room that is featured at Bloomingdale's, helps you quickly see what clothes look like on you—without the drudgery of trying on multiple garments in the changing room.

    Twenty Bloomingdale's stores across the United States are featuring Swivel this week— including outlets in Atlanta, Chicago, Miami, Los Angeles, and San Francisco. This Kinect for Windows application was developed by FaceCake Marketing Technologies, Inc.

    Also featured at Bloomingdale's during Fashion Week is a virtual version of a Microsoft Research project called The Printing Dress. This remarkable melding of fashion and technology is on display at Bloomingdale's 59th Street location in New York. The Printing Dress enables the wearer of the virtual dress to display messages via a projector inside the dress by typing on keys that are inlaid on the bodice. Normally, you wouldn't be able to try on such a fragile runway garment, but the Kinect-enabled technology makes it possible to see how haute couture looks on you.

    Bloomingdale's has made early and ongoing investments in deploying Kinect for Windows gesture-based experiences at retail stores: they featured another Kinect for Windows solution last March at their Century City store in Los Angeles, just six weeks after the launch of the technology. That solution by Bodymetrics uses shoppers’ body measurements to help them find the best fitting jeans. The Bodymetrics body mapping technology is currently being used at the Bloomingdale’s store in Palo Alto, California.

    "Merging fashion with technology is not just a current trend, but the wave of the future," said Bloomingdale's Senior Vice President of Marketing Frank Berman. "We recognize the melding of the two here at Bloomingdale's, and value our partnership with companies like Microsoft to bring exciting animation to our stores and website to enhance the experience for our shoppers."

    Here's how Swivel works: the Kinect for Windows sensor detects your body and displays an image of you on the screen. Kinect provides both the customer's skeleton frame and 3-D depth data to the Swivel sizing and product display applications. Wave your hand to select a new outfit, and it is nearly instantly fitted to your form. Next, you can turn around and view the clothing from different angles. Finally, you can snap a picture of you dressed in your favorite ensemble and—by using a secure tablet—share it with friends over social networks.

    The Printing Dress, a remarkable melding of fashion and technology, on display at Bloomingdale's in New York.Since Bloomingdale’s piloted the Swivel application last May, FaceCake has enhanced detection and identification so that the camera tracks the shopper (instead of forcing the shopper to move further for the camera) and improved detection of different-sized people so that it can display more accurately how the garment would look if fitted to the customer.

    Swivel and Bodymetrics are only two examples of Kinect for Windows unleashing new experiences in fashion and retail. Others include:

    • One of the participants in the recent Microsoft Accelerator for Kinect program, Styku, LLC, has also developed virtual fitting room software and body scanner technology powered by Kinect for Windows. 
    • Mattel brought to life Barbie: The Dream Closet that makes it possible for anyone to try on clothes from 50 years of Barbie's wardrobe. 
    • Kimetric , another Kinect Accelerator participant, uses Kinect for Windows sensors strategically placed throughout a store to gather useful data, helping a retailer to better understand consumer behavior.

    With this recent wave of retail experiences powered by Kinect for Windows, we are starting to get a glimpse into the ways technology innovators and retailers will reimagine and transform the way we shop with new Kinect-enabled tools.

    Kinect for Windows Team

    Key Links

  • Kinect for Windows Product Blog

    Build-A-Bear Selects Kinect for Windows for "Store of the Future"

    • 0 Comments

    Build-A-Bear Workshop stores have been delivering custom-tailored experiences to children for 15 years in the form of make-your-own stuffed animals, but the company recently recognized that its target audience was gravitating toward digital devices. So it has begun advancing its in-store experiences to match the preferences of its core customers by incorporating digital screens throughout the stores—from the entrance to the stations where the magic of creating new fluffy friends happens.

    A key part of Build-A-Bear's digital shift is their interactive storefront that's powered by Kinect for Windows. It enables shoppers to play digital games on either a screen adjacent to the store entrance or directly through the storefront window simply by using their bodies and natural gestures to control the game.

    Children pop virtual balloons in a Kinect for Windows-enabled game at this Build-A-Bear store's front window.
    Children pop virtual balloons in a Kinect for Windows-enabled game at this Build-A-Bear store's front window.

    "We're half retail, half theme park," said Build-A-Bear Director of Digital Ventures Brandon Elliott. The Kinect for Windows platform instantly appealed to Build-A-Bear as "a great enabler for personalized interactivity."

    The Kinect for Windows application, launched at six pilot stores, uses skeletal tracking to enable two players (four hands) to pop virtual balloons (up to five balloons simultaneously) by waving their hands or by touching the screen directly. While an increasing number of retail stores use digital signage these days, Elliott noted: "What they're not doing is building a platform for interactive use."

    "We wanted something that we could build on, that's a platform for ever-improving experiences," Elliott said. "With Kinect for Windows, there’s no learning curve. People can interact naturally with technology by simply speaking and gesturing the way they do when communicating with other people. The Kinect for Windows sensor sees and hears them."

    "Right now, we're just using the skeletal tracking, but we could use voice recognition components or transform the kids into on-screen avatars," he added. "The possibilities are endless."
     
    Part of the Build-A-Bear's vision is to create Kinect for Windows apps that tie into the seasonal marketing themes that permeate the stores. Elliott said that Build-A-Bear selected the combination of the Microsoft .NET Framework, Kinect for Windows SDK, and Kinect for Windows sensor specifically so that they can take advantage of existing developer platforms to build these new apps quickly.

    “We appreciate that the Kinect for Windows SDK is developing so rapidly. We appreciate the investment Microsoft is making to continue to open up features within the Kinect for Windows sensor to us,” Elliott said. "The combination of Kinect for Windows hardware and software unlocks a world of new UI possibilities for us."

    Microsoft developer and technical architect Todd Van Nurden and others at the Minneapolis-based Microsoft Technology Center helped Build-A-Bear with an early consultation that led to prototyping apps for the project.

    "The main focus of my consult was to look for areas beyond simple screen-tied interactions to create experiences where Kinect for Windows activates the environment. Screen-based interactions are, of course, the easiest but less magical then environmental," Van Nurden said. "We were going for magical."

    The first six Build-A-Bear interactive stores launched in October and November 2012 in St. Louis, Missouri; Pleasanton, California; Annapolis, Maryland; Troy, Michigan; Fairfax, Virginia, and Indianapolis, Indiana (details). Four of the stores have gesture-enhanced interactive signs at the entrance, while two had to be placed behind windows to comply with mall rules. Kinect for Windows can work through glass with the assistance of a capacitive sensor that enables the window to work as a touch screen, and an inductive driver that turns glass into a speaker.

    So far, Build-A-Bear has been thrilled with what Elliott calls "fantastic" results. "Kids get it," he said. "We have a list of apps we want to build over the next couple of years. We can literally write an app for one computer in the store, and put it anywhere."

    Kinect for Windows team

    Key Links

Page 3 of 10 (95 items) 12345»