As you might imagine, working in a nuclear power plant provides special challenges. One crucial aspect for any project is the need to minimize employee exposure to radiation by applying a standard known as As Low As Reasonably Achievable—ALARA for short.
How this works: Plant ALARA managers work with the maintenance groups to estimate how much time is required to perform a task and, allowing for exposure limits, they determine how many employees may be needed to safely complete it. Today, that work is typically done with pen and paper. But new tools from Siemens PLM Software that incorporate the Kinect for Windows sensor could change this by providing a 3-D virtual interactive modeling environment.
Kinect for Windows is used to capture realistic movement for use in the Siemens Teamcenter solution for ALARA radiation planning.
The solution, piloted at a US nuclear power plant last year, is built on Siemens’ Teamcenter software, using its Tecnomatix process simulate productivity product. Siemens PLM Software Tecnomatix provides virtual 3-D human avatars—“Jack” and “Jill”—that are integrated to model motion-controlled actions input with a Kinect for Windows sensor. This solution is helping to usher in a new era of industrial planning applications for employee health and safety in the nuclear industry.
"We're really revolutionizing the industry," said Erica Simmons, global marketing manager for Energy, Oil, and Gas Industries at Siemens PLM Software. "For us, this was a new way to develop a product in tandem with the industry associations. We created a specific use case with off-the-shelf technology and tested and validated it with industry. What we have now is a new visual and interactive way of simulating potential radiation exposure which can lead to better health and safety strategies for the plant."
Traditional pencil-and-paper planning (left) compared to the Siemens PLM Software Process Simulate on Teamcenter solution (right) with “Jack” avatar and Kinect for Windows movement input.
The Siemens Tecnomatix process planning application, integrated with the Kinect for Windows system, will give nuclear plant management the ability to better manage individual worker radiation exposure and optimize steps to reduce overall team exposure. As a bonus, once tasks have been recorded by using “Jack,” the software can be used for training. Employees can learn and practice an optimized task by using Kinect for Windows and Siemens “Jack” and “Jill”—safely outside of the radiation zone—until they have mastered it and are ready to perform the actual work.
"We wanted to add something more for the user of this solution in addition to our 3-D human avatars and the hazard zones created by our visual cartography; this led us to exploring what we could do with the Kinect for Windows SDK for this use case," said Dr. Ulrich Raschke, director of Human Simulation Technologies at Siemens PLM Software. “User feedback has been good so far; the addition of the Kinect for Windows system adds another level of interactivity to our application."
This Siemens solution grew out of a collaborative effort with Electric Power Research Institute (EPRI) and Fiatech industry association, which identified the need for more technologically advanced estimation tools for worker radiation dosage. Kinect for Windows was incorporated when the developers were tailoring the avatar system to the solution and exploring ways to make the user experience much more interactive.
"Collaboration with several key stakeholders and industry experts led to this innovative solution," said Phung Tran, senior project manager at EPRI. "We're pleased the industry software providers are using it, and look forward to seeing the industry utilize these new tools."
“In fact,” Tran added, “the tool is not necessarily limited to radiation work planning. It could help improve the management and execution of many operation, maintenance, and project-based tasks.”
Kinect for Windows team
Much like Build-A-Bear Workshop, Mattel has been watching the trends and finding that children are embracing digital media. How can the company keep a toy like the Barbie doll, launched in 1959, relevant in a world where tablet computers and smartphones dominate kids' wishlists?
Once again, Kinect for Windows has proved a formidable ally in bridging the gap between digital entertainment and traditional toys. A six-month project for Mattel, Gun Communications and creative applications developer Adapptor built Barbie: the Dream Closet, which lets enthusiasts of all ages across Australia virtually try on a variety Barbie outfits from different decades by using a Kinect for Windows-enabled "magic mirror." Have you ever wondered what you’d look like in one of Barbie's ball gowns, or as an astronaut, or a race car driver? With the Dream Closet, it's possible. Additionally, you can save and share photos over social media, or even take a photo home.
To build the application, each outfit was photographed on a Barbie doll, trimmed into its component parts, and then reconstructred dynamically on Barbie fans by the custom Dream Closet application, built in Microsoft XNA. The Kinect for Windows sensor and software development kit (SDK) make it easy to accurately determine the size of the user so the virtual clothes and selection menus can be fitted to match.
"If we would have had to write code from the ground up [versus using code provided in the SDK], it would have taken much longer, and the end result wouldn’t have been nearly as impactful," said Adapptor Managing Director Mark Loveridge. "The Kinect for Windows SDK doubled our development speed."
The result of Barbie: the Dream Closet? Increased customer brand loyalty and media coverage yielding more than 25 million impressions, a new case study reports.
"The impact of Kinect for Windows on the public and the Barbie brand is incredible," notes Mattel Marketing Director Amanda Allegos. "Kinect for Windows has given us a new way to reach existing Barbie fans and attract new ones in a way that's contemporary, interactive, and bridges both the digital and physical worlds."
A unique clinic for treating children with cancer and blood disorders, alex’s place is designed to be a warm, open, communal space. The center—which is located in Miami, Florida—helps put its patients at ease by engaging them with interactive screens that allow them to be transported into different environments—where they become a friendly teddy bear, frog, or robot and control their character’s movements in real time.
"As soon as they walk in, technology is embracing them," said Dr. Julio Barredo, chief of pediatric services at alex's place in The Sylvester Comprehensive Cancer Center, University of Miami Health Systems.
The clinic—which opened its doors in May 2012—was conceived of and designed with this in mind, and the Kinect for Windows digital experience was part of the vision from day one. Created by Snibbe Interactive, Character Mirror was designed to fit naturally within this innovative, unconventional treatment environment. The goal is to help reinforce patients' mind-body connection with engaging play and entertainment, as well as to potentially reduce their fear of technology and the treatments they face. As an added benefit, nurses can observe a child's natural range of movement during play and more easily draw out answers to key diagnostic questions.
"I find the gestural interactive experiences we created for alex's place in Miami among the most worthwhile and satisfying in our history," said Scott Snibbe, founder and CEO of Snibbe Interactive. "Kids in hospitals are feeling lonely, scared, and bored, not to mention sick. Partnering with Alex Daly and Dr. Barredo, we created a set of magical experiences that encourage healthy, social, and physical activity among the kids.
"Kids found these experiences so pleasing that they actually didn't want to leave after their treatments were complete," Snibbe added. "We are very excited to roll out these solutions to more hospitals, and transform healthcare through natural user interfaces that promote social play and spontaneous physical therapy."
The Kinect for Windows software development kit (SDK) October release was a pivotal update with a number of key improvements. One important update in this release is how control of infrared (IR) sensing capabilities has been enhanced to create a world of new possibilities for developers.
IR sensing is a core feature of the Kinect sensor, but until this newest release, developers were somewhat restrained in how they could use it. The front of the Kinect for Windows sensor has three openings, each housing a core piece of technology. On the left, there is an IR emitter, which transmits a factory calibrated pattern of dots across the room in which the sensor resides. The middle opening is a color camera. The third is the IR camera, which reads the dot pattern and can help the Kinect for Windows system software sense objects and people along with their skeletal tracking data.
One key improvement in the SDK is the ability to control the IR emitter with a new API, KinectSensor.ForceInfraredEmitterOff. How is this useful? Previously, the sensor's IR emitter was always active when the sensor was active, which can cause depth detection degradation if multiple sensors are observing the same space. The original focus of the SDK had been on single sensor use, but as soon as innovative multi-sensor solutions began emerging, it became a high priority to enable developers to control the IR emitter. “We have been listening closely to the developer community, and expanded IR functionality has been an important request,” notes Adam Smith, Kinect for Windows principal engineering lead. “This opens up a lot of possibilities for Kinect for Windows solutions, and we plan to continue to build on this for future releases.”
Another useful application is expanded night vision with an external IR lamp (wavelength: 827 nanometers). “You can turn off the IR emitter for pure night vision ("clean IR"),” explains Smith, “or you can leave the emitter on as an illumination source and continue to deliver full skeletal tracking. You could even combine these modes into a dual-mode application, toggling between clean IR and skeletal tracking on demand, depending on the situation. This unlocks a wide range of possibilities—from security and monitoring applications to motion detection, including full gesture control in a dark environment.”
Finally, developers can use the latest version of the SDK to pair the IR capabilities of the Kinect for Windows sensor with a higher definition color camera for enhanced green screen capabilities. This will enable them to go beyond the default 640x480 color camera resolution without sacrificing frame rate. “To do this, you calibrate your own color camera with the depth sensor by using a tool like OpenCV, and then use the Kinect sensor in concert with additional external cameras or, indeed, additional Kinect sensors,” notes Smith. “The possibilities here are pretty remarkable: you could build a green screen movie studio with full motion tracking and create software that transforms professional actors—or even, say, visitors to a theme park—into nearly anything that you could imagine."
Build-A-Bear Workshop stores have been delivering custom-tailored experiences to children for 15 years in the form of make-your-own stuffed animals, but the company recently recognized that its target audience was gravitating toward digital devices. So it has begun advancing its in-store experiences to match the preferences of its core customers by incorporating digital screens throughout the stores—from the entrance to the stations where the magic of creating new fluffy friends happens.
A key part of Build-A-Bear's digital shift is their interactive storefront that's powered by Kinect for Windows. It enables shoppers to play digital games on either a screen adjacent to the store entrance or directly through the storefront window simply by using their bodies and natural gestures to control the game.
Children pop virtual balloons in a Kinect for Windows-enabled game at this Build-A-Bear store's front window.
"We're half retail, half theme park," said Build-A-Bear Director of Digital Ventures Brandon Elliott. The Kinect for Windows platform instantly appealed to Build-A-Bear as "a great enabler for personalized interactivity."
The Kinect for Windows application, launched at six pilot stores, uses skeletal tracking to enable two players (four hands) to pop virtual balloons (up to five balloons simultaneously) by waving their hands or by touching the screen directly. While an increasing number of retail stores use digital signage these days, Elliott noted: "What they're not doing is building a platform for interactive use."
"We wanted something that we could build on, that's a platform for ever-improving experiences," Elliott said. "With Kinect for Windows, there’s no learning curve. People can interact naturally with technology by simply speaking and gesturing the way they do when communicating with other people. The Kinect for Windows sensor sees and hears them."
"Right now, we're just using the skeletal tracking, but we could use voice recognition components or transform the kids into on-screen avatars," he added. "The possibilities are endless." Part of the Build-A-Bear's vision is to create Kinect for Windows apps that tie into the seasonal marketing themes that permeate the stores. Elliott said that Build-A-Bear selected the combination of the Microsoft .NET Framework, Kinect for Windows SDK, and Kinect for Windows sensor specifically so that they can take advantage of existing developer platforms to build these new apps quickly.
“We appreciate that the Kinect for Windows SDK is developing so rapidly. We appreciate the investment Microsoft is making to continue to open up features within the Kinect for Windows sensor to us,” Elliott said. "The combination of Kinect for Windows hardware and software unlocks a world of new UI possibilities for us."
Microsoft developer and technical architect Todd Van Nurden and others at the Minneapolis-based Microsoft Technology Center helped Build-A-Bear with an early consultation that led to prototyping apps for the project.
"The main focus of my consult was to look for areas beyond simple screen-tied interactions to create experiences where Kinect for Windows activates the environment. Screen-based interactions are, of course, the easiest but less magical then environmental," Van Nurden said. "We were going for magical."
The first six Build-A-Bear interactive stores launched in October and November 2012 in St. Louis, Missouri; Pleasanton, California; Annapolis, Maryland; Troy, Michigan; Fairfax, Virginia, and Indianapolis, Indiana (details). Four of the stores have gesture-enhanced interactive signs at the entrance, while two had to be placed behind windows to comply with mall rules. Kinect for Windows can work through glass with the assistance of a capacitive sensor that enables the window to work as a touch screen, and an inductive driver that turns glass into a speaker.
So far, Build-A-Bear has been thrilled with what Elliott calls "fantastic" results. "Kids get it," he said. "We have a list of apps we want to build over the next couple of years. We can literally write an app for one computer in the store, and put it anywhere."
Swivel Close-Up, a Kinect for Windows-based kiosk from FaceCake, lets customers visualize themselves in small accessories such as makeup, sunglasses, and jewelry.
Microsoft Kinect for Windows has been playing an increasingly important role in retail, from interactive kiosks at stores such as Build-A-Bear Workshop, to virtual dressing rooms at fashion leaders like Bloomingdale's, to virtual showrooms at Nissan dealerships. This year's National Retail Federation (NRF) Convention and Expo, which took place earlier this week, showcased several solutions that provide retailers with new ways to drive customer engagement, sales, and loyalty.
Trend watchers have noted significant shifts in how consumers shop—often blending online and in-store investigation by using phones, tablets, kiosks, and computers in addition to good old-fashioned salesperson interaction. Brick-and-mortar stores, which are facing vigorous competition from online resellers, are embracing new technologies like Kinect for Windows to help drive sales and retention—and to delight and surprise customers with fun, interactive shopping experiences. Even better, customers can get more accurate and personalized information about whether a specific product is right for them—whether it's an article of clothing or a piece of furniture—reducing dissatisfaction and inconvenient returns.
"This past holiday season, we’ve seen retailers get much more tech savvy in how they engage customers and offer more flexibility in how they shop," said Kinect for Windows Senior Channel Development Manager Michael Fry. "As the lines between traditional and digital shopping channels continue to blur, retailers must seek new ways to deliver the most value and earn loyalty through compelling, seamless experiences across all touch points with their customers. Technologies like Kinect for Windows help retailers engage customers with interactive shopping experiences that are not only fun, but also increase important bottom-line business results—increasing engagement, awareness, and brand value while making it easier to select the best products for them."
At a hospitality event during NRF, Kinect for Windows partner Avanade showed one such innovation: their "shoppable storefront," created for my-wardrobe.com in Norway. Customers can walk up to the showroom window and—even after business hours—interact with the Kinect for Windows sensor to browse the store catalog, view pricing, and scan a Quick Response (QR) code to quickly purchase the product online via mobile phone. See a video of how it works.
"Consider the possibilities within the store, they're almost endless with a technology like Kinect for Windows," said John Konczal, director of service line marketing at Avanade. "You could build a guide for customers to find more information about products and quickly locate them in the store. If an item is not available, order it for shipment and pick-up at the nearest store. The interactivity, simplicity, and responsiveness of this technology can really help retailers differentiate their stores from the competition."
Avanade also demonstrated Natural User Observation of Retail Displays (NUO), which provides a cost-effective solution for retailers by gathering real-time customer response and behavior. This allows retail managers to do things like determine where customers are spending their time in the store, identify trends, and gather demographic and customer behavior as they interact with store displays. Avanade reports that the solution integrates into existing store and back-office IT systems and provides dashboards and data-rich reporting for improving product placement, marketing effectiveness, and overall display performance.
Another of our partners, FaceCake Marketing Technologies, Inc., which developed Swivel, the 3-D virtual dressing room that's been featured at Bloomingdale's, showed NRF attendees the newest enhancements to their Swivel software. The enhancements, which work in conjunction with the latest Kinect for Windows SDK, include face-tracking and a feature called real-time Compare, which allows you to contrast two looks in a full-motion visualization of yourself in two dresses (or any type of clothing) side-by-side. Sizing is now even more accurate, and FaceCake also added multi-user functionality that allows, for example, a bride to see herself, virtually, in various wedding dresses at the same time as her bridesmaids see themselves in their bridesmaid dresses.
We also featured another exciting new product from FaceCake in our booth: Swivel Close-Up. This Kinect for Windows-based kiosk, which operates within a two-foot environment, lets customers try on much smaller accessories than clothing including makeup, sunglasses, and jewelry. Earrings dangle and twist beautifully as a shopper tries them on virtually, and consumers now have the opportunity to try on a limitless number of lip colors without lipstick ever touching their lips.
"We can now provide an extended Try-On solution that is real-time, 3-D, and full motion as opposed to just uploading a static image and then modifying it," said FaceCake CEO Linda Smith. "The result is a lifelike representation that's just like looking in a mirror—your dream dressing room mirror powered by Swivel and Kinect for Windows! It's both efficient and fun for the customer."
One of the key themes of this year's NRF event was putting customers at the center of retail marketing, something that Kinect for Windows accomplishes readily, thanks to its ability to quickly entice customers into virtual shopping spaces within actual storefronts, making it easier than ever for them to find, experience, and purchase products that are right for them.
"Staying competitive in retail today means putting customers at the heart of the business and seeking new ways to deliver value in the store," Fry said. "A Kinect for Windows retail display immediately puts the focus on the shopper, delivering uniquely personalized results that drive both sales and customer satisfaction."
Styku, a Kinect Accelerator startup, set out to alter clothes shopping for retailers by using the Kinect for Windows sensor and software development kit to develop its Smart Fitting Room quickly, a new case study reports.
The technology will soon be used by Brooks Brothers, IM-Label, and other fashion retailers. Styku hopes to improve the shopping experience—reducing the problem of shoppers returning up to 40 percent of their online purchases and offering a faster, less expensive body scanning solution. Additionally, military apparel contractors appreciate the improved measurement capability of Kinect for Windows with the Styku software—estimated to be up to 400 percent more accurate—which could save soldiers' lives, thanks to better fitting body armor.
Customers can quickly visualize the fit and fabric characteristics of garments over digital renderings of their bodies that are created by scanning their body with the Kinect for Windows sensor. The scan lasts only one second—reducing the risk that a fidgety customer will compromise the scan’s accuracy. Clothing is rendered in 3-D, and customers can use gesture to rotate, view a custom-fit color map, and compare multiple sizes.
"Kinect for Windows had exactly the sensors that we needed, in a small package," said Pierre Du Charme, vice president of Software Engineering for Styku. "The SDK was easy to learn and gave us the tools to quickly implement a full-featured application."
Reflexion Health, founded with technology developed at the West Health Institute, realized years ago that assessing physical therapy outcomes is difficult for a variety of reasons, and took on the challenge of designing a solution to help increase the success rates of rehabilitation from physical injury.
In 2011, the Reflexion team approached the Orthopedic Surgery Department of the Naval Medical Center San Diego to help test their new Rehabilitation Measurement Tool (RMT). This software solution was developed to make physical therapy more engaging, efficient, and successful. By using the Kinect for Windows sensor and software development kit (SDK), the RMT allows clinicians to measure patient progress. Patients often do much of their therapy alone and because they can lack immediate feedback from therapists, it can be difficult for them to be certain that they are performing the exercises in a manner that will provide them with optimal benefits. The RMT can indicate if exercises were performed properly, how frequently they were performed, and give patients real-time feedback.
Reflexion Health's Kinect for Windows-based tool helps measure how patients respond to physical therapy.
“Kinect for Windows helps motivate patients to do physical therapy—and the data set we gather when they use the RMT is becoming valuable to demonstrate what form of therapy is most effective, what types of patients react better to what type of therapy, and how to best deliver that therapy. Those questions have vexed people for a long time,” says Dr. Ravi Komatireddy, co-founder at Reflexion Health.
The proprietary RMT software engages patients with avatars and educational information, and a Kinect for Windows sensor tracks a patient’s range of motion and other clinical data. This valuable information helps therapists customize and deliver therapy plans to patients.
“RMT is a breakthrough that can change how physical therapy is delivered,” Spencer Hutchins, co-founder and CEO of Reflexion Health says. “Kinect for Windows helps us build a repository of information so we can answer rigorous questions about patient care in a quantitative way.” Ultimately, Reflexion Health has demonstrated how software could be prescribed—similarly to pharmaceuticals and medical devices—and how it could possibly lower the cost of healthcare.
More information about RMT and the clinical trials conducted by the Naval Medical Center can be found in the newly released case study.
The following blog post was guest authored by Celeste Humphrey, business development consultant at nsquared, and Dr. Neil Roodyn, director of nsquared.
A company that is passionate about learning, technology, and creating awesome user experiences, nsquared has developed three new applications that take advantage of Kinect for Windows to provide users with interactive, natural user interface experiences. nsquared is located in Sydney, Australia.
At nsquared, we believe that vision-based interaction is the future of computing. The excitement we see in the technology industry regarding touch and tablet computing is a harbinger of the changes that are coming as smarter computer vision systems evolve.
Kinect for Windows has provided us with the tools to create some truly amazing products for education, hospitality, and events.
Education: nsquared sky spelling
We are excited to announce nsquared sky spelling, our first Kinect for Windows-based educational game. This new application, aimed at children aged 4 to 12, makes it fun for children to learn to spell in an interactive and collaborative environment. Each child selects a character or vehicle, such as a dragon, a biplane, or a butterfly, and then flies as that character through the sky to capture letters that complete the spelling of various words. The skeleton recognition capabilities of the Kinect for Windows sensor and software development kit (SDK) track the movement of the children as they stretch out their arms as wings to navigate their character through hoops alongside their wingman (another player). The color camera in the Kinect for Windows sensor allows each child to add their photo, thereby personalizing their experience.
nsquared sky spelling
Hospitality: nsquared hotel kiosk
The nsquared hotel kiosk augments the concierge function in a hotel by providing guidance to hotel guests through an intuitive, interactive experience. Guests can browse through images and videos of activities, explore locations on a map, and find out what's happening with a live event calendar. It also provides live weather updates and has customizable themes. The nsquared hotel kiosk uses the new gestures supported in the Kinect for Windows SDK 1.7, enabling users to use a “grip” gesture to drag content across the screen and a “push” gesture to select content. With its fun user interface, this informative kiosk provides guests an interactive alternative to the old brochure rack.
Kinect for Windows technology enables nsquared to provide an interactive kiosk experience for less than half the price of a similar sized touchscreen (see note).
nsquared hotel kiosk
Events: nsquared media viewer
The new nsquared media viewer application is a great way to explore interactive content in almost any environment. Designed for building lobbies, experience centers, events, and corporate locations, the nsquared media viewer enables you to display images and video by category in a stylish, customizable carousel. Easy to use, anyone can walk up and start browsing in seconds.
In addition to taking advantage of key features of the Kinect for Windows sensor and SDK, nsquared media viewer utilizes Windows Azure, allowing clients to view reports about the usage of the screen and the content displayed.
nsquared media viewer
Kinect for Windows technology has made it possible for nsquared to create applications that allow people to interact with content in amazing new ways, helping us take a step towards our collective future of richer vision-based computing systems.
Celeste Humphrey, business development consultant, andDr. Neil Roodyn, director, nsquared
____________Note: Based on the price of 65-inch touch overlay at approximately US$900 compared to the cost of a Kinect for Windows sensor at approximately US$250. For integrated touch solutions, the price can be far higher. Back to blog...
The following blog post was guest authored by Anup Chathoth, co-founder and CEO of Ubi Interactive.
Ubi Interactive is a Seattle startup that was one of 11 companies from around the world selected to take part in a three-month Microsoft Kinect Accelerator program in the spring of 2012. Since then, the company has developed the software with more than 100 users and is now accepting orders for the software.
Patrick Wirtz, an innovation manager for The Walsh Group, spends most of his time implementing technology that will enhance Walsh’s ability to work with clients. It’s a vital role at The Walsh Group, a general building construction organization founded in 1898 that has invested more than US$450 Million in capital equipment and regularly employs more than 5,000 engineers and skilled tradespeople.
"It’s a powerful piece of technology," says Patrick Wirtz, shown here using Ubi in The Walsh Group offices. By setting up interactive 3-D blueprints on the walls, Walsh gives clients the ability to explore, virtually, a future building or facility.
In the construction industry, building information modeling (BIM) is a critical component of presentations to clients. BIM allows construction companies like The Walsh Group to represent the functional characteristics of a facility digitally. While this is mostly effective, Wirtz wanted something that would really “wow” his clients. He wanted a way for them to not only see the drawings, but to bring the buildings to life by allowing clients to explore the blueprints themselves.
Wirtz found the solution he had been seeking when he stumbled upon an article about Ubi. At Ubi Interactive, we provide the technology to transform any surface into an interactive touch screen. All the user needs is a computer running our software, a projector, and the Kinect for Windows sensor. Immediately, Wirtz knew Ubi was something he wanted to implement at Walsh: “I contacted the guys at Ubi and told them I am very interested in purchasing the product.” Wirtz was excited about the software and flew out to Seattle for a demo.
After interacting with the software, Wirtz was convinced that this technology could help The Walsh Group. “Ubi is futuristic-like technology,” he noted—but a technology that he and his colleagues are able to use today. Wirtz immediately saw the potential: Walsh’s building information models could now be interactive displays. Instead of merely presenting drawings to clients, Walsh can now set up an interactive 3-D blueprint on the wall. Clients can walk up to the blueprint and discover what the building will look like by touching and interacting with the display. In use at Walsh headquarters since June 2012, Ubi Interactive brings client engagement to an entirely new level.
Similarly, Evan Collins, a recent graduate of California Polytechnic State University, used the Ubi software as part of an architecture show he organized. The exhibition showcased 20 interactive displays that allowed the fifth-year architecture students to present their thesis projects in a way that was captivating to audience members. Collins said the interactive displays, “…allowed audience members to choose what content they interacted with instead of listening to a static slideshow presentation.”
Twenty Ubi Interactive displays at California Polytechnic University
Wirtz’s and Collins’ cases are just two ways that people are currently using Ubi. Because the solution is so affordable, people from a wide range of industries have found useful applications for the Ubi software. Wirtz said, “I didn’t want to spend $10,000. I already had a projector and a computer. All I needed to purchase was the software and a $250 Kinect for Windows sensor. With this small investment, I can now turn any surface into a touch screen. It’s a powerful piece of technology.”
In addition to small- and mid-sized companies, several Fortune 500 enterprises like Microsoft and Intel are also using the software in their conference rooms. And the use of the technology goes beyond conference rooms:
At Ubi Interactive, it is our goal to make the world a more interactive place. We want human collaboration and information to be just one finger touch away, no matter where you are. By making it possible to turn any surface into a touch screen, we eliminate the need for screen hardware and thereby reduce the cost and extend the possibilities of enabling interactive displays in places where they were not previously feasible—such as on walls in public spaces. Our technology has implications of revolutionizing the way people live their lives on a global level. After private beta evaluation with more than 50 organizations, the Ubi software is now available for ordering at ubi-interactive.com.
Anup ChathothCo-Founder and CEO, Ubi Interactive