Download Research Tools
When we released the Lab of Things a year ago, we knew that it would benefit researchers experimenting with connected devices in various domains. It has been very gratifying to see how the Lab of Things has helped to accelerate research on helping people with disabilities to live more independent lives.
Essentially, the Lab of Things is a research platform that enables the deployment of connected devices and sensors at scale. By providing a client-side set of components called HomeOS, the Lab of Things frees researchers from having to develop the complete software stack for deploying their experiments. The HomeOS enables a simple yet powerful connectivity and experiment execution environment. The Lab of Things also comes with a set of cloud services for updating, monitoring, and storage, allowing researchers to scale up deployments and deploy in geographically diverse locations. These features lower the barriers for testing new devices and understanding their behavior in a quick, stable, and repeatable fashion.
Researchers have been using the Lab of Things to develop new technologies. Professor Nilanjan Banerjee of the University of Maryland at Baltimore County and I recently had the opportunity to describe some of these exciting new technologies during an online webcast of the 2014 Microsoft Research Faculty Summit.
Professor Banerjee was an early Lab of Things adopter. He approached us shortly after we released it, speaking passionately about developing sensors that could help people with limited mobility lead more independent lives by enabling them to control the environment in their home and workplaces. The Lab of Things and its underlying HomeOS seemed the perfect platform for his project. It would allow him to test his ideas quickly and adapt his design as necessary.
He started working with Buz Chmielewski, who became a quadriplegic after a surfing accident 25 years earlier. Buz helped Professor Banerjee test his design and arrive at a more usable sensor. After nearly a year, Professor Banerjee and his colleagues had developed a sensor that detects gestures and uses them to activate lights and other appliances in the home. The sensor can be sewn almost anything in the environment—for example, clothing or bedding.
The sensor design for this project was developed by Professor Ryan Robucci and his team. With the Lab of Things, Professor Robucci was able to develop and test the sensor components quickly without having to develop the accompanying software. Also part of this project was Dr. Sandy McCombe-Waller from the University of Maryland, School of Medicine, who specializes in rehabilitation of people with various forms of injuries and disabilities. She helped with understanding the various types of mobility issues involved, and with the Lab of Things was able to test various designs of the sensor quickly.
Over the past year, the Lab of Things has also grown in what it offers. Recently we added support for the Arduino hardware prototyping board, opening up the Lab of Things to a whole new world of experimentation with new sensors and devices. The Lab of Things also supports web calls to services such as Weather Underground. All of the apps and drivers are available as sample code for users to adapt.
—Arjmand Samuel, Senior Research Program Manager, Microsoft Research
We’re here at the Microsoft Conference Center in Redmond, Washington, where the first day of the 2014 Microsoft Research Faculty Summit is underway. The event kicked off with an opening keynote from Harry Shum, executive vice president of Microsoft’s Development and Research group, during which he highlighted major efforts at Microsoft Research. Two of significance are the integration of Microsoft Academic Search into Bing with Cortana (Microsoft’s new personal digital assistant), and major improvements in computer vision via deep-learning techniques.
Cortana is powered by Bing, and its tight integration of academic data into Bing search results means that Cortana will become a researcher’s dream assistant. Instead of treating information from the academic community in a separate search engine—as competitors do—Bing, and therefore Cortana, will treat scholarly information as a first-class citizen in search results.
We were especially delighted and entertained when the computer-vision announcement was accompanied by the appearance of three show dogs on stage—a visual link to one of the project’s more arresting achievements: the ability to distinguish between the breeds. The system, code-named “Project Adam,” not only can “see” that an image is a dog, but it can accurately determine which of two very similar dog breeds it’s looking at. Project Adam uses deep-learning techniques to deliver a highly efficient, highly scalable distributed system that can perform computer-vision recognition and categorization tasks at world-record levels of performance.
Harry Shum, Microsoft executive vice president of Technology and Research, greets a dog that took the stage as part of a demonstration of Project Adam, a world-record computer-vision effort to train a computer system that can, for example, identify dogs by breed.
Following the keynote, Peter Lee joined Harry on stage to recognize the 2014 Microsoft Research Faculty Fellows. These seven early-career academics are pursuing some of the most exciting, high-impact areas of computer science, and we’re pleased that their fellowships will free them to devote their energies to research. The crowd was enthusiastic, as most of them know all too well the burdens of grant writing that the Faculty Fellowship alleviates.
Perhaps most exciting to us, however, is knowing that this keynote was shared live with viewers around the world through our live stream of the Faculty Summit. Not only did our online audience get to watch the opening keynote and pose questions during the Q&A, they were also treated to an “online extra”: an extended, post-keynote interview with Harry.
Up next is the session with Jeannette Wing, where our online attendees will be able to learn firsthand what the scientific community deems as Hot Topics. And throughout the course of the day, our online audience will be treated to eight in-depth interviews that will cover cutting-edge developments in online education, wildlife conservation, and the Internet of Things—just to mention a few of the topics. So fire up your web browser and tune in to the Faculty Summit—there are still hours of great content awaiting you.
—Judith Bishop, Director Computer Science, Microsoft Research
As part of Microsoft Research’s commitment to encouraging and supporting the up-and-coming generation of researchers worldwide, Microsoft Research Asia sponsored "Korea Day at Microsoft Research 2014" on June 9 in Beijing, China. The event was the culmination of a 10-month research project competition, which began in August 2013 when we selected creative, collaborative research projects from the top eight universities in Korea. Each of the 21 selected teams illustrated their research and outcomes through posters and displays during Korea Day, which more than 150 Microsoft Research Asia representatives attended.
Korean researchers at Microsoft Research Asia
Among the many projects of merit, one in particular stood out to the judges—a cell phone app project named “NUGU: A Group-based Intervention App for Improving Self-Regulation on Smartphone Usage.” The app, developed by Professor Uijin Lee’s team from KAIST, is designed to help users reduce their cell phone usage through positive reinforcement in the form of an awards system. For example, if a user sets a goal not to use his or her smartphone for an hour during study time, the app will award points upon successful completion of the goal.
Project NUGU: A Group-based Intervention App for Improving Self-Regulation on Smartphone Usage
“I created this app after getting an idea from a psychological theory of an individual’s behavior being greatly affected by the people around them," says Minsam Go, a fourth-year PhD student at KAIST. "It lets the user earn points after completing the goal and makes the user compete with friends on who has higher points, which encourages the user to spend less time on his or her smartphone.”
The app won the first prize award of US$3,000, which Dr. Hsiao-Wuen Hon, managing director of Microsoft Research Asia, presented to the team at the Korea Day event.
Left: Professor Uichin Lee, KAIST and Hsiao-Wuen Hon, managing director of Microsoft Research AsiaRight: Professor Seung-won Hwang, POSTECH and Hyunseek Lee, senior vice president, National IT Industry Promotion Agency
Second place went to Professor Seung-won Hwang's POSTECH team, which presented “Spatiality and Temporality Footprint for Entity Understanding.” Two teams tied for third place: Professor Hyogon Kim’s Korea University team, with “Software Defined Radio on a Smartphone,” and Professor Nam-Jong Paik’s Seoul National University team, with “Stroke Recovery with Kinect.”
Left: Professor Nam-Jong Paik, Seoul National University of College of Medicine, and Tim Pan, university relations director of Microsoft Research AsiaRight: Professor Hyogon Kim, Korea University and Tim Pan
Recognizing the great work done by the university teams was just the beginning of the event. We wanted to encourage participants—from both Microsoft Research Asia and the universities—to build collaborative relationships and share their research insights.
Many of the day's presentations demonstrated the technologies that will play a role in Korea's future IT competitiveness. For example, Professor Seungyong Lee's team displayed a technology involving Kinect for Windows. It uses the Kinect sensor to scan a room, and then uses spatial positioning technology to create a computerized 3D panorama of the room and its contents that can be viewed from any angle. “Kinect was originally developed for Xbox, but it has been used in various research projects other than video games, as it has an ability to measure the depths of space," Lee notes. "The technology enables the user to transfer 3D interior space into the computer as it is.”
Interactive 3D Floor Plan Reconstruction from RGB-D Images, from Professor Seungyong Lee’s team
It is similar to applying an Internet mapping service such as Street View, but mapping indoor areas. Unlike Street View, which puts several 2D photos together, the technology enables the user to feel the depth and 3D effects of space. “From now on, the interior images of buildings can be transferred to a computer like this," Lee remarks. For example, "It can be used for firefighters to see the burning building’s interior before they begin the rescue, or the general public can use it to rearrange their home furniture, or for interior design.”
Another notable project presented during the event came from Korea University Professor Haechang Lim's team. The group developed a system that uses natural language and big data to assess commercial brands—or people—on Twitter.
"In the past, a survey had to be done manually to examine the customer image on brands, which costs a lot of money and time," explains Mincheol Yang, a third-year doctorate student. "Now, the consumers are voluntarily expressing their opinions on social network services."
Korea Day participants gathered in the Microsoft Research Asia Sky Garden
We enjoyed seeing the results of the excellent research and project work on Korea Day. All of the teams worked very hard and did a great job. We extend our congratulations to all the participants, and we look forward to seeing great things from all of you in the future.
—Miran Lee, Principle Research Program Manager, Microsoft Research Asia