Der deutsche Education Blog

Microsoft Research Connections Blog

The Microsoft Research Connections blog shares stories of collaborations with computer scientists at academic and scientific institutions to advance technical innovations in computing, as well as related events, scholarships, and fellowships.

  • Microsoft Research Connections Blog

    How to Say “Fourth Paradigm” in Portuguese

    • 0 Comments

    Through my work with academics in Brazil, I have witnessed an increasing awareness of the importance of computing in advancing scientific research in such areas as bioenergy, biodiversity, climate change, and plant physiology. In order to advance these fields, scientists need to deal with increasingly complex projects that require the expertise of a multidisciplinary team, and computing is a key element in this effort.

    From data acquisition to data management, visualization, and modeling, researchers confront the need for new tools to enable innovative investigations. At the Microsoft Research-FAPESP Institute, I’ve seen programs such as BIOEN (a bioenergy research program), BIOTA (a biodiversity project), and the Research Program on Global Climate Change, and they all share the need to access and manage massive amounts of data.

    An electronic version of The Fourth Paradigm was released in Portuguese on August, 15, 2011.In light of this need, the Microsoft Research-FAPESP Institute launched the Portuguese translation of The Fourth Paradigm: Data-Intensive Scientific Discovery, a wide-ranging collection of essays on the process and promise of data-intensive science. An outgrowth of the thinking of late Microsoft researcher Jim Gray, The Fourth Paradigm sets out the parameters of twenty-first-century eScience.

    The launch of the Portuguese edition took place on November 3, 2011, at FAPESP in São Paulo. Professor Carlos Henrique de Brito Cruz, FAPESP’s scientific director, opened the launch event, observing that “Science advances mostly through the development and application of new instruments. Computing power, the cloud, and other facilities constitute a big new instrument that allows researchers to obtain and analyze gigantic data sets in a way which was not possible a few years ago. The Fourth Paradigm deals with this fascinating window of opportunity for science and a Portuguese translation will contribute to the visibility of the authors’ ideas in Brazil.”

    Professor Roberto Marcondes Cesar, Jr., who supervised the translation into Portuguese, then spoke about eScience in Brazil. “The Brazilian computer science community has been working together with domain scientists for decades in fields such as astronomy, geoscience, bioenergy, and medicine—to name but a few. Different expressive results addressing relevant problems for the country have been achieved and the Brazilian CS [computer science] researchers proceed to increase the collaboration results both in volume and quality. In this sense, the Portuguese translation of The Fourth Paradigm represents an important step in disseminating eScience methods and opportunities, both to attract CS researchers and students to the field and to draw the attention of domain scientists who may benefit from interdisciplinary research.”

    These comments set the stage for a talk by Dan Fay, the director of Microsoft Research Connection’s Earth, Energy, and Environment activities, who said, in part: “For scientists, access to massive amounts of data can be a blessing and a curse—finding the significant nuggets of information that will lead to insights in the huge volumes of data is the problem. Big data is as much challenge as opportunity. When you have data sets as a large as a petabyte, that’s always going to be difficult to move around and analyze… The science of big data is as much about asking the right questions, so that scientists collect the right data, as it is trying to sift through data after the fact.”

    Microsoft Research Connections is proud to partner with FAPESP in the pursuit of data-intensive research, as together we explore the use of computing technology to meet the social and economic needs in the Brazilian state of São Paulo. Oh, and this is how you say “fourth paradigm” in Portuguese: o quarto paradigma.

    Juliana Salles, Senior Research Program Manager, Microsoft Research Connections

    Learn More

  • Microsoft Research Connections Blog

    MIXing It Up: the Kinect for Windows SDK

    • 2 Comments

    Kinect for Windows SDK betaBack in February at TechForum, Craig Mundie, Microsoft's chief research and strategy officer, and Don Mattrick, president of Microsoft's Interactive Entertainment Business (IEB), announced that Microsoft Research and IEB would release a non-commercial Kinect for Windows software development kit this spring. Addressing a growing body of academic researchers and enthusiasts who are anxious to build applications employing Kinect's natural user interface, Mundie and Mattrick offered tantalizing promises of access to Kinect's system capabilities, including audio, system APIs, and direct control of the Kinect sensor.

    Today at the MIX developer conference in Las Vegas, Scott Guthrie, corporate vice president of the Microsoft .NET Developer Platform, unveiled three key features of the upcoming Kinect for Windows SDK: robust skeletal tracking, advanced audio capabilities, and XYZ depth camera. He also announced the launch of a new website for the SDK, where you can subscribe to a newsfeed and be notified as soon as the SDK is available for download.

    Our hope is that this "starter kit" for application developers will make it easier for the academic research and enthusiast communities to create even richer experiences by using Kinect technology. Here are a few details on each of the SDK's ground-breaking NUI features:

    • Robust skeletal tracking will provide high-performance capabilities for tracking the skeletal image of one or two people moving within the Kinect field of view.
    • Advanced audio will enable great sound capabilities by using a four-element microphone array with sophisticated acoustic noise and echo cancellation. The advanced audio will also include beam formation to identify the sound source and integration with the Windows speech recognition API.
    • XYZ depth camera will provide a standard color camera stream along with depth data indicating the distance of the object from the Kinect camera. This will give developers access to the raw data and enable the creation of novel interfaces by using the unaltered data.

    As is often the case, the sum of these features is greater than the parts. By combining the audio, depth, and image data, developers will have great opportunities to build deeper NUI experiences. And just to give his audience a taste of what these features will enable, Guthrie demoed a version of the WorldWide Telescope that you can interact with by using gestures—a feature built on the SDK platform.

    MIX was an ideal setting for announcing the new SDK features, as this annual gathering brings together developers, designers, UX experts, and business professionals who are creating some of the most innovative consumer sites on the web and beyond. The SDK feature announcements will be highlighted to the academic research community this week at the Microsoft Research Software Summit in Paris.

    So, it's onward and upward with the Kinect for Windows SDK. We're confident that this non-commercial SDK will fuse the work of Microsoft Research with the creativity of the academic research and enthusiast communities to deliver NUI applications that will revolutionize our relationship with computers.

    Learn More

    Tony Hey, Corporate Vice President, Microsoft Research Connections

  • Microsoft Research Connections Blog

    Surface + Robotics = Life-Saving Possibilities

    • 3 Comments

    In the realm of applied research, perhaps nothing is more satisfying than working on projects that can help save lives.  Such is the case with a unique project at the University of Massachusetts Lowell that combines Microsoft Surface and Microsoft Robotics Developer Studio in a Human-Robot Interaction (HRI) application to create novel remote controls for rescue robots.  To the best of our knowledge, this is the first time these two technologies have been used together—tell us if you know of others! Once perfected, this approach could enable emergency responders to safely maneuver rescue robots through buildings damaged by earthquakes, fire, or even terrorist attacks.

    The groundbreaking work was dramatically presented on the Web in August, when doctoral candidate Mark Micire posted a live video of his PhD defense showing how to control swarms of robots using the Surface table as a touch controller.  A new, higher quality video of the thesis defense and an overview video have recently been posted online.  The overview shows how a team of rescue robots could be controlled remotely by using the Surface table and a device known as the DREAM Controller (a lovely acronym for Dynamically, Resizing, Ergonomic, And Multi-touch Controller). 

    The system could be a tremendous boon for emergency responders, who now must often wait 12 to 24 hours to obtain geo-referenced data that combine notes from rescue workers in the field with paper maps and building plans. During Hurricane Katrina, for example, many response groups were still using hand-drawn paper maps.  Additionally, robot cameras sent video only to the site operators—not immediately to the command staff.

    The proposed system would obviate these problems by creating a common computing platform that would bring all this information to the command staff, enabling them to more effectively utilize rescue robots. As Micire describes in his presentation, "A single-robot operator control unit and a multi-robot command and control interface [can be] used to monitor and interact with all of the robots deployed at a disaster response. Users can tap and drag commands for individual or multiple robots through a gesture set designed to maximize ease of learning."

    An example of the burgeoning research field of NUI—or Natural User Interaction—this work "illustrates just one of the many exciting new directions enabled by advanced technologies in the human-computer interface," says ER's NUI Theme Director, Kristin Tolle. The project, which was supervised by UMass Lowell's renowned robotics expert, Professor Holly Yanco, also demonstrates the great synergy that can arise from collaborations between Microsoft Research and leading academic institutions.  By empowering Yanco and Micire's research with cutting-edge tools, a potentially life-saving technology is in the offing.

    This work was partly supported by a grant from Microsoft Research under our Human-Robot Interaction RFP (Request For Proposals).

    Stewart Tansley, senior research program manager, External Research, a division of Microsoft Research

    Learn more

Page 70 of 103 (308 items) «6869707172»