Der deutsche Education Blog

December, 2010

Microsoft Research Connections Blog

The Microsoft Research Connections blog shares stories of collaborations with computer scientists at academic and scientific institutions to advance technical innovations in computing, as well as related events, scholarships, and fellowships.

December, 2010

  • Microsoft Research Connections Blog

    Virtual Fire System Aids Firefighters in Wildfire Combat and Prevention

    • 1 Comments

    When a wildfire strikes, every second counts. Time lost can all too often be measured in lost life, deforestation, and property damage. Enter the Virtual Fire application, based on Microsoft Bing Maps, ESRI ArcGIS, and other software. This web geographic information system (GIS) platform is designed to support wildfire early warning, control, and civil protection by sharing information and tools produced by the Geography of Natural Disasters Laboratory at the University of the Aegean in Greece.

    With these new tools, firefighting personnel, emergency crews, and other authorities can design an operational plan to contain the forest fire, pinpointing the best ways to put it out with new levels of precision. Fire management professionals can locate fire service vehicles and other resources online and in real-time. Fire patrol aircrafts use Global Positioning System (GPS) tracking and communications to send coordinates for each item to Virtual Fire, which depicts them on a web GIS. Cameras can augment this data by transmitting images of high-risk areas into the Virtual Fire system.

    One of the compelling advantages of Virtual Fire is that it enables fire management professionals to take advantage of GIS capabilities without extensive training on complicated GIS applications. The platform enables end-users to query the databases and get answers immediately, locate points of interest in high-resolution satellite images, and download information to their portable computers or GPS devices.

    But the Virtual Fire application offers services beyond simple coordination of emergency efforts. Remote automatic weather stations and a weather forecasting system based on the SKIRON weather model (developed by the Atmospheric Modeling and Weather Forecasting group at the University of Athens) provide crucial data needed for fire prevention and early warning. Virtual Fire provides geographical representation of the fire risk potential and identifies high-risk areas at different local regions daily, based on a high performance computing (HPC) pilot application that runs on Windows HPC Server.

    "Virtual Fire hosts and visualizes models used for predicting forest fire risk and behavior to understand how the fire is likely to spread, based on the actual meteorological data, vegetation, and landscape morphology," says Kostas Kalabokidis, geography professor at University of the Aegean and principal investigator of the Virtual Fire initiative. "These prediction data—along with a plethora of other information spanning roads, location of water tanks, the positioning of aircrafts and vehicles, vegetation types, and weather data—will be visualized over online maps such as Bing Maps. This will enable fire fighters in control centers, or on-site via handheld devices, to more effectively manage forest fires and deal with any other emergencies situations that may arise."

    The system runs on servers that were donated by Hewlett Packard (three quad-core computing nodes: one head node and two computing nodes). By using the FARSITE and FlamMap fire behavior software (created by Missoula Fire Sciences Laboratory), maps are produced on demand to graphically represent the spread and intensity of a forest fire at different times and places. In addition, user feeds and email messages provide effective communication between users and administrators for reporting events.

    During the course of its development, the Virtual Fire platform delivered some early successes in combating and even deterring wildfires. On July 8, 2009, an extremely dangerous wildfire broke out on Lesvos Island. The Virtual Fire system—which was at its initial stage, only partly operational with the fire-risk probability index and the weather forecasting and monitoring—provided the fire service with a better grasp of local topography and details of current and imminent weather as well as the high-risk prediction map. This resulted in a prompt initial response that prevented the fire from uncontrolled enlargement and encroachment to nearby sensitive ecological preserves and a military base camp. Virtual Fire successfully predicted the fire risk for the particular area where the event took place, which led to its status as a preferred fire risk prediction tool in 2010.

    During the 2010 fire season (from April to October), no serious fire breakouts developed on Lesvos Island, in contrast to other Greek islands such as Samos. Almost all of the fire events were promptly confronted; fires were not permitted to overgrow and they responded to initial efforts to subdue them. Evidence currently under investigation suggests that Virtual Fire played an important role in these improved results, offering the local fire service valuable information to utilize for decision support with their own considerable operational experience and knowledge.

    Coordinating Prefecture Board of Lesvos, Mytilene, in Greece

    Coordinating Prefecture Board of Lesvos, Mytilene, in Greece

    The results of the Virtual Fire initiative were presented July 6, 2010, at the Coordinating Prefecture Board of Lesvos, Mytilene, in Greece. Event attendees included the prefect and counsellors of Lesvos Prefecture, mayors and representatives of the Municipalities of Lesvos Island, heads of Civil Protection, officers and fire fighters of the North Aegean and Mytilene Fire Services, staff of Lesvos Forest Service, commanders and officers of military and public service authorities, representatives of social services and fire-fighting volunteer organizations of Lesvos Island, and the partners of the project from University of the Aegean, University of Athens, Microsoft Research, Microsoft Hellas, and Microsoft Innovation Center—Greece. For more information, read the press release.

    —Scarlet Schwiderski-Grosche, Research Program Manager, External Research division of Microsoft Research, Cambridge

  • Microsoft Research Connections Blog

    Bioinformatics Tools Promote Life-Saving Research

    • 0 Comments

    On November 30, I appeared on Health Tech Today, where I chatted with Dr. Bill Crounse about the Microsoft Biology Foundation and how it will help scientists advance their research. This interview marks yet another opportunity for Microsoft External Research to spread the word about our open-source work in developing tools that, as Dr. Crounse noted, provide researchers with "the potential to discover amazing things and solve big problems." 

    Health Tech Today Video: Beatriz Diaz Acosta

    The heart of the interview was a discussion of the overarching goal of the Microsoft Biology Foundation, which is to develop a set of tools that enable researchers to more easily and effectively collaborate and thereby expedite new discoveries.  I explained how researchers around the world have come up with their own data formats—their own unique standards on how to encode, share, and work with data. They have developed all these individual languages, and now this profusion of tongues is impeding collaboration. In essence, we have a scientific cacophony, with researchers speaking different languages through the data.

    Through the Microsoft Biology Foundation, we're providing the building blocks to translate these many data files and formats into a common language. In addition to these file parsers, we also provide standard algorithms to assemble and align genetic sequence data, thereby obviating the need for each researcher to create his or her own unique version. Through these efforts, we're relieving researchers of onerous translation chores and allowing them to focus on solving real problems, like designing new drugs and developing new vaccines, which will ultimately save lives.

    As I discussed with Dr. Crounse, we're doing this by building an extensive and expandable library on the Microsoft .NET Framework, a technology that gives us the advantage of being interoperable with many different programming languages. I pointed out our goal of using tools like DeepZoom and Pivot to facilitate interactive visualization of massive quantities of data, such as the billions of base pairs found in the human genome. These new technologies have the potential to make it easier and more intuitive to identify patterns and spot outliers in the data. Such discoveries could lead to breakthroughs against some of the most feared diseases, including AIDS. I noted, in fact, how we are using the Microsoft Biology Foundation library to support David Heckerman's work in the eScience team at Microsoft Research, where they are working diligently to create a vaccine against HIV.

    I finished my interview with an appeal for medical researchers and biologists to visit www.research.microsoft.com/bio, where they can learn more about the Microsoft Biology Initiative and its open-source tools. As I noted, we need feedback from those working on the frontlines of medicine and biology in order to improve our tools and move forward in our quest to improve human health.

    —Beatriz Diaz Acosta, Senior Research Program Manager, Health and Wellbeing, the External Research division of Microsoft Research

    Related Links

  • Microsoft Research Connections Blog

    Multicore Workshop Attendees Work to Integrate Software and Hardware for Optimal Performance and New Applications

    • 0 Comments

    Attendees at the Second Barcelona Multicore Workshop

    The latest innovations in multicore technology are meaningless if the software you run is not written to take advantage of the advanced hardware design. To help address this and other issues, attendees at the Second Barcelona Multicore Workshop (BMW) met October 21-22, 2010, to critically examine developments in computer chip technology in the two years since the highly successful 2008 workshop.

    Today, sequential chips are almost entirely superseded by multicore processors. The hardware community is focused on designing these processors to maximize the potential performance. Meanwhile, software developers need to know how best to program for machines that use this multicore technology, particularly when it is used for desktop workloads or on mobile devices rather than traditional scientific applications.

    To help understand and solve these concerns in a multidisciplinary manner, representatives from Barcelona Supercomputing Center, Hipeac, Microsoft Research, and academics and researchers from Europe, Asia, and the United States met and cross-fertilized ideas across the hardware and software communities. Many participants report that the conference sparked new plans for collaboration, including company partnerships with academia and the sharing of valuable tools and ideas.

    Among the key discussions were:

    • Parallel programming models for the Barrelfish research operating system that Microsoft Research has developed with ETH Zurich in Switzerland. Barrelfish treats the internals of a multi-processor machine as a distributed system: Each core runs independently, and they communicate via message passing. Project leaders are working with Barcelona Supercomputing Center to use their experience with the StarSs programming model to write parallel programs that run on Barrelfish.
    • How developers are using low-power vector processors to apply ideas originally developed for high-performance computing to applications such as face and speech recognition, machine-learning, and column-store databases that might run in the cloud or on future mobile devices.
    • Panelists attending "Can Software Keep Up with the Pace of Hardware Development?" discussed what can be done to address the readiness of the software industry to meet the multicore/heterogeneous hardware trends. One major discussion explored whether processor designers could help address the issue by focusing less on specific applications and single-use benchmarks and more on the operating system and need for hardware to efficiently support many different processes on the machine at the same time. "There is a growing sense among those who do research into system software that computer architects—those who design processors and other system components—need to change their focus," reports Timothy Roscoe of ETH Zurich. "This is partly because many commercially important workloads are now OS-intensive, and some current processor designs incur a high overhead when switching to kernel mode, and partly because as chips become more parallel, the need to coordinate multiple tasks and communicate between multiple applications on cores becomes a key performance bottleneck."

    —Tim Harris, Senior Researcher, System and Networking Group at Microsoft Research Cambridge

Page 2 of 3 (9 items) 123