Dan on eScience & Technical Computing @ Microsoft

eScience & Technical Computing - Web Services and Scientific Research

  • Dan on eScience & Technical Computing @ Microsoft

    3D Journal on TabletPC

    • 1 Comments

    A pretty cool research project – 3D Journal Project – demostraing live 3–D sketching on Tablet PC from Cornell Computational Synthesis Lab

  • Dan on eScience & Technical Computing @ Microsoft

    New Workshop - Introduction to Web Services for Clusters Workshop

    • 1 Comments

    Cornell Theory Center is adding Web Services training to their current Windows HPC Cluster training – February 10-11, 2005 at the CTC-Manhattan training facility. 

    Web services enable certain classes of high-performance computing (HPC) applications, specifically those that are very loosely-coupled, to distribute computation and data from a desktop or mobile device to remote servers or "workers."

    Workshop topics include:

    • Setting up and running Network Load Balancing (NLB)
    • Writing and installing a Web Service and client
    • Adding an Excel front-end to a Web Service

    After attending the "Introduction to .NET and Web Services Technical Training Workshop," technologists will be able to design and deploy an integrated solution using Microsoft .NET.

    For technologists that are interested high-performance computing clusters for tightly coupled applications (Message Passing Interface or MPI-based clusters), CTC is offering a "Windows High-Performance Computing Technical Training" workshop at the same location on February 8-9.

    These workshops are for representatives of companies, universities, and government agencies who want to learn more about implementing and using high-performance computing on Windows-based clusters.

  • Dan on eScience & Technical Computing @ Microsoft

    Free E-Learning Online Training for SQL and VS

    • 1 Comments

    Great opportunity for eScience folks looking to get updated on SQL Server 2005 and Visual Studio 2005.

    Free* in-depth online training on the newest features of Visual Studio 2005 and SQL Server 2005, with hands-on virtual labs, and offline functionality

    * Microsoft E-Learning for Visual Studio 2005 is free until November 8, 2005. Microsoft E-Learning for SQL Server 2005 is free until November 1, 2006. Internet connection time charges may apply.   

    ·         Visual Studio 2005 Learning Resources: http://lab.msdn.microsoft.com/vs2005/learning

    ·         SQL Server 2005 Learning Resources: http://www.microsoft.com/technet/sql/learning/

  • Dan on eScience & Technical Computing @ Microsoft

    Windows Bioinformatics Applications Server - Cornell Theory Center

    • 1 Comments

    Cornell Theory Center hosts a list of bio related software programs that run on Windows and code is available via Visual Studio .NET 2003. 

    http://www.tc.cornell.edu/wba

  • Dan on eScience & Technical Computing @ Microsoft

    Lang .NET 2006 Symposium (July 31-Aug 2)

    • 1 Comments

    Just ran across the Lang .Net 2006 Symposium - is a forum for discussion on programming languages, managed execution environments, compilers, multi-language libraries, and integrated development environments. The conference will be held on the Microsoft Campus, July 31 through Aug. 2.

  • Dan on eScience & Technical Computing @ Microsoft

    MSR eScience Workshop Agenda

    • 1 Comments
    The agenda for MSR eScience Workshop is now available...along with all the fabulous talks and speakers, we are excited to have Tony Hey (VP of TCI) speaking and Jim Gray (MSR BARC) participating.  For all you scientists and researchers interested in attending - don't forget to register
  • Dan on eScience & Technical Computing @ Microsoft

    The BioTeam Delivers Informatics Solution on Microsoft CCS

    • 1 Comments

    This is great news for the Bio researchers...info at iNquiry Bioinformatics Portal for Microsoft Compute Cluster Server - they also mention the "The Scientific Desktop"

    "The scientific desktop" aims to build a bridge between common computing tools and everyday scientific computing tasks. One example is an Excel add-in that enables researchers to launch Blast queries directly from Excel and then get results back into their spreadsheet application.

    HPCWire: The BioTeam Delivers Informatics Solution on Microsoft CCS

    The BioTeam, a consulting collective that delivers informatics solutions to the life science industry, has announced that it is releasing their iNquiry software on the Microsoft's new Windows Compute Cluster Server 2003.

    The challenge is many informatics problems are data-intensive and require high computational power to solve. However, the individuals who work on these problems are not always experts in scalable computing. Science departments and other groups that are getting into high performance computing for the first time need a platform that is powerful but easy to use, and cost effective. The installation, customization and ongoing support of a scalable commodity cluster have traditionally presented a formidable challenge to busy scientists, and limited budgets.

    BioTeam's iNquiry software platform enables the rapid deployment of a ready-to-use cluster and web portal for use in life science informatics settings. It comes preconfigured with many open source scientific applications and can be extended to support additional commercial, open-source or internally developed applications.

    Microsoft Windows Compute Cluster Server (CCS) 2003 is a new HPC operating system specifically designed for group and departmental-level deployment.

    "Windows Computer Cluster Server 2003 has everything needed to quickly deploy a Windows-based cluster," said Michael Athanas, founding partner of The BioTeam. "The combination of Microsoft's product and iNquiry will be a solid and compelling compute solution platform for researchers and scientists."

    Source: The BioTeam Delivers Informatics Solution on Microsoft CCS

  • Dan on eScience & Technical Computing @ Microsoft

    Data Mining Addins for Office 2007 (Excel & Visio)

    • 1 Comments

    I think of the Data Mining Addins for Excel 2007, as data mining tools for common man - they allow anyone to use SQL Server Analysis Services without having to know how to program the backend.  It allows you to look for Key Influencers, Detect Categories, Highlight Exceptions, etc.  This can be a real benefit for scientists looking to do things like data cleaning directly from Excel. 

  • Dan on eScience & Technical Computing @ Microsoft

    SecPAL Preview Release for Microsoft .NET

    • 1 Comments

    Grid folks and security researchers should be interested in the SecPAL preview release - The goal of the SecPAL project is to develop a language for expressing decentralized authorization policies, and to investigate language design and semantics, as well as related algorithms and analysis techniques.

    SecPAL Preview Release for Microsoft .NET

    The Security Policy Assertion Language (SecPAL) provides a flexible and robust declarative authorization language developed for large-scale Grid Computing Environments. This installable MSI includes a preview release of the .NET implementation of SecPAL, developer document describing the SecPAL programming model and scenario based samples intended to support evaluation of SecPAL.

    Source: SecPAL Preview Release for Microsoft .NET

  • Dan on eScience & Technical Computing @ Microsoft

    [Data Service] Microsoft Project Codename "Astoria"

    • 1 Comments

    Time to think about how Scientific data could utilized the Astoria service... 

    The goal of Microsoft Codename Astoria is to enable  applications to expose data as a data service that can be consumed by web clients within a corporate network and across the internet. The data service is reachable over HTTP, and URIs are used to identify the various pieces of information available through the service. Interactions with the data service happens in terms of HTTP verbs such as GET, POST, PUT and DELETE, and the data exchanged in those interactions is represented in simple formats such as XML and JSON.
    We are delivering this first early release of Astoria as a Community Tech Preview you can download and also as an experimental online service you can access over the internet.

    Source: Microsoft Project Codename "Astoria"

  • Dan on eScience & Technical Computing @ Microsoft

    Family.Show from Vertigo

    • 1 Comments

    Every once in a while you run across really neat application that uses the latest technologies...while providing a real compelling user experience and in this case it's to map your family tree.  The experience is so intuitive - the one feature I'd like to be able to do - is grab my family contacts from Outlook and drop them directly into app - that would probably make building up these relationships even easier.

    The other good news is that it's a ClickOnce install and even runs in Firefox and the source code is available.

    I see Dead People, with Windows Presentation Foundation

    For a hobby that revolves around dead people, genealogy is remarkably  popular: it's the fastest growing scene in North America. And a perfect study for Vertigo's next Windows Presentation Foundation (WPF) reference application for Microsoft.familyshow

    Our designers employed every trick in the WPF book– styles, resources, templates, data binding, animation, transforms– to present an innovative visualization of the classic family tree, freeing our developers to concentrate on behind-the-scenes features like XPS, P/Invoke wrapper for Windows Vista common dialogs, and ClickOnce for WPF.

    Source: Vertigo: Family.Show

  • Dan on eScience & Technical Computing @ Microsoft

    Silverlight 1.1 2D Physics

    • 1 Comments

     Very cool demo of a Silverlight and a 2D Physics engine...

    Silverlight 1.1 2D Physics

    May 22nd, 2007 · 19 Comments

    Are you tired of physics demos yet?  I hope not because I’ve just made another one smile_teeth  It’s using the Managed Bullet Physics library I ported to WPF, tweaked slightly to run with Silverlight 1.1.  This might be the world’s first cross-platform physics demo! (umm, except Flade of course, and probably a million others I’m unaware of).

    Source: Silverlight 1.1 2D Physics « Chris Cavanagh’s Blog

  • Dan on eScience & Technical Computing @ Microsoft

    Delivering End-to-End High-Productivity Computing

    • 1 Comments

    Here's a very good paper outlining the work by Marc Holmes (MTC) and Simon Cox on producing an end to end solution using WinCCS (HPC) and many other technologies - it would be great to get feedback from others if they have setup their WinCCS cluster this way or if there are architectures they are using for scientific research .

    Here's the technologies they have incorporated -

    Delivering End-to-End High-Productivity Computing

    by Marc Holmes and Simon Cox

    Summary: Performing a complex computational science and engineering calculation today is more than about just buying a big supercomputer. Although HPC traditionally stands for "high-performance computing," we believe that the real end-to-end solution should be about "high-productivity computing." What we mean by "high-productivity computing" is the whole computational and data-handling infrastructure, as well as the tools, technologies, and platforms required to coordinate, execute, and monitor such a calculation end-to-end.

    Many challenges are associated with delivering a general high-productivity computing (HPC) solution for engineering and scientific domain problems. In this article, we discuss these challenges based on the typical requirements of such problems, propose various solutions, and demonstrate how they have been deployed to users in a specific end-to-end environmental-science exemplar. Our general technical solution will potentially translate to any solution requiring controlling and interface layers for a distributed service-oriented HPC service.

    <...>

    Conclusion

    Architecting for high-productivity computing is not just a case of ensuring the "best" performance in order to compute results as quickly as possible; that is more of an expectation than a design feature. In the context of the overall value stream, the architecture must drive value from other areas, such as ease of access and decreasing cost of specialist skills to operate the system.

    A successful architecture for high-productivity computing solutions involves consideration of the overall process alongside the computationally intensive activities, and, therefore, might use several integrated components to perform the individual aspects of the process. Microsoft Cluster Compute Server Edition is easy to include as a service gateway inside a general n-tier application structure, and is simple to integrate via command-line or API hooks.

    Other available technologies can provide the basis for an HPC solution. In particular, Windows Workflow Foundation is well-suited to provide an application interface to CCS, because the features and extensibility of WF, such as persistence and tracking, lend themselves to the requirements of HPC-based solutions. The use of WF also opens up the available choices of user-experience technologies to be applied in a given domain.

    Source: Delivering End-to-End High-Productivity Computing

  • Dan on eScience & Technical Computing @ Microsoft

    SecPAL v1.1 Now Available

    • 1 Comments

    Just got a note from Jason that SecPAL 1.1 is available - this is great - take a look at the codeplex site for more details - Jason/Blair and team, you're doing great things...let's see if the community recognize what you've been up to...

    Summary of New Features in the SecPAL Research Release v1.1

    SecPAL v1.1 is a minor release of SecPAL that maintains compability with our first reasearch release of SecPAL. Changes for v1.1 include:
    New / Upgraded Features
    • We have updated the SecPAL grammar to a new and much more readable grammar. I will post a longer explanation in the coming weeks, but in short when you read a SecPAL policy conditions will now be prefaced by an "IF" statement, and constriaints will now be prefaced by a "WHERE" statement. These changes along with improved readibility of fact qualifiers should make the English representation of your policies / assertions much simpler to read. Note: This change should have no impact on existing policies, as it only affects the output of policies / tokens when you call .ToString() on them.
    • The 'CanActAs' predicate can now be used as a conditional fact within an assertion.
    • No breaking changes were made to API's so any SecPAL dependent code that you have written should behave the same.

    SecPAL v1.1 Now Available

    Just a quick note to let everyone know that we have just released a minor update to our SecPAL library. In addition to a couple of minor bug fixes there are two features which I think you are really going to like. The first is an update to our grammar - making it much clearer what conditions and constraints are. The second (which was actually a bug fix) is that our graphical proof graphs now work.

    Source: The Hogg Blog : SecPAL v1.1 Now Available

  • Dan on eScience & Technical Computing @ Microsoft

    Windows-Based Supercomputers Make Top 500 List

    • 1 Comments

    Great to see 2 Windows CCS machines on the new Top 500 list

    Also look for the Microsoft Excel Add-in to drive all the LinPack benchmarks - should be released soon at http://windowshpc.net

    Windows-Based Supercomputers Make Top 500 List

    Illustrating the ability of Windows to deliver productivity with scalability and performance, Windows Compute Cluster Server 2003 appeared on the computing industry’s semiannual top 500 list of the world’s most powerful supercomputers, released earlier today.

    Windows Compute Cluster Server 2003 served as the underlying operating system for a new HPC cluster at Mitsubishi UFJ Securities, which is a part of one of the largest financial services institutions in Japan. Its expanding derivatives business will leverage a Windows-based clustering to enhance risk-management practices and reduce simulation times. Mitsubishi UFJ Securities chose the Microsoft Windows platform because of the power, familiarity, and ease-of-development in Microsoft Visual Studio® 2005 and Visual C++®. Its top 500 benchmark on Windows Compute Cluster Server was run on a 448-node IBM BladeCenter HS21 cluster with 1,760 processors, and placed at 193 on the top 500 list. The benchmark result of 6.52 trillion computations per second (teraflops) demonstrates the power of the advanced simulation environment that will enable the pricing, risk management and product development of its derivatives offering. The result marks one of the top supercomputers for the financial services industry.

    Windows Compute Cluster Server also served as the underlying operating system for a new HPC cluster at Microsoft’s datacenter in Tukwila, Wash., which ranked 106 in the top 500.

    This system achieved 8.99 teraflops on 256 compute nodes and 2,048 processing cores of 64-bit Intel Xeon 5300 quad-core processors, powering Dell PowerEdge 1955 blade servers and Cisco infiniband switches.

    To achieve this result, Microsoft ran the benchmark over Windows Compute Cluster Server using a Microsoft Excel 2007 add-in created specifically to drive all the parameters and results required of the top 500 LINPACK benchmark. The Excel 2007 workbook contains all the necessary data needed to perform the LINPACK benchmark. This Excel add-in will be available within 90 days at http://windowshpc.net/default.aspx.

    Source: Microsoft Technical Fellow Keynotes on the Reinvention of Computing: Microsoft’s high-performance computing business gains adoption across industries.

  • Dan on eScience & Technical Computing @ Microsoft

    Wikipedia Explorer beta

    • 1 Comments

    I saw this post about the Wikipedia Explorer using WPF on Steve Claytons blog - it's really easy to use - I like the network mode - seeing all the relevant links - I also like how it adds columns to the page as you maximize the window...I see this as a model for how sceintific data/papers should be viewed.

    "Using the latest WPF technologies, Dot Net Solutions has crafted an application to browse Wikipedia which we have dubbed Wikipedia Explorer. Compared to the standard text only view of articles, Wikipedia Explorer deals with and displays the relationships between the articles.
    With the display of the data, the application allows 3 forms of view. An initial Document layout displays the article's content as it would be displayed in Wikipedia itself. The real value of the application however, is in the extra 3DExplorer and Network view modes.

    Within the 3DExplorer mode, the main article is displayed in the centre of the screen with all linked articles shown around in a helix structure for quick navigation. Scrolling through the articles is as easy as scrolling with your mouse wheel."

    What you get is a VERY powerful visualization of the flat by hypertexted Wikpedia. The 3D explorer is funky but the Network mode is just awesome. Put in a search term, switch to Network mode and watch the app build out the web of links before your very eyes. I think Tim has done a stunning job here but you can check for yourself as Wikipedia Explorer can be run as a ClickOnce application (note that you do need to install the .NET 3.0 redistributable package)

    Source: Steve Clayton: Geek In Disguise : Wikipedia Explorer beta

  • Dan on eScience & Technical Computing @ Microsoft

    Technology Review: The Year in Infotech and MSR SenseWeb

    • 1 Comments

    It's great to see the MSR SenseWeb project picked as one of 2006 most significant advances in information technology by Technology Review.  Check out the SensorMap and you can see the JHU Life Under Your Feet Project sensors on the map.

    Geotagging. GPS is becoming a more common feature in mobile phones, cameras, and cars. The result is a world of people, pictures, cars, and data trails on maps. A Microsoft research project aggregates disparate sensor data to map the world in real time.

    Source: Technology Review: The Year in Infotech

  • Dan on eScience & Technical Computing @ Microsoft

    Windows Server 2003 Compute Cluster Solution: Beta 1 Available Now

    • 1 Comments

    Beta 1 of the Windows Compute Cluster Solution was annouced today - www.microsoft.com/hpc.  You can nominate yourself for the beta at - http://www.microsoft.com/windowsserver2003/hpc/beta.mspx

  • Dan on eScience & Technical Computing @ Microsoft

    Scripting for Compute Cluster Server

    • 1 Comments

    Check out the new site on the TechNet Script Center, Scripting for Compute Cluster Server at http://www.microsoft.com/technet/scriptcenter/hubs/ccs.mspx.

  • Dan on eScience & Technical Computing @ Microsoft

    Word 2007 - Academic features: citation &amp; bibliography tools

    • 1 Comments

    Here’s a detailed description of what citation and bibliography features are available in Word 2007.

    Academic features: citation & bibliography tools

  • Dan on eScience & Technical Computing @ Microsoft

    [Article] Next-Generation Data Access: Making the Conceptual Level Real

    • 1 Comments

    Interesting article on MSDN - Next-Generation Data Access: Making the Conceptual Level Real

    Eliminate the impedance mismatch for both applications and data services like reporting, analysis, and replication offered as part of the SQL Server product by raising the level of abstraction from the logical (relational) level to the conceptual (entity) level.

  • Dan on eScience & Technical Computing @ Microsoft

    [Papers] Supporting Finite Element Analysis with a Relational Database Backend; There is Life beyond Files

    • 1 Comments

    In a discussion I had today around ways to advance a scientific problem I was reminded of Jim Gray and Gerd Heber's trilogy - Supporting Finite Element Analysis with a Relational Database Backend.  The three papers are really a good resource for understanding how databases can be used in scientific challenges.

    Part I: There is Life beyond Files

    We show how to use a Relational Database Management System in support of Finite Element Analysis. We believe it is a new way of thinking about data management in well-understood applications to prepare them for two major challenges, - size and integration (globalization). Neither extreme size nor integration (with other applications over the Web) was a design concern 30 years ago when the paradigm for FEA implementation first was formed. On the other hand, database technology has come a long way since its inception and it is past time to highlight its usefulness to the field of scientific computing and computer based engineering. This series aims to widen the list of applications for database designers and for FEA users and application developers to reap some of the benefits of database development.

    Part II: Database Design and Access

    This is Part II of a three articles on using databases for Finite Element Analysis (FEA). It discusses (1) db design, (2) data loading, (3) typical use cases during grid building, (4) typical use cases during simulation (get and put), (5) typical use cases during analysis (also done in Part III) and some performance measures of these cases. It argues that using a database is simpler to implement than custom data schemas, has better performance because it can use data parallelism, and better supports FEA modularity and tool evolution because database schema evolution, data independence, and self-defining data.

    Part III: OpenDX – Where the Numbers Come Alive

    In this report, we show a unified visualization and data analysis approach to Finite Element Analysis. The example application is visualization of 3D models of (metallic) polycrystals. Our solution combines a mature, general purpose, rapid-prototyping visualization tool, OpenDX (formerly known as IBM Visualization Data Explorer) [1,2], with an enterprise-class relational database management system, Microsoft SQL Server [3]. Substantial progress can be made with established off-the-shelf technologies. This approach certainly has its limits and we point out some of the shortcomings which require more innovative products for visualization, data-, and knowledge management. But, overall, the approach is a substantial improvement in the FEA lifecycle, and probably will work for other data-intensive sciences wanting to visualize and analyze massive simulation or measurement datasets.

  • Dan on eScience & Technical Computing @ Microsoft

    Create Your Own E-Learning

    • 1 Comments

    Ran across the release of the Learning Content Development System (LCDS).  Would seem to be a good way to package up some scientific demos/simulations and get it out in short course forms...

    What is the LCDS?

    The Learning Content Development System (LCDS) is a tool that enables you to create high quality, interactive, online courses. Virtually anyone can publish e-learning courses by completing the easy-to-use LCDS forms that seamlessly generate highly customized content, interactivities, quizzes, games, and assessments—as well as Silverlight-based animations, demos, and other multimedia. Register to download the free LCDS release, then start creating your own e-learning courses today!

    Register to download the free LCDS tool

    What does the LCDS offer?

    With the LCDS, you can:

    • Develop and deliver content quickly, while it is timely and relevant.
    • Distribute your content via the Web or in a learning management system.
    • Deliver Web content that conforms to Sharable Content Object Reference Model (SCORM) 1.2, and which can be hosted in a learning management system.
    • Upload or attach your existing content. (LCDS supports multiple file formats.)
    • Choose from a wide variety of forms for authoring rich e-learning content.
    • Develop your course structure and easily rearrange it at any time.

    Create Your Own E-Learning

  • Dan on eScience & Technical Computing @ Microsoft

    Microsoft Releases Speech Recognition Macros for Windows Vista

    • 1 Comments

    With the combination of a Vista Tablet PC and these Speech Recognition Macros (thanks Blake) - could that be a basis for a Electronic Lab Notebook (ELN)?  It would be interesting to see if having those different modes of interaction (text, touch, pen, speech, etc) would be enough for scientists...

    The Windows Speech Recognition Macros tool (aka WSRMacros) extends the usefulness of the speech recognition capabilities already included in Windows Vista. Users can now create powerful macros that are triggered by spoken commands. These macros can perform a single task, or a series of tasks. Macros can be as simple as inserting your mailing address to as complex as providing a completely different speech interaction utilizing a number of built in capabilities or utilizing custom JScript/VBScript actions.

    The Windows Speech Recognition Macros tool (Technical Preview) extends the usefulness of the speech recognition capabilities in Windows Vista. Users can create powerful macros that are triggered by spoken commands which can perform a series of tasks from as simple as inserting your mailing address to as complex as providing a completely different speech interaction with applications. (Thanks Rob)

    The Road to Know Where: Microsoft Releases Speech Recognition Macros for Windows Vista

  • Dan on eScience & Technical Computing @ Microsoft

    Windows HPC - Computational Finance Pilot

    • 1 Comments

    MS HPC The Windows HPC team has just made public their Computational Finance Pilot where they are enabling execution of computational fiance models for university courses - there is a good paper on the implementation.  It would be good to see if the same type of implementation could be used for offering up science based services.  The pilot brings is comprised of the following components:

    • Computing Resources - A 64-node\256-core compute cluster with 5 TB storage and a low latency interconnect
    • Central Market Dataset - Historical market data including 5 years of intraday equity tick data for the S&P 500, daily and fundamental data for 10,000 stocks and mortgage backed securities pool data
    • Microsoft® Office SharePoint® Server 2007 Web Portal - A Microsoft Office SharePoint Server 2007 portal to publish, browse and monitor models
    • Excel® HPC Task Pane - A Microsoft Excel 2007 user interface for model input and results
    • Model Execution Status Notifications - Model execution workflow with status email notifications

    Introducing the Microsoft HPC++ CompFin Lab

    The Microsoft HPC++ CompFin Lab integrates Microsoft HPC Server, a central market data database and Microsoft productivity products to provide university courses and research with an online service to publish, execute and manage computational finance models. Read more about it on our home page located here.

    Windows HPC Community - CompFin Pilot

Page 2 of 17 (412 items) 12345»