More case studies on .NET 4 parallelism support

More case studies on .NET 4 parallelism support

  • Comments 8

When .NET 4 was launched, we blogged about several case studies published regarding usage of .NET 4 to parallelize applications.  Quite recently, several additional case studies have been published.  I love reading these in order to better understand how folks are applying this technology, and parallelism in general, to improve their solutions, to make their customers’ lives better, and to truly improve the world.  Here are a few… enjoy!

Institute Makes Strides Toward Solving Genetic Mysteries With Parallel Development
”Casa Sollievo della Sofferenza in San Giovanni Rotondo and its affiliate institute CSS-Mendel in Rome conduct vital genetic research to help further the fight against diseases such as neurodegenerative disease, diabetes, and cancer. The institute relies on bioinformatics solutions to aid its research projects by looking for discrepancies among DNA samples. To process more genetic data at greater speeds, the institute used Microsoft parallel development tools, such as the support for Parallel Programing in Microsoft .NET Framework 4, to create a set of software plug-ins for the Ocean Workbench, a bioinformatics platform designed to model, check and simulate biological models. With that, researchers can spend their time more efficiently and experience a shorter time-to-result. As a result, they can push the boundaries of their work and have a positive effect on the lives of patients.”

Mining Services Company Implements Parallel Processing with Only Six Lines of Code
”SGS Geometallurgy, a division of SGS Canada, Inc., helps mining companies optimize plant design, forecast production, and minimize risk. FLEET™, the group’s old solution for simulating the flow of ore through a processing plant, required seven seconds per mine block regardless of how powerful the system running the application was. SGS used the new parallel processing tools in Microsoft Visual Studio 2010 and the .NET Framework 4 to build its Integrated Geometallurgical Simulator, or IGS™, which simulates 20 blocks per second per processor core—more than a million blocks per hour on a modern four-processor server. SGS adopted the new parallel processing tools in just a few days and can now easily parallelize its applications. Ease of parallelization has also enabled the company to simplify its code and deliver compelling new features, giving SGS a greater edge in winning new business”

Digital Field Solutions
Hosted Solution Provider Uses Parallel Programming to Quadruple Server Performance
”Digital Field Solutions provides digital pen-and-paper solutions, which capture and process handwriting electronically to simplify data capture and automate forms processing. As the demand for its hosted service grew, the company had a choice: figure out how to get more performance out of existing hardware, or purchase an additional server. The company quadrupled the workload capacity of its server by modifying its code for parallel processing, thereby enabling the server’s workload to be distributed across all four processor cores instead of one. Digital Field Solutions was able to implement parallel processing quickly and easily, with minimal developer effort. The company’s customers are benefiting from fast and reliable service, and Digital Field Solutions saved an estimated £12,000 per year in hardware costs.”

Powerful Price Search Engine Driven by Task Parallel Library
"What differentiates the PriceSpider search engine is its real-time product and pricing information. Each day, the PriceSpider data crawl fetches dozens of terabytes of up-to-the-minute product data, including images, for hundreds of thousands of products. To retrieve and process all that information faster and more efficiently, PriceSpider took advantage of the parallel-programming capabilities provided in Microsoft Visual Studio 2010 Premium with Team Foundation Server and Microsoft .NET Framework 4, including the Task Parallel Library (TPL). By converting from its manual parallelization process to the more efficient TPL, PriceSpider minimized bottlenecks in the crawling process, saved time and energy, and ensured that customers have continuous access to up-to-the-minute product images and information.”

Massachusetts General Hospital
Researchers Reduce Processing Times by Factor of 20 to Improve Colon Cancer Screening
”To make colon cancer screening more broadly accessible, Massachusetts General Hospital (MGH) sought to reduce the processing time required to electronically cleanse and view a three-dimensional (3-D) model of the colon from an hour to five minutes. By working together with Microsoft, Intel, and Vectorform, MGH reduced processing times to about two minutes, achieving the performance required to support mass colon cancer screening. The screening approach pioneered by MGH can also drastically decrease costs and is far more patient-friendly, avoiding the many negatives of an optical colonoscopy. It also improves convenience for radiologists, who can navigate and interpret the 3-D images of a colon by using finger gestures on a touchscreen-enabled PC, and improves diagnostic accuracy through computer-aided identification of potential polyps.”

Leave a Comment
  • Please add 5 and 8 and type the answer here:
  • Post
  • Hi

    Nice Aricle About Parallel Programming with .net

    Thanks for the post

  • Hi Stephen,

    We've just invested considerable effort in designing a dynamic Partitioner that could be very useful in the .net framework.

    Essentially it manages the partitioning of work based on a rolling window/buffer of items taking into consideration a unique constraint to prevent clashing items from being processed in parallel.

    The constructor looks like this:

    UniqueConstraintPartitioner<TItem>(IEnumerable<TItem> dataSource, Func<TItem, object> uniqueConstraint,

    int maxBufferedItems).

    I've included a sample of how it's used below.

    Please email me and let me know if you're interested in the details. My email is mhano at the domain of deltalateral with the tld of .com.



    var baseFeederCollection = new BlockingCollection<Package>(1);

    // start feeding work to collection in the background

    Task.Factory.StartNew(() =>


       Parallel.ForEach(enumerableStreamOfPackages, baseFeederCollection.Add);



    // create a partitioner to break up the work

    var uniqueConstraintPartitioner = new UniqueConstraintPartitioner<Package>(


       package => package.CustomerKey,


    // parallel process all the work using the .net task scheduler to manager the degree of parallelism automatically

    Parallel.ForEach(uniqueConstraintPartitioner, package =>


       // NOTE: DO WORK HERE

       Console.WriteLine("processing," + package.CustomerKey + "," + package.DocumentID);


  • Hi,

    I find the new Async / Await CTP add-on extremelly exciting.  It resolves a basic problem very elegantly and closes the loop on a bunch of parallel problems where we all knew we could gain but never dit it because the resulting code would be unmaintainable.

    Now, I got a question for you.  How does your team plan to address the WCF proxy generated by ChannelFactory?  Those proxies do not contain Task-oriented methods, they just contain sync service operations method, so I do not see how we could hook to a task in order to async / await it.

    This is a problem dear to me as we are currently designing a huge SOA system and will be implementing it using WCF technologies in a couple of months and will likely be able to pick up the RTW version of Async / Await add-on in the middle of implementation.

    Could you comment on this issue?



  • Hi Vincent-Phillipe-

    I'm very glad you like the new async features.  Regarding the WCF proxies, yes, we do plan to add that support to Visual Studio.  See and for more details.

  • Mhano, thanks for sharing info about your solution.  I'm glad to hear that the custom partitioning support was useful to you.

  • Excellent!  That rocks!


  • Thanks for sharing these case studies. We have also used the Task Parallel Library (TPL) and experienced similar dramatic improvements in the performance of our Microsoft CRM data conversion tool. Our total runtime went from 14 hours down to 2 hours 30 minutes!  What is amazing is how easy it was to learn and apply. I think this technology has one of the best returns on investment of anything I have tried in years. This reduction in duration made a big difference in our ability to use and test our program.  For those interested in speeding up data loads of Microsoft CRM, here is my post, summarizing the issues we faced.

  • This is great to see, David!  Thanks for taking the time to share your experiences and successes.

Page 1 of 1 (8 items)