Colin Thomsen's Microsoft Blog

I'm a developer working on the code profiler that ships with Visual Studio 2010 Premium and Ultimate editions. At a previous company I worked on computer vision software for face and gaze tracking.

  • Colin Thomsen's Microsoft Blog

    Performance: Find Application Bottlenecks With Visual Studio Profiler


    If you're a subscriber to msdn magazine, take a look at the article in the March 2008, Vol 23, No 4 issue on Page 81 which describes how to use the Visual Studio 2008 profiler to improve the performance of an application. A couple of members of the profiler team examine a Mandelbrot fractal drawing program in some detail. They isolate and fix several performance problems in the code, speeding up program execution approximately tenfold.

    UPDATE: You can read the article here.

  • Colin Thomsen's Microsoft Blog

    Developer Dogfooding at Microsoft


    I hadn't heard the term dogfooding used much before I started here, but it has already been explained so take a look here. The basic idea is that if you're not happy using your product (i.e. eating your own dogfood) then why should you expect your customers to be? Working at Microsoft gives you incredible scope to dogfood a wide variety of products.

    As a Microsoft employee, I should be using Internet Explorer, Vista, Office, etc etc and I am. This doesn't necessarily mean I shouldn't run alternative products as well or when a Microsoft product doesn't provide the functionality I need. 

    As a Microsoft developer, I should be using Team Foundation Server for bug tracking and source control. I should be developing Visual Studio using Visual Studio. I should be profiling my code using VSTS profiling tools. Fortunately, I am, although not exclusively and probably not in some other parts of the company.

    The main reason I think this is a good idea is because we get to feel any of the pain that customers do. We have extra incentive to fix any problems instead of ignoring them. We often catch problems early on before customers even see them.

    I'll admit it, the process can be painful. The pain typically increases as you get closer to the bleeding edge of technology. For example, my Visual Studio dogfooding experience involves running the latest build of VSTS while developing. There are issues which delay my development, but facing these issues every day helps me drive improvements to the product. Imagine if your source control system went down - you'd want it fixed pretty quickly and that's just what we want from our TFS dogfood server.

    Here's a few of things that I think need to happen for successful dogfooding:

    • The process must not be voluntary. As an individual dev I must use a pre-release version of TFS. As a Microsoft employee my computer is automatically updated to use the latest updates before they are pushed out to customers. There isn't a choice.
    • There must be a feedback mechanism. If things are broken it must be easy to report this and critical breaks must be fixed quickly.
    • Things must actually get better. Limit the audience for really unstable dogfooding. For example, we don't make devs outside the VS team build their own VS from last night's source. They get a 'Last Known Good' build of a release that has had extra testing carried out on it.

    If you're an application developer, are you using your own alpha/beta software before it is released to the public?

  • Colin Thomsen's Microsoft Blog

    PDC 2008 - See the Sessions


    This year if you didn't get a chance to go to the Professional Developer's Conference (PDC), there is still a wealth of information available to you. The most valuable resource I think are the videos of all the PDC sessions. Here are a few of the sessions that I've viewed and found most interesting:

    • Improving .NET Application Performance and Scalability, starring my boss Steve Carroll and Ed Glass, this session covers a bunch of new Visual Studio 2010 Profiler features.
    • Visual Studio Debugger Tips & Tricks, with speaker John Cunningham who is a Microsoft Development Manager (and Steve's boss), covering features in Visual Studio 2008, 2008 SP1 and features to look forward to in Visual Studio 2010. Note to self 'if you ever ship without symbols, I would fire you'.
    • Microsoft Visual Studio Team System: Software Diagnostics and Quality for Services, featuring Habib and Justin, who are also folks from the diagnostics team. The most exciting demo from this talk shows off the cool new Historical Debugging feature. It also features the new Test Impact Analysis feature, which can tell you which tests you should run after changing your code.
    • Framework Design Guidelines, by the guys who wrote the book of the same name, Krzysztof Cwalina and Brad Adams. If you write managed code this is a must-see session.

    If you'd like to try some of the Visual Studio 2010 features for yourself, you can download the newest CTP here.

  • Colin Thomsen's Microsoft Blog

    Why Performance Matters


    Everybody likes to think that what they're working on is important so this is why I think performance (and measuring performance) matters and why it matters now more than ever.

    In the past chip manufacturers like Intel and AMD did a lot of the performance work for us software guys by consistently delivering faster chips that often made performance issues just disappear. This has meant that many developers treat performance issues as secondary concerns that will get better all by themselves if they wait long enough. Unfortunately free lunches don't last forever as Herb Sutter discusses.

    Chip manufacturers can no longer keep increasing the clock speeds of their chips to boost performance so they are reducing the clock speed and increasing the number of cores to take computing to new levels of energy-efficient performance. The result, which is discussed on one of Intel's blog's, is that some applications will actually run less quickly on newer multicore hardware. This will surely shock some software consumers when they upgrade to a better machine and find it running software slower than before.

    Clearly we need to change our applications to allow them to take advantage of multiple cores. Unfortunately this introduces a lot of complexity and there are many competing opinions about how we should do this. Some seasoned developers are pretty negative about using multithreaded development in any application. Some academics suggest that we use different language constructs altogether to avoid the inherent nondeterminism associated with developing using threads.

    Consider also that the types of applications we are developing are also changing. Sure, the traditional rich-client applications are still very popular, but there is also a demand for light-weight web-based clients that communicate with a central server (or servers). The software running on the servers must cater for many users and will have very strict performance requirements.

    So how does all this fit in with performance measurement? Well now developers have to write concurrent applications that are difficult to understand and develop. They write web-delivered applications that must respond promptly to many concurrent users and it is unlikely that just upgrading hardware is going to fix any performance problems that crop up. The free lunch is basically over, so now we have to pay. One way to minimize the cost of our lunches is to be able to 'debug' or resolve dynamic software issues using a profiler just like you would use a debugger to fix issues with program correctness.

    One of the main benefits of using a profiler instead of manually inspecting the code is that it avoids the 'gut feel' approach to performance optimization that is common. For example, a developer sees a loop like:

    for (int i=0; i < some_vector.size(); ++i)

    So they decide to optimize by making a temporary so that the size() function doesn't get called for every iteration of the loop:

    const int some_vector_length = some_vec.size(); 
    for (int i=0; i < some_vector_length; ++i)

    The number of lines of code has now increased by 1. If the length of the vector is always small, it is unlikely this buys much in the way of performance. Even worse, a developer may start to do things like loop unrolling when the real cause of the performance problems is something they don't notice. As the complexity of the code goes up the maintenance costs increase. If a profiler were used, it would be much easier to isolate the cause of a performance problem without wasting time optimizing code that is barely impacting the performance of an application.

    Before I get too carried away, I should clarify that performance matters, but only if the performance is poor. For example, if you're working on a User Interface (UI), according to Jakob Nielsen if the response time to a user action is less than 0.1 seconds, the user will feel that the system is reacting immediately to their action. If you're working on a computer game the performance requirement might be that the frame rate must be at least 30 Hz. In both of these cases the user will notice if the performance requirement isn't met, but they will probably not notice or care about performance if the performance requirement is met.

    If you haven't used a profiler before, go and try out a Community Technology Preview (CTP) of Orcas which will be the next version of Visual Studio. For the full experience you should avoid using the VPC images which have reduced profiler functionality. Some day, maybe soon if not already, you'll have to fix a performance problem with your code and using a profiler might help.

  • Colin Thomsen's Microsoft Blog

    Link: Beginners Guide to Performance Profiling


    The Visual Studio 2010 MSDN documentation includes some more detailed examples (including screenshots) than previous versions. Here's a decent intro to profiling:
    Beginners Guide to Performance Profiling

  • Colin Thomsen's Microsoft Blog

    Tech-Ed 2007


    Tech-Ed 2007 is starting tomorrow and the Profiler Team is sending a few people to sunny Orlando for the event. This is great news for me because my boss, Steve Carroll, is away for the week (just kidding Steve), but it is really great news for folks at Tech Ed because he'll be there presenting with Marc Popkin-Paine:

    DEV313 - Improving Code Performance with Microsoft Visual Studio Team System  [N210 E]

    June 07

    9:45 AM

    11:00 AM

    I believe they'll be demoing a few new Orcas features and giving a pretty good introduction to profiling. If you didn't know Visual Studio Team System has a profiler, or you don't think performance is important, you should definitely check this out.

    If you're not lucky enough to be able to make it to Orlando this year, be sure to take a look at Virtual Tech Ed, which will include webcasts and other content from some of the sessions. One that jumps out at me is MSDN Webcast: A Lap around Microsoft Visual Studio Code Name "Orcas" (Level 200).

    UPDATE: Steve is already helping people at Tech Ed. If you're there and you're interested in performance go and have a chat with him in the Technical Learning Center.

  • Colin Thomsen's Microsoft Blog

    Visual Studio 2008, Beta 2 (now with some of my code)


    Today we released Beta 2 of VS2008. This is the first public release from Microsoft that contains a nontrivial amount of code that I wrote (even though I haven't written too much code just yet). I had barely synched up the source tree and only fixed a couple of bugs when we released Beta 1 but now I've found my feet and am contributing more.

    The major release announcements have focussed on the flashier (and admittedly very cool) aspects of the Beta like LINQ and some of the HTML editing and Javascript debugging features. However, us Profiler folks have also been toiling away adding new features and fixing bugs. Look out for things like (and some of these already featured in Beta 1, but they just keep getting better):

    • A promotion to the new Developer menu
    • Hot path - find the critical path/paths through your call trees
    • Noise reduction - trim and/or fold your call trees so that they are easier to examine. See above for folding example.
    • Comparison reports - compare subsequent profiler runs to determine if code changes are improving performance
    • x64 OS support - profile on x64 Vista or W2K3 server 

    If you can, please download it and let us know what you think. If you don't have the time at least take a look at the overview video showing some of the major features. You should also check out Ian's entry about controlling data collection while profiling. Hopefully I'll have time to go through some of the new profiler-specific features soon.

  • Colin Thomsen's Microsoft Blog

    VS2010: Using the keyboard to profile an application (Alt-F2 shortcut)


    In announcing the Visual Studio Beta 2 profiler features, Chris mentioned that we have a new option on the Debug menu called ‘Start Performance Analysis’ which has the Alt-F2 keyboard shortcut. This makes it easier than ever to start profiling your application. The new menu item has the following behavior:

    • You must have a Visual Studio Solution open in order to enable it.
    • If you have a solution open, but do not have a launchable current performance session, Start Performance Analysis launches the Performance Wizard.
    • If you have a solution open and have a launchable current performance session, Start Performance Analysis starts profiling.

    Let’s use this new functionality to profile an application that I prepared earlier.

    1. Open the solution with ‘Alt-F, J, Enter’:
    2. Start Performance Analysis with ‘Alt-F2’, which brings up the wizard:
    3.   Press ‘Enter’ to choose the default ‘CPU Sampling’ profiling method and move to the target selection page:
    4. Press ‘Enter’ to select the only launchable project in the solution and move to final wizard page:
    5. Press ‘Enter’ to finish the wizard and start profiling:
    6. The report will open when profiling finishes:


    If you wish to profile again, selecting Alt-F2 will start profiling with the Performance Session that was created after step #4.

  • Colin Thomsen's Microsoft Blog

    Tech-Ed 2008 Demos


    Last year my boss took a trip to sunny Orlando to present at Tech-Ed and to offer help and suggestions in the Technical Learning Center (TLC). This year I'm lucky enough to be attending with a couple of other folks (Habib and Tim) and since I'm not an official Speaker I'll be spending most of my time hanging out in the Application Lifecycle Management (ALM) demo station for Visual Studio 2008 Team System, Development Edition.

    We've prepared a few demos covering things like:

    • Profiling using Instrumentation Mode on a Virtual PC image.
    • Collecting Allocation and Object Lifetime information.
    • Analyzing Performance Reports.
    • Using Code Analysis to improve your code.
    • Enabling Code Analysis Check-In Policies.

    We're also looking forward to discussing your specific scenarios so if you're at Tech-Ed and interested in diagnostic tools and solving performance problems we'd love to chat with you.

  • Colin Thomsen's Microsoft Blog

    Tools of the Trade


    I've been thinking about what some of the most important tools are for me while coding. Here's a few:

    • Good IDE - syntax highlighting, integrated builds, source control integration, search facility, debugger and profiler built-in. I use VSTS.
    • Source control/bug tracking system. I use TFS (typically a dogfood version of TFS).
    • Windows Task Manager.
      I use task manager to:
      - View CPU usage
      - Kill processes
      - Start a new explorer.exe if I ever kill explorer.exe. Do this from the Applications tab.
    • Process Explorer.
      I use Process Explorer like task manager, but it can also:
      - Find what process has a handle (e.g. a file) open. This can be handy if you want to delete a file but a process has it locked.
      - Find out what DLLs a process has loaded
      - List the environment variables for a running process
    • Process Monitor
      I use Process Monitor to record file, registry and process activity. This is very useful when debugging issues in complex programs like VSTS which have a lot of registry interactions.
    • DebugView
      Display debugging output from programs without having to attach a debugger. This is very useful if you want to run your program outside a debugger and still want to see all those debug prints.
    • Media Player. I like to listen to music while I code.
    • Outlook. It is somewhat sad that I spend a fair percentage of my day reading emails, scheduling or checking up on meetings and writing notes in an Outlook journal, but I do and so I have Outlook open all the time.
    • Internet Explorer. I need to use MSDN a lot and do web searches. I also read RSS feeds of relevant blogs with IE.
    • Regedit.
    • Remote Desktop. I work on different machines pretty regularly and Remote Desktop makes switching between machines easy.
Page 3 of 4 (38 items) 1234