Colin Thomsen's Microsoft Blog

I'm a developer working on the code profiler that ships with Visual Studio 2010 Premium and Ultimate editions. At a previous company I worked on computer vision software for face and gaze tracking.

Posts
  • Colin Thomsen's Microsoft Blog

    Scripting for C#

    • 1 Comments

    Have you ever wanted to quickly run a C# application without having to setup a new project in Visual Studio and configure all the settings? A fellow developer here at Microsoft has written a tool called Code Runner .NET that allows just that. It isn't scripting exactly because the code is still compiled before being run instead of being interpreted, but you don't have to maintain project or solution files or worry about binaries. 

    To try it out I downloaded the tool and installed it to c:\temp\csr. I created a file test.csr after referring to the Getting Started Guide:

       1: using System;
       2: using Microsoft.Tools.CodeRunner;
       3:  
       4: public class Program
       5: {
       6:     public static int Main(string[] args)
       7:     {
       8:         if (args.Length == 0)
       9:         {
      10:             // TODO: Fill out usage information
      11:             Console.WriteLine("Usage: {0}", ScriptEnvironment.ScriptPath.FileAndExtension);
      12:             return 0;
      13:         }
      14:  
      15:         // TODO: Script code goes here...
      16:  
      17:         return 0;
      18:     }
      19: }

    My directory was completely empty aside from this file:

    C:\temp\cr\mytest>dir /b
    test.csr

    I could then run the file as follows:

    C:\temp\cr\mytest>..\csr test.csr
    Usage: test.csr

    That's it. The directory was still clean after the run:

    C:\temp\cr\mytest>dir /b
    test.csr

    You can also debug your code in Visual Studio using a a nifty tool called scaffold which makes the requisite csproj and sln files and conveniently cleans up after we're done:

    C:\temp\cr\mytest>..\Scaffold.exe test.csr
    Response file 'C:\temp\cr\csc.rsp' processed
    File 'C:\temp\cr\mytest\Scaffold_78D592CA\test.csproj' created
    File 'C:\temp\cr\mytest\Scaffold_78D592CA\test.csproj.user' created
    File 'C:\temp\cr\mytest\Scaffold_78D592CA\test.sln' created
    Starting Visual Studio
    Waiting for Visual Studio to exit

    We can now debug as usual:

    image

    When Visual Studio closes:

    Sub-directory 'C:\temp\cr\mytest\Scaffold_78D592CA\' was deleted

    Code Runner .NET has already been updated to support Visual Studio 2008 so download it from CodePlex now.

  • Colin Thomsen's Microsoft Blog

    Noise Reduction in the VS2008 Profiler

    • 1 Comments

    One of the new profiler features in Visual Studio Team System (VS2008) is called Noise Reduction. This feature is intended to make it easier to review the Call Tree view by reducing the amount of data that is displayed, while still showing the most important functions.

    To illustrate this new feature I wrote a very simple native C++ application that utilizes TR1 in the Feature Pack Beta. In this simple app I create some shared_ptrs in a for loop after calling a recursive function a few times. If you're not familiar with TR1 take a look at the VC blog for more information.

       1: #include <tchar.h>
       2: #include <memory>
       3:  
       4: class A
       5: {
       6: public:
       7:     A(int v, int w) : b(v), c(w)
       8:     {}
       9: private:
      10:     int b;
      11:     int c;
      12: };
      13:  
      14: void recurse(int v)
      15: {
      16:     if (v > 0)
      17:     {
      18:         recurse(--v);
      19:     }
      20:     else
      21:     {
      22:         A someA(1,2);
      23:         for (int i=0; i < 100; ++i)
      24:         {
      25:             std::tr1::shared_ptr<int> a(new int(2));
      26:             std::tr1::shared_ptr<A> b(new A(2,3));
      27:             std::tr1::shared_ptr<A> c(b);
      28:         }
      29:     }
      30: }
      31:  
      32: int _tmain(int argc, _TCHAR* argv[])
      33: {
      34:     recurse(5);
      35:     return 0;
      36: }

     

    I profiled the application using Instrumentation Mode as shown below. Shifting to 'Call Tree' view and expanding I see my main function, which calls recurse 6 times (the initial call from main plus 5 recursive calls from inside the if statement in recurse). There are also a lot of other calls that have very little inclusive time (e.g. __RTC_CheckESP).

    default

     

    To reduce the noise in the call tree I choose to enable noise reduction by clicking on the icon on the far right of the toolbar (it looks like a checklist). In the dialog I enable trimming, using a threshold of 3%.

    trim_dlg

     

    After enabling this option I expand the call tree again and many of the extra calls are gone and the calls to recurse are much easier to see as shown below.

    trim

     

    I still think I can do a little better so I open up the Noise Reduction dialog again and 'Enable Folding'.

    fold_dlg

     

    The resulting call tree can now be completely expanded and shown without scroll bars. I can see that functions that have significant exclusive time include: __security_init_cookie (28.40%), shared_ptr setup for A, (10.23%), shared_ptr refcount destructor (11.30%) & the last call to recurse (7.58%).

    fold

     

    Using Noise Reduction in a larger application should make it easier for you to find performance problems.

    Details:

    • Trimming removes leaf nodes in the call tree that have Elapsed Inclusive Time % (or Inclusive Sample % if you use sampling mode) lower than the threshold.
    • Folding combines (folds) a child node up to its parent node if it is the only child and the Elapsed Inclusive Time % (or Inclusive Sample % if you use sampling mode) is within Threshold of its parent. The intention is to fold simple forwarding functions that don't have much influence on the performance of your code.

    Trimming is applied first and then folding, which is why the calls to recurse are all folded.

  • Colin Thomsen's Microsoft Blog

    Tip: VS2008 - Understanding Performance Targets

    • 1 Comments
    default_wizard_output_slnexplorer

    If you have a solution that contains multiple projects it is important to know what the 'Targets' group in the Performance Explorer is used for. The PeopleTrax solution shown on the right has 4 projects, with 3 of them compiling to managed DLLs and 1 compiling to an executable.

    After running the Performance Wizard to create a Performance Session the Performance Explorer contains a single target as shown below.

    default_wizard_output_perfexplorer 

    Only the project that compiles to an executable is listed in the 'Targets' folder (for other project types like websites it would include the default launch project). What about the other 3 projects? As this tip explains, it depends upon the type of profiling you wish to do.

    Sampling

    With sampling there is no need to add the additional projects to your targets list. We do not modify assemblies when sampling and we will automatically attempt to collect data for any assemblies loaded by the PeopleTrax target. The only exception to this requirement is if you wish to collect data for multi-process scenarios and therefore need to launch multiple targets.

    Instrumentation

    For instrumentation, if you wish to collect data for the additional projects they should be added to your targets list as follows:

    1. In the Performance Explorer, right-click on the 'Targets' folder:
      add_target_project_rightclick
    2. Choose 'Add Target Project' to display a dialog:
      add_target_project_dialog 
    3. Select the assemblies you wish to collect Instrumentation data for and choose OK.

    The selected projects will now be modified (instrumented) when you start profiling. You can selectively disable instrumentation for certain projects by right-clicking on the target and unchecking the 'Instrument' option.

    targets_launchable_trace_properties_crop
    Instrumentation properties for a specific target.

  • Colin Thomsen's Microsoft Blog

    The Honeymoon Is Over

    • 1 Comments

    I've been here at Microsoft for more than 6 months so I guess you could say that I've passed through the Honeymoon Phase. By now the initial joy and excitement should be starting to wear off and I should be settling into a monotonous routine.

    Well I'm happy to say that it hasn't happened so far. I'm still learning a lot, including things like:

    • Shipping big products is fun. We get to think about cool new ideas and some of them we implement and some of them get implemented by other smart folks.
    • Shipping big products is hard. We have to worry about things like localization, corner case scenarios and crashes that smaller products just don't need to consider. All of this takes time and there can be periods of time where you're fixing strings or working in high-contrast mode.
    • Our debugging tools are cool. For most of the bugs I need to fix my primary tool is Visual Studio. It is a good sign that even working with less stable dogfood versions is better than using another tool.
    • Bug/Feature Triage is important. We have so many people using our products that all kinds of bugs are reported, from serious (crashes) to suggestion (please improve this feature by...). If we did everything that was asked of us, we would never have a stable version to release. However, triaging can be much more lenient in the early stages of development. Here we go through stages:
      • Code review - any change you make must pass a code review. The reviewer might say 'hey, why are we fixing this bug!' and it may not be accepted.
      • Tell mode - closer to a release our team leads will go along to a meeting (called a shiproom meeting) and they will say "hey, we're fixing these bugs". If a lead goes along and says "we changed the font size from 9 to 10 points" without a good reason there might be some raised eyebrows.
      • Ask mode - even closer to release, before a bug is submitted, it has to go to the shiproom and be approved. Usually there are only certain classes of bugs that will be approved (blocking bugs, localization bugs, etc.). It is important that this 'bug bar' is known so that developers/leads know whether to attempt to fix a bug or not.

        All of this means that the number of bugs we fix gets fewer closer to a release, which means the product has time to stabilize and be thoroughly tested. At the same time, more minor bugs get a chance to get fixed early in the release cycle.
    • Company Meetings are exciting. There was a lot of shouting, collective back-slapping and cool demos. It was amazing that 1/3 of a baseball stadium was all from the same company.
    • Seattle summers are great. There is so much talk about how rainy Seattle is, but over summer the weather is warm but not really hot and it doesn't rain all that much. Daylight hours are long and it is perfect for getting out and about.

    I also like hearing about new features and products and being able to try them out before they're distributed to customers. Let's see how the next 6 months go.

  • Colin Thomsen's Microsoft Blog

    Microsoft Blogs I Read

    • 1 Comments

    There are a lot of Microsoft bloggers, literally thousands of them. When I first joined Microsoft I wasn't sure who to read. I've gradually built up a list based on interesting product and feature announcements and people I've met. Here they are:

    Profiling

    • Our Team Blog
    • IanWho's Blog. Written by a fellow dev on the profiler team, Ian has probably written the most about profiling across the team.
    • joc's bLog. Written by my bosses' boss.
    • mgoldin's blog. Written by a senior dev on my team. Find out about the difference between different types of samples etc.
    • My Code Does What?!. A relatively new blog about profiling by another fellow dev.
    • scarroll's Blog. Written by my boss.

    Technical

    • bharry's WebLog. Written by a Technical Fellow (read more about this) with a huge amount of experience who has a big focus on TFS.
    • Greggm's Weblog. Written by a senior dev on the Debugger team. Has many advanced debugger tips.
    • Mark Russinovich. Mark wrote some cool Sysinternals tools and now blogs some fascinating posts about his investigation into problems he finds everyday just using his PC.
    • Rico Mariani's Performance Tidbits. Written by a senior Microsoftie who has been here for a long time. Gives tips for analyzing performance and provides guidelines to use in writing .NET code.
    • ScottGu's Blog. Find out about LINQ, ASP.NET AJAX etc. etc. This blog has many examples including screenshots and source code.
    • Somasegar's WebLog. As the corporate VP of DevDiv, Soma covers a lot of Visual Studio features and other developer tools.

    Other

    That's just some of the Microsoft blogs I read. Are there other 'must-reads' that I'm missing?

  • Colin Thomsen's Microsoft Blog

    Visual Studio Team System Chat – December 5th

    • 0 Comments

    Join members of the Visual Studio Team System product group to discuss features available in Team Foundation Server, Team Suite, Architecture Edition, Development Edition, Database Edition, and Test Edition. In addition, discuss what's new for these editions for Visual Studio 2008.

     

    We will be holding two sessions:

     

    Join the chat on Wednesday, December 5th, 2007 from 10:00am - 11:00am Pacific Time. Add to Calendar | Additional Time Zones


                    -and-

    Join the chat on Wednesday, December 5th, 2007 from 4:00pm - 5:00pm Pacific Time. Add to Calendar | Additional Time Zones

    --- 

    I'll be in the Wed 4 pm - 5 pm chat to answer any questions related to profiling. Another member of the profiler team will be online for the earlier chat.

    ---

  • Colin Thomsen's Microsoft Blog

    Tip: VS2008 – Finding and Setting Properties (Right-Click)

    • 0 Comments

    The Visual Studio Profiler has many properties and options and this tip shows you where to find most of them. Future posts may cover some of the specific properties in more detail.

    Performance Session:
    session_properties 
    Select an existing Performance Session in the Performance Explorer to see properties in the Properties Window. If the Properties Window is hidden: 
    Press ‘F4’ or go to
    ‘View->Properties Window’.
      Performance Report:
    report_properties

    Select a Performance Report in the Performance Explorer to view many properties including Collection, ETW, General, Machine Information, Performance Counters, Process, Thread and Version Information.

     

    Performance Session Properties (and Options):

    session_properties_1 To adjust Performance Session properties:
    1. Right-click on the Performance Session (Performance1 in this example).
    2. Select ‘Properties’.

    Properties for Performance1 are shown below. There are different categories of properties on the left (e.g. General, Launch, Sampling, …).

    session_properties_2

     

    Performance Targets:

    target_properties_1 To adjust Performance Target properties:
    1. Right-click on the Target (ConsoleApplication3 in this example).
    2. Select ‘Properties’.

    Adjust the properties for the Performance Target as required. These properties do not often need to be changed, with the possible exception of the Instrumentation property ‘Exclude small functions from instrumentation’.

    target_properties_2

     

    Tools –> Options –> Performance Tools:

    Some global options can be configured using the Visual Studio Options dialog, which is accessed via:

    Tools –> Options –> Performance Tools

    tools_options

    That’s all the properties I can think of but I’m probably missing some still. Probably the most important aspect to this tip is to emphasize that right-clicking with the mouse is often the way to access important contextual information.

  • Colin Thomsen's Microsoft Blog

    PDC 2008 - See the Sessions

    • 0 Comments

    This year if you didn't get a chance to go to the Professional Developer's Conference (PDC), there is still a wealth of information available to you. The most valuable resource I think are the videos of all the PDC sessions. Here are a few of the sessions that I've viewed and found most interesting:

    • Improving .NET Application Performance and Scalability, starring my boss Steve Carroll and Ed Glass, this session covers a bunch of new Visual Studio 2010 Profiler features.
    • Visual Studio Debugger Tips & Tricks, with speaker John Cunningham who is a Microsoft Development Manager (and Steve's boss), covering features in Visual Studio 2008, 2008 SP1 and features to look forward to in Visual Studio 2010. Note to self 'if you ever ship without symbols, I would fire you'.
    • Microsoft Visual Studio Team System: Software Diagnostics and Quality for Services, featuring Habib and Justin, who are also folks from the diagnostics team. The most exciting demo from this talk shows off the cool new Historical Debugging feature. It also features the new Test Impact Analysis feature, which can tell you which tests you should run after changing your code.
    • Framework Design Guidelines, by the guys who wrote the book of the same name, Krzysztof Cwalina and Brad Adams. If you write managed code this is a must-see session.

    If you'd like to try some of the Visual Studio 2010 features for yourself, you can download the newest CTP here.

  • Colin Thomsen's Microsoft Blog

    VS2010: Attaching the Profiler to a Managed Application

    • 0 Comments

    Before Visual Studio 2010, in order to attach the profiler to a managed application, certain environment variables had to be set using vsperfclrenv.cmd. An example profiling session might look like this:

    • vsperfclrenv /sampleon
    • [Start managed application from the same command window]
    • vsperfcmd /start:sample /output:myapp.vsp /attach:[pid]
    • [Close application]

    If the environment variables were not correctly set, when attempting to attach you would see this message:
    old_attach_warning

    The profiling environment for ConsoleApplication2 is not set up correctly. Use vsperfclrenv.cmd to setup environment variables. Continue anyway?

    The generated report would typically look something like the report below. The warning at the bottom of the page indicates the problem and the report itself would typically not be useful since no managed modules or functions would be resolved correctly.

    old_attach_badreport  Report with 'CLRStubOrUnknownAddress and Unknown Frame(s) and the warning ‘It appears that the file was collected without properly setting the environment variables with VSPerfCLREnv.cmd. Symbols for managed binaries may not resolve’.

    Fortunately the Common Language Runtime (CLR) team provided us with a new capability to attach to an already running managed application without setting any environment variables. For more detailed information take a look at David Broman’s post.

    Caveats:

    • We only support attach without environment variables for basic sampling. It will not work for Allocation or Object Lifetime data collection and Instrumentation attach is not possible. Concurrency (resource contention) attach is supported.
    • The new attach mechanism only works for CLR V4-based runtimes.
    • The new attach mechanism will work if your application has multiple runtimes (i.e. V2 and V4  SxS), but as noted above, you can only attach to the V4 runtime. I’ll write another post about the profiler and Side by Side (SxS).
    • The old environment-variable-based attach still works, so you can still use that if you prefer.

    The new procedure for attaching the profiler to a managed application in Visual Studio 2010 goes like this:

    • Launch your app (if it isn’t already running)
    • Attach to it, either from the command-line or from the UI.
    • When you’re finished, detach or close the app to generate a report.

    new_attach_report

    If you want to diagnose any issues with attach, the CLR V4 runtime provides diagnostic information via the Event Log (view with Event Viewer) and the profiler also displays information there:

    new_attach_eventlog

    Event Log: ‘Loading profiler. Running CLR: v4.0.21202. Using ‘Profile First’ strategy’

    There are two .NET Runtime messages regarding the attach, the first indicating that an attach was requested and the second that the attach succeeded. The VSPERF message describes which CLR is being profiled.

  • Colin Thomsen's Microsoft Blog

    VS2010: Using the keyboard to profile an application (Alt-F2 shortcut)

    • 0 Comments

    In announcing the Visual Studio Beta 2 profiler features, Chris mentioned that we have a new option on the Debug menu called ‘Start Performance Analysis’ which has the Alt-F2 keyboard shortcut. This makes it easier than ever to start profiling your application. The new menu item has the following behavior:

    • You must have a Visual Studio Solution open in order to enable it.
    • If you have a solution open, but do not have a launchable current performance session, Start Performance Analysis launches the Performance Wizard.
    • If you have a solution open and have a launchable current performance session, Start Performance Analysis starts profiling.

    Let’s use this new functionality to profile an application that I prepared earlier.

    1. Open the solution with ‘Alt-F, J, Enter’:
      1_open_project 
    2. Start Performance Analysis with ‘Alt-F2’, which brings up the wizard:
      2_alt-f2_wizard
    3.   Press ‘Enter’ to choose the default ‘CPU Sampling’ profiling method and move to the target selection page:
      3_enter_next 
    4. Press ‘Enter’ to select the only launchable project in the solution and move to final wizard page:
      4_enter_finish
    5. Press ‘Enter’ to finish the wizard and start profiling:
      5_profiling
    6. The report will open when profiling finishes:
       6_report

     

    If you wish to profile again, selecting Alt-F2 will start profiling with the Performance Session that was created after step #4.

Page 2 of 4 (38 items) 1234