Colin Thomsen's Microsoft Blog

I'm a developer working on the code profiler that ships with Visual Studio 2010 Premium and Ultimate editions. At a previous company I worked on computer vision software for face and gaze tracking.

Posts
  • Colin Thomsen's Microsoft Blog

    Scripting for C#

    • 1 Comments

    Have you ever wanted to quickly run a C# application without having to setup a new project in Visual Studio and configure all the settings? A fellow developer here at Microsoft has written a tool called Code Runner .NET that allows just that. It isn't scripting exactly because the code is still compiled before being run instead of being interpreted, but you don't have to maintain project or solution files or worry about binaries. 

    To try it out I downloaded the tool and installed it to c:\temp\csr. I created a file test.csr after referring to the Getting Started Guide:

       1: using System;
       2: using Microsoft.Tools.CodeRunner;
       3:  
       4: public class Program
       5: {
       6:     public static int Main(string[] args)
       7:     {
       8:         if (args.Length == 0)
       9:         {
      10:             // TODO: Fill out usage information
      11:             Console.WriteLine("Usage: {0}", ScriptEnvironment.ScriptPath.FileAndExtension);
      12:             return 0;
      13:         }
      14:  
      15:         // TODO: Script code goes here...
      16:  
      17:         return 0;
      18:     }
      19: }

    My directory was completely empty aside from this file:

    C:\temp\cr\mytest>dir /b
    test.csr

    I could then run the file as follows:

    C:\temp\cr\mytest>..\csr test.csr
    Usage: test.csr

    That's it. The directory was still clean after the run:

    C:\temp\cr\mytest>dir /b
    test.csr

    You can also debug your code in Visual Studio using a a nifty tool called scaffold which makes the requisite csproj and sln files and conveniently cleans up after we're done:

    C:\temp\cr\mytest>..\Scaffold.exe test.csr
    Response file 'C:\temp\cr\csc.rsp' processed
    File 'C:\temp\cr\mytest\Scaffold_78D592CA\test.csproj' created
    File 'C:\temp\cr\mytest\Scaffold_78D592CA\test.csproj.user' created
    File 'C:\temp\cr\mytest\Scaffold_78D592CA\test.sln' created
    Starting Visual Studio
    Waiting for Visual Studio to exit

    We can now debug as usual:

    image

    When Visual Studio closes:

    Sub-directory 'C:\temp\cr\mytest\Scaffold_78D592CA\' was deleted

    Code Runner .NET has already been updated to support Visual Studio 2008 so download it from CodePlex now.

  • Colin Thomsen's Microsoft Blog

    VS2010: New Profiler Summary Page

    • 0 Comments

    With Visual Studio 2010 we want to make it easier for customers to find and fix performance issues with their code. One of the first things we looked at was the view that shows up after profiling an application – the Summary Page.

    I’ll describe a few features of the new summary page using the PeopleTrax application, which you can download from CodeBox. I won’t describe collecting the profiling data since this is already covered on MSDN. The summary page for a sample profiling run is shown below.

    summary_sampling

    I’ll describe the individual pieces of the report below:

    • The header shows the type of data collected (Sample, Instrumentation, etc.) and also how many samples were collected.
      sampling__header
    • The hot path is shown right on the front page.
      sampling hotpath 
      We can see straight away that the ReadLine() function is an interesting function to look at. Clicking on the hyperlink will take you directly to the Function Details page. There are also related views links to navigate directly to the ‘Call Tree’ and ‘Functions’ views.
    • Functions that are doing the most individual work (have the highest Exclusive Samples) are listed with visual ‘sparklines’ (bar charts) to indicate relative importance.
      sampling exclusive functions
      Clicking on the hyperlink for a function takes you directly to the Function Details page for that function.
    • Notifications that change based on the state of the report are shown in a new ‘Notifications’ area. In the example below, clicking Show All Code will reanalyze the report with Just My Code turned off. ‘View Guidance’ will bring up the Error List to display guidance from the new Rules and Guidance profiler feature.
       sampling notifications
    • Action links are shown in the ‘Report’ area and they do not depend on the state of the report. One useful link we added is ‘Toggle Full Screen’ which shows the report in full-screen mode.
      sampling actions
    • A chart of CPU usage (for sampling, although we also show contention data in Concurrency mode) enables filtering of the report data. Highlight a region on the chart and choose ‘Filter by selection’ to show data from only that period of time. For example, with the chart below it may be useful to filter between 16 and 24 seconds.
      sampling chart
    • At the top of the report above the header, we still have the profiler toolbar which is useful for navigating (see the left and right arrows and also the dropdown for jumping between views) and for executing some of the actions that are also listed on the Summary Page.
      sampling toolbar 

    Hopefully these new features on the front page will make it quicker and easier than ever to diagnose your performance issues.

  • Colin Thomsen's Microsoft Blog

    C# for C++ Devs: Generics vs Templates for Primitive Types

    • 3 Comments

    I was trying to write some type-generic (almost) code in C# using a pattern that I commonly use in C++. A very simple version of what I was trying to do looks something like:

    class B
    {};

    template<typename T>

    int convert(T value)

    {

        return (int)value;

    }

     

    int main(int argc, char* argv[])

    {

        convert(3.5f);

        convert(11.5);

        // The line below would fail with "error C2440: 'type cast' : cannot convert from 'B' to 'int'"

        //convert(B());

        return 0;

    }

    In C++ this compiles and runs just fine, as long as you don't uncomment the convert function for the class. For templates, code is only generated when needed at compile time and as long as the substituted code supports the required functions (or in this case cast), everything is just fine.

    The equivalent C# program using generics looks like this:

    class Program

    {

       static int convert<T>(T value)

       {

          return (int)value;

       }

     

       static void Main(string[] args)

       {

          convert(11.5);

       }

    }

    Unfortunately it doesn't compile. I get an error: Cannot convert type 'T' to 'int'

    This is due to the way generics are handled in C#. Instead of having code generated for different types at compile time, generics are resolved to the particular type at runtime (JIT actually). This means that the convert function shown above must support all types at compile time. This is not the case, since generic objects cannot always be converted to integers.

    It is possible to use constraints if you are not using a primitive type, but there doesn't seem to be a nice way to support 'generic across primitive types' in C#. Am I missing something?

  • Colin Thomsen's Microsoft Blog

    C# for C++ Devs: Structs vs Classes

    • 0 Comments

    I'm from a C++ background and now I'm working quite a bit more with C# so I'm learning new things all the time. One thing that baffled me recently was caused by the difference between structs and classes in C#.

    In C++ the only difference between struct and class is the default member accessibility. For example, in the code below A::f() is private, whereas B::f() is public

    class A
    {
       
    void f();
    }

    struct B
    {
       
    void f();
    }

    That's the only difference. Structs can have member functions and classes can contain only data members. For C#, things are different, as I found out recently.

    In C#, structs are always passed by value, whereas classes are always passed by reference. What this means in practice is that anywhere you pass a struct to a function as a parameter or return it you are doing so by value.

    The confusing piece of code for me was equivalent to the following:

    struct Animal
    {
       
    public int Spots;
    }

    class Program
    {
       
    static void Main(string[] args)
        {
           
    List<Animal> allAnimals = new List<Animal>();
            allAnimals.Add(
    new Animal());
            allAnimals.Add(
    new Animal());

           
    foreach (Animal animal in allAnimals)
            {
                animal.Spots = 5;
           
    }
          
    Debug.WriteLine(String.Format("First animal spots: {0}", allAnimals[0].Spots));
        }
    }

    When I compiled the code above I got the error:

    error CS1654: Cannot modify members of 'animal' because it is a 'foreach iteration variable'

    How strange, I thought. OK, maybe in a foreach loop you can't modify public members. Let's try calling a function instead: 

    struct Animal
    {
        public void setSpots(int NewSpots)
       
    {
            Spots = NewSpots;
        }
       
    public int Spots;
    }

    class Program
    {
       
    static void Main(string[] args)
        {
           
    List<Animal> allAnimals = new List<Animal>();
            allAnimals.Add(
    new Animal());
            allAnimals.Add(
    new Animal());

           
    foreach (Animal animal in allAnimals)
            {
                animal.setSpots(5);
           
    }
          
    Debug.WriteLine(String.Format("First animal spots: {0}", allAnimals[0].Spots));
        }
    }

    So the compile error went away, but the message printed out was:

    First animal spots: 0

    I was expecting 5 here. After reading a little bit about structs and classes in C#, the penny dropped. Each iteration through allAnimals was getting a copy of the animal and calling setSpots. If I changed the definition of Animal to a class instead of struct, I could use the original code.

    class Animal
    {
       
    public int Spots;
    }

    class Program
    {
       
    static void Main(string[] args)
        {
           
    List<Animal> allAnimals = new List<Animal>();
            allAnimals.Add(
    new Animal());
            allAnimals.Add(
    new Animal());

           
    foreach (Animal animal in allAnimals)
            {
                animal.Spots = 5;
           
    }
          
    Debug.WriteLine(String.Format("First animal spots: {0}", allAnimals[0].Spots));
        }
    }

    Incidentally, members of structs also do not have default public accessibility in C#.

  • Colin Thomsen's Microsoft Blog

    Noise Reduction in the VS2008 Profiler

    • 1 Comments

    One of the new profiler features in Visual Studio Team System (VS2008) is called Noise Reduction. This feature is intended to make it easier to review the Call Tree view by reducing the amount of data that is displayed, while still showing the most important functions.

    To illustrate this new feature I wrote a very simple native C++ application that utilizes TR1 in the Feature Pack Beta. In this simple app I create some shared_ptrs in a for loop after calling a recursive function a few times. If you're not familiar with TR1 take a look at the VC blog for more information.

       1: #include <tchar.h>
       2: #include <memory>
       3:  
       4: class A
       5: {
       6: public:
       7:     A(int v, int w) : b(v), c(w)
       8:     {}
       9: private:
      10:     int b;
      11:     int c;
      12: };
      13:  
      14: void recurse(int v)
      15: {
      16:     if (v > 0)
      17:     {
      18:         recurse(--v);
      19:     }
      20:     else
      21:     {
      22:         A someA(1,2);
      23:         for (int i=0; i < 100; ++i)
      24:         {
      25:             std::tr1::shared_ptr<int> a(new int(2));
      26:             std::tr1::shared_ptr<A> b(new A(2,3));
      27:             std::tr1::shared_ptr<A> c(b);
      28:         }
      29:     }
      30: }
      31:  
      32: int _tmain(int argc, _TCHAR* argv[])
      33: {
      34:     recurse(5);
      35:     return 0;
      36: }

     

    I profiled the application using Instrumentation Mode as shown below. Shifting to 'Call Tree' view and expanding I see my main function, which calls recurse 6 times (the initial call from main plus 5 recursive calls from inside the if statement in recurse). There are also a lot of other calls that have very little inclusive time (e.g. __RTC_CheckESP).

    default

     

    To reduce the noise in the call tree I choose to enable noise reduction by clicking on the icon on the far right of the toolbar (it looks like a checklist). In the dialog I enable trimming, using a threshold of 3%.

    trim_dlg

     

    After enabling this option I expand the call tree again and many of the extra calls are gone and the calls to recurse are much easier to see as shown below.

    trim

     

    I still think I can do a little better so I open up the Noise Reduction dialog again and 'Enable Folding'.

    fold_dlg

     

    The resulting call tree can now be completely expanded and shown without scroll bars. I can see that functions that have significant exclusive time include: __security_init_cookie (28.40%), shared_ptr setup for A, (10.23%), shared_ptr refcount destructor (11.30%) & the last call to recurse (7.58%).

    fold

     

    Using Noise Reduction in a larger application should make it easier for you to find performance problems.

    Details:

    • Trimming removes leaf nodes in the call tree that have Elapsed Inclusive Time % (or Inclusive Sample % if you use sampling mode) lower than the threshold.
    • Folding combines (folds) a child node up to its parent node if it is the only child and the Elapsed Inclusive Time % (or Inclusive Sample % if you use sampling mode) is within Threshold of its parent. The intention is to fold simple forwarding functions that don't have much influence on the performance of your code.

    Trimming is applied first and then folding, which is why the calls to recurse are all folded.

  • Colin Thomsen's Microsoft Blog

    VS2010: Investigating a sample profiling report (Function Details)

    • 0 Comments

    I’ve already discussed how the new profiler summary page makes it easier to discover and fix performance issues, but sometimes a little more investigation is required. This time I’m going to drill in with another new VS2010 feature called the Function Details View.

    I’m going to use PeopleTrax application again, which you can download from CodeBox. I have a report which I collected earlier using the Sampling method of profiling, since the application appears to be CPU bound (at least on a single core).

    In this report I take a look at the Hot Path and see that People.GetNames is calling two hot functions ‘StringReader.ReadLine()’ and ‘String.Trim()’.

    1_summary

    I’d like to see where these expensive calls are in my code so I click on GetNames (shown in purple above) to bring up the new Function Details view.

    2_function_details_splitscreen

    There are a few things to note about this new view.

    • The title at the top is for People.GetNames().
    • There is a dropdown to switch between ‘Inclusive Samples %’ and ‘Inclusive Samples’
    • There are some blue boxes showing callers and callees (I’ll cover this more later).
    • The code is shown with hot lines and an annotated margin.

    In the default view we would need to drag the splitter (shown in purple) to see all of the boxes. Instead, we can use the toolbar option to ‘Split Screen Vertically’, also highlighted in purple.

    3_function_details_splitscreen2

    Now we can clearly see where the calls to ReadLine() and Trim() are in the code on the right-hand side. There is also a metrics table in the ‘Function Performance Details’ section and there are some related links.

    At this point, let’s look at the blue boxes. The left-hand box shows the callers of this function. The height of the box represents the relative cost for the metric that is currently chosen (in this case Inclusive Samples). Since we have a single caller, it takes up the entire box.

    On the right we have another blue box that contains several sections:

    • Function Body (87 samples). The height of this box represents the relative number of samples in the body itself (Exclusive Samples).
    • ReadLine (727 samples). This is the function with the most Inclusive Samples. Again, the height is proportional so it is also the biggest box.
    • Trim (642 samples). The function with the second most Inclusive Samples. Slightly smaller height.
    • Other functions with smaller sample counts.

    Each of these boxes (aside from Function Body) is clickable, allowing navigation of the Caller/Callees.

    Looking at the code I can’t see an easy way to simplify it and I don’t control either Trim() or ReadLine(), so now let’s navigate one level up in the callers by clicking on GetPeople.

    6_function_details_clickable

    Clicking causes us to navigate to the GetPeople Function Details view:

    7_function_details_navigate

    From the code on the right-hand side we can see that in the two highlighted calls to GetNames 89.2% of the samples are occurring. From the loop structure it seems like it would be a good idea to avoid making these GetNames calls inside the for loop. Followers of the PeopleTrax application will notice that this is the first optimization suggestion for this application – cache the GetNames calls in the constructor. The next step in this investigation would be to change the code, collect another profile and compare the reports, but I’ll leave that up to you.

    NOTE: line-level information is only available for sample-profiling. Since this information is not available in Instrumentation mode, highlighting and margin annotation is also not available for Instrumentation mode.

  • Colin Thomsen's Microsoft Blog

    There's a Profiler in Visual Studio?

    • 3 Comments

    I've just started working in the Visual Studio Team System, Profiler team. Many people are surprised to hear that Visual Studio does actually have a profiler and that it has had one since Visual Studio 2005. To use the profiler you need either Visual Studio Team System for Developers or Visual Studio Team Suite.

    So where exactly is the profiler? Well in VS2005 it is buried in the 'Tools' menu under 'Performance Tools'. Fortunately the profiler is being promoted in the next release of Visual Studio to become part of a new 'Developer' menu. You can see it on my colleague Ian's blog in Figure 1.1.

    I must admit that I hadn't used the VS2005 profiler extensively before joining this team, so I went in search of more information about it. Here's a few of things I found:

    • Videos - (Part 1, Part 2)
      Ian stars in videos from 2005 that show a semi-real example of how you might use a profiler to improve performance. He discusses what is probably the most confusing thing for new users - what exactly is the difference between sampling and instrumentation? 
    • Profiler Blog
      Includes posts from other members of the team.
    • Ian's Blog
      Interest from Ian's blog articles led to the videos above. Be sure to check out some of the new features that are part of the next VS Beta.
    • Tech Notes
      More detailed articles about profiling and other aspects of VS2005.
    • MSDN Forum
      Here you can either ask questions (and developers do actually monitor and answer) or look at past answers.

    That's it for now.

  • Colin Thomsen's Microsoft Blog

    Tip: Fixing VSPerfASPNetCmd metabase errors

    • 0 Comments

    VSPerfASPNetCmd is a new Visual Studio 2010 tool that helps you profile ASP.Net websites from the command-line. Recently I noticed an error message which didn’t cover one common situation so I thought I’d write about it. Here’s an example:

    > VSPerfASPNetCmd.exe http://localhost
    Microsoft (R) VSPerf ASP.NET Command, Version 10.0.0.0
    Copyright (C) Microsoft Corporation. All rights reserved.

    Error
    VSP 7008: ASP.net exception: "The website metabase contains unexpected information or you do not have permission to access the metabase.  You must be a member of the Administrators group on the local computer to access the IIS metabase. Therefore, you cannot create or open a local IIS Web site.  If you have Read, Write, and Modify Permissions for the folder where the files are located, you can create a file system web site that points to the folder in order to proceed."

    The information in the error is correct and it is worth checking to make sure that you are running from an elevated command prompt, but it does miss a common configuration issue. In order to query for information from the IIS metabase, certain IIS components need to be installed.

    To check this in Windows 7:

    1. Open ‘Control Panel\Programs\Programs and Features’ (or run ‘appwiz.cpl’).
    2. Choose ‘Turn Windows features on or off’.
    3. In the ‘Internet Information Services’ section, make sure that the following options are selected.

     

    IIS Configuration Options

    The non-default options include:

    • IIS 6 Scripting Tools
    • IIS 6 WMI Compatibility
    • IIS Metabase and IIS 6 configuration compatibility
    • ASP.NET,
    • Windows Authentication
  • Colin Thomsen's Microsoft Blog

    Visual Studio 2010

    • 2 Comments

    This week Microsoft is starting to talk about Visual Studio 2010. One of the best resources I've seen has a bunch of videos from various Microsoft folks about some of the new features and what the high-level goals of the next release of Visual Studio are. Take a look at the Channel 9 site

    There isn't much about profiling just yet, although some of the other features being developed by the diagnostics team including 'Eliminate No-Repro Bugs' and 'Test Impact Analysis' are covered. For more details about those features, you may also want to keep an eye on John's blog.

    We'll also start talking about some of our new profiling features on the main profiler blog. If you're going to PDC, be sure to check out my boss Steve Carroll's (and Ed's) session:

    You'll get to see how some of our new profiler features make finding and solving performance problems easier.

  • Colin Thomsen's Microsoft Blog

    Performance: Inserting Marks Using Code

    • 0 Comments

    Ian previously covered using the VS 2008 Data Collection Control to choose when to collect data. The Data Collection Control can also be used to insert marks into the performance report, but sometimes it is convenient to modify the source code to do this automatically.

    Consider a typical application (PeopleTrax) where I am interested in gathering profiler data only between when a button is clicked and when data is displayed. The application is shown below.

    pre_click

    After the 'Get People' button is clicked, data is displayed after just over 6 seconds. This seems a little excessive so I want to focus my performance investigation in this area.

    post_click

    To filter the data so that it only shows information collected between those two points, I could use the Data Collection Control, but maybe I'm planning to run a complicated scenario and don't want to have to remember to insert the marks manually. Instead, it is possible to modify the original code to request the profiler insert marks in the required locations.

    The Profiler API is available for managed code in an assembly that can be added directly to the project from \Program Files\Microsoft Visual Studio 9.0\Team Tools\Performance Tools.

    add_ref_zoom_border

    After adding the reference it shows up in the 'References' list for the PeopleTrax project.

    add_ref_done_zoom_border

    I can then use functions in the Profiler API to control the profiler. This might include starting or stopping data collection or in this case, inserting marks into the datastream. This is easily achieved as shown below.

    mark_in_code

    I can then profile the application and when I open the Performance Report and switch to Marks View I see that the marks have been correctly inserted. We can also see that the time elapsed between the marks is about 6.5 seconds, which corresponds with the measurement that is already displayed in the PeopleTrax UI.

    marks_view_zoom_border

    I can use the marks to filter the report to only show profiling data for the time between the two inserted marks and then start my performance investigation.

    filter_on_marks_border

Page 1 of 4 (38 items) 1234