With Visual Studio 2010 we want to make it easier for customers to find and fix performance issues with their code. One of the first things we looked at was the view that shows up after profiling an application – the Summary Page.
I’ll describe a few features of the new summary page using the PeopleTrax application, which you can download from CodeBox. I won’t describe collecting the profiling data since this is already covered on MSDN. The summary page for a sample profiling run is shown below.
I’ll describe the individual pieces of the report below:
Hopefully these new features on the front page will make it quicker and easier than ever to diagnose your performance issues.
I’ve already discussed how the new profiler summary page makes it easier to discover and fix performance issues, but sometimes a little more investigation is required. This time I’m going to drill in with another new VS2010 feature called the Function Details View.
I’m going to use PeopleTrax application again, which you can download from CodeBox. I have a report which I collected earlier using the Sampling method of profiling, since the application appears to be CPU bound (at least on a single core).
In this report I take a look at the Hot Path and see that People.GetNames is calling two hot functions ‘StringReader.ReadLine()’ and ‘String.Trim()’.
I’d like to see where these expensive calls are in my code so I click on GetNames (shown in purple above) to bring up the new Function Details view.
There are a few things to note about this new view.
In the default view we would need to drag the splitter (shown in purple) to see all of the boxes. Instead, we can use the toolbar option to ‘Split Screen Vertically’, also highlighted in purple.
Now we can clearly see where the calls to ReadLine() and Trim() are in the code on the right-hand side. There is also a metrics table in the ‘Function Performance Details’ section and there are some related links.
At this point, let’s look at the blue boxes. The left-hand box shows the callers of this function. The height of the box represents the relative cost for the metric that is currently chosen (in this case Inclusive Samples). Since we have a single caller, it takes up the entire box.
On the right we have another blue box that contains several sections:
Each of these boxes (aside from Function Body) is clickable, allowing navigation of the Caller/Callees.
Looking at the code I can’t see an easy way to simplify it and I don’t control either Trim() or ReadLine(), so now let’s navigate one level up in the callers by clicking on GetPeople.
Clicking causes us to navigate to the GetPeople Function Details view:
From the code on the right-hand side we can see that in the two highlighted calls to GetNames 89.2% of the samples are occurring. From the loop structure it seems like it would be a good idea to avoid making these GetNames calls inside the for loop. Followers of the PeopleTrax application will notice that this is the first optimization suggestion for this application – cache the GetNames calls in the constructor. The next step in this investigation would be to change the code, collect another profile and compare the reports, but I’ll leave that up to you.
NOTE: line-level information is only available for sample-profiling. Since this information is not available in Instrumentation mode, highlighting and margin annotation is also not available for Instrumentation mode.
With Visual Studio 2010 we want to make it clearer when you are profiling your application within the Visual Studio IDE. To accomplish this we added a new control which we call ‘Profiling In Progress’ or PIP, which we show when you launch the profiler or attach the profiler to a running application.
Our goal was to show something simple and lightweight while profiling and to make a few simple operations possible. PIP in VS2010 is shown below:
Currently profiling. Pause profiling. Stop profiling or exit the application to generate a report.
The progress/spin control indicates that the UI is responding and there are two links:
After profiling completes, PIP changes to show that we are about to open the report.
Preparing to open report… This might take a while if your symbol server is slow or not responding.
The new guidance feature in the VS2010 profiler will look familiar to people who have used the static code analysis tools in previous versions. However, instead of statically analyzing your code, the profiler runs it and analyzes the results to provide guidance to fix some common performance issues.
Probably the best way to introduce this feature is via an example. Let’s assume you have written a simple application as follows:
1: using System;
2: namespace TestApplication
4: class Program
6: static void Main(string args)
11: private static void BadStringConcat()
13: string s = "Base ";
14: for (int i = 0; i < 10000; ++i)
16: s += i.ToString();
If you profile this application in Sampling Mode you’ll see an Action Link called ‘View Guidance’ on the Summary Page:
Clicking on this link brings up the Error List, which is where you would also see things like compiler errors and static code analysis warnings:
DA0001: System.String.Concat(.*) = 96.00; Consider using StringBuilder for string concatenations.
As you can see there is a 1 warning, which is DA0001, warning about excessive usage of String.Concat. The number 96.00 is the percentage of inclusive samples in this function.
Double-clicking on the warning in the Error List switches to the Function Details View. Navigating up one level of callers, we see that BadStringConcat is calling Concat (96% of Inclusive Samples) and doing some work itself (4%). The String.Concat call is not a direct call, but looking at the Function Code View you can see a ‘+=’ call on a string triggers the call.
The DA0001 rule suggests fixing the problem by changing the string concatenation to use a StringBuilder but I’ll leave that up to the reader. Instead, I’ll cover some of the other aspects of rules.
One of the more important questions is what to do if you wish to turn off a given rule (or even all rules)? The answer is to open up the ‘Tools/Options’ dialog and in the Performance section, navigate to the new ‘Rules’ subsection:
Here you can see that I’ve started searching by typing ‘st’ in the search box at the top. This dialog can be used to turn off rules (by clicking on the checkboxes on the left), or to change rule categories to ‘Warning’, ‘Information’ or ‘Error’. The only affect is to change how the rule is displayed in the Error List.
If you have a situation where you are sharing a profiler report (VSP file) with team members, sometimes it might be useful to let them know that a warning is not important or has already been considered. In this case you can right-click on the Error List and choose ‘Suppress Message’.
The rule warning is crossed out and you can choose to save the VSP file so that the next time it is opened, the suppression is shown:
That’s it for now. I plan on covering a little more about rules in a future post, including more details about the rules themselves, how you can tweak thresholds and even write your own rules.
VSPerfASPNetCmd is a new Visual Studio 2010 tool that helps you profile ASP.Net websites from the command-line. Recently I noticed an error message which didn’t cover one common situation so I thought I’d write about it. Here’s an example:
> VSPerfASPNetCmd.exe http://localhost Microsoft (R) VSPerf ASP.NET Command, Version 10.0.0.0 Copyright (C) Microsoft Corporation. All rights reserved.
Error VSP 7008: ASP.net exception: "The website metabase contains unexpected information or you do not have permission to access the metabase. You must be a member of the Administrators group on the local computer to access the IIS metabase. Therefore, you cannot create or open a local IIS Web site. If you have Read, Write, and Modify Permissions for the folder where the files are located, you can create a file system web site that points to the folder in order to proceed."
The information in the error is correct and it is worth checking to make sure that you are running from an elevated command prompt, but it does miss a common configuration issue. In order to query for information from the IIS metabase, certain IIS components need to be installed.
To check this in Windows 7:
The non-default options include:
The Visual Studio 2010 MSDN documentation includes some more detailed examples (including screenshots) than previous versions. Here's a decent intro to profiling:Beginners Guide to Performance Profiling
The ‘Just My Code’ feature in the profiler has a few differences to the ‘Just My Code’ feature in the debugger so this post should provide a useful introduction.
Here’s a very simple program I’ll use in this post.
static void Main(string args)
private static void Foo()
double d = 0;
for (int i = 0; i < 100000000; ++i)
d += Math.Sqrt(i);
Typically when profiling you are most interested in optimizing code that you either wrote or you have control over. Sure, sometimes there will be issues in the frameworks that you are using or in other binaries, but even then you often control the calls into those frameworks. Just My Code or JMC is intended to filter the data that is displayed in profiler reports so that more of the code you control shows up in the reports and the reports are more manageable.
For example, the Call Tree after collecting sampling data for the simple program above, with JMC off, is shown below:
With the default JMC options, this reduces down to:
There are two conditions for code being considered ‘My Code’ by the profiler and they are both at the Module level (Module Name column in the screenshots above). In the example above, this means the checks are made against the clr.dll, mscoreee.dll, mscoreei.dll and ConsoleApplication1.exe binaries.
Modules considered ‘My Code’:
You can temporarily toggle JMC on or off on the profiler Summary Page in the Notifications area using ‘Show All Code’ or ‘Hide All Code’ (shown in red below):
The default setting may be configured as discussed in the following section.
Use Tools –> Options –> Performance Tools –> General and set options in the ‘Just My Code’ section:
The default has JMC on, showing one level of non-user callee functions. In the example above with JMC on, this is why we see the call to COMDouble::Sqrt(dobule) showing up in the call tree.
It is also possible to show one-level of non-user code calling user code, which in the example above would add one level of the non-user code that calls main, as shown below:
When you instrument binaries for profiling, you have already performed some level of JMC. Only binaries that you instrument and first-level calls into other binaries will show up in the instrumentation report, so JMC is not really necessary.
The GM of Visual Studio Ultimate, Cameron Skinner, recently gave a talk at Tech-Ed 2011 about Application Lifecycle Management. It is worth taking a look at if you're interested in some of the new features being created for the next release of Visual Studio, such as the Team Navigator, shown below:
Every so often, on days like today, I need to debug on a machine where I don't have a debugger installed. There are a few options in this case, but one of the most convenient of these is to use Visual Studio's remote debugging facility.
The procedure is pretty convenient with Visual Studio 2010, since all you have to do is:
Debug as you usually would. Thanks to Gregg for writing a blog post reminding me how to do this. You might also be interested in his post about Remote Debugging without domain accounts.
If you're a subscriber to msdn magazine, take a look at the article in the March 2008, Vol 23, No 4 issue on Page 81 which describes how to use the Visual Studio 2008 profiler to improve the performance of an application. A couple of members of the profiler team examine a Mandelbrot fractal drawing program in some detail. They isolate and fix several performance problems in the code, speeding up program execution approximately tenfold.
UPDATE: You can read the article here.