I’m with a customer in the labs this week doing some performance work and one of the tasks we wanted to do was to programmatically read the load test database in order to retrieve a table of summary data from a test run. My customer has instrumented their application well, and came prepared with 3 months of statistical data about which service calls were used over that time period. These service calls map (fairly) nicely into the unit tests we have, so we can, from this usage data, project out to any number of potential users and then see if the system will cope.

As an example, their instrumentation data can be grouped (it’s all in SQL) to produce a summary table something like the following…

Method Name % calls Actual calls
AddOnCostWithNameExistsTest 3.5 7865
CanDeletePriceParentTest 2.56 6684
CheckIfCustomerValidForLinkTest 7.55 19012
CheckIfExchangeRateExistsTest 4.34 12042

So, we used these percentages to define the test mix in Visual Studio. Then we ran some short tests and I wanted to be able to output something similar from the actual results in the load test database in order to compare them.

The new Export To Excel tool is excellent in VS2010 for comparing one run to another, but I didn’t want that – I wanted to compare a run to my existing data. In my simple mind, if I have 3 months of calls (for 200 users), then I can use a short load test using the percentages defined to work out how many calls the system will make in, say, 8 hours. I can then extrapolate this out to 3 months, and say (with a reasonable degree of accuracy) that the system should be able to support more users.

My problem however was getting the data out of the load test database in a form that was usable to me. I’m a big fan of referential integrity in the database, as not only does it ensure that you can’t bung in data that is nonsensical, but tools such as SQL Management Studio can extract the schema and show you the links. After some poking around in the database I identified the two tables that were of use to me. Needless to say there were no foreign keys between these tables which is most probably due to insert performance (i.e. recording tests as fast as possible). Oh well. They are as follows…

Capture

There’s also the LoadTestRun table which is the root as far as tests are concerned, and from that you can get the date/time of the test run and other data (such as the description of the test once you’ve annotated your run). A quick select on this table will give you a list of all the load tests you have executed, and from this you need the LoadTestRunId…

      SELECT * FROM LoadTestRun

With the unique load test ID it’s a simple job to then select the summary data. I created a stored proc to do this as follows…

CREATE PROCEDURE GetTestSummary (@loadTestRunId INT) AS
BEGIN
  SET NOCOUNT ON
  SELECT LTC.TestCaseName, LTTSD.TestsRun, ROUND(LTTSD.Average,2) AS 'Average (s)', 
         ROUND(LTTSD.Minimum,2) AS 'Minimum (s)', ROUND(LTTSD.Maximum,2) AS 'Maximum (s)', 
         ROUND (LTTSD.Median,2) AS 'Median (s)'
    FROM LoadTestTestSummaryData LTTSD
    INNER JOIN LoadTestCase LTC
      ON LTTSD.TestCaseId = LTC.TestCaseId
      AND LTTSD.LoadTestRunId = LTC.LoadTestRunId
    WHERE LTTSD.LoadTestRunId = @loadTestRunId 
    ORDER BY LTC.TestCaseName ASC
END  

This produces just the output I was looking for…

Results

So, I can now extrapolate these figures out to see how many of these tests could run in three months, and from that work out what the likely user load the system could handle would be. It’s not perfect as table sizes will change, but it will give me a good ballpark figure.

One other very useful piece of data we can also extract from the customers instrumentation is the timings (as luckily these were recorded). I can then use the frequency of calls and the execution times to judge which methods I should really concentrate on tuning.

Hope this helps someone.

 

Original Post by Morgan Skinner on 18/01/2011 here: http://blogs.msdn.com/b/morgan/archive/2011/01/18/reading-summary-results-from-the-vs2010-load-test-database.aspx