When running an automated performance test against a web application it is important to verify that your results are true and accurate. Many times Automated Performance tests that are poorly written and/or incomplete will cause unwarranted errors within the application under test. Record and Playback type Automated Performance Tools make test script generation extremely fast and easy, these scripts are unfortunately not always accurate and steps should be taken to ensure that the script is valid prior to running tests and reporting results.
The following check-list provides several ways to verify your results and ensure a clean, accurate performance test:
A simple verification of your test script authenticity would be to execute a manual walk-through of your application scenario, save all logs mentioned above. Next refresh all of the logs and execute a single iteration of your automated performance test with a single user. Now compare the two logs and look for any discrepancies.
The following SQL statement can be used to get a count of every table within a database:
select object_name(id), rowcnt from sysindexes
where indid in (0,1)
and object_name(id) not like 'sys%'
order by 1
Now you could count the number of times “action=search” appears in the IIS log and you would have the total number of times a search was completed. Compare this number to the number the table counts results from item 2 above.
Appending a query string parameter to requests within your test script is extremely helpful in testing web services where you POST to the same asmx page while invoking different methods. Append a query string parameter that indicates the method name that is invoked for example:
Verifying the overall validity of the test results is a critical part of performance testing as code changes to the application may be made based on these findings. The integrity of the test must be indisputable.