Lately, I have been doing a great deal of performance testing. In fact, I am kinda glad that I had the opportunity and time to get good at it. I have only been working at MS for one and half years, a lot of the tasks we do daily here are still kinda new and exciting to me. For example, performance testing is not as easy as some people may think it is. There are so many factors which can affect the testing itself as well as the result. Most importantly, the purpose of a performance test is to measure the scalibility and architecture of an application. Sometimes, I even surprised at the conclusion we came to. All I have to say, “Performance testing is a science!”

What made the performance testing a science in my own mind is that the tester has to understand what he/she is going after, what kind of steps he/she needs to take to get there, and most importantly, how to interpret so many numbers a tester gets back from running the test. Here is a simple list of things a normal performance testing would need:

- Client(s) as computers which would run the test
- Target as the object a tester want to measure
- Performance Counters as the things someone need to pay attention to
- Duration as the length of the test which can vary dramatically from few minutes to several days; in some extreme case, even few months
- Warmup and CoolDown Time as the time the Target would get ready to be stressed and “Attacked”
- Numbers as the result a tester would get back from running a performance test
- Conslusion as the analysis of those numbers

If you haven't noticed yet, I was trying to create a cool acronym with the first letters of each term. Have a nice day, all! Hopefully you would find this post interesting.

P.S. My friend and I have been going to Applebee's lately for its Happy Hour menu, all I have to say is “Man, it is good!”