I submitted some changes to an automation script a few days ago and failed my code review. Code reviews are the mechanism we use to ensure we all adhere to coding guidelines - Pascal Casing vs. camel Casing, for instance. A peer will look over my code to ensure I'm following our agreed upon best practices. There are nothing special about the OneNote test team best practices. We simply follow the .NET framework guidelines.

Anyway, my first programming experiences were on old 8 bit computers. My first formal training came with a Commodore Pet and interpreted BASIC. With little  memory, and the slowness of the first generation of home computers, any optimizations you could discover were used. One of these had to do with variables. The value of a variable was stored in a table, and each time BASIC had to retrieve the value of a variable, it started at the beginning of the table and searched through it until it found the value it needed. Imagine you wanted to organize your tools in little bins on a wall.  Each time you needed a tool, you started at the top left bin and looked in each bin until you found the tool you needed.  If your awl was in the last bin, this means you would have to look through every bin every time you needed the awl.  Clearly, if you are woodworking, you would want awls and chisels near the beginning of your bins, and wrenches sheet metal tools near the end to save yourself time.

To a running program, this meant that if you used a variable constantly, you would want it at the beginning of that table so when the computer (interpreter) went to look up it's value, it did not have to look far through the table each time you needed the variable. A variable that was only used once or twice was better located at the end of the table. Since all variables were what we now call "global," this worked fairly well.

The mechanism to add a variable in the table was pretty simple. The first time a variable was used, it was added to the table. If this was the first variable in a program, it was the first variable in the table. A pretty simple way to order variables in an optimized fashion was to declare all your variables at the beginning of a program - think of something like this from a space based shooting game:

10 DIM SCORE, ALIENS, SHIPS

In this case, the variable "SCORE" would represent the running score. Hopefully, the player would cause this to go up repeatedly and would be a candidate for the most used variable. The number of "ALIENS" killed would be the second quickest value to find, and so on.

This also meant there could be many lines of code between the creation of the variable and the first time it was actually used. Also, memory was at a premium so "SCORE" would just as likely been "S." This made knowing what a variable was used for more difficult by seeing only a line at a time. Still, the performance benefits of this practice tended to outweigh the readability of the code, and most books and articles I saw recommended putting all variables at the beginning of the code.

Fast forward to my code review.

I had a method like this:

Verify()

{

int itemCount=0;

//code

//code

//do something - about 40 lines worth of code, etc…

itemCount = GetCountOfItems();

//now verify my count is correct...

}

And I was gently corrected to change my code to this:

Verify()

{

//code

//code

//do something, about 40 lines worth of code, etc…

int itemCount=0;

itemCount = GetCountOfItems();

//now verify my count is correct...

}

With the obvious change being to move the instantiation of the variable to be just before its first use. No big deal - I made the change, resubmitted and passed. My old 8th grade programming skills let me down, though. The biggest irony here is that Commodore, the machine on which I learned to program, used Microsoft BASIC.

Just a day in the life of a tester.

Questions, comments, concerns and criticisms are always welcome,

John