Larry Osterman's WebLog

Confessions of an Old Fogey
Blog - Title

Types of testers and types of testing

Types of testers and types of testing

  • Comments 15

In yesterday’s “non admin” post, Mat Hall made the following comment:

"Isn't testing the whole purpose of developing as non-admin?"

Remember, Larry is lucky enough that the REAL testing of his work is done by someone else. The last time I did any development in a team with dedicated testers, my testing was of the "it compiles, runs, doesn't break the build, and seems to do what I intended it to". I then handed it over to someone else who hammered it to death in completely unexpected ways and handed it back to me...
 

Mat’s right and it served as a reminder to me that not everyone lives in the ivory tower with the resources of a dedicated test team.  Mea culpa.

Having said that, I figured that a quick discussion about the kinds of testers and the types of tests I work with might be interesting.  Some of this is software test engineering 101, some of it isn’t.

In general, there are basically four different kinds of testing done of our products.

The first type of testing is static analysis tools.  These are tools like FxCop and PREfast that the developers run on our code daily and help to catch errors before they leave the developers machines.  Gunnar Kudrjavets has written a useful post about the tools we use that can be found here.

The second is the responsibility of the developer – before a feature can be deployed, we need to develop a set of unit tests for that feature.  For some components, this test can be quite simple.  For example, the waveOutGetNumDevs() unit test is relatively simple, because it doesn’t have any parameters, and thus has a relatively limited set of scenarios.  Some components have quite involved unit tests.  The unit tests in Exchange server that test email delivery can be quite complicated.

In general, a unit test functions as a “sniff test” – it’s the responsibility of the developer to ensure that the basic functionality continues to work.

The next type of testing done is component tests.  These are typically suites of tests designed to thoroughly exercise a component.  Continuing the waveOutGetNumDevs() example above, the component test might include tests that involve plugging in and removing USB audio devices to verify that waveOutGetNumDevs() handles device arrival and removal correctly.   Typically a component covers more than a single API – all of the waveOutXxx APIs might be considered a single component, for example.

And the last type of testing done is system tests.  The system tests are the ones that test the entire process.  So there won’t be a waveOutXxx() system test, but the waveOutGetNumDevs() API would be tested as a part of the system test.  A system test typically involves cross-component tests, so they’d test the interaction between the mixerXxx APIs and the waveOutXxx APIs. 

System tests include stress tests and non stress tests, both are critical to the process.

Now for types of testers.  There are typically three kinds of testers in a given organization. 

The first type of tester is the developer herself.  She’s responsible for knowing what needs to be tested in her component, and it’s her job to ensure that her component can be tested.  It’s surprising how easy it is to have components that are essentially untestable, and those are usually the areas that have horrid bugs.

The second type of tester is the test developer.  A test developer is responsible for coding the component and system tests mentioned above.  A good test developer is a rare beast; it takes a special kind of mindset to be able to look at an API and noodle out how to break it.  Test developers also design and implement the test harnesses that are used to support the tests. For whatever reason, each team at Microsoft has their own pet favorite test harness, nobody has yet been able to come up with a single test harness that makes everyone happy, so teams tend to pick their own and run with it.  There are continuing efforts going on to at least rationalize the output of the test harnesses, but that’s an ongoing process.

The third type of tester is the test runner.  This sounds like a button-presser job, but it’s not.  Many of the best testers that I know of do nothing but run tests.  But their worth is in their ability to realize that something’s wrong and to function as the first line of defense in tracking down a bug.  Since the test runner is the first person to encounter a problem, they need to have a thorough understanding of how the system fits together so that (at a minimum) they can determine who to call in to look at a bug.

One of the things to keep in mind is that the skill sets for each of those jobs is different, and they are ALL necessary.  I’ve worked with test developers who don’t have the patience to sit there and install new builds and run.  Similarly, most of the developers I’ve known don’t have the patience to design thorough component tests (some have, and the ability to write tests for your component is one of the hallmarks of a good developer IMHO).

 

  • So what kind of tester am I?

    Larry: Hey I just finished this component and checked it in.
    Me: Cool! What's it do?
    Larry: Here, let me show you...
    Me: Why do you have a blue screen? Seems to me that your code doesn't work correctly.
    Larry: Hmmm, I must have something loaded that I shouldn't have. Let me reboot, and then I'll show you.
    Me: Okie.
    Larry: Wait, now my machine won't boot.
    Me: Ok, let me go get a coke and maybe it'll work while I'm gone.
    <returning later>
    Me: How goes it?
    Larry: Great, here it is up on the screen.
    Me <rounding the desk>: All I see is a blue screen.
    Larry: $#*)#&#

    The Wife - breaking everyone's code for the last 20 years. I call a company to order a product and their network goes down. I visit the bank to deposit a check and the terminal of the person who is helping me goes dead. I walk down the hallways and Microsoft and errors appear but magically disappear as I leave. When I was a tester, even the testing apps wouldn't run for me. All the subtle bugs happened only when I was around.

    And no, watches don't tend to run on me either. I must have a seriously magnetic personality. :-)
  • Dear, you're the kind of tester that causes developers to cringe in horror when you walk near :)

    And the kind of tester that gives a good name to the testing organization, because you find the stuff that our customers find but we never do.
  • LOL, you know I have the other type of personality, people come to me with problems in my code I walk up to look at it and the problem went away and can't be reproduced.

    On a more serious side Excellent post Larry. I got a question for you as I am always doing more agile development with unit tests and so on. But I am experimenting now more with test driven development where you write the test harnesses before you write the code to test. A different mindset and takes a little more thought in what can break something you haven't made yet.

    Any thought or opinions on test driven development?
  • Jeff, the simple answer is that I think that TDD is a fascinating idea, and it's got some merits.

    I personally don't believe that developing the test BEFORE writing the code is that important, but on the other hand, developing the test WHILE you're writing the code is a really cool idea.

    We're currently working with a TDD-style paradigm in our group, and so far I'm encouraged.
  • A dev whose testing consists of "it compiles, runs, doesn't break the build, and seems to do what I intended it to" is probably a hack that I never want to work with. A developer's testing has to go beyond "F7 F5 check it in" because it's the dev's job to deliver not just code, but *working* code.

    Rather than continue on with a full commentary, I refer the reader to _Writing Solid Code_, which is a great book that every programmer should read if they want to progress from "code generator" to "respected professional developer".
  • Note (since I know I'll get flamed if I don't clarify this) I wasn't flaming Mat Hall specifically. I was just reusing his quote.
  • Ya know, Mike - I totally forgot to put in the Writing Solid Code plug. You're absolutely right, it's an awesome resource, as is Code Complete
  • 9/23/2004 1:16 PM Larry Osterman

    > And the kind of tester that gives a good
    > name to the testing organization,

    That sounds like a theoretical possibility. It does sound like Microsoft's testing organization could get a good name by hiring a few hundred people just like her.

    > because you find the stuff that our
    > customers find but we never do.

    Actually quite a lot still waits for customers to find, and then MS reneges on warranties. If customers pay fees to submit bug reports then sometimes MS fixes them in subsequent versions of the products ... hmm yeah some get fixed in service packs too ... but some of them don't get fixed and customers still get left with entire destroyed hard disk partitions etc.

    By the way some customers use non-US versions of Microsoft products. Depending on what the bug is, sometimes it didn't get repaired at all (as mentioned above), but sometimes it got tested and repaired in the US version only. For example US Windows 2000 handles Japanese fonts in Japanese applications better than Japanese Windows 2000 does, for exactly this reason.
  • This is great
  • "Note (since I know I'll get flamed if I don't clarify this) I wasn't flaming Mat Hall specifically. I was just reusing his quote."

    No problem -- I was somewhat simplifying the issue, and left myself open for abuse. Obviously I had to do some sort of testing to make sure it wasn't going to go down in flames and did what I expected it to do, but I left the job of testing marginal/unusual use cases, stress testing, and running under a range of platforms/user types [the source of the original comment] to people who were much better at it than me.

    I think of it as being a bit like proofreading -- when I write something then I do my damndest to make sure it makes sense, is free of grammar and spelling errors, etc. However, it's almost inevitable that something's going to slip through the net as when I read it back I know what it was supposed to say and as a consequece tend to see what I meant instead of what I wrote. Having a fresh pair of eyes give it a once over (especially a pair trained to spot these kinds of things) is always helpful.

    Now I'm a one-man-shop for my entire department I no longer have this luxury, which on the one hand is a pain -- I don't particularly enjoy doing extensive testing -- but on the other hand I've found I tend to code a lot more carefully and think about the odd things the users are going to get up to before I even put finger to keyboard. Still, life would be much easier if users didn't insist on doing things you didn't expect them to do. ("It says 'Select a text file to process', so why on earth are you trying to open a PDF? Of *course* it crashes!")
  • So will Microsoft ever make PreFAST and PreFIX available to the industry at large? Besides the driver version of PreFAST, that is.

    I know there are similar tools available commercially from other vendors; but my general attitude is that if it's good enough for Microsoft, then...:)
  • JMW, I don't know the answer to that, it's a good question.
  • PREfast will be available in Visual Studio Team System(VSTS) next year, along with FXCop for managed code. For more information about VSTS see Rob Caron's blog (see the comment/trackback above you ;))

    In this testing-context the "Visual Studio 2005 Team Test Edition" is most interesting. Here is a link to it's homepage:
    http://lab.msdn.microsoft.com/vs2005/teamsystem/tester/default.aspx. From here you'll find links to more info as well.
Page 1 of 1 (15 items)