After we released Pex recently, I came across a couple of interesting blog posts of people who tried out Pex, for example Ben Hall, Peter, and Stan. They all ran Pex on a small example. In this post, I want to run Pex on a more complicated piece of code: The .Net ResourceReader, and all the other classes which it uses under the hood.

The ResourceReader takes a stream of data, and splits it up into individual chunks that represent resources. In that sense, the ResourceReader is a big parser.

Writing a parameterized unit test

I started by creating a new C# test project in Microsoft Visual Studio, and adding a reference to Microsoft.Pex.Framework. (If you try this at home, you need to install Pex first, of course.)

Then I wrote a very simple parameterized unit test for the resource reader. It takes any (non-null) array of bytes, creates an UnmanagedMemoryStream with the bytes, and then decodes it with the ResourceReader. It's not a particular expressive test, since it doesn't contain any assertions. All this test is really saying is that, no matter what the test inputs are, the code shouldn't throw any exception. (No exception at all? I'll investigate this in more detail later.)

 
   1:      [PexClass, TestClass]
   2:      public partial class ResourceReaderTest
   3:      {
   4:          [PexMethod]
   5:          public unsafe void ReadEntries([PexAssumeNotNull]byte[] data)
   6:          {
   7:              fixed (byte* p = data)
   8:                  using (UnmanagedMemoryStream stream =
   9:                      new UnmanagedMemoryStream(p, data.Length))
  10:                  {
  11:                      ResourceReader reader = new ResourceReader(stream);
  12:                      foreach (var entry in reader) { /* just reading */ }
  13:                  }
  14:          }
  15:      }

Let's run Pex.

Run Pex Exploration

I can now let Pex analyze this parameterized unit test in-depth by right-clicking on the parameterized unit test, and selecting Run Pex Exploration.

image

(I am running our latest internal Pex bits, and we just renamed the "Pex It" menu item to "Run Pex Exploration". So if you just see a "Pex It" menu item, use that.)

Pex now generates test inputs. In a nutshell, Pex runs the code, monitors what it's doing, and uses a constraint solver to determine more relevant test inputs that will take the program along different execution paths.

That sounds great, but the result is a little bit disappointing: Pex reports only two test inputs: An empty array, and an array with one element that is zero.

image

(Again, I have to say that I am running our latest internal Pex bits. If you see a big black empty region in the lower right part of the Pex Results, titled with "dynamic coverage", then you can get rid of this waste of screen real-estate by pressing the button that I circled in red:

image

We won't show this "dynamic coverage" graph by default anymore. I might talk more about this feature in the future. And now, back to the ResourceReader.)

Pex tells us in bold letters on red ground that it encountered 3 Uninstrumented Methods.

What does that mean? Well, Pex performs a dynamic analysis, which means that it runs the code, and then Pex monitors what the code is actually doing. However, Pex doesn't monitor all code, but only code that Pex instruments. Pex doesn't instrument all code by default, since the instrumentation and the monitoring comes with a huge performance overhead. And often it is not necessary to monitor all the code that is running. For example, when you want to test your algorithm, and your algorithm happens to write its progress to the console, then you still only want to test your algorithm, but not all the supporting code that hides behind Console.WriteLine.

Configuring what Pex analyzes: Instrumentation settings

But back to the ResourceReader: Let's click on the 3 Uninstrumented Methods warning.

image

We see that Pex neither instrumented UnmanagedMemoryStream nor ResourceReader. No wonder that Pex' analysis didn't find any really interesting test inputs! Pex also didn't monitor what happened in a call to GC.SuppressFinalize, but this relates to the .NET garbage collector, and probably isn't so important. (In fact, Pex has a built-in list of classes which don't need to be monitored, and we might add the GC class to this list in the future.)

Let's select UnmanagedMemoryStream..ctor (".ctor" is the generic .NET name for constructors),

image

and click on Instrument type, and repeat for ResourceReader..ctor. For GC.SuppressFinalize, I click on Ignore uninstrumented method instead. Pex persists my choices as assembly-level attributes in a new file called PexAssembyInfo.cs in the Properties folder. It now looks like this:

   1:  using Microsoft.Pex.Framework.Instrumentation;
   2:  using System.IO;
   3:  using System.Resources;
   4:  using Microsoft.Pex.Framework.Suppression;
   5:  using System;
   6:   
   7:  [assembly: PexInstrumentType(typeof(UnmanagedMemoryStream))]
   8:  [assembly: PexInstrumentType(typeof(ResourceReader))]
   9:  [assembly: PexSuppressUninstrumentedMethodFromType(typeof(GC))]

I let Pex run again. You can re-run the last exploration by clicking on the Play-button:

image

Pex did a little bit more this time -- three test cases! And another 7 Uninstrumented Methods warnings...

image

What are the uninstrumented methods this time?

image

As you can see, Pex' analysis already proceeded much deeper into the code, and now Pex hits all kinds of other methods. Most of the methods sound relevant. So I repeat the game, and let Pex instrument all of the types. And then I let Pex run again. Boom! Pex still reports some uninstrumented methods (and lots of other things which I'll explain another time), but Pex also doesn't stop so quickly anymore. Instead, Pex is having a ball and it keeps producing more and more test cases:

image

(If you run this example yourself, the precise results you get will vary. There is some non-deterministic code involved in the analysis.)

Pex even managed to produce a valid resource file! (here, row 215 with the green checkmark)

When you double-click on any row, Pex shows the generated unit test code. Here is the valid resource file:

   1:          [TestMethod]
   2:          [PexGeneratedBy(typeof(ResourceReaderTest))]
   3:          public void ReadEntriesByte_20080604_134738_032()
   4:          {
   5:              byte[] bs0 = new byte[55];
   6:              bs0[0] = (byte)206;
   7:              bs0[1] = (byte)202;
   8:              bs0[2] = (byte)239;
   9:              bs0[3] = (byte)190;
  10:              bs0[7] = (byte)64;
  11:              bs0[12] = (byte)2;
  12:              bs0[20] = (byte)7;
  13:              bs0[24] = (byte)128;
  14:              bs0[25] = (byte)128;
  15:              bs0[32] = (byte)4;
  16:              this.ReadEntries(bs0);
  17:          }

The resource file is 55 bytes long, and most bytes are zero, the default value of the byte type. All values which are not zero were carefully chosen by Pex to pass all the file-validation code of the ResourceReader.

Configuring the test oracle: Allowed exceptions

All the other rows are tagged with an ugly red cross, indicating that the test failed. Did these tests really fail? Let's see. In all cases, the code either throws a ArgumentNullException, ArgumentException, BadImageFormatException, IOException, NotSupportedException, or a FormatException.

These exceptions don't sound so unreasonable, considering that Pex tried obviously ill-formed test inputs in the course of its exploration.

If you look around the documentation of the ResourceReader (here, here, and here), you will find that in fact most of these exceptions are documented somewhere (although the documentation could be more explicit, and it's missing NotSupportedException).

I can inform Pex that these exceptions are okay. The easiest way to do that is to select a failing test case in the table, and then click on Allow It. Again, my choice is persisted as attributes in PexAssemblyInfo.cs:

   1:  [assembly: PexAllowedExceptionFromAssembly(typeof(ArgumentNullException), "mscorlib")]
   2:  [assembly: PexAllowedExceptionFromAssembly(typeof(BadImageFormatException), "mscorlib")]
   3:  [assembly: PexAllowedExceptionFromAssembly(typeof(ArgumentException), "mscorlib")]
   4:  [assembly: PexAllowedExceptionFromAssembly(typeof(NotSupportedException), "mscorlib")]
   5:  [assembly: PexAllowedExceptionFromAssembly(typeof(IOException), "mscorlib")]
   6:  [assembly: PexAllowedExceptionFromAssembly(typeof(FormatException), "mscorlib")]

image

When you run Pex again, all the exceptions that are marked as allowed are now tagged with friendly green checkmarks, indicating that they passed.

But what is that? An OverflowException... (here, the red cross in row 98) Yet another undocumented exception...

What can we learn from this exercise?

You have seen how to configure code instrumentation settings, and how to tell Pex that certain exceptions are okay.

But I think there is a much more interesting insight: By using Pex, we discovered some unexpected ways how some complicated library code behaves. Library code that we build our applications on. Documentation will always be imperfect, and we have to test our applications to make sure that we everything works correctly together. By using a white-box testing tool like Pex, that tried many relevant test inputs and not just one exemplary test input, we discovered behaviors (here, exceptions) we were not aware of before!

Stay tuned for more

That's it for today. Another time I might talk about the constraints that Pex had to solve to create a valid resource file, how to get an idea about how much code analyzed, how to visualize that Pex makes progress, and how to write more expressive parameterized unit tests.

This posting is provided "AS IS" with no warranties, and confers no rights.