The span of time between the time when a version of Windows is "finished" and when the next  version's planning and execution effort is both precious and fleeting.  It's often the time when I do all those projects I've been wanting to do but couldn't justify based upon the needs of the prior product.  This time around I basically hit upon a goal out of the blue, and had time to plan and execute a fairly substantial task in a very short amount of time.

Put another way, in the turmoil and tumult as people look to change jobs, work to influence future plans and all that, it's easy for a guy like me to become completely invisible and finally get something done for a change.

It's quite obvious that WDF has a lot of USB support built into it, and we need to test that.  To do so, we began with test based around the FX2 device, but that left a lot of cases we couldn't get at (multiple interfaces and alternate settings, for instance- also edge cases around pipe count and interface count).  We tried to supplement that with other devices, but they added quite a burden in additional programming time- there's never enough of that around, after all.

Also, economical as the FX2 is, testing with devices brings its hassles with it- cables break, devices get dropped, and spilled on, break down, and on and on.  Getting one on every machine in a lab, particularly one shared with hundreds or thousands of other people, can be a daunting management task.  But a software simulation of a device- it's there when you need it and gone when you don't.  If it breaks, you can examine it in as much detail as you can any of the software it is talking to- and with the same tools you already know how to use.

There's more I could say (and occasionally have), but yes, I find simulation attractive- I believe it also has cost benefits and efficiency gains- and you can defer the "well, better make sure it works with the real thing" task to an integration task that will rarely fail if your testing with simulated hardware is effective.  I can envision at this point someone who's focused on a specific device chiming in about how the inevitable timing differences will lead to problems that will cause that integration step to fail a lot.  It may well do so- for you, because you're focused on a different task than I am.  I want to make sure that when you call a KMDF or UMDF USB DDI, we send the correct IOCTLs and make the correct interface calls to the USB stack in a timely fashion, and that we handle the responses from those correctly.  I don't need to worry about whether the stack and what's behind it is faster or slower because of how it's implemented.  Now, I'm not saying I don't care whatsoever about such timings- but that class of errors are not my top or even second priority. My experience so far has been that when hardware-related issues do occur, it is rarely because we have a bug in the WDF code.  In the most obvious case that I can recall, the bug that occurred could have been caught with simulation, but the simulation tools lacked the features necessary to let us surface and expose the bug.

[That's a much longer lead-in to my primary idea than I'd originally envisioned- my editor's hat is obviously in need of more wearing.]

We began (during Win7, this was) with DSF for USB simulation- we had a functioning FX2 simulator, so I incorporated that into our standard test setup- in a fashion that also allowed you to have the real thing attached, in which case the real one would get used instead.  We found some DSF bugs- the USB code was already fairly stable, of course, but regression testing matters, too.  We then introduced several more simulators to cover some of those other cases- all of them basically imaginary devices- some of them non-functional other than reporting descriptor sets so we could poke at the code supporting interface selection and pipe counts and so on.

I can't explain why, but this year I decided to stop using DSF for this.  Hence I had to do a quickie build-it-yourself USB simulation framework and build a half dozen device simulators in the space of a month or so- and while I wish that would be a man-month, it was going to be more like a couple of man-weeks.  I largely succeeded, so now I'm consuming cloud space and possibly your time by talking about it.

The initial choices

  • Simulators would run in the kernel- test apps could always use IOCTLs or other means to control them
  • I would use my earlier hardware simulation concept- the device simulator would report a PDO with the device IDs, and that PDO would look and act like a PDO reported by "USB" (rolling bus and hub and controller into a single glob).
  • Simulators would be KMDF drivers, but:
    • I would build a library with the core function, including driver entry point and device management, etc.
    • Simulators would be responsible only for describing their descriptors to the core and managing their pipes- they could determine interfaces selected and so on to do that, but they didn't need to be involved directly.
    • I would use C++ and inheritable polymorphic classes to glue the two together.
    • That set of assumptions resembled a native code DSF USB simulated device's construction closely enough that I could borrow the basics and maybe even some code quickly when porting the simulators- I could work on the core function once, and it would work for all of them.  If I did it correctly, anyway.
  • I would rely upon MSDN and public samples and headers- I wasn't going to dive into confidential code to get the job done- if I did, I'd have to assume I could never talk about what I was doing...

Oh well, I seem to have run out of time, and given a choice between four-day weekend and finishing this, time away from work won easily.  Who knows, I may even write another installment sometime.  Even more unlikely, someone might actually notice that I did or did not.

Ahh well, I'll leave by pointing to an odd attempt of mine to be jazzy (yes, the volume's pretty low- I'm not claiming to be much of a recording engineer, either):