As we are getting close to releasing VS whidbey, I was thinking about my contributions to this release and one of the major item that appeared in the list was testing C# edit and contine (here after EnC). This is one of the primary features I was responsible for testing in whidbey. Then I thought of writing a short summary about how we tested this feature. Below is the result of that exercise.

Brief history

Sometime in Jan 2004  when we were half way into the whidbey product cycle, decision was made to provide Edit and Continue feature for C# in whidbey. This was based on the overwhelming feedback from customers asking for EnC for C# and that it was the #1 feature request listed in MSDN feedback.

EnC is one of those features thats very simple to use but complicated when you get into the details of its design and implementation.It touched the core areas of VS - CLR, Compiler, Editor and debugger, making testing very challenging. However our initial understanding was that since VB already had support for this (in whidbey),  it was just a matter of making changes to C# compiler and C# editor to make this happen and the support in CLR and debugger already existed (and tested). Its another matter that this assumption turned out to be incorrect (atleast partially) in the end.

Intially the test team comprised of me and my collegue Daigo (he blogs in Japanese and can be found here). Santosh joined the team some time later.

Test strategy

Quality metric

We started of by coming up with the quality metric that would help us drive our testing efforct. These are the two primary attributues we wanted to target:


EnC should reliably succeed or fail.  If EnC does not succeed, the user should always be able to resume the original debugging session.  However, when the original debugging session is resumed, code changes will remain.


EnC should accurately reflect supported code changes in the new executable image, while maintaining the purity of the rest of the image.  The new image should be accurately reflected in the resumed debugging session.

Pieces making up EnC

After talking to developers about the different components making up this feature and the details about their  inner workings we came up with the following breakup list that made up EnC:

  • Rude edit detection: Feature that detects disallowed edits during EnC. This feature is implemented by C# editor
  • Local mapping: Feature that enables tracking local variables (moving, adding, removing locals) during EnC. This feature is implemented by compiler
  • UI interaction: All the UI elements associated with EnC such as displaying of rude edit squiglees, readonly markers , error dialogs, etc. This feature is implemented by editor and debugger.
  • IP remapping:  Feature that calculates and resets the instruction pointer(IP) to the next active statement during EnC. This feature is implemented by debugger.
  • Line mapping:  Feature that calculates and tracks the movements of statements in the program being debugged during EnC. This feature is implemented by debugger.


Once we had the list of sub features making up the overall EnC feature, our next task was to come up with the approach to actually test these pieces individually and also test the feature as a whole. Below list kind of summarized the approach:

  • Identifying the high risk areas: We focussed our initial testing on the new code added to support C# EnC. This mainly was rude edit detection, local mapping and line mapping. There were limited modifications to other pieces and was'nt very significant.
  • Data driven testing: We used this methodology which is ideal when you have a lot of different data values that you wish to exercise the feature with but where the sequence of steps to execute is pretty much identical. This required some upfront investment on comeup with the test framework but once its up and running, automating the tests was extremely fast.
  • Testabilty: We got our developers to provide testabiltiy hooks in the product that made testing higly productive and tests more reliable
  • Reuse existing infrastructure: For the common code, we reused the already written tests (which were already automated) for VB which helped us increase our coverage quickly.
  • Exploratory testing: Adoped directed exploratory testing to test the integration scenarios (ie. test EnC the way users would do). This helped us find some really good issues early on in the product cycle.
  • Stress testing: We invested good amount of time to come up with stress tool and do stress testing. This mainly comprised (among other things) of determining the upper limit on the number of edits that could be made in a single EnC session. This helped us identify some criticial performance bottlenecks in our implementation which got eventually fixed.
  • Effective use of Code coverage: We used code coverage regularly to identify the test holes and beef up our automation coverage

Lessons learnt

  • Using diverse methods of testing ('test styles') gives best results
  • Close and constant interaction with the developers helps to keep the crucial two way feedback going
  • Whitebox testing is extremely effective and helps is identifying the right 'targets' for testing
  • Reusing existing tools and infrastructure makes testing more productive
  • Testing != Automation. Its just one of the aspects of overall testing and not an end in itself.