"Which versions of a managed debugger (eg, Visual Studio) can debug which versions of the CLR? And How?"
This is the fundamental debugger versioning question.
Here is the "debugging stack" from a versioning perspective, including the protocols between each of the layers and the process boundaries:
So there are potentially 6 different components that could be versioned here!
Which CLR gets loaded?
The compiler produces a separate executable that will get executed in a separate process.The CLR version loaded is determined by shim / loader / config policy. The debugger is agnostic here, although it can certainly do things to influence that, such as laying down config files before launching the app. Since managed executables are IL opcodes and metadata, which are well specified, there are versioning options to compile an app for .NET Version X, but run it on .Net version Y. For example, an app compiled for .Net 1.1 could be run on .Net 2.0.
Which ICorDebug gets loaded? So once the CLR version (mscorwks.dll) is determined, the next question becomes: which mscordbi does the debugger load?
We choose to version the debugging at ICorDebug instead of the private protocol:- ICorDebug is already a public COM API, and already has a versioning story through things like QueryInterface.- This reduces test combinations. Allowing mixing and matching a mscorwks.dll with an arbitrary mscordbi.dll would produce an ever growing test matrix. If mscordbi.dll needed to be able to debug multiple versions of mscorwks, it would have ever increasing complexity. - This lets the CLR innovate freely on the private CLR debugging protocol. One advantage of that is that it lets us adjust the protocol chattiness.
The consequence of this is that mscordbi.dll must be picked to match the version of mscorwks.dll that is loaded. This is done when first creating the ICorDebug object via CreateDebuggingInterfaceFromVersion.
The debugger generally needs to be the latest version so that it can understand what it sees in the debuggee. For example, generics was added in V2. A V1.1 debugger would get very confused at seeing generics in a V2 app. Now there's always the debate about whether this confusion can be mitigated with "graceful degradation", generally by:- having the debugger just ignore what it doesn't understand (eg, just don't show generic methods in the callstack)- OR, finding a way to approximate V2 constructs with V1 constructs. Sometimes these techniques can work, but it's a very slippery slope.
This is why VS2003 can't debug .Net 2.0 apps.