If you ever have the pleasure of working here at Microsoft you’ll find that there is certainly no shortage of passionate people who have great ideas and want to share them with you. I was recently introduced to such a team that was comprised of individuals that presented a new methodology based on the best aspects of currently established methodologies (e.g. MSF for Agile, Scrum, CMMI, Extreme/Pair Programming, etc.). Their new methodology is generically called Iterative Solution Development (ISD) and it’s an interesting take on shipping working software.

The Solution Development Center (SDC) is the group ultimately responsible for this team and can best be described as:

Enabling the sale and predictable delivery of custom application development solutions by providing the people, process and tools that best leverage Microsoft technologies using our software development expertise.”

While there are many aspects of this new approach which could take a week or more to discuss, there was one part (Converge to Confidence) that I found intriguing. This principal dealt specifically with estimation and how to determine hours and duration of engineering time more effectively.

The principal basically states that you must keep iterating until you’ve reached an acceptable Confidence Factor (or deviation ratio) within a certain risk tolerance (generally 92-95%). Any commitment to a faster schedule should be considered a risk but more importantly, this process should force discussions around scope, assumptions and risks that must be documented as part of the estimate.

Having seen people “lick their finger and stick it in the wind” to enter an estimation time, I always cringe when I execute a report to obtain these numbers. Granted, an estimate is just that, an estimate but seeing this group’s attitude about putting some context around how to improve the validity of estimating was refreshing.

The concept of role base assignments has been around for some time so this part of the process capitalizes on that and involves 5 roles, which are:

  • Expert
    • Knows most about the problem (scope)
    • Understands the solution (what)
  • Blind
    • Knows the solution components (how)
    • Does not need to know the specific problem
  • Dummy
    • Does not need to know the specific problem
    • Does not need to know the solution components
    • Must ask questions and force explanations
  • Another Expert
    • Any skills necessary or applicable to the problem or solution
  • Contributor
    • Any skills necessary or applicable to the problem or solution

 After assigning these roles and going through multiple rounds of refining the scope and starting again, you’ll end up with a chart similar to the following:

Ironically the “Dummy” role is the most essential role when going through this process. The “Dummy” should ask the most questions and help drive to detail those items that were discussed during each round.

After taking part in these estimation sessions, the worth that this process brings in my opinion is invaluable.

Although there are other methodologies available, cherry picking from them and using the best areas with customization can only benefit the ALM community as a whole. Some may say that having yet another methodology to potentially follow may fragment what has already been refined and in use but I would counter by saying that having more options than few and the ability to grow a process is a good thing. Therefore, I foresee this new methodology gaining traction and evolving over time precisely around the consulting aspect of application development as it lends itself quite easily to this area. The beauty of this methodology/practice is it’s adaptability as while there are certain procedures that need to be followed to successfully complete a project, most are general guidelines and can be applied and adjusted accordingly.