Software Engineering, Project Management, and Effectiveness
Yesterday's snowfall in Redmond was interesting for me. During my drive home, it was pretty dark, icy and cold. As I came up Old Redmond Road, I saw an object coming towards me, moving somewhat erratically, that looked too small to be a car.
It wasn't a car at all. It was a cross-country skier making his way down the middle of the road, followed by a trail of cars. I'm not sure at what point the street looked like good skiing and I don't know if he had an exit strategy, but he did seem to be having fun and going the speed limit.
I had my digital camera with me, but I forgot to use it. I was more focused on skating my car down the right side of the road. By the time I got home, I had a bunch of "mental snapshots" of various scenes along the way home, but nothing in hand to share.
That got me thinking of the MyLifeBits project. MyLifeBits is effectively software for "lifelogging" or archiving your life on disk. Although it's a bit extreme for me, there are times where I wish I automatically had more than just the mental snapshots.
When I last met with Rob Caron to walk him through Guidance Explorer, one of the concepts that peaked his interest was test-cases for content. He suggested I blog it, since it's not common practice and could benefit others. I agreed.
If you're an author or a reviewer, this technique may help you. You can create explicit test-cases for the content. Simply put, these are the "tests for success" for a given piece of content. Here's an example of a few test cases for a guideline:
Test Cases for Guidelines
What to Do
Additional Tests to Consider When Writing a Guideline
Benefits to Authors and ReviewersThe test-cases serve as checkpoints that help both authors and reviewers produce more effective guidance. While you probably implicitly ask many of these questions, making them explicit makes them a repeatable practice for yourself or others. I've found questions to be the best encapsulation of the test because they set the right frame of mind. If you're an author, you can start writing guidance by addressing the questions. If you're a reviewer, you can efficiently check for the most critical pieces of information. How much developer guidance exists that does not answer the why or when? Too much. As I sift through the guidance I've produced over the years, I can't believe how many times I've missed making the why or when explicit.
I'm a fan of the test-driven approach to guidance and here's my top reasons why:
Examples of Test Cases for GuidanceI've posted examples of our test-cases for guidance on Channel 9.
Today we published 238 new guidance items in Guidance Explorer. If you use the offline client, it should automatically synchronize to our online store.
We're in the process of performing a guidance sweep. The approach to the sweep is twofold:1. Make existing guidance available in Guidance Explorer.2. Identify user experience issues with the information models and tool design.
Benefits in GEMaking existing guidance available in Guidance Explorer involves re-factoring existing security guidance and performance guidance. The benefits of having the guidance available in Guidance Explorer include:
How We Improve Our GuidanceAn underlying strategy in GE was to help support users quickly hunt and gather relevant items rather than try and guess your context and what you need. In other words, it's a tool to help smart people versus a smart tool that might get in your way. This was actually an important decision because we had to pick a problem we knew we could help directly solve and add value.
The feedback from customers on existing guidance was that it was great stuff, but there were 3 key problems:1. it's a copy+paste exercise to grab just the guidance you need2. it's not atomic enough (monoliths over bite-sized chunks)3. many of the items, while they read well, were not actionable enough
That's why we took the following measures on our guidance:
As we port existing guidance to our updated schemas, we often find guidance items lacking key information such as why or how, or example code.
Guidance Explorer in PracticeWhat's been great so far is that some folks in the field have let me know how they've been using it for customer engagments. Apparently the ability to customize guidance has resonated very well. One consultant in particular has used Guidance Explorer for several engagements to save time and effort. He uses GE as a general purpose rules and guidelines store. He's also tailored guidelines and checklists for different audience levels (executive, development leads, architects, developers, PMs) and for different activities (design reviews, code reviews, and deployment reviews).
A few customers have let me know they are using the UNC share scenario to create guidance libraries for their team development. They told me they like the idea that it is like a simple typed-wiki that you can act on. The fact that they can create views and print out docs from the library has been the main appeal.
The other benefit that more customers are appreciating is the templates for guidelines and checklists. They like the fact that it starts to simplify authoring as well as sharing prescriptive guidance. For anybody who has authored guidelines or checklists, they know that it's challenging to write actionable guidance that can be reused. What we're sharing in Guidance Explorer is the benefit of experience and lessons learned over the years of producing resuable guidance for various audiences.
R&D ProjectAs a reminder and to keep things in perspective, Guidance Explorer is an R&D project. While there are immediately tangible benefits, the real focus is on the learnings around user experience so that patterns & practices can improve it's ability to author and share guidance, and to make progress on helping debottleneck the creation of prescritive guidance for the software industry.
FeedbackYou can send feedback on GE directly to the team at email@example.com
I've had to hunt down Practices Checker for .NET Framework 1.1 a few times, so now I'm posting it. It was an R&D project to help automate the search and discovery of potential coding practices and configuration settings that do not adhere to the ASP.Net 1.1 Performance checklist.
It may seem a bit after the fact, given it is .NET 1.1, but there were a few reasons for this: 1) our focus was more on testing how to codify our library of practices rather than a specific version; 2) we figured adding rules/versions would be easy once we understood the feasibility and work required; 3) our field was still performing code reviews for customers using .NET 1.1 so we could immediately test the impact.
What You Need to Know
Key Take AwaysThe take aways for me in this project were:
I'm continuing to explore various options to manage a library of building codes/practices/rules and then map out which tools can check these items, and where in the life cycle they should be checked. I've been informally referring to this problem as "policy verification through the life cycle."
I missed blogging this at time of release. The Security Toolbar for VS.NET 2005 was a short R&D project to connect developers to security guidance on MSDN from within VS.NET. The toolbar has direct links to our indexes for Security Engineering, threat modeling, how tos, and checklists.
While this first version is simply a set of links, it's a stepping stone to adding additional functionality. We wanted a way to deliver incremental functionality with a simple interface. The toolbar can be notified when there's new content or an new version of the toolbar itself.
The most interesting learning for me was the trade-offs in user experience in terms of a designing a toolbar:
In the end, we went with a hybrid model to take the best of the best: