Mike Swanson

  • Mike Swanson's Blog

    Code Review and Complexity

    • 35 Comments

    For the past year-and-a-half, I have helped manage the development team responsible for the NxOpinion diagnostic software. Although the methodology we're using for the project isn't 100% Agile, we have borrowed and blended a number of Agile tenets that have afforded us many benefits (Boehm & Turner's Balancing Agility and Discipline is a good book about effectively balancing traditional and Agile methodologies). We are using two techniques that aren't normally talked about when discussing Agile software development: formal code review and code metrics. A recent event prompted me to write this article about how we relate these two techniques on the NxOpinion project.

    Code Review

    One of the practices of eXtreme Programming (or "XP", an instance of Agile software development) is pair-programming, the concept that two people physically work side-by-side at a single computer. The idea is that by having two people work on the same logic, one can type the code while the other watches for errors and possible improvements. In a properly functioning XP pair, partners change frequently (although I've heard of many projects where "pair-programming" means two people are stuck together for the entire length of the project...definitely not XP's concept of pair-programming). Not only does this pairing directly influence code quality, but the constantly changing membership naturally has the effect of distributing project knowledge throughout the entire development team. The goal of pair-programming is not to make everyone an expert in all specialties, but the practice does teach everyone who the "go to" people are.

    Advocates of XP will often argue that pair-programming eliminates the need for formal code review because the code is reviewed as it is being written. Although I do believe that there is some truth to this, I think it also misses out on some key points. On the NxOpinion project, we have a set of documented coding standards (based on Microsoft's Design Guidelines for Class Library Developers) that we expect the development team to adhere to. Coding standards are part of the XP process, but in my experience, just because something is documented doesn't necessarily mean that it will be respected and followed. We use our formal code review process to help educate the team about our standards and help them gain a respect for why those standards exist. After a few meetings, this is something that can usually be automated through the use of tools, and having code pass a standards check before a review is scheduled is a good requirement. Of course, the primary reason we formally review code is to subjectively comment on other possible ways to accomplish the same functionality, simplify its logic, or identify candidates for refactoring.

    Because we write comprehensive unit tests, a lot of the time that we would traditionally spend reviewing proper functionality is no longer necessary. Instead, we focus on improving the functionality of code that has already been shown to work. Compared to a more traditional approach, we do not require all code to be formally reviewed before it is integrated into the system (frankly, XP's notion of collective code ownership would make this notion unrealistic). So, since we believe that there are benefits of a formal code review process, but we don't need to spend the time reviewing everything in the system, how do we decide what we formally review?

    There are two key areas that we focus on when choosing code for review:

    • Functionality that is important to the proper operation of the system (e.g. core frameworks, unique algorithms, performance-critical code, etc.).
    • Code that has a high complexity

    As an example, for the NxOpinion applications, most of our data types inherit from a base type that provides a lot of common functionality. Because of its placement in the hierarchy, it is important that our base type functions in a consistent, reliable, and expected manner. Likewise, the inference algorithms that drive the medical diagnostics must work properly and without error. These are two good examples of core functionality that is required for correct system operation. For other code, we rely on code complexity measurements.

    Code Complexity

    Every day at 5:00pm, an automated process checks out all current source code for the NxOpinion project and calculates its metrics. These metrics are stored as checkpoints that each represent a snapshot of the project at a given point in time. In addition to trending, we use the metrics to gauge our team productivity. They can also be used as a historical record to help improve future estimates. Related to the current discussion, we closely watch our maximum code complexity measurement.

    In 1976, Tom McCabe published a paper arguing that code complexity is defined by its control flow. Since that time, others have identified different ways of measuring complexity (e.g. data complexity, module coupling, algorithmic complexity, calls-to and called-by, etc.). Although these other methods are effective in the right context, it seems to be generally accepted that control flow is one of the most useful measurements of complexity, and high complexity scores have been shown to be a strong indicator of low reliability and frequent errors.

    The Cyclomatic Complexity computation that we use on the NxOpinion project is based on Tom McCabe's work and is defined in Steve McConnell's book, Code Complete on page 395 (a second edition of Steve's excellent book has just become available):

    • Start with 1 for the straight path through the routine
    • Add 1 for each of the following keywords or their equivalents: if, while, repeat, for, and, or
    • Add 1 for each case in a case statement

    So, if we have this C# example:

        while (nextPage != true)

        {

            if ((lineCount <= linesPerPage) && (status != Status.Cancelled) && (morePages == true))

            {

                // ...

            }

        }

     

    In the code above, we start with 1 for the routine, add 1 for the while, add 1 for the if, and add 1 for each && for a total calculated complexity of 5. Anything with a greater complexity than 10 or so is an excellent candidate for simplification and refactoring. Minimizing complexity is a great goal for writing high-quality, maintainable code.

    Some advantages of McCabe's Cyclomatic Complexity include:

    • It is very easy to compute, as illustrated in the example
    • Unlike other complexity measurements, it can be computed immediately in the development lifecycle (which makes it Agile-friendly)
    • It provides a good indicator of the ease of code maintenance
    • It can help focus testing efforts
    • It makes it easy to find complex code for formal review

    It is important to note that a high complexity score does not automatically mean that code is bad. However, it does highlight areas of the code that have the potential for error. The more complex a method is, the more likely it is to contain errors, and the more difficult it is to completely test.

    A Practical Example

    Recently, I was reviewing our NxOpinion code complexity measurements to determine what to include in an upcoming code review. Without divulging all of the details, the graph of our maximum complexity metric looked like this:

    As you can plainly see, the "towering monolith" in the center of the graph represents a huge increase in complexity (it was this graph that inspired this article). Fortunately for our team, this is an abnormal occurrence, but it made it very easy for me to identify the code for our next formal review.

    Upon closer inspection, the culprit of this high measurement was a method that we use to parse mathematical expressions. Similar to other parsing code I've seen in the past, it was cluttered with a lot of conditional logic (ifs and cases). After a very productive code review meeting that produced many good suggestions, the original author of this method was able to re-approach the problem, simplify the design, and refactor a good portion of the logic. As represented in the graph, the complexity measurement for the parsing code decreased considerably. As a result, it was easier to test the expression feature, and we are much more comfortable about the maintenance and stability of its code.

    Conclusion

    Hopefully, I've been able to illustrate that formal code review coupled with complexity measurements provide a very compelling technique for quality improvement, and it is something that can easily be adopted by an Agile team. So, what can you do to implement this technique for your project?

    1. Find a tool that computes code metrics (specifically complexity) for your language and toolset
    2. Schedule the tool so that it automatically runs and captures metrics every day
    3. Use the code complexity measurement to help identify candidates for formal code review
    4. Capture the results of the code review and monitor their follow-up (too many teams forget about the follow-up)

    Good luck, and don't forget to let me know if this works for you and your team!

    References

    Boehm, Barry and Turner, Richard. 2003. Balancing Agility and Discipline: A Guide for the Perplexed. Boston: Addison-Wesley.
    Extreme Programming. 2003 <http://www.extremeprogramming.org/>
    Fowler, Martin. 1999. Refactoring: Improving the Design of Existing Code. Boston: Addison-Wesley.
    McCabe, Tom. 1976. "A Complexity Measure." IEEE Transactions on Software Engineering, SE-2, no. 4 (December): 308-20.
    McConnell, Steve. 1993. Code Complete. Redmond: Microsoft Press.
    Martin, Robert C. 2002. Agile Software Development: Principles, Patterns, and Practices. Upper Saddle River, New Jersey: Prentice Hall.

  • Mike Swanson's Blog

    Cardamom Bread and Pepparkakor Cookies

    • 35 Comments

    I thought about titling this post, Man Found Dead with Cardamom Bread Recipe Stuffed in His Mouth. However, after considering the situation realistically for a moment, I realized that my grandparents probably wouldn't knock me off for sharing two of our secret Swedish holiday recipes. But, if I turn up missing, you know who to look for! :-)

    Every year around the holidays, I look forward to these two tasty treats. I've eaten cardamom bread and papparkakor cookies during Christmastime for as long as I can remember. The smell of either of them baking immediately brings back warm memories for me. About 10 years ago, I asked my grandparents for these recipes, and they were kind enough to provide them. I'm posting both of them here so that others can enjoy their fantastic flavor.

    Cardamom Bread

    Here's the recipe that I use for cardamom bread. The original recipe is the version that I received from my grandparents. The modified recipe is my own conversion for use with a bread machine. I've made this recipe tens of times, and I'm always pleased with the results.

    Original Recipe         Modified for Bread Machine
    ¾ cup milk   ½ cup milk
    ¼ cup butter   3 tablespoons butter
    1 egg   1 egg
    1/3 cup sugar   ¼ cup sugar
    ½ teaspoon salt   ½ teaspoon salt
    3 cups flour   2¼ cups flour (bread flour)
    2 teaspoons yeast   1½ teaspoons active dry yeast
    1½ - 2 teaspoons cardamom   1½ teaspoons cardamom
    (3 teaspoons if using powder)
    1. Microwave milk and butter for approximately 50 seconds
    2. Make dough using all ingredients (manually or with bread machine on "dough" mode)
    3. Divide into 3 rolled strips, cover with cloth, and allow to rest for 10 minutes
    4. Braid dough and top with light sugar coating (not included in above ingredients)
    5. Allow to rise for 40-50 minutes under plastic wrap
    6. Bake at 350 for 18-20 minutes
    7. Remove from oven and place on cooling rack

     Cardamom bread is good toasted or plain, buttered or not. If you're like me, you won't be able to limit yourself to just one or two slices. :-)

    Pepparkakor Cookies

    Done properly, pepparkakor cookies (a Swedish twist on ginger cookies) are relatively thin and crisp. Pepparkakor cookie dough is my favorite, with chocolate chip cookie dough coming in a close second (at least the Nestlé Toll House recipe). Yeah, I know...raw eggs, Salmonella, etc. Call me crazy, but kids and adults have been eating raw cookie dough since the dawn of time (okay...maybe not quite that long), and as far as I know, kids aren't keeling over in the kitchen. But hey, I'm no doctor, so proceed at your own risk.

    1 cup butter
    1 egg
    1 cup white sugar
    ½ teaspoon salt
    1 teaspoon ginger
    2 tablespoons milk
    3 tablespoons molasses (I prefer the “Dark Full Flavor” kind)
    2 teaspoons baking soda
    3 cups flour
    2 teaspoons cinnamon
    1. Cream butter with sugar
    2. Add egg, milk, and molasses
    3. Mix everything else in
    4. Refrigerate the dough overnight
    5. Roll dough onto flowered surface until approximately 1/8" thick, and cut into shapes
    6. Bake at 350 degrees until done (approximately 8 minutes)

    If you end up making either of these recipes, or if you have similar recipes that you'd like to share, please leave feedback. I'm very curious to hear what you think!

  • Mike Swanson's Blog

    File Formats for Conversion to XAML

    • 34 Comments

    As WPF application development continues to pick up steam, it's becoming more and more important to consider which file formats make the most sense for conversion to XAML. Just based on discussions I've had with many of you regarding my Illustrator export plug-in, I know that there are other formats out there that—due to lack of tool support—are very difficult or next-to-impossible to convert to XAML.

    So, I'd like to identify a list of "top x" file formats that would help ease the pain for both WPF designers and developers. What tool does your company use to create 2D or 3D content? What file format(s) does it make the most sense to convert? If you don't have an opinion (possibly because you don't work with those tools), please forward this to your design staff. I'm very interested in feedback.

  • Mike Swanson's Blog

    Interviewing at Microsoft

    • 33 Comments

    I am frequently asked about the interview process at Microsoft, and although I’m usually more than happy to relate my individual story and provide some general tips, I can’t provide nearly the insight that two of our recruiters, Gretchen Ledgard, and Zoë Goldring (both responsible for the JobsBlog) provide in this 20 minute Channel 9 video. You’ll hear them talk about dress code, pre-interview tips, the actual interview day, logic questions, coding questions, what we’re looking for in people, whether or not you need a degree, etc. This is the first of at least two video segments to be published, so expect a second part sometime soon. On a related note, Chris Sells maintains a page about Interviewing at Microsoft that is worth reading. And for those who haven’t heard my Microsoft interview story, read on…

    I always thought that I would either work for myself or for Microsoft. After being an independent consultant for many years, creating the industry’s first uninstall application, the first nationwide movie showtime web site, working at Donnelly Corporation for exactly a year (to the minute), and lots of other mildly interesting things, I decided that it was time to send in a resume. I knew that Microsoft received thousands of resumes each day (according to Gretchen, we now receive around 6,000 per day), so I knew that I had to do something that would show my passion for the company, my “out of the box” thinking, and grab their attention.

    You know those life sized celebrity cardboard cutouts that adorn the occasional geek office? I decided to build a cardboard cutout of me and send it along with my resume as the “model Microsoft employee.” To figure out how large it could be, I visited the FedEx office and asked for the maximum dimensions for something shipped next-day air. Although I don’t recall the exact numbers, it was something like 170 inches for combined length and girth. Not only did I want it to be as close to actual size as possible, but I wanted it to make a splash when it was delivered to the HR department in Redmond. After all, how many next-day air packages have you received that were much bigger than a standard letter?

    So, I had some professional photographs taken of me holding a mouse and keyboard and had it professionally printed and mounted on foam core. Using a cardboard celebrity cutout that I purchased as a template, I proceeded to remove the extra foam core and create the folding flaps that would allow it to stand on its own. Then, I created an advertising slick sheet that would accompany my package that explained the model Microsoft employee and how I was obviously a perfect fit.

    Of course, I still spent a lot of time polishing my resume, and it served as the “meat” of my job application. The cardboard cutout was simply a way to get noticed among the thousands of resumes that are received by the company each day. After sandwiching everything between two sides of a cardboard refrigerator box and carefully taping around the edges, I managed to squeeze it in my car and take it down to the local FedEx office. The FedEx employee that helped me was fascinated by the story behind the contents of my package and proceeded to measure the length and girth with a small chain he kept behind the counter. Boy, was it close. I had neglected to consider the thickness that all of that foam core and cardboard would add to the measurement, and I barely squeezed by. Whew!

    About two weeks of stomach churning passed before I finally received a letter from Microsoft in my mailbox. It was a personal note from the Vice President of Human Resources, and he was writing to say that in all of his years at Microsoft, he had never seen a resume quite like mine. He was impressed that I was able to make my job application stand out (no pun intended), and apparently, it was the talk of the whole department (he had it standing outside of his office). He went on to say that my resume would be added to the database and considered for all open positions (I hadn’t applied for a specific job code). He wished me the best of luck and hoped to see me as a future employee.

    I never heard anything else from Microsoft about that resume, and it didn’t end up getting me a job (surprise, surprise). However, I was proud of the fact that I had tried, and I cherished the letter that I received as a response. Unfortunately, I can’t seem to find that letter, or I’d post a copy of it right here. If I do manage to dig it up, I’ll be sure to update this post.

    So, there must be another interview story, right? I mean, I do work for Microsoft, so there has to be more! Of course there is, but it’s not quite as interesting as this one, and it has a much better outcome. The funny thing is, a lot of people have confused this story with the fact that I work for the company, so I often hear that “you’re the guy who sent the cardboard cutout to get the job, right?” That’s when I smile and proceed to tell my story.

    Update: I was able to find the photo I used to produce the cutout...I look pretty silly. I haven't found the letter yet, but I think I know where it is.

    Update #2: I found the letter! I'll scan it and add it to this post later tonight.

    Update #3: I've added the scanned letter below.

  • Mike Swanson's Blog

    PDC05 Sessions Online

    • 31 Comments

    If you were unable to attend the Microsoft Professional Developers Conference 2005 (PDC05) in Los Angeles this year, never fear; 209 breakout sessions, panels, and symposia are now available online. Each session includes a video of the presenter, a navigable index of the content, the PowerPoint presentation itself, and video of any demos. We'll be hosting this content for free, for anyone, for six full months.

    And, due to popular blogger and e-mail demand, you can also download each session individually for offline viewing. Just click the Download Presentation link that appears beneath the session information (full session zip files average around 150MB each).

    All of the sessions include downloadable PowerPoint presentations, and many of them also include materials (code samples, Visual Studio solutions, papers, etc.).

    If the thought of spending 46 full days downloading almost 27GB of content via a 56K dial-up connection doesn’t sound appealing, you can order the 4 dual-layer 8.5GB DVD set here (attendee price: $199, non-attendee price: $499). As a bonus, the DVD set also includes nearly 4GB of Channel 9 video content that was produced for PDC05. All attendees will automatically receive the DVD set, and it is expected to begin shipping in early November.

    In case you missed my earlier post, you can right-click on the speaker video, choose Play Speed, then Fast to save yourself some time by watching the presentation at a higher speed. Since there’s over 250 hours of content, this can be a big time-saver.

    Enjoy!

Page 3 of 74 (369 items) 12345»