J.D. Meier's Blog

Software Engineering, Project Management, and Effectiveness

August, 2007

  • J.D. Meier's Blog

    New Release: patterns & practices Performance Testing Guidance for Web Applications


    We released the final version of our patterns & practices Performance Testing Guidance for Web Applications.  This guide provides an end-to-end approach for implementing performance testing. Whether you're new to performance testing or looking for ways to improve your current performance-testing approach, you will gain insights that you can tailor to your specific scenarios.  The main purpose of the guide is to be a relatively stable backdrop to capture, consolidate and share a methodology for performance testing.  Even though the topics addressed apply to other types of applications, we focused on explaining from a Web application perspective to maintain consistency and to be relevant to the majority of our anticipated readers.

    Key Changes Since Beta 1

    • Added forewords by Alberto Savoia and Rico Mariani.
    • Integrated more feedback and insights from customer reviews (particularly chapters 1-4, 9, 14, 18)
    • Integrated learnings from our Engineering Excellence team.
    • Refactored and revamped the performance testing types.
    • Revamped and improved the test execution chapter.
    • Revamped and improved the reporting chapter.
    • Revamped the stress testing chapter.
    • Released the guide in HTML pages on our CodePlex Wiki.


    • Learn the core activities of performance testing.
    • Learn the values and benefits associated with each type of performance testing.
    • Learn how to map performance testing to agile
    • Learn how to map performance testing to CMMI
    • Learn how to identify and capture performance requirements and testing objectives based on the perspectives of system users, business owners of the system, and the project team, in addition to compliance expectations and technological considerations.
    • Learn how to apply principles of effective reporting to performance test data.
    • Learn how to construct realistic workload models for Web applications based on expectations, documentation, observation, log files, and other data available prior to the release of the application to production.

    Why We Wrote the Guide

    • To consolidate real-world lessons learned around performance testing.
    • To present a roadmap for end-to-end performance testing.
    • To narrow the gap between state of the art and state of the practice.


    • Managing and conducting performance testing in both dynamic (e.g., Agile) and structured (e.g., CMMI) environments.
    • Performance testing, including load testing, stress testing, and other types of performance related testing.
    • Core activities of performance testing: identifying objectives, designing tests, executing tests, analyzing results, and reporting.

    Features of the Guide

    • Approach for performance testing.  The guide provides an approach that organizes performance testing into logical units to help you incrementally adopt performance testing throughout your application life cycle.
    • Principles and practices.  These serve as the foundation for the guide and provide a stable basis for recommendations. They also reflect successful approaches used in the field.
    • Processes and methodologies.  These provide steps for managing and conducting performance testing. For simplification and tangible results, they are broken down into activities with inputs, outputs, and steps. You can use the steps as a baseline or to help you evolve your own process.
    • Life cycle approach.  The guide provides end-to-end guidance on managing performance testing throughout your application life cycle, to reduce risk and lower total cost of ownership (TCO).
    • Modular.  Each chapter within the guide is designed to be read independently. You do not need to read the guide from beginning to end to benefit from it. Use the parts you need.
    • Holistic.  The guide is designed with the end in mind. If you do read the guide from beginning to end, it is organized to fit together in a comprehensive way. The guide, in its entirety, is better than the sum of its parts.
    • Subject matter expertise.  The guide exposes insight from various experts throughout Microsoft and from customers in the field.


    • Part 1, Introduction to Performance Testing
    • Part II, Exemplar Performance Testing Approaches
    • Part III, Identify the Test Environment
    • Part IV, Identify Performance Acceptance Criteria
    • Part V, Plan and Design Tests
    • Part VI, Execute Tests
    • Part VII, Analyze Results and Report
    • Part VIII, Performance-Testing Techniques


    • Chapter 1 – Fundamentals of Web Application Performance Testing
    • Chapter 2 – Types of Performance Testing
    • Chapter 3 – Risks Addressed Through Performance Testing
    • Chapter 4 – Web Application Performance Testing Core Activities
    • Chapter 5 – Coordinating Performance Testing with an Iteration-Based Process
    • Chapter 6 – Managing an Agile Performance Test Cycle
    • Chapter 7 – Managing the Performance Test Cycle in a Regulated (CMMI) Environment
    • Chapter 8 – Evaluating Systems to Increase Performance-Testing Effectiveness
    • Chapter 9 – Determining Performance Testing Objectives
    • Chapter 10 – Quantifying End-User Response Time Goals
    • Chapter 11 – Consolidating Various Types of Performance Acceptance Criteria
    • Chapter 12 – Modeling Application Usage
    • Chapter 13 – Determining Individual User Data and Variances
    • Chapter 14 – Test Execution
    • Chapter 15 – Key Mathematic Principles for Performance Testers
    • Chapter 16 – Performance Test Reporting Fundamentals
    • Chapter 17 – Load-Testing Web Applications
    • Chapter 18 – Stress-Testing Web Applications

    Our Team

    Contributors and Reviewers

    • External Contributors and Reviewers: Alberto Savoia; Ben Simo; Cem Kaner; Chris Loosley; Corey Goldberg; Dawn Haynes; Derek Mead; Karen N. Johnson; Mike Bonar; Pradeep Soundararajan; Richard Leeke; Roland Stens; Ross Collard; Steven Woody
    • Microsoft Contributors / Reviewers: Alan Ridlehoover; Clint Huffman; Edmund Wong; Ken Perilman; Larry Brader; Mark Tomlinson; Paul Williams; Pete Coupland; Rico Mariani

    My Related Posts

  • J.D. Meier's Blog

    Performance Threats


    Rico and I have long talked about performance threats.  I finally created a view that shows how you can think of performance issues in terms of vulnerabilities, threats and countermeasures.  See Performance Frame v2

    In this case, the vulnerabilities, threats and countermeasures are purely from a technical design standpoint.  To rationalize performance against other quality attributes and against goals and constraints, you can use performance modeling and threat modeling.  To put it another way, evaluate your design trade-offs against the acceptance criteria for your usage scenarios, considering your user, system, and business goals and constraints.

  • J.D. Meier's Blog

    Blog Improvements


    I did a few things to try and improve browsing and findability:

    I was surprised by how many of my posts related to productivity.  Then again, I focus heavily on productivity with my mentees.  I think personal productivity is an important tool for turning their great ideas, hopes, and dreams into results.  If it's not already their strength, I want to make sure it's a least not a liability. 

    On my Book Share blog, I changed themes, reorganized key features, and created a best of list.  While it may sound simple here, I actually went through quite a bit of trial and error.  I tested many, many user experience patterns and relied heavily on feedback from a trusted set of reviewers.  Although I used a satisficing strategy, I did try to make browsing the content as efficient and effective as possible.  I was surprised by how many subtle patterns and practices there are for blog layouts.  Maybe more surprising was how many anti-patterns there are.

  • J.D. Meier's Blog

    Cutting Questions


    How do you cut to the chase?  How do you clear the air of ambiguity and get to facts?  Ask cutting questions.

    My manager, Per , doesn't ask a lot of questions.  He asks the right ones.  Here's some examples:

    • Who's on board?  Who are five customers that stand behind you?
    • Next steps?
    • What does your gut say?
    • Is it working?  Is it effective?
    • What would "x" say? (for example, what would your peers say?)
    • What's their story?
    • Where's your prioritized list of scenarios?

    As simple as it sounds, having five separate customers stand behind you is a start.  I'm in the habbit of litmus checking my path early on to see who's on board or to find the resistance.  As customers get on board, my confidence goes up.  I've also seen this cutting question work well with startups. I've asked a few startups about their five customers.  Some had great ideas, but no customers on board.  The ones that had at least five are still around. 

    At the end of any meeting, Per never fails to ask "next steps?", and the meeting quickly shifts from talk to action.

    "Is it working?" is a pretty cutting question.  It's great because it forces you to step back and reflect on your results and consider a change in approach.

  • J.D. Meier's Blog

    Vision, Mission, Values


    There's a lot to be said for well-crafted vision and mission statements.   I've been researching and leaving a trail at The Bookshare.

    In a Nutshell

    • Mission - who are you? what do you do?
    • Vision - where do you want to go?
    • Values - what do you value? what's important? (your corporate culture)

    How Do You Craft Them

    1. You start by figuring out the values.  You figure out the values by observing how your organization prioritizes and how they spend their time.  There can be a gap between what folks say they value and what they actually do.  Actions speak louder than words.
    2. Once you know your culture and values, you can figure out your mission -- who you are and what you do.  What is your organization's unique value you bring to the table?  What is your unique strength?  In a world of survival of the fittest, this is important to know and to leverage.
    3. Now that you know who you are, you can figure out where you want to go.

    A good vision statement is a one-liner statement you can repeat in the halls.  Nobody has to memorize it.  It's easy to say and it's easy to groc.  The same goes for a mission statement.  You might need to add another line or two to your mission statement to disambiguate, but if folks don't quickly get what you do from your mission statement -- it's not working.

    How Do You Use Them

    • Use a mission statement to quickly tell others what you do.
    • Use a vision statement to inspire and rally the team.  It should be on the horizon, but achievable and believable.
    • Use a mission statement as a gauge for success. 
    • Set goals and objectives that tell you whether you're accomplishing your mission and moving toward or away from your vision.
    • Use your mission to remind you what you do (and what you don't) and to help you prioritize.
    • Craft a personal mission and vision statement to help you get clarity on what you want to accomplish.
    • Use your personal vision and mission statements to help you stay on your horse, or get back on, when you get knocked down, or lose your way.

    I'm a fan of using reference examples (lots of them) to get a sense of what works and what doesn't.  The Man on a Mission blog is dedicated to mission statements and has plenty of real-life examples to walk through. 

  • J.D. Meier's Blog

    Daily Syncs


    On my teams we do a daily sync meeting.  It's 10 minutes max.  We go around the team with three questions:

    1. What did you get done?
    2. What are you getting done next?
    3. Where do you need help?

    We stay out of details (that's for offline and follow-up).  It's a status meeting more on accomplishments and progress over reporting activities (lots of folks are doing lots of things, so it's crisper to focus on accomplishments.)  The more distributed the team, the more important the meeting.

    Keys to Results

    • 10 minute Timebox.  The 10 minute bar is apparently a big factor in how folks view the meeting, based on feedback from folks that have been in longer meetings (1/2 hr or more).  The 10 minute max is key because it keeps a fast pace and energy high (vs. another meeting of blah, blah, blah.)  We can always finish earlier (in fact, one of my teams was regularly finishing the meeting in under 2 minutes for a 5 person team).
    • Daily.  Daily is important.  Having them daily, means everybody can structure their day consistently.  Daily means it's also easy to build a routine and reduce friction points.  It also means that team members have a reliable forum for getting help if needed.

    The best pattern that has worked over time is ...

    • Mondays - we define the most important outcomes for the week (the few big things that matter, no laundry lists).  This is actually closer to a 1/2 hour (max) meeting.
    • Daily - we do a daily checkpoint meetings. (this is about execution, bottlenecks and awareness)
    • Fridays - we reflect on lessons learned and make any improvements to project practices.

    Another way of thinking about this is ... "if this were the end of the week, what would you feel good about having completed?"  "Each day, are we getting closer or further, or do we need to readjust priorities or expectations?" ...  "What did we learn and what can we improve?"

    My Related Posts

  • J.D. Meier's Blog

    Execution Checklists


    Execution checklists are a simple, but effective technique for improving results.  Rather than a to do list, it's a focused checklist of steps in sequence to execute a specific task.  I use notepad to start.  I write the steps.  On each execution of the steps, by myself or a teammate, we improve the steps as we learn.  We share our execution checklists in Groove or in a Wiki.

    Key Scenarios
    There's two main scenarios:

    1. You are planning the work to execute.  In this case, you're thinking through what you have to get done.  This is great when you feel over-burdened or if you have a mind-numbing, routine task that you need to get done.  This can help you avoid task saturation and it can also help you avoid silly mistakes while you're in execution mode.
    2. You are paving a path through the execution.  In this case, you're leaving a trail of what worked.  This works great for tasks that you'll have to perform more than once or you have to share best practices across the team.   

    I encourage my teams to create execution checklists for any friction points or sticking spots we hit.  For example, if there's a tough process with lots of movable parts, we capture the steps and tune them over time as we gain proficiency.  As simple as this sounds it's very effective whether it's for a personal task, a team task, or any execution steps you want to improve. 

    One of my most valuable execution checklists is steps for rebuilding my box.  While I could rebuild my box without it, I would fumble around a bit and probably forget some key things, and potentially get reminded the hard way.

    The most recent execution checklist I made was for building the PDF for our Team Development with Visual Studio Team Foundation Server guide.  There were a lot of manual steps and there was plenty of room for error.  Each time I made a build, I baked the lessons learned into the execution checklist.  By the time I got to the successful build, there was much less room for error simply by following the checklist.

  • J.D. Meier's Blog

    New Release: patterns & practices Team Development with Team Foundation Server Guide


    Today we release the final version of our patterns & practices: Team Development with Visual Studio Team Foundation Server.  It's our Microsoft playbook for Team Foundation Server.  It shows you how to make the most of the Team Foundation Server.  It's a compendium of proven practices, product team recommendations, and insights from the field.

    Key Changes Since Beta 1

    • We added guidelines for build, project management and reporting.
    • We added practices at a glance for build, project management, and reporting.
    • We added a chapter to summarize key Visual Studio 2008 changes.
    • We revamped our Internet access strategies.
    • We did a full sweep of the guide.
    • We completed more thorough product team reviews for key chapters.

    Contents at a Glance

    • Part I, Fundamentals
    • Part II, Source Control
    • Part III, Builds
    • Part IV, Large Project Considerations
    • Part V, Project Management
    • Part VI, Process Templates
    • Part VII, Reporting
    • Part VIII, Setting Up and Maintaining the Team Environment
    • Part IX, Visual Studio 2008 Team Foundation Server


    • Ch 01 – Introducing the Team Environment
    • Ch 02 – Team Foundation Server Architecture
    • Ch 03 – Structuring Projects and Solutions in Source Control
    • Ch 04 – Structuring Projects and Solutions in Team Foundation Source Control
    • Ch 05 – Defining Your Branching and Merging Strategy
    • Ch 06 – Managing Source Control Dependencies in Visual Studio Team System
    • Ch 07 – Team Build Explained
    • Ch 08 – Setting Up Continuous Integration with Team Build
    • Ch 09 – Setting Up Scheduled Builds with Team Build
    • Ch 10 – Large Project Considerations
    • Ch 11 – Project Management Explained
    • Ch 12 – Work Items Explained
    • Ch 13 – Process Templates Explained
    • Ch 14 – MSF for Agile Software Development Projects
    • Ch 15 – Reporting Explained
    • Ch 16 – Installation and Deployment
    • Ch 17 – Providing Internet Access to Team Foundation Server
    • Ch 18 – What’s New in Visual Studio 2008 Team Foundation Server

    Our Team

    Contributors and Reviewers

    • External Contributors / Reviewers: David P. Romig, Sr; Dennis Rea; Eugene Zakhareyev; Leon Langleyben; Martin Woodward; Michael Rummier; Miguel Mendoza; Mike Fourie; Quang Tran; Sarit Tamir; Tushar More; Vaughn Hughes
    • Microsoft Contributors / Reviewers:  Aaron Hallberg; Ahmed Salijee; Ajay Sudan; Ajoy Krishnamoorthy; Alan Ridlehoover; Alik Levin; Ameya Bhatawdekar; Bijan Javidi; Bill Essary; Brett Keown; Brian Harry; Brian Keller; Brian Moore; Buck Hodges; Burt Harris; Conor Morrison; David Caufield; David Lemphers; Doug Neumann; Edward Jezierski; Eric Blanchet; Eric Charran; Graham Barry; Gregg Boer; Grigori Melnik; Janet Williams Hepler; Jeff Beehler; Jose Parra; Julie MacAller; Ken Perilman; Lenny Fenster; Marc Kuperstein; Mario Rodriguez; Matthew Mitrik; Michael Puleio; Nobuyuki Akama; Paul Goring; Pete Coupland; Peter Provost; Granville (Randy) Miller; Rob Caron; Robert Horvick; Rohit Sharma; Ryley Taketa; Sajee Mathew; Siddharth Bhatia; Tom Hollander; Tom Marsh; Venky Veeraraghavan

    My Related Posts

  • J.D. Meier's Blog

    Improvement Frame


    As a mentor at work, I like to checkpoint results.  While I can do area-specific coaching, I tend to take a more holistic approach.  For me, it's more rewarding to find ways to unleash somebody's full potential and improve their overall effectiveness at Microsoft.  Aside from checking against specific goals, I use the following frame to gauge progress.

    Improvement Frame

    Area Prompts
    Thinking / Feeling
  • Do you find your work rewarding?
  • Are you passionate about what you do?
  • Are you spending more time feeling good?
  • What thoughts dominate your mind now?
  • Is your general outlook more positive or negative?
  • Do you have more energy or less in general?
  • Are you still worried about the same things?
  • Are you excited about anything?
  • Have you changed your self-talk from inner-critic to coach?
  • Situation
  • Are you spending more time working on what you enjoy?
  • What would you rather be spending more time doing?
  • Do you have the manager you want?
  • Do you have the job you want?
  • Are you moving toward or away from your career goals?
  • If your situation was never going to change, what one skill would you need to make the most of it?
  • Time / Task Management
  • Are you driving your day or being driven?
  • Are you spending less time on administration?
  • Are you getting your "MUSTs" done?
  • Are you dropping the ball on anything important?
  • Do you have a task management system you trust?
  • Are you avoiding using your head as a collection point?
  • How are you avoiding biting off more than you can chew?
  • How are you delivering incremental value?
  • Domain Knowledge
  • Have you learned new skills?
  • Have you sharpened your key strengths?
  • Have you reduced your key liabilities?
  • What are you the go-to person for?
  • What could you learn that would make your more valuable to your team?
  • Strategies / Approaches
  • What are you approaching differently than the past?
  • How are you more resourceful?
  • How are you finding lessons in everything you do?
  • How are you learning from everybody that you can?
  • How are you improving your effectiveness?
  • How are you modeling the success of others?
  • How are you tailoring advice to make it work for you?
  • Relationships
  • Are you managing up effectively?
  • Are your priorities in sync with your manager's?
  • Has your support network grown or shrunk?
  • How are you participating in new circles of influence?
  • How are you spending more time with people that catalyze you?
  • How are you working more effectively with people that drain you?
  • How are you leveraging more mentors and area specific coaches?
  • I've found this frame very effective for quickly finding areas that need work or to find sticking points.  It's also very revealing in terms of how much dramatic change there can be.  While situations or circumstances may not change much, I find that changes in strategies and approaches can have a profound impact.  My take on this is that while you can't always control what's on your plate, you can control how you eat it.

  • J.D. Meier's Blog



    I showed a colleague of mine one of my tricks for building slide decks faster.  It's a divide and conquer approach I've been using a few years.  I do what I call "one-sliders." 

    Whenever I build a deck, such as for milestone meetings, I create a set of single-slide decks.  I name each slide appropriately (vision, scope, budget, ... etc.)  I then compose the master deck from the slides.

    Here's the benefits that might not be obvious:

    • It's easy to jump to a particular slide without manipulating a heavy deck, which helps when I'm first building the deck.
    • It encourages quick focused reviews with the right people (e.g. I can pair with our CFO on the budget slide without hunting through a deck)
    • It encourages sharing with precision.  I share the relevant slide vs. "see slide 32" in a 60 slide deck.
    • I end up with a repository of reusable slide nuggets.  I find myself drawing from my "one-slider" depot regularly 
    • Doing a slide at a time, encourages thinking in great slides.  It's similar to thinking in great pages in a Wiki (a trick Ward taught me).

    The biggest impact though is that now I find myself frequently sharing concise one-sliders, and getting points across faster and simpler than blobby mails.

Page 1 of 1 (10 items)