Steve Lange @ Work

Steve Lange's thoughts on application lifecycle management, Visual Studio, and Team Foundation Server

  • Steve Lange @ Work

    My 2 cents on Areas and Iterations in Team Foundation Server

    • 27 Comments

    There’s not a huge amount of best practice info out there regarding areas and iterations.  One interesting place to look at is a blog post that describes how the Visual Studio team uses them (http://blogs.msdn.com/ericlee/archive/2006/08/09/when-to-use-team-projects.aspx)

     

    So here are my 2 cents (you can see how much that's worth these days!) on Areas and Iterations. 

     

    Areas

    To me, areas are ways of tagging or organizing objects within a Team Project.  Typically, areas are used to define either logical, physical, or functional boundaries.  It’s a way to slice and dice a normally large project effort into more manageable, reportable, and easily identifiable pieces. 

     

    For example, let’s say we have a tiered web application managed in a single TFS project called “MySite”.  There are 3 major components to this app:  the web site, a web service, and a database.  If this is a decent-sized application, you might have 1,200 tasks in the system for this project.  But how do you know to which component a given task belongs?  What if I only wanted to see tasks for the web service piece?  Areas are a convenient way to handle this.  Set up areas like this:

     

    MySite

       \Web Site

       \Web Service

       \Database

     

    Now you can specify an area of assignment for each task (work item), making it easy to effectively filter what you want to look at/work on.  You can use areas in both queries and reports as well.

     

    You may optionally want to further dissect those major components to be even more specific:

     

    MySite

       \Web Site

          \Layout & Design

          \Navigation

          \Pages

             \Contact Us

             \Homepage

             \Links

       \Web Service

          \Performance

          \Security

       \Database

          \Performance

          \Security

          \Schema

     

    One final aspect of Areas to consider is security.  You can set security options on each Area node which can dictate not only who can change the areas, but also who can view or edit work items in a particular Area.

     

    Iterations

    So if you think of Areas as slicing and dicing by “space”, think of Iterations as slicing and dicing by “time”.  Iterations are like “phases” of a lifecycle, which can dissect the timeline of a project effort into more manageable time-based pieces. 

     

    So going back to the “MySite” example, say the project management team wants to split the entire project into 3 cycles, Phase 1, Phase 2, and Phase 3.  Thus, your Iterations can mirror that:

     

    \MySite

       \Phase 1

       \Phase 2

       \Phase 3

     

    These Iterations can be phases within the entire life of a project, or phases within a given release of a project.  So if “MySite” is going to have multiple releases over time, my Iterations might look lik this

     

    \MySite

       \Release 1.0

          \Phase 1

          \Phase 2

          \Phase 3

       \Release 2.0

          \Phase 1

          \Phase 2

          \Phase 3

     

    Now you have categorization options for both space and time (now if only we had a continuum..) for your project, allowing you to assign your tasks or other work items not only to the appropriate functional area (Area), but also to the phase (time cycle) of the project.

  • Steve Lange @ Work

    VS/TFS 2012 Tidbits: Requesting a Code Review on Code Already Checked in

    • 18 Comments

    As the Visual Studio family of products (Visual Studio, TFS, Test Professional) nears its 2012 release, I thought I’d bring some short hits – tidbits, if you will – to my blog. Some of these are pretty obvious (well-documented, or much-discussed), but some may be less obvious than you’d think. Either way, it’s always good to make sure the word is getting out there. Hope you enjoy!

    Requesting a Code Review on Code Already Checked in

    There’s been great hype about the new built-in code review capabilities in TFS 2012, and for good reason. The process is easy, effective, and most of all, audited.

    image

    But did you know that “My Work” is not the only place from where you can kick of a code review?  You can also do a review on code that’s already been checked in. Go to the file in Source Control Explorer, then view its history. In the History window, right-click on the changeset/revision and select “Request Review”.

    image

    This will load up the New Code Review form in Team Explorer:

    image

    Notice that it not only brings in the files from the changeset (5 of them, in this example), but also any work items that were related to this changeset as well.  The check-in comments are used to populate the title of the code review, as well as the optional description.

    Off ya go!

  • Steve Lange @ Work

    It’s Official: VS 2010 Branding & Pricing

    • 12 Comments

    Microsoft just announced final branding and pricing for the Visual Studio 2010 lineup!  Here’s what it looks like (you can call this either the stadium or Lego view):

    vsts_r7_2_screen

    BRANDING

    There are three minor changes to product names, listed below:

    Old Name New Name

    Microsoft Visual Studio Test Elements 2010

    Microsoft Visual Studio Test Professional 2010

    Microsoft Visual Studio Team Lab Management 2010

    Microsoft Visual Studio Lab Management 2010

    Microsoft Test and Lab Manager*

    Microsoft Test Manager 2010*

    * Not available as a separate product for purchase.

    PRICING

    Below is the suggested pricing (USD) for each of the 2010 products.

    With 1-yr MSDN Subscription
    Product Buy Upgrade Buy Renew
    Visual Studio 2010 Ultimate - - $11,899 $3,799
    Visual Studio 2010 Premium - - $5,469 $2,299
    Visual Studio 2010 Professional $799 $549 $1,199 >$799
    Visual Studio Test Professional 2010 - - $2,169 $899
    Visual Studio Team Foundation Server 2010 $499 $399 - -
    Visual Studio Team Foundation Server 2010 CAL $499 - - -
    Visual Studio Load Test Virtual User Pack 2010 (1000 Virtual Users) $4,499 - - -

    * Subscription contents vary by purchased product.

    A couple things to note:

    • TFS 2010 and a TFS 2010 CAL are included with every MSDN subscription
    • The above prices are suggested list price.  Companies buying development tools licenses usually go through volume licensing which usually result in lower prices.

    Not sure what product has what?

    Visual Studio 2010 lineup - from the Rangers 2010 Quick Reference Guide

    Here’s another angle:

    Visual Studio 2010 lineup 

    For more details on each feature, you can view a matrix here.

  • Steve Lange @ Work

    Panels vs. Context: A Tale of Two Visual Studios and a Practical Explanation of the Value of CodeLens

    • 11 Comments

    If you have Visual Studio 2013 Ultimate, you know CodeLens is amazing.  If you don’t know what CodeLens is, I hope this helps.  I have a lot of customers who ask me about CodeLens, what it is, and how valuable I think it is for an organization.  Here’s my response.

    It’s really a tale of two Visual Studios, if you think about.

    A Visual Studio Full of Panels

    Let’s say you’re looking at a code file, specifically a method.  Your Visual Studio environment may look like this:

    image

    I’m looking at the second Create method (the one that takes a Customer).  If I want to know where this method may be referenced, I can “Find All References”, either by selecting it from the context menu, or using Shift + F12. Now I have this:

    image

    Great!  Now, if I decide to change this code, will it will work?  Will my tests still work?  In order for me to figure that out, I need open my Test Explorer window.

    image

    Which gives me a slightly more cluttered VS environment:

    image

    (Now I can see my tests, but I still need to try and identify which tests actually exercise my method.)

    Another great point of context to have is knowing if I’m looking at the latest version of my code.  I’d hate to make changes to an out-of-date version and grant myself a merge condition.  So next I need to see the history of the file.

    image

    Cluttering my environment even more (because I don’t want to take my eyes of my code, I need to snap it somewhere else), I get this:

    image

    Okay, time out. 

    Yes, this looks pretty cluttered, but I can organize my panels better, right?  I can move some panels to a second monitor if I want, right?  Right on both counts.  By doing so, I can get a multi-faceted view of the code I’m looking at.  However, what if I start looking at another method, or another file?  The “context” of those other panels don’t follow what I’m doing.  Therefore, if I open the EmployeesController.cs file, my “views” are out of sync!

    image

    That’s not fun.

    A Visual Studio Full of Context

    So all of the above illustrates two main benefits of something like CodeLens.  CodeLens inserts easy, powerful, at-a-glance context for the code your looking at.  If it’s not turned on, do so in Options:

    image

    While you’re there, look at all the information it’s going to give you!

    Once you’ve enabled CodeLens, let’s reset to the top of our scenario and see what we have:

    image

    Notice an “overlay” line of text above each method.  That’s CodeLens in action. Each piece of information is called a CodeLens Indicator, and provides specific contextual information about the code you’re looking at.  Let’s look more closely.

    image

    References

    image

    References shows you exactly that – references to this method of code.  Click on that indicator and you can see and do some terrific things:

    image

    It shows you the references to this method, where those references are, and even allows you to display those references on a Code Map:

    image

    Tests

    image

    As you can imagine, this shows you tests for this method.  This is extremely helpful in understanding the viability of a code change.  This indicator lets you view the tests for this method, interrogate them, as well as run them.

    image

    As an example, if I double-click the failing test, it will open the test for me.  In that file, CodeLens will inform me of the error:

    image

    Dramatic pause: This CodeLens indicator is tremendously valuable in a TDD (Test Driven Development). Imagine sitting your test file and code file side-by-side, turning on “Run Tests After Build”, and using the CodeLens indicator to get immediate feedback about your progress.

    Authors

    image

    This indicator gives you very similar information as the next one, but list the authors of this method for at-a-glance context.  Note that the latest author is the one noted in the CodeLens overlay.  Clicking on this indicator provides several options, which I’ll explain in the next section.

    image

    Changes

    image

    The Changes indicator tells you information about the history of the file at it exists in TFS, specifically Changesets.  First, the overlay tells you how many recent changes there are to this method in the current working branch.  Second, if you click on the indicator you’ll see there are several valuable actions you can take right from that context:

    image

    What are we looking at?

    • Recent check-in history of the file, including Changeset ID, Branch, Changeset comments, Changeset author, and Date/time.
    • Status of my file compared to history (notice the blue “Local Version” tag telling me that my code is 1 version behind current).
    • Branch icons tell me where each change came from (current/parent/child/peer branch, farther branch, or merge from parent/child/unrelated (baseless)).

    Right-clicking on a version of the file gives you additional options:

    image

    • I can compare my working/local version against the selected version
    • I can open the full details of the Changeset
    • I can track the Changeset visually
    • I can get a specific version of the file
    • I can even email the author of that version of the file
    • (Not shown) If I’m using Lync, I can also collaborate with the author via IM, video, etc.

    This is a heck of a lot easier way to understand the churn or velocity of this code.

    Incoming Changes

    image

    The Incoming Changes indicator was added in 2013 Update 2, and gives you a heads up about changes occurring in other branches by other developers.  Clicking on it gives you information like:

    image

    Selecting the Changeset gives you the same options as the Authors and Changes indicators.

    This indicator has a strong moral for anyone who’s ever been burned by having to merge a bunch of stuff as part of a forward or reverse integration exercise:  If you see an incoming change, check in first!

    Work Items (Bugs, Work Items, Code Reviews)

    image

    I’m lumping these last indicators together because they are effectively filtered views of the same larger content: work items.  Each of these indicators give you information about work items linked to the code in TFS.

    image

    image

    Knowing if/when there were code reviews performed, tasks or bugs linked, etc., provides fantastic insight about how the code came to be.  It answers the “how” and “why” of the code’s current incarnation.

     

    A couple final notes:

    • The indicators are cached so they don’t put unnecessary load on your machine.  As such they are scheduled to refresh at specific intervals.  If you don’t want to wait, you can refresh the indicators yourself by right-clicking the indicators and choosing “Refresh CodeLens Team Indicators”

    image

    • There is an additional CodeLens indicator in the Visual Studio Gallery – the Code Health Indicator. It gives method maintainability numbers so you can see how your changes are affecting the overall maintainability of your code.
    • You can dock the CodeLens indicators as well – just know that if they dock, they act like other panels and will be static.  This means you’ll have to refresh them manually (this probably applies most to the References indicator).
    • If you want to adjust the font colors and sizes (perhaps to save screen real estate), you can do so in Tools –> Options –> Fonts and Colors.  Choose “Show settings for” and set it to “CodeLens”.

     

    I hope you find this helpful!

  • Steve Lange @ Work

    Team Foundation Server vs. Team Foundation Service

    • 10 Comments

    You’ve probably found a few comparisons on the interwebs comparing the “traditional”, on-premise TFS with the new cloud-hosted Team Foundation Service.  I get asked about this a lot – as a result, I thought I’d share the slide deck I used to drive this conversation.  Please let me know if you have any questions!

     

    Basically, TF Service is a nice way to get up and running quickly, without worrying about infrastructure, backups, etc.  What you lose is some customization, lab management, and SSRS reporting.

    Happy developing!

  • Steve Lange @ Work

    Requirements Management in TFS: Part 1 (of 4): Overview

    • 10 Comments

    There are several schools of thought on how to "do RM" ranging from the very lightweight (whiteboards, sticky notes, cocktail napkins, etc) to the robust (formal elicitation, authoring, validation and management of specifications).  Chances are your organization falls somewhere in between. 

    Past dictates future (thanks, Dr. Phil), and the same applies in how teams approach requirements management.  Basically, if you're used to managing your requirements using Word documents (and believe me, you're in the majority), most likely that's what you figure to do when starting a new project.

    The historically mainstream ways to manage requirements (Word, Excel, email) can be efficient (people know the tools, and again, it's how it's been done before) and satisfactory.  But with the application development projects of today becoming increasingly complex and distributed (both architecture and project teams), this process becomes more difficult to manage.  Throw in your regulation/compliance package-of-the-day and you quickly realize you need more.  Key capabilities around collaboration, audit/history, and traceability rise to the top of the priority list.

    As a result, the idea of managing requirements as individual elements (rather than parts of a larger specification document) is becoming increasingly popular.

    I hear this quite often:  "How does Team System support Requirements Management?"  Visual Studio Team System, or more specifically Team Foundation Server, possesses the plumbing needed for the above-mentioned capabilities out of the box as part of its inherent architecture.  TFS provides work item tracking to allow items (bugs, tasks, or in this case, requirements) to be treated as individually managed objects with their own workflows, attributes, and traceability.  However, while the infrastructure is there, TFS wasn't designed specifically to support a requirements management process. 

    But if you are looking at Team Foundation Server to manage your development process, I would suggest that you take a peek at how it can be used to support your business analysts from a requirements management perspective as well.  Again, although it's not specifically targeted at business analysts (it is on the radar (see: Team System Futures, however) there many of the capabilities of TFS can help support a productive RM process.

    This series will take a look at a few different ways that TFS can support requirements management.  In Part 2 I'll show a couple of ways to do this using TFS "natively" (without any add-ins/plug-ins); and in Part 3 I'll briefly discuss some 3rd party solutions that support requirements more directly yet still integrate with Team System.  And we'll close the loop in Part 4 with a summary.

    Next:  TFS - Out of the Box

    Series:

  • Steve Lange @ Work

    Requirements Management in TFS: Part 2 (of 4): TFS Out of the Box

    • 10 Comments

    This is Part 2 of the series, Requirements Management in TFS.  For a brief background, please see Part 1: Overview.

    In this part, we'll discuss a couple of the primary ways to support requirements in Team Foundation Server: the Team Portal and Work Item Tracking

    Team Portal (SharePoint)

    Team Explorer's Documents folderIf you use some kind of document format for authoring and tracking your specifications (Word, PDF, etc.), you may already be using something like a SharePoint site to store them.  Or some kind of repository, even if it's a network share somewhere.  Team Foundation Server creates SharePoint-based (specifically Windows SharePoint Services) web portals automatically when you create a new project.  These portals provide an easy, web-based way for interested parties to "check-in" on a project's status, participate in discussions, post announcements, view process guidance documentation, and submit supporting documents and files to a document library.

    It's the document library that provides a natural fit for bringing your specifications a little more in line with the rest of the application lifecycle.  The document library allows analysts to remain in their comfort application (i.e. Word), but submit changes to requirements to a location that is much more accessible to those that will consume those requirements (architects, developers, testers, etc.).  Team Foundation Server users can access the document library from within Visual Studio Team System, TFS Web Access or other interfaces, thereby allowing them to more readily react to new changes and provide feedback as necessary. 

    Document Library in TFS PortalAnd now that the requirements documents are in the document library and managed (indirectly) by TFS, you can easily leverage the linking capabilities of TFS to add traceability between your specifications and source code, defects, tasks, and test results.  (How do you link work items to items in the document library?  Work items can be linked to hyperlinks, so you can simply link to the URL of the specific file in the document library in SharePoint.)  This adds some real tangibility to your reporting in that you can now view reports from TFS that, for example, show all development tasks and their related requirements spec, changed source code, and validated test results.

    Bug linked to a changeset and work item

    Easy, right?  Well, yes and no.  There are some definite drawbacks to this approach (I'm leaving it up to you to decide if the good outweighs the bad), the primary being that you still don't have any more granular control over individual requirements changes than you did before.  Changes are still tracked, and linked, at the document level.  This can be challenging if you need to track changes to individual requirements (change tracking in Word will only take you so far) for auditing and compliance reasons.

    Benefits Drawbacks
    • Analysts remain in comfort application (Word, etc.)
    • SharePoint is a "natural" extension in Word/Office applications
    • Requirement specs more easily consumed by other roles in lifecycle.
    • Provides basic mechanism to enable traceability and better "cross-artifact" reporting.
    • Lack of item-level granularity.
    • Document-level linking only (can't link to an individual requirement inside specification document)
    • Document Workflow managed by SharePoint, whereas Workflow for other lifecycle artifacts are managed by TFS.

    Work Item Tracking

    Requirements QueriesTeam Foundation Server's work item tracking feature is a major moving part of the system.  Work Item Tracking (WIT) provides a robust yet flexible way to track any item of record throughout a lifecycle.  Some work item types provided with TFS include: bug, task, risk, scenario, or (to the point of this article) requirement. Work items are managed in the TFS database alongside code, builds, test results, etc, and provide a proper level of granularity for controlling change and traceability.

    In the previous example, using the SharePoint project portal lacked the ability to control changes to individual requirements, nor did it allow linking to those individual elements.  Leveraging WIT in TFS addresses both of these shortcomings.  You can create and customize your own types of work items, allowing teams to have complete control over what types of work items are used, their fields, workflow, and even UI.  Say for example, your team typically leverages three types of requirements: Business, Functional, and Non-Functional.  TFS allows you to create custom work item types that represent each of these categories of requirements.

    Now that your requirements are managed as work items in TFS, you can take advantage of all the benefits of the work item tracking system (see benefits below)

    Requirements work items are accessed in the exact same manner as any other work item:

    Since work items are primarily access by way of queries in Team Explorer, teams can easily filter what requirements are displayed and accessed at certain points.

    Reporting gets a considerable leg up using the work item tracking approach.

    The biggest challenge with this approach (in my opinion) is the shift in mindset.  In case you didn't notice, I haven't mentioned using Word in this section.  WIT gets more granular than Word does for managing item-level changes, and there is not currently a Microsoft-provided integration to Word from TFS.  There is often considerable resistance to change in that, "without the document, what will we do?"

    Benefits Drawbacks
    • All changes will be recorded and audited
    • Links can be created between individual requirements and other work items (any type), source code, test results, and hyperlinks)
    • Workflow is enforced and controlled in the same manner as all other work item types
    • Supporting information (screenshots, documents, UML diagrams, etc.) can be attached
    • Reporting can be much more granular (showing requirement implementation rates, impact analysis, scope creep).
    • Change of interface may meet resistance (i.e. no more Word!)
    • Customization work most likely involved (creating custom work item types, fields, & workflow).

     

    Getting Into the Code

    And lastly, if you're really into it, you can tap the Team Foundation Server SDK to get really creative.  For example, you can write a custom lightweight interface for business analysts to enter and track requirements work items in TFS.  Or create a custom report (although you might be better off creating a custom report via the server's reporting mechanism (SQL Server Reporting Services).  I have a little app that creates a "coverage analysis" spreadsheet in Excel that shows coverage (i.e. links) between work item types (for example, I can see if there are any business requirements that have no corresponding functional requirements).

    Next:  TFS - Partner Integrations

    Series:

  • Steve Lange @ Work

    Requirements Management in TFS: Part 4 (of 4): Summary

    • 10 Comments

    Every organization approaches the concept of "requirements" differently.  Factors include general history, skill set, complexity, and agility.  Many development organizations are adopting Team Foundation Server to help improve team communication & collaboration, project control & visibility, and generally a more integrated experience across the various actors in the application lifecycle. 

    The more pervasive TFS becomes in an organization, the more I'm asked about managing requirements within the confines if Team System.  Some shops want to know about how to integrate more RM-specific applications into the platform, while others want to leverage TFS as much as possible and wait until Microsoft releases a requirements management solution (I know, I know, Word is the most widely-used requirements tool in the world - but I think you know what I mean by now!).

    If you're trying to choose which path to take (TFS-only or a partner integration), here are a few basic considerations:

     

      Benefits Drawbacks
    TFS Only
    • Affordability (only a TFS CAL is required)
    • Full integration with rest of the application lifecycle (existing infrastructure is leveraged for reporting & visibility)
    • Consistent capture & storage mechanism for all project artifacts.
    • Lack of specific focus on the analyst role
    • Interface may be a bit "heavy" and counter-intuitive for analysts.
    Partner Integrations
    • Can immediately provide requirements-specific capabilities (rich-text, use case diagramming, etc.)
    • Many can trace/link to work items in TFS, providing end-to-end visibility
    • Cost (Most partner tools require their own licenses, and each user still requires a TFS CAL from Microsoft.  Maintenance costs may be a factor as well)
    • Additional skill set is required for partner tool

    Some requirements-related resources (other links can be found in the various parts of this series):

    Well, I hope you at least found this series worth the time it took you to read it.  I welcome any comments and feedback as this topic is always shifting in perception, intention, schools of thought.

    Series:

  • Steve Lange @ Work

    FREE EVENT: “A Day in the Life of Scrum” with Team System

    • 9 Comments

    *** UPDATED 4/30/2009 ***

    • The Denver date has changed to June 4th
    • The Phoenix date has been updated!  The new date is June 2nd, and the updated registration information is below. 

    vsts2If you attended the Agile & Scrum Essentials event series last fall, then you’ve been expecting this second round!  And if you missed it, now you can catch up!

    Please join Microsoft and Neudesic for a day in the life of Scrum with Visual Studio Team System 2008 and Team Foundation Server!  Agile methods are a set of development processes intended to create software in a lighter, faster, more people-centric way. Many development teams have adopted "agile" methodologies to manage change and to improve software quality. These methodologies promote continuous integration as a practice to build and test software products incrementally as new features are included, bugs are fixed, and code is refactored.

    If you missed the first series of Agile & Scrum Essentials last fall; here’s your chance to attend the follow-on event where we’ll briefly revisit the basics of Agile and Scrum and provide a walkthrough of how to configure Visual Studio Team System 2008 and Team Foundation Server for Scrum. Participants will be familiarized with how key artifacts are managed within this popular process template for enacting Scrum in organizations.   

    Join us for this interactive event as we explore a “day in the life of a Sprint,” that will give you a practical perspective of how Scrum teams leverage Visual Studio Team System for end to end management of the planning, execution and control of Scrum projects. The day will end with an overview of what’s coming in Visual Studio Team System 2010!

    Please register today for the event nearest you!

    DATE CITY REGISTRATION INFO
    3/19/2009 Mountain View, CA Click here to register with invitation code: 38B820
    6/4/2009 Denver, CO Click here to register with invitation code: 02B7F8
    6/2/2009
    updated!
    Phoenix, AZ Click here to register with invitation code: 4DEAA2
    4/2/2009 Bellevue, WA Click here to register with invitation code: 46F263
    4/7/2009 Salt Lake City, UT Click here to register with invitation code: FF5466
    4/9/2009 Portland, OR Click here to register with invitation code: ED7794
    4/14/2009 San Diego, CA Click here to register with invitation code: 1A8639
    4/15/2009 Irvine, CA Click here to register with invitation code: E4995A
    4/16/2009 Los Angeles, CA Click here to register with invitation code: A61EB4

    You can also call 1.877.MSEVENT (1.877.673.8368) and provide the appropriate invitation code to register.

    I will be at the Denver, Phoenix, and Salt Lake City venues and hope to see you there!

    Did I mention this event is FREE?

    microsoft logo   neudesic logo

  • Steve Lange @ Work

    The “Ultimate” Event: Visual Studio 2010 & Team Foundation Server 2010

    • 9 Comments

    vs2010logo

    Join us for a sneak peek of Microsoft® Visual Studio® 2010, which will be a landmark release of the premier development toolset for Windows®, Web and Cloud development.
    The Ultimate Event is your exclusive opportunity to hear about Visual Studio 2010 from experts before the product is launched this year. Microsoft has made significant investments to and improvements of Modeling and Testing/QA tools in Visual Studio 2010. At this event you’ll get a comprehensive overview of Visual Studio 2010 and Team Foundation Server 2010, which is the Application Lifecycle Management (ALM) core of Visual Studio. We’ll present enhancements in version control, reporting, project management and build management. 
    Spend the day with us to learn how to take software development to the next level with Visual Studio 2010!

    Agenda

    Time Topic
    8:30 AM-9:00 AM Registration, Welcome
    9:00 AM-10:30 AM Lap Around VS 2010
    10:45 AM-12:00 PM Agile Management with TFS
    12:00 PM-12:30 PM Lunch
    12:30 PM-1:45 PM No More "No Repro"
    2:00 PM-3:15 PM Architecture for Everyone

    I hope to see you there!

    Venues:

    Date Location Event ID

    3/2/10

    Bellevue, WA

    1032439179

    3/2/10

    San Diego, CA

    1032439178

    3/4/10

    Los Angeles, CA

    1032439180

    3/9/10

    Mountain View, CA

    1032439176

    3/9/10

    Irvine, CA

    1032439181

    3/10/10

    Phoenix, AZ

    1032439183

    3/11/10

    Salt Lake City, UT

    1032439996

    3/11/10

    Portland, OR

    1032439182

    3/16/10

    Denver, CO

    1032439184

    3/16/10

    San Francisco, CA

    1032439177

  • Steve Lange @ Work

    Running Code Metrics as Part of a TFS 2010 Build – The Poor Man’s Way

    • 8 Comments

    Code Metrics, not to be confused with code analysis, has always been tough impossible to run as part of a build in Team Foundation Server.  Previously, the only way to run code metrics was to do so inside Visual Studio itself.

    In January, Microsoft released the Visual Studio Code Metrics PowerTool, a command line utility that calculates code metrics for your managed code and saves the results to an XML file (Cameron Skinner explains in detail on his blog). The code metrics calculated are the standard ones you’d see inside Visual Studio (explanations of metric values):

    • Maintainability Index
    • Cyclomatic Complexity
    • Depth of Inheritance
    • Class Coupling
    • Lines Of Code (LOC)

    Basically, the power tool adds a Metrics.exe file to an existing Visual Studio 2010 Ultimate or Visual Studio 2010 Premium or Team Foundation Server 2010 installation.

    So what does this mean?  It means that you can now start running code metrics as part of your builds in TFS.  How?  Well, since this post is titled “The Poor Man’s Way”, I’ll show you the quick and dirty (read: it works but is not elegant) way to do it.

    As a note, Jakob Ehn describes a much more elegant way to do it, including a custom build activity, the ability to fail a build based on threshold, and better parameterization.  I really like how flexible it is!  Below is my humble, quick & dirty way.

    The below steps will add a sequence (containing individual activities to the build process workflow that will run just prior to copying binaries to the drop folder.  (These steps are based on modifying DefaultBuildTemplate.xaml.)

    1. Open the build process template you want to edit (it may be simpler to create a new template (based on the DefaultBuildProcessTemplate.xaml) to work with.
    2. Expand the activity “Run On Agent”
    3. Expand the activity “Try, Compile, Test and Associate Changesets and Work items”
      1. Click on “Variables”, find BuildDirectory, and set its scope to “Run On Agent”
    4. In the “Finally” area, expand “Revert Workspace and Copy Files to Drop Location”
    5. From the toolbox (Control Flow tab), drag a new Sequence onto the designer, just under/after the “Revert Workspace for Shelveset Builds”. (Adding a sequence will allow you to better manage/visualize the activities related to code metrics generation).
      1. In the Properties pane, set the DisplayName to “Run Code Metrics”
    6. From the toolbox (Team Foundation Build Activities), drag a WriteBuildMessage activity into the “Run Code Metrics” sequence.
      1. In the Properties pane
        1. set DisplayName to Beginning Code Metrics
        2. set Importance to Microsoft.TeamFoundation.Build.Client.BuildMessageImportance.Normal (or adjust to .High if needed)
        3. set Message to “Beginning Code Metrics: “ & BinariesDirectory
    7. From the toolbox, drag an InvokeProcess activity into the sequence below the “Beginning Code Metrics” activity (this activity will actually execute code metrics generation).
      1. In the Properties pane
        1. set DisplayName to Execute Coded Metrics
        2. set FileName to “””<path to Metrics.exe on the build machine>”””
        3. set Arguments to “/f:””” & BinariesDirectory & “\<name of assembly>”” /o:””” & BinariesDirectory & “\MetricsResult.xml”  (you can also omit the assembly name to run matrics against all assemblies found)
        4. set WorkingDirectory to BinariesDirectory
    8. (optional) From the toolbox, drag another InvokeProcess activity below “Execute Code Metrics” (This activity will copy the XSD file to the binaries directory)
      1. In the Properties pane
        1. set DisplayName to Copy Metrics XSD file
        2. set FileName to “xcopy”
        3. set Arguments to “””<path to MetricsReport.xsd>”” ””” & BinariesDirectory & “”””
    9. Save the XAML file and check it in to TFS.

    Workflow after adding code metrics sequenceThe sequence you just added should look like (boxed in red):

    You basically have a sequence called “Run Code Metrics” which first spits out a message to notify the build that code metrics are beginning.

    Next, you actually execute the Metrics.exe executable via the InvokeProcess activity, which dumps the results (XML) file in the Binaries directory (this makes it simpler to eventually copy into the drop folder).

    The “Copy Metrics XSD file” activity is another InvokeProcess activity which brings along the appropriate XSD file with the metrics result file.  This is optional of course.

    After you run a build using this updated template, your drop folder should have something like this:

    Drop folder after running build with new template

    Pay no attention to the actual binaries – it’s the presence of MetricsReport.xsd and MetricsResults.xml that matter.

    Pretty cool, but there’s one annoyance here!  The metrics results are still in XML, and aren’t nearly as readable as the results pane inside Visual Studio:

    MetricsResults.xml on top, Code Metrics Results window in VS on bottom

    Don’t get me wrong – this is a huge first step toward a fully-baked out-of-VS code metrics generator.  The actual report generation formatting will surely be improved in future iterations.

    I decided to take one additional step and write a simple parser and report generator to take the XML results and turn them into something more pretty, like HTML.

    Before I dive into code, this is the part where I remind you that I’m not (nor have I ever been) a developer by trade, so the code in this blog is purely for functional example purposes.  Winking smile

    I created a relatively simple console application to read in a results XML file, parse it, and spit out a formatted HTML file (using a style sheet to give some control over formatting).

    I’m posting the full example code to this post, but below are the highlights:

    I first created some application settings to specify the thresholds for Low and Moderate metric values (anything above ModerateThreshold is considered “good”).

    Settings to specify Low and Moderate metric thresholds

    I created a class called MetricsParser, with properties to capture the results XML file path, the path to output the report, and a path to a CSS file to use for styling.

    To store individual line item results, I also created a struct called ResultEntry:

        struct ResultEntry
        {
            public string Scope { get; set; }
            public string Project { get; set; }
            public string Namespace { get; set; }
            public string Type { get; set; }
            public string Member { get; set; }
            public Dictionary<string, string> Metrics { get; set; }
        }

    I then added:

    private List<ResultEntry> entriesShifty

    which captures each code metrics line item.

    If you look at the results XML file, you can see that in general the format cascades itself, capturing scope, project, namespace, type, then member.  Each level has its own metrics.  So I wrote a few methods which effectively recurse through all the elements in the XML file until a complete list of ResultEntry objects is built.

    private void ParseModule(XElement item)
            {
                string modulename = item.Attribute("Name").Value.ToString();
                
                ResultEntry entry = new ResultEntry
                {
                    Scope = "Project",
                    Project = modulename,
                    Namespace = "",
                    Type = "",
                    Member = ""
                };
                List<XElement> metrics = (from el in item.Descendants("Metrics").First().Descendants("Metric")
                                          select el).ToList<XElement>();
                entry.Metrics = GetMetricsDictionary(metrics);
                entries.Add(entry);
                List<XElement> namespaces = (from el in item.Descendants("Namespace")
                                          select el).ToList<XElement>();
                foreach (XElement ns in namespaces)
                {
                    ParseNamespace(ns, modulename);
                }
            }

    Bada-bing, now we have all our results parsed.  Next, to dump them to an HTML file.

    I simply used HtmlTextWriter to build the HTML, the write it to a file.  If a valid CSS file was provided, the CSS was embedded directly into the HTML header:

     #region Include CSS if available
     
                    string cssText = GetCssContent(CssFile);
                    if (cssText != string.Empty)
                    {
                        writer.RenderBeginTag(HtmlTextWriterTag.Style);
                        writer.Write(cssText);
                        writer.RenderEndTag();
                    }
     
    #endregion

    After that, I looped through my ResultEntry objects, inserting them into an HTML table, applying CSS along the way.  At the end, the HTML report is saved to disk, ideally in the build’s binaries folder.  This then allows the report to be copied along with the binaries to the TFS drop location.

    Code Metrics Results HTML Report

    You’ll notice that this layout looks much like the code metrics in Visual Studio if exported to Excel.

    So again, not the most sophisticated solution, but one that a pseudo-coder like me could figure out.  You can expand on this and build all of this into a custom build activity which would be much more portable.

    Here is the code for MetricsParser:

    Again I recommend looking at Jakob’s solution as well.  He puts a much more analytical spin on build-driven code metrics by allowing you specify your own thresholds to help pass or fail a build.  My solution is all about getting a pretty picture

    Happy developing!

  • Steve Lange @ Work

    Visual Studio 2012 Launch Roadshow!

    • 8 Comments

    Visual Studio 2012 Launch Roadshow

    If you’re not heading to Seattle for the Visual Studio 2012 Launch Event on September 12th, don’t worry: We’re coming to you!

    Be our guest and attend in person to experience all of the incredible new capabilities of Visual Studio 2012 first hand.

    UPDATE

    For those of you who have attended the events so far and are looking for the slides/content, look no further! Everything is here:  http://aka.ms/VS2012Roadshow

    I’ll be there, will you?

    Discover how Visual Studio 2012 allows you to collaborate better and be more agile. See how it helps you turn big ideas into more compelling apps. Experience how it integrates best practices that accelerate development and deployment.  You’ll enjoy several sessions which will take Visual Studio, Team Foundation Server, and Test Professional through their paces to show off what’s possible with this incredible release!

    Register today for a city near you (dates and locations listed below), we hope to see you there!

    Cities & Dates

    9/18

    Denver, CO

    9/25

    Lehi, UT

    10/2

    Tempe, AZ

    10/9

    San Diego, CA

    10/10

    Irvine, CA

    10/16

    Mountain View, CA

    10/17

    San Francisco, CA

    10/25

    Portland, OR

    10/30

    Boise, ID

       

    Registration/check-in begins at 8:30.  The event runs from 9:00AM to 4:00PM.

  • Steve Lange @ Work

    One-Click Check-in on Southwest Airlines with your Windows Mobile Phone

    • 6 Comments

    NOTE:  Since this posting (about a year or two after), Southwest updated their mobile site so this no longer works..

    Okay, I just had to share this nugget of a time-saver (If you know about it already, then this won't seem very original..).  I got this tip from a colleague of mine, so I'm not taking credit here, but rather just passing it along.

    If you haven't flown Southwest Airlines before, it's open seating, first-come, first-serve based upon passengers order of check-in.  That means that if you check-in first, you board first. 

    First 60 to check-in.. ..get an A boarding pass (numbered 1-60)
    Second 60 to check-in.. .. get a B boarding pass (numbered 1-60)
    Everyone else.. .. gets a C boarding pass (numbered 1-60)

    You can check-in 24 hours before departure.  So what do you do if you want an "A" boarding pass but aren't at your computer to check-in online?

    Southwest Airlines has a mobile website which allows you to check-in via your phone and then print your boarding pass at the airport.  So that saves you some time.  You go to the site with your Windows Mobile phone, enter your first name, last name, and confirmation number, and you're all set.

    Check-in page on SWA's mobile site

    Fill in your information, and (assuming you're within the 24-hour check-in window) you'll arrive here:

    image 

    Click "Check In All", and you'll be checked into your flight:

    image

    Then just either print your boarding pass later on your printer, or do it at a kiosk at the airport.

    But wait, there's more..

    But what if you don't have the confirmation handy, say, while you're driving in your car? 

    You can link to the check-in page's submission directly by embedding your name and confirmation number in the below URL:

    http://mobile.southwest.com/cgi-bin/wireless?action=checkin&first=FIRSTNAME&last=LASTNAME&pnr=CONFCODE

    Following the link directly will take you to the "Checkin Availability" page where all you need to do is click the "Check In All"  button.

    What I do is save the "template" URL as part of my Outlook Contact entry for Southwest.  When I book a flight and add the flight to my calendar, I put the completed URL in the calendar entry, then set a 1 day (24 hour) reminder for the flight. 

    When I get the reminder, I simply open the calendar entry, click the link, and check-in.  It takes less than 30 seconds.

    Another way to store the completed URL is to create an Outlook task ("Check in for tomorrow's flight") with the URL, with a reminder or due date set for 24 hours before the flight.

    And since my Windows Mobile device automatically syncs with Exchange, my calendar and task entries, including their reminders, are readily accessible from my phone.

    Lastly, I also use TripIt to organize and share my travel itineraries with family and friends.  You can add the direct check-in URL to my itinerary and access on my mobile phone via TripIt's mobile site.

  • Steve Lange @ Work

    A Mock Business Plan for the New Microsoft Stores

    • 5 Comments

    “Coming Soon, to a Mall Near You”

    So if you haven’t check your favorite news site already, Microsoft has announced plans to open retail stores (for real – we’ve even hired a VP to do it).

    Initial, knee-jerk thoughts vary greatly, from the “what are they thinking” to “hey, that could work”..

    I’m mixed on this one.  The Microsoftie in me thinks this is a bold but needed step to start correcting the negative perception of Microsoft products in the eyes of the consumer (Vista sucks, right?  Microsoft’s evil, right?).  The amateur economist in me can’t help but be wary of venturing into retail when that industry is hurting so badly.

    Signage?Since the state of the economy has been discussed to almost a numbing degree, let’s look at the possible positive (and humorous, of course) scenarios surrounding the “Microsoft Store”.

    First of all, what will it be called?  Should we follow Apple’s suit and just call it the “Microsoft Store”?  (Actually, if we’re really following Apple we wouldn’t have a name, just the Windows or Vista logo.)  Here are a few thoughts:

    • Microsoft Store
    • MSStore
    • Windows
    • The Mojave Store
    • Hotfix

    The nay-sayers are wondering what the heck will actually be sold in the store.  It’s not like we can “sell” Windows Live, SkyDrive, or Photosynth.  Well, it sounds like the store will be stocked with new computers (Dell, HP, etc.) loaded with Vista (actually, probably Windows 7 by the time the stores are fully operational), software packages (i.e. Office), Xboxes and Zunes.  All the typical stuff, right?  Ahh, not so fast.  A real hidden bonus for this retail idea is the opportunity to showcase a lot of physical products (i.e. hardware, what you can touch) that the typical consumer may not know about.  Let’s look at some of the possibilities (including some obvious ones):

    • Xbox:  Duh!  Have plenty of Xboxes to sell, and have several set up, networked together and online.  Also showcase how users can watch Netflix movies, and connect to Media Center PC’s.
    • Zune:  Another “duh”, right?  The Zune, right out of the gate, unfortunately had to bear a “this product is crappy” moniker simply because of the Microsoft logo on it.  If you haven’t actually played with one before, here’s your opportunity?
    • Gaming Products:  Huh?  That’s right.  Did you know that Microsoft cranks out some killer accessories to boost your gaming experience?  Like the Sidewinder mouse & keyboard, and Reclusa keyboard.
    • Communications Hardware:  There are some really great available webcams and headsets.  I have a LifeCam NX-6000 for my laptop and it works terrific given its form factor.
    • Mice & Keyboards:  Beyond just the standard ones, try the wireless presenter mouse or Explorer Mini-Mouse.
    • Cell Phones:  Unless your a corporate guy/gal, you may not really know that Microsoft provides an OS for smartphones/PDA’s called Windows Mobile.  Why not use a storefront to showcase some of the cooler phones running Windows Mobile?
    • Surface:  Sure, no one will really be able to actually buy one, but putting a Surface machine or two in a store will bring people in the door, GUARANTEED.  Encourage folks to put their phones on it and display pictures, view YouTube videos, play games, etc.  Put it smack-dab in the middle of the store.
    • Mediaroom:  Microsoft Mediaroom isn’t a light investment either, but it provides a “whoah, that’s cool” factor which will bring people in the door (“butts in seats”, as we presenters call it).

    Now, what should the PC’s in the store have on them?  Okay, okay – BESIDES Windows and Office.  Here’s a short list of software & services that should be readily available for any shopper who saddles up to a machine, including what the “Microsoft Guru” should be ready to show:

    Product/Service What to Demonstrate
    Live Products:
    Live Writer, Live Photo Gallery, Live Messenger, Live Mesh, etc.
    Have some sample LiveID’s already set up so shoppers can browse the various Live services, such as Spaces, SkyDrive, and Photos.

    Show how the different services work together (example: Use Live Writer to post to a blog, pulling pictures from Live Photo Gallery (or even Facebook), to Spaces.)

    Demonstrate how you can use Live Mesh to easily push photos from your PC in Colorado to Grandma in California.

    PhotoSynth Seriously, this is a killer app if you like to take pictures.  Show it off with existing collections, or take a battery of pictures of the store and watch it work.
    Windows Home Server Why not?  Show how WHS can automatically backup all the computers in your house, and restore them from crashes in just a few clicks.  On the more fun site, demonstrate how to serve up websites & photo albums.
    Media Center Show how you can record TV right to your PC, and access/broadcast those shows in other areas in your house.
    AutoCollage Take eight pictures of the store, and show how easily you can drop it into a collage.
    Songsmith Create a song on the fly.

    There are several more, but this is a good start, I think.

    Take a page from the Apple folks and surround all the set up PC’s with complementary products, such as Windows Mobile phones, Zunes, digital picture frames, etc.

    Now of course, you’ll want to stock the shelves with all the software we offer, including OS’s, Office, Streets & Trips, OneCare, etc.

    Lastly, there should be an “ask the expert” station where you can discuss any Microsoft-related product issue with (presumably) an expert.  There shouldn’t just be sales-oriented people in the store, but rather technical support –types that can put a smile on their face.  Lastly, the store employees will need a thick skin as there will undoubtedly anti-Microsoft (justified or not) walk in for the sake of whining & moaning.  (As a former tech support guy, I assure you they’re out there.)

    These “gurus” should hold regularly-schedules workshops:  “Get the most out of your photos", “How to back up my PC”, “Tell me about Internet Explorer”.. those kinds of things.

    So we’ve covered signage, inventory, and personnel.  What about store layout?  I have no idea what this will actually look like, but here’s a rough thought:

    Okay, I got carried away.. possibility for MS Store Layout? (by Visio)

    The key to getting people in the store will be to move the rows of stocked software (boring to look at) to the back and bring the cool stuff to the front, i.e. Xbox and Surface.  If a shopper walking by glances inside and sees some people on a couch having a blast playing video games, and a small crowd of people going nuts on a Surface, that person will have a hard time not venturing inside to check it out.

    Okay, so I’ve gone a little overboard here.  I had a little time on my hands and found myself getting surprisingly excited by this concept.  To start changing perception, Microsoft needs to be tangible and approachable.  This could be a great start!

  • Steve Lange @ Work

    I told you it was coming! “Team System Big Event”

    • 5 Comments

    TeamSystemBigEventI mentioned before, but couldn’t give away the details until now.. We’re covering all aspects of Application Lifecycle Management, and I hope to see you there.  We plan to pack in each venue, so tell your friends..  There will be presenters from both Microsoft and the development community, including Team System MVP’s, so you’re bound to be entertained and learn something in the process!

    Be sure to register below, and download the attached PDF to distribute as you see fit!

    How do you take an idea from conception to completion? How can you truly do more with less?

    Please join us for this FREE unique, invitation-only event to discover how both product and processes help your organization succeed in today’s environment. We will explore how Team System assists teams across the board to be successful in today’s tough times. This “break through” event will not only provide you with best practices around development and testing, but will demonstrate key capabilities of both Visual Studio Team System 2008 and the upcoming 2010 release. It’s a day that promises to have something for everyone!

    Team System Big Event

     

    SESSIONS

    • Test Driven Development: Improving .NET Application Performance & Scalability
      • This session will demonstrate how to leverage Test Driven Development in Team System. We’ll highlight both writing unit tests up front as well as creating test stubs for existing code.
    • "It Works on My Machine!" Closing the Loop Between Development & Testing
      • In this session, we will examine the traditional barriers between the developer and tester; and how Team System can help remove those walls.
    • Treating Databases as First-Class Citizens in Development
      • Team System Database Edition elevates database development to the same level as code development. See how Database Edition enables database change management, automation, comparison, and deployment.
    • vstsbigevent_characterArchitecture without Big Design Up Front
      • Microsoft Visual Studio Team System 2010 Architecture Edition, introduces new UML designers, use cases, activity diagrams, sequence diagrams that can visualize existing code, layering to enforce dependency rules, and physical designers to visualize, analyze, and refactor your software. See how VSTS extends UML logical views into physical views of your code. Learn how to create relationships from these views to work items and project metrics, how to extend these designers, and how to programmatically transform models into patterns for other domains and disciplines.
    • Development Best Practices & How Microsoft Helps
      • Sometimes development teams get too bogged down with the details. Take a deep breath, step back, and re-acquaint yourself with a review of current development best practice trends, including continuous integration, automation, and requirements analysis; and see how Microsoft tools map to those practices.
    • "Bang for Your Buck" Getting the Most out of Team Foundation Server
      • Today’s IT budgets are forcing teams to do as much as they can with as little as possible. Why not leverage Team Foundation Server to its full potential? In this session we’ll highlight some capabilities of TFS that you may or may not already know about to help you maximize productivity.

    Welcome: 8:00 AM

    Seminar: 8:30-5:00 PM

    REGISTRATION

    Denver, CO April 22, 2009 Click here to register with invitation code: DD1A7F
    Mountain View, CA April 28, 2009 Click here to register with invitation code: 80D459
    Irvine, CA April 30, 2009 Click here to register with invitation code: A86389
    Portland, OR May 5, 2009 Click here to register with invitation code: 2DC0A9
    Phoenix, AZ May 7, 2009 Click here to register with invitation code: 90BC47

    To Register by Phone – Call 1.877.MSEVENT (1.877.673.8368) with invitation code.

    PS - This event is Free!

  • Steve Lange @ Work

    Creating a Data-Driven Web Test against a Web Service

    • 4 Comments

    Okay, I'm sure some of you will tell me, "Yeah, I know this already!", but I've been asked this several times.  So in addition to pointing you to the MSDN documentation, I thought I'd give my own example.

    The more mainstream recommendation for testing a web service is to use a unit test.  Code up the unit test, add a reference to the service, call the web service, and assert the results.  You can then take the unit test and run it under load via a load test.

    However, what if you want a more visual test?  Well, you can use a web test to record interaction with a web service.  This is actually documented in the MSDN Library here, but below is my simple example.

    Here's what we're going to do:

    1. Create the web service
    2. Create the web test
    3. Execute the web test (to make sure it works)
    4. Create the test data data source
    5. Bind it to the web test
    6. Run the test again

    First, we create a web service.  In my example, it's the sample "Hello, World" service and I've created one additional method called "HelloToPerson":

    <WebMethod()> _
        Public Function HelloToPerson(ByVal person As String) As String
            Return "Hello, " & person
        End Function

    As you can see, the method will simply say hello to the passed person's name.

    Now, let's create a web test to exercise this web method (Test->New Test, select Web Test), creating a test project in the process if you don't already have one in your solution.  I named my web test WebServiceTest.webtest.

    As soon as Internet Explorer opens with the web test recorder in the left pane, click the "Stop" button in the recorder.  This will return you Visual Studio's web test editor with an empty test.

    Web test with no requests

    Now launch Internet Explorer, go to your web service (.asmx), and select the method to test (again, in this example it's "HelloToPerson").  Examine the SOAP 1.1 message.  In my example, the message looks like this:

    POST /Service1.asmx HTTP/1.1
    Host: localhost
    Content-Type: text/xml; charset=utf-8
    Content-Length: length
    SOAPAction: http://tempuri.org/HelloToPerson
    <?xml version="1.0" encoding="utf-8"?>
    <soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
       <soap:Body>
          <HelloToPerson xmlns="http://tempuri.org/">
             <person>string</person>
          </HelloToPerson>
       </soap:Body>
    </soap:Envelope>

    We'll need to refer back to this information (I color-coded a couple of sections for later reference).

    Right-click on the root node (WebServiceTest in my example) and select "Add Web Service Request."

    Add a Web service Request

    In the URL property of the new request, enter the URL of the web service (by default this value is populated with http://localhost/.

    Specifying the correct URL for the web service

    Now, let's make sure we use a SOAP request.  Right-click the request and select "Add Header".

    Adding a header to the request

    Enter "SOAPAction" in the name field.  In the value field, enter the value of SOAPAction in the message from your web service.  For my example, it's "http://tempuri.org/HelloToPerson" (color-coded in blue)

    Adding the SOAPAction header to the request

    Next, select the String Body node

    • In the Content Type property, specify "text/xml"
    • In the String Body property, copy/paste the XML portion of the SOAP 1.1 message of your web service method (color-coded in red).  At this time, be sure to replace any parameters with actual values you want to test (in this example, my parameter is "person", so I enter "Steve" instead of "string"

    Entering the XML portion of the SOAP message, specifying a real value for the 'person' parameter

    The properties dialog for the String Body node

    Now, right-click on the web service request and select "Add URL QueryString Parameter."

    Adding a URL QueryString Parameter

    In the QueryString Parameter node, specify "op" as the name and the name of your method as the value.  In this example, it's "HelloToPerson".

    Viewing the added QueryString Parameter

    Finally, let's run the test and see the results!

    Viewing the test results

    As you can see, the test passed, and the "Web Browser" panel shows the returned SOAP envelope with the correct results.

    Now for some more fun.  Let's make this a data-driven test so we can pass different values to the web method.

    We'll create a simple data source so that we can pass several names to this method (very helpful so we don't have to record multiple tests against the same method).  You can use a database, XML file, or CSV (text) file as a data source.  In my example, I'm going to use an XML file:

    <?xml version="1.0" encoding="utf-8" ?>
    <TestData>
      <Name>Steve</Name>
      <Name>Mickey</Name>
      <Name>Devin</Name>
      <Name>Joe</Name>
      <Name>Eric</Name>
    </TestData>

    Save this file as "Names.xml" in your test project. 

    To make this data source available to the web test, right click on the web test and select "Add Data Source" (you can also click the corresponding toolbar button).

    Adding a data source

    Provide a name for the data source (for me, it's "Names_DataSource") and select XML file for the data source type.

    Selecting the data source type

    Next, provide the path to the XML file, then select the data table containing your test data.  You'll know if you select it correctly since you'll get a preview of your data.

    Selecting the XML file

    Check the boxes next to the data tables you want to be available for your tests.  In my example, I only have one ("Names").

    image 

    Click Finish (if you're asked to include any files in your test project, just click yes to the prompts).

    Now your XML data is available to bind to your web test.

    Data source is now available to your test.

    Finally, let's put this data source to work.  We want to bind the name values in the data source to the "person" parameter for my web service call.  If you recall, that value is specified in the String Body property.  So we inject the following syntax (using the values appropriate for this example) into the String Body property:

    {{DataSourceName.TableName.ColumnName}}, so for my example, I use {{Names_DataSource.Name.Name_Text}}

     image

    Now we just need to tell the web test to execute once for each value in my data source.  We can do this two ways:

    If you will mostly just run this test in a single pass (not iterate through the data source), you can just run your test and "Edit Run Settings" to augment (on a one-off basis) your iteration settings.

    Editing test run settings

    Again, note that doing this way will affect only the current test run (i.e. next run made), and will not be saved.

    If you want to specify that you want to use the data source by default, you need to open the LocalTestRun.testrunconfig file in your Solution Items folder.

     Finding the .testrunconfig file

    Opening the .testrunconfig file will give you the below dialog.  Select Web Test on the left, then click the radio button to specify "One run per data source row."  Click Apply then Close.

    image

    Now for the beautiful part.  Go back to your web test and run it again.  This time instead of a single run, it will automatically execute a test run for each row in your data source. 

    Viewing test results with multiple runs

    Notice results for each run, including pass/fail information, and the resulting SOAP envelope with the appropriate method result in each (I've highlighted the second run to show that "Mickey" was used in this run).

    Happy Testing! 

  • Steve Lange @ Work

    VS 2012 ALM Tidbits: The Feedback Client’s Two Modes

    • 4 Comments

    As the Visual Studio family of products (Visual Studio, TFS, Test Professional) nears its 2012 release, I thought I’d bring some short hits – tidbits, if you will – to my blog. Some of these are pretty obvious (well-documented, or much-discussed), but some may be less obvious than you’d think. Either way, it’s always good to make sure the word is getting out there. Hope you enjoy!

    The Feedback Client’s Two Modes

    One of the “new” new features of TFS 2012 is the addition of the Microsoft Feedback Client (download) for collecting feedback from stakeholders, end users, etc. This tool integrates with TFS to provide a mechanism to engage those stakeholders and more seamlessly include their insights in the lifecycle.

    It’s important to know that this new tool provides a mechanism for collecting feedback in two distinct manners, voluntary and requested. The rest of this post will walk through each of these “modes”.

    Regardless of the mode used to provide feedback, this feedback gets stored in TFS as a work item (of type Feedback Response) which then gets all the benefits of being a work item (auditing, assignment, linking, reporting, etc.). As you can imagine, this is a much more effective way of tracking feedback than email, lists, and forms. We’ll talk about that (plus licensing) toward the end of this post.

    Voluntary Feedback Mode

    This mode is used naturally by a stakeholder (I’m using the term “stakeholder” to mean anyone that may want to have a say in how a product evolves) to provide unsolicited feedback about a product or application. This means that if a stakeholder is using an application and thinks of an idea to improve it (or maybe even to report a problem), they can fire up the Feedback Client and include annotated screenshots, record video or audio, and notes.  

    Voluntary feedback

    In this screenshot, I provide “voluntary” feedback that I should be more prominently featured on Bing. Yes, I’m that way.. ;)

    This is an incredible light and easy way for a stakeholder to feel like they have a say/vote in the direction of an application.

    Requested Feedback Mode

    As the name implies, this kind of feedback is given in response to a request for feedback from another user. Requesting feedback begins in Team Web Access on a project’s home page, by clicking on the “Request feedback” link under Activities. 

    Request feedback

    The requestor fills out the Request Feedback form:

    Request feedback form

    Which sends the following email to all included stakeholders (yes, you can send a single request to multiple recipients, as well as request multiple items of feedback in a single request):

    Feedback request email

    When the stakeholder clicks the link in the email, the Feedback Client will launch and walk the stakeholder through the process.Requested feedback in Feedback Client

    Once the feedback is submitted, everything shoots back into TFS and is automatically linked to the Feedback Request work item.

    Response linked to Request

    Looking at the feedback response in my example:

    Feedback Response work item

    Okay, Now What?

    Now that you have feedback in TFS, what do you do with it?

    Several things, actually.  First, leverage the linking capabilities of work items to associate feedback with the appropriate task, backlog item, bug, or whatever. In my example, I linked my feedback request to a PBI:

    image

    This provides an even more cohesive story for “covering” the PBI.  Now not only can you see from a PBI all the tasks, storyboards, bugs, etc. related it to it, but you have a way to track “sign-off”, or at least unofficial support from stakeholders about the “doneness” of the backlog item.

    Also, you may want to is create a few shared queries to better help you view and track feedback.

    Feedback queries

    In this example, I created 4 queries to help me manage feedback (again, just an example):

    • All Feedback – Flat list showing all feedback responses (voluntary or requested).
    • Feedback Requests & Responses – Direct links query showing all feedback request and any associated responses.
    • Feedback without PBI – Flat list showing all feedback requests and responses that are not associated with a Product Backlog Item.
    • Unsolicited Feedback – Flat list showing all voluntary feedback.

    Lastly, if stakeholder feedback is important to you, add one of your feedback queries as a Team Favorite, which will make it show up on your team’s home page.

    Team Favorites

    Licensing

    • To provide feedback (i.e. use the Microsoft Feedback Client), there is no licensing requirement at all. The Feedback Client tool is free to download, and there is no TFS CAL requirement to use it.
    • To request feedback (i.e. solicit feedback from others), you need to be part of one of the following licensing groups: Visual Studio Premium, Visual Studio Ultimate, or Visual Studio Test Professional.

     

    There’s plenty of documentation on stakeholder feedback, but something that can fall through the cracks is the fact that there are indeed two modes of using this capability.

    Hope this helps!

  • Steve Lange @ Work

    VS/TFS 2012 Tidbits: Merging Changes by Work Item

    • 4 Comments

    As the Visual Studio family of products (Visual Studio, TFS, Test Professional) nears its 2012 release, I thought I’d bring some short hits – tidbits, if you will – to my blog. Some of these are pretty obvious (well-documented, or much-discussed), but some may be less obvious than you’d think. Either way, it’s always good to make sure the word is getting out there. Hope you enjoy!

    Merging Changes by Work Item

    This is something that existed in VS 2010, but it wasn’t talked about as much.  While it’s pretty straightforward to track changes merged across branches by changeset, sometimes it’s even more effective to track merges by work item (i.e. show me where changes associated with a work item have been merged/pushed to other branches).

    Let’s catch up. Consider the relatively simple branch hierarchy below:

    image

    A work item has been assigned to Julia, Task #80.

    image

    Julia makes some code changes, and checks in against (linking to) the work item (Task #80).

    She checks in 2 individual changes to create links to 2 discrete changesets from the task.

    Now, it’s easy to go ahead and track an individual changeset by selecting the option from the History window.

    image

    That’s all well and good, but if I didn’t know the exact changeset ID (#17), or if there was more than one changeset in associated with the task, this tracking process becomes less effective.

    What Julia can do is right-click on the work item and select “Track Work Item”.    (Note that this option will be disabled if there are no changesets linked to the work item.)

    image

    She can also click the “Track Work Item” button at the top of the work item form:

    image

    I get a much clearer picture now of all the work and where it’s been applied, and the “Tracking” visualization will now include all changesets (in my case, 2 changesets) in the window.

    Now I know exactly what changes to merge.  I merge them, and now I can see that the entire work item has been merged to Main from Dev (i.e. both changesets were merged).

    image

    And just as effectively, I can see these changes in the Timeline Tracking view:

    image

    So that’s it! Tracking by work items are pretty easy to do, and paint a much clearer picture of how a change from a work item perspective can, or has been, applied across branches.

    Again, I know this isn’t exactly a new feature, but there are a lot of people out there who are looking for ways to “merge by work item” and aren’t aware of this feature.

  • Steve Lange @ Work

    Ordered Tests in TFS Build

    • 4 Comments

    In an earlier article I discussed how to use and Ordered Test to control the execution order of Coded UI Tests (the same can be applied to other test types as well).  I received a few follow-up questions about how to do this in TFS Build so tests run in a particular order as part of a build.

    Here’s one way that’s remarkably easy.

    In my example, I have a project called JustTesting, which contains just a test project with 3 unit tests (which will always pass, BTW).

    image

    I put those tests into an ordered test:

    image

    In Solution Items, I open up my JustTesting.vsmdi file, create a new test list (called Ordered Tests), and add my ordered test to it.

    image

    Once that’s done, I check everything into TFS (my Team Project’s name is “Sample CMMI”.

    Next, I set up a build definition (in Team Explorer, right-click Builds, and select “New Build Definition”).  Set whatever options you want (name, trigger, workspace, build defaults) but stop at “Process”.

    In the section named “2. Basic”, you’ll see that by default the Automated Tests field is set to (something like): “Run tests in assemblies matching **\*test*.dll using settings from $/Sample CMMI/JustTesting/Local.testsettings”. 

    image

    Click on the ellipsis on the right of that to open the Automated Tests dialog:

    image

    Remove the entry you see (or leave it if you wish to include that test definition), and then click “Add”.

    In the Add/Edit Test dialog, select the optoin for “Test metadata file (.vsmdi)”.  Use the browse button to find and select your desired .vsmdi file.  In my example, JustTesting.vsmdi.

    Uncheck “Run all tests in this VSMDI file”, then check the box next to your test list containing the ordered test.  In my example, the test list is called “Ordered Tests”.  Your dialog should look something like this:

    image

    Click OK and you’re Automated Tests dialog should look like:

    image

    Click OK again, then save your build definition.

    Queue a new build using this definition.  Once complete, look at the build report to see your test results.

    image

    image

    It’s a few steps, but nothing ridiculous.  And I didn’t have to hack any XML files or do any custom coding.

    Hope this helps!

  • Steve Lange @ Work

    Thank you, Denver! Goodnight!

    • 4 Comments

    Thanks to the roughly 100 of you who attended the Denver VS.Net User Group.  While I’m sure you all showed up primarily for the free food and door prizes, I appreciate the level of interaction during my presentation last night (“Team Foundation Server: Today & Tomorrow”). 

    As promised, here is the presentation I used last night (posted on SkyDrive):

    Please send me feedback or any other questions you might have!

  • Steve Lange @ Work

    Data-Driven Tests in Team System Using Excel as the Data Source

    • 3 Comments

    There is some documentation to explain this already, but below is a step-by-step that shows how to use an Excel spreadsheet as a Data Source for both unit and web tests.

    First, let’s set the stage.  I’m going to use a solution containing a class library and a web site. 

    imageSolution 

    The class library has a single class with a single method that simply returns a “hello”-type greeting. 

    namespace SimpleLibrary
    {
        public class Class1
        {
            public string GetGreeting(string name)
            {
                return "Hello, " + name;
            }
        }
    }
    For my VB friends out there:
    Namespace SimpleLibrary
        Public Class Class1
            Public Function GetGreeting(ByVal name As String) As String
                Return "Hello, " & name
            End Function
        End Class
    End Namespace

    Unit Testing

    So now I’m going to create a unit test to exercise the “GetGreeting” method.  (As always, tests go into a Test project.  I’m calling mine “TestStuff”.)

    image

    Here’s my straightforward unit test:

    [TestMethod()]
    public void GetGreetingTest()
    {
       Class1 target = new Class1();
       string name = "Steve";
       string expected = "Hello, " + name;
       string actual;
       actual = target.GetGreeting(name);
       Assert.AreEqual(expected, actual);
    }

    In VB:

    <TestMethod()> _
    Public Sub GetGreetingTest()
       Dim target As Class1 = New Class1
       Dim name As String = "Steve"
       Dim expected As String = "Hello, " & name
       Dim actual As String
       actual = target.GetGreeting(name)
       Assert.AreEqual(expected, actual)
    End Sub

    I’ll run it once to make sure it builds, runs, and passes:

    image

    I have an Excel file with the following content in Sheet1:

    image

    Nothing fancy, but I reserve the right to over-simplify for demo purposes.  :)

    To create a data-driven unit test that uses this Excel spreadsheet, I basically follow the steps you’d find on MSDN, with the main difference being in how I wire up my data source.

    I click on the ellipsis in the Data Connection String property for my unit test.

    image

    Follow these steps to set up the Excel spreadsheet as a test data source for a unit test.

    • In the New Test Data Source Wizard dialog, select “Database”. 
    • Click “New Connection”.
    • In the “Choose Data Source” dialog, slect “Microsoft ODBC Data Source” and click “Continue”.  (For additional details about connection strings & data sources, check this out.)
      image
    • In “Connection Properties”, select the “Use connection string” radio button, then click “Build”.
    • Choose if you want to use a File Data Source or a Machine Data Source.  For this post, I’m using a Machine Data Source
    • Select the “Machine Data Source” tab, select “Excel Files” and click Ok
    • Browse to and select your Excel file.
      image
    • Click “Test Connection” to make sure everything’s golden.
      image
    • Click Ok to close “Connection Properties”
    • Click Next
    • You should see the worksheets listed in the available tables for this data source.
      image
    • In my example, I’ll select “Sheet1$”
    • Click “Finish”
    • You should get a message asking if you want to copy your data file into the project and add as a deployment item.  Click Yes.
      image
    • You should now see the appropriate values in Data Connection String and Data Table Name properties, as well as your Excel file listed as a deployment item:
      image 
    • Now I return to my unit test, note that it’s properly decorated, and make a change to the “name” variable assignment to reference my data source (accessible via TestContext):
      [DataSource("System.Data.Odbc", "Dsn=Excel Files; 
      dbq=|DataDirectory|\\ExcelTestData.xlsx;defaultdir=C:\\TestData; 
      driverid=1046;maxbuffersize=2048;pagetimeout=5", "Sheet1$", 
      DataAccessMethod.Sequential), 
      DeploymentItem("TestStuff\\ExcelTestData.xlsx"), TestMethod()]
              public void GetGreetingTest()
              {
                  Class1 target = new Class1();
                  string name = TestContext.DataRow["FirstName"].ToString();
                  string expected = "Hello, " + name;
                  string actual;
                  actual = target.GetGreeting(name);
                  Assert.AreEqual(expected, actual);
              }
    Again, in VB:
    <DataSource("System.Data.Odbc", "Dsn=Excel Files;
    dbq=|DataDirectory|\ExcelTestData.xlsx;defaultdir=C:\TestData;
    driverid=1046;maxbuffersize=2048;pagetimeout=5", "Sheet1$", 
    DataAccessMethod.Sequential)> 
    <DeploymentItem("TestStuff\ExcelTestData.xlsx")> <TestMethod()> _
        Public Sub GetGreetingTest()
            Dim target As Class1 = New Class1
            Dim name As String = TestContext.DataRow("FirstName").ToString()
            Dim expected As String = "Hello, " + name
            Dim actual As String
            actual = target.GetGreeting(name)
            Assert.AreEqual(expected, actual)
        End Sub
    • Now, running the unit test shows me that it ran a pass for each row in my sheet
      image

    Yippee!

    Web Testing

    You can achieve the same thing with a web test.  So I’m going to first create a simple web test that records me navigating to the website (at Default.aspx), entering a name in the text box, clicking, submit, and seeing the results.  After recording, it looks like this.

    image

    See “TxtName=Steve”?  The value is what I want to wire up to my Excel spreadsheet.  To do that:

    • Click on the “Add Data Source” toolbar button.
    • Enter a data source name (I’m using “ExcelData”)
    • Select “Database” as the data source type, and click Next
    • Go through the same steps in the Unit Testing section to set up a data connection to the Excel file.  (Note:  If you’ve already done the above, and therefore the Excel file is already in your project and a deployment item, browse to and select the copy of the Excel file that’s in your testing project.  That will save you the hassle of re-copying the file, and overwriting.)
    • You’ll now see a Data Sources node in my web test:
      image
    • Select the parameter you want to wire to the data source (in my case, TxtName), and view its properties.
    • Click the drop-down arrow in the Value property, and select the data field you want to use.
      image
    • Now save and run your web test again.  If you haven’t used any other data-driven web tests in this project, you’ll notice that there was only one pass.  That’s because your web test run configuration is set to a fixed run count (1) by default.  To make changes for each run, click “Edit run settings” and select “One run per data source row”.  To make sure all rows in data sources are always leveraged, edit your .testrunconfig file to specify as such.
      image
    • Now run it again, and you should see several passes in your test results:
      image

    That’s it in a simple nutshell!  There are other considerations to keep in mind such as concurrent access, additional deployment items, and perhaps using system DSNs, but this should get you started.

  • Steve Lange @ Work

    Querying the TFS Database to Check TFS Usage

    • 3 Comments

    Why would you want to know how many users are actually using Team Foundation Server?  Well, for starters:

    • You want to make sure that each user in your environment using TFS is properly licensed with a TFS CAL (Client Access License). 
    • You want to show management just how popular TFS is in your environment.
    • You want to request additional hardware for TFS, and want to show current usage capacity.

    But, what if your users are spread out all over the world, so you can’t just send a simple email asking, “Hey, are you using TFS?”

    One relatively straightforward way is to ask your TFS server’s database.  TFS logs activity in a database ‘TfsActivityLogging’, specifically in a table ‘tbl_Command’.

    NOTE:  It’s not supported to go directly against the database, so take note of 2 things:

    1. Be very careful!
    2. Be clear that this isn’t supported.  This process works, but only in the absence of a supported way to query TFS usage.  Just because I work for Microsoft, doesn’t mean you can get official support from MS on this.

    All that out of the way, the simple way to do this is to use Excel:

    Open Excel.

    Go to the Data tab and select ‘From Other Sources’ in the ‘Get External Data’ group, and select ‘From SQL Server’.

    image

    The Data Connection Wizard will open.  Follow steps to connect to the SQL Server that’s used by TFS, selecting the ‘TfsActivityLogging’ database and the contained ‘tbl_Command’ table.

    image

    Enter the SQL Server name that TFS uses.  For the below, my SQL server is at ‘tfsrtm08’.

    image

    Select the ‘TfsActivityLogging’ database, then select the ‘tbl_Command’ table. Click Next.

    image

    Click Finish.

    Select how you’d like to import the table’s data.  For this example, I’m choosing ‘PivotTable Report’.

    image

    Now you’re ready to get the data you want:

    Listing All Users Who Have Touched TFS

    In the ‘PivotTable Field List’ panel on the right, select the ‘IdentityName’ field.  Your spreadsheet should look something like this:

    image

    If you just want a list of users that have touched TFS, then you’re done (in my example, I really only have 2 accounts, and one is the TFSSERVICE account that actually runs TFS).

    However, if you want a little extra information about your users’ activities, you can do a couple extra things.

    List Users and Their Relative Activity Levels

    Add the ‘ExecutionCount’ field to the ‘Values’ section of the PivotTable, and you’ll see the number of commands each user has run against TFS (some minor, like gets, and other major, like changing ACL’s):

    image

    List Users and Their Specific Activity Levels

    Add first the ‘ExecutionCount’ field to the ‘Values’ section of the PivotTable, then add the ‘Command’ field to the ‘Row Labels’ section:

    image

    (Again, remember that some of these commands are less significant than others, but still indicate user activity.)

    List Users and Their Clients

    Add the ‘UserAgent’ field to the ‘Row Labels’ section of the PivotTable:

    image

    List Users and Their Last Activity Time

    Add ‘IdentityName’ to the ‘Row Labels’ section of the PivotTable and ‘StartTime’ to the ‘Values’ section.  Then click ‘Count of StartTime’ (in the Values section) and select ‘Value Field Settings’.  Change the ‘Summarize the value field by’ value to ‘Max’.

    image

    Click ‘Number Format’ and set the format to ‘Date’.  Click OK.  You’ll now see the last activity date for each user.

    image 

    I hope this helps!

    Other Tip:

    • You’ll probably see (like in my example) the built-in accounts and their activities (i.e. TFSSERVICE, perhaps TFSBUILD as well).  You may want to filter those ones out from your report.
    • I’ve heard conflicting reports about how much data the ‘tbl_Commands’ table retains (some say just the preceding week).  In my example, I queried the ‘Min’ start times for logged activities and went back over 5 months.  Just something to think about:  Your mileage may vary greatly.  (Apparently a clean-up job is supposed to run periodically which trims this table.)
  • Steve Lange @ Work

    Thoughts on TFS Project Collections

    • 3 Comments

    New to TFS 2010, Team Project Collections (TPCs) provide an additional layer of project organization/abstraction above the Team Project level (see the MSDN article, “Organizing Your Server with Project Collections”)

    I’ve been asked numerous times over the past couple of months about the intention of project collections, their flexibility and limitations.  Below are simply my educated thoughts on the subject.  Please do your due diligence before deciding how you wish (or wish not) to implement project collections in your environment.

    You can use collections to more tightly couple related projects, break up the administrative process, or to dramatically increase scale.  But the primary design goal behind introducing project collections is around isolation (of code, projects, or groups within an organization) in a way that provides all the benefits of TFS, scoped to a defined level within a single instance of TFS.  You’re effectively partitioning TFS.

     Basic project collection structure

     

    If you have ever used TFS 2005 or 2008, think of it this way.  A project collection effectively compartmentalizes all the capabilities you’ve grown to love in a single TFS 2005/2008 instance:

    Project collection compartmentalization

    I won’t go into how you create/edit/delete project collections.  Just know that you can.  (BTW – for those of you upgrading from an earlier version of TFS, your existing projects will go into a single default project collection (by default, it’s named “Default Collection”.  Original, right?)

    Consider this (over-simplified) example.  I have 4 projects in my server, currently in a single (“default”) collection:

    Single collection view

    Say Project A and Project B are used by “Division A” in my company, and Agile1 and Sample Win App are used by “Division B”.  Project A and Project B share some code and leverage the same user base.  The assets in each division’s projects are in no way related to the other.  Consequently, I’d love to take advantage of project collections and separate our divisions’ stuff.  A more practical implementation of project collections might look like this:

    I build out my collections using the TFS Administration Console to look like this:

    Viewing my project collections in the admin console

    Once that’s done, I can ultimately end up with such a structure that my desired projects are contained in their respective organization’s collection:

    Division A’s stuff:

    Division A's collection

    Division B’s stuff:

    Division B's collection

    Now each division’s stuff is effectively compartmentalized.  No shared process templates, no shared user base, and no shared database (which means one division’s screw-up won’t affect another division’s work).

    Okay, so I lied a little – I earlier said I wouldn’t go into detail about how to CRUD collections.  But I will mention one thing here, which will add context to the above scenario.  In the above, I had a single collection that I effectively wanted to split into two collections (i.e. go from “Default Collection” to “Division A” and “Division B”).  This is surprisingly easy to do (more complicated than drag & drop, but not ridiculous either).  The documentation for splitting a collection lists 15 main steps to accomplish this, but basically what you’re doing is cloning a collection and then deleting what you don’t want.

    See?  I told you it would be a simple example.  But if you expand this to think of a TFS environment with 100 projects (instead of my puny four), you get the point.

    This all sounds pretty cool, right?  It. Is. Very. Cool.  Project collections can be used for various purposes in your TFS deployment (consolidating related development efforts, scaling the SQL backend, mapping TFS hierarchy to organization hierarchy, etc.).  However, with flexibility comes complexity.  If you had fun sitting around a conference table debating how to structure your TFS 2005/2008 project hierarchy (perhaps consulting our branching guidance document or patterns & practices?), project collections add a new element to consider for 2010.  Below I’ve outlined some of the main considerations for you and your team to think about before taking advantage of project collections in TFS 2010.

    For Systems Administrators:  Pros & Cons

    Pros

    • Flexibility to to backup/restore collections individually.  This can reduce downtime as restoring one collection will not impact users of other collections.
    • Since each collection is contained in its own database, these databases can be moved around a SQL infrastructure to increase scale and load balancing.
    • Could help consolidate IT resources.  If your organization currently leverages several TFS instances simply to isolate environments between departments, collections can allow the same TFS instance to be used while still providing this isolation.

    Cons

    • Again, with flexibility comes complexity.  Since project collections use their own databases, each one must be backed up (and restored) individually.  Also, other admin tasks such as permissions and build controller configuration grow proportionately as additional collections are created.
    • Users and permissions need to be administered separately for each project collection (this may also be a project admin consideration).
    • There are more databases to restore in the event a full disaster recovery is needed.

    For Project Administrators:  Pros & Cons

    Pros

    • Organizational hierarchy.  If your organization has multiple divisions/departments, you can break up your TFS project structure to reflect that organizational view.  This makes it much easier for users to identify (or constrain via permissions) which projects belong to their department.
    • Projects grouped in the same project collection can leverage similar reports (“dashboards”) work item types, etc.  They can can also inherit source code from other grouped projects.

    Cons

    • In Visual Studio, you can only connect to one collection at a time.  While it’s relatively trivial to simply connect to a different collection, you can’t view team projects in Team Explorer that reside in different project collections.
    • Relationship-specific operations you enjoy across team projects cannot span project collections.  This means that there are several things you cannot do across collection boundaries, such as:
    • Branch/merge source code (you can do this cross-project, but not cross-collection)
    • Query work items (i.e. you can’t build a query that will show you all bugs across multiple collections)
    • Link items (i.e. you can’t link a changeset in one collection to a task in another collection)
    • Process templates are customized and applied at the project collection level, not the TFS server level

    What does it boil down to?

    It’s really about your need for isolation.  Do you ideally want to isolate by application/system, organization, or something else?  Do you foresee a need to share code, work items, or other assets across projects?  It’s a fun little decision tree:

     Basic, over-simplified decision tree

    So that’s it!  The devil is always hiding in the details, so do your own research and use your own discretion when deciding how to adopt project collections into your TFS deployment.  I anticipate more guidance on this topic to come out as TFS 2010 installations propagate throughout the world.

    For more resources and practical guidance on using Team Foundation Server, see the TFS team’s blog on MSDN.

    I hope this helps you somewhat!  And thanks for reading!

  • Steve Lange @ Work

    August 2010 - Steve’s Monthly Developer Tools Newsletter (First Installment!)

    • 3 Comments

    [UPDATE – To allow comments and better tracking, I’m going to be publishing my newsletter as a regular blog post instead of a static page.]

    imageIt happens often:  I meet with a customer who asks a terrific question which makes me think, “Man, I have a lot of other customers who’d love to know about that as well!”

    So I’ve decided to (try and) put together a monthly newsletter which provides announcements, tips, event notices, and other information that I think will interest you.  (And yes, I’m open to ideas/topics as well!)

    Earlier today, I posted my first installment for August 2010.  As I post more, I’ll maintain an archive as well, I’ll be tagging my newsletter posts as well so you can see an archive.  While I will be posting these newsletters online, I will (and already have!) sent notifications to some of you.  If you’d like to be notified of new newsletters, send me an email or fill out the contact form and let me know.  (Yep, opt in.  I don’t want to just spam.)

    I hope to publish at the beginning of each month, detailing news from the past month and covering upcoming items for the next month.

  • Steve Lange @ Work

    Microsoft’s Visual Studio ALM is a leader in the the Gartner Magic Quadrant

    • 3 Comments

    The brilliant minds at Gartner have positioned Microsoft in the “leader” quadrant for Application Lifecycle Management, in their June 5th, 2012 publication, “Magic Quadrant for Application Life Cycle Management” (available for 12 months following publication).

    Their evaluation was based on Visual Studio 2010 and Team Foundation Server 2010. I can’t wait to see what they think of the 2012 version once it releases!

    Magic Quadrant for Application Life Cycle Management (Gartner June 2012) 

    I’ll let you read the report (Microsoft section) for full details, but notable quotes include:

    “By virtue of its position in the market as a provider of key platforms and development tools, Microsoft acts as an overall thought leader in the ALM market”

    “Unlike all of the other tools in this Magic Quadrant, Microsoft's is the only one that tightly binds its versioning system to the rest of the ALM planning tool.”

    “..the company has made good strides with support for Eclipse and the ability to extend TFS with Java code.”

    This is truly a great accomplishment for our teams at Microsoft.  Congratulations to all!

Page 1 of 14 (339 items) 12345»