Buck Hodges

Visual Studio Online, Team Foundation Server, MSDN

Posts
  • Buck Hodges

    Team Foundation Version Control client API example for TFS 2010 and newer

    • 42 Comments

    Over six years ago, I posted a sample on how to use the version control API.  The API changed in TFS 2010, but I hadn’t updated the sample.  Here is a version that works with 2010 and newer and is a little less aggressive on clean up in the finally block.

    This is a really simple example that uses the version control API.  It shows how to create a workspace, pend changes, check in those changes, and hook up some important event listeners.  This sample doesn't do anything useful, but it should get you going.

    You have to supply a Team Project as an argument.

    The only real difference in this version is that it uses the TeamFoundationServer constructor (in beta 3, you were forced to use the factory class).

    You'll need to add references to the following TFS assemblies to compile this example.

    Microsoft.TeamFoundation.VersionControl.Client.dll
    Microsoft.TeamFoundation.Client.dll

    Code Snippet
    1. using System;
    2. using System.Collections.Generic;
    3. using System.Diagnostics;
    4. using System.IO;
    5. using System.Text;
    6. using Microsoft.TeamFoundation.Client;
    7. using Microsoft.TeamFoundation.VersionControl.Client;
    8.  
    9. namespace BasicSccExample
    10. {
    11.     class Example
    12.     {
    13.         static void Main(string[] args)
    14.         {
    15.             // Verify that we have the arguments we require.
    16.             if (args.Length < 2)
    17.             {
    18.                 String appName = Path.GetFileName(Process.GetCurrentProcess().MainModule.FileName);
    19.                 Console.Error.WriteLine("Usage: {0} collectionURL teamProjectPath", appName);
    20.                 Console.Error.WriteLine("Example: {0} http://tfsserver:8080/tfs/DefaultCollection $/MyProject", appName);
    21.                 Environment.Exit(1);
    22.             }
    23.  
    24.             // Get a reference to our Team Foundation Server.
    25.             TfsTeamProjectCollection tpc = new TfsTeamProjectCollection(new Uri(args[0]));
    26.  
    27.             // Get a reference to Version Control.
    28.             VersionControlServer versionControl = tpc.GetService<VersionControlServer>();
    29.  
    30.             // Listen for the Source Control events.
    31.             versionControl.NonFatalError += Example.OnNonFatalError;
    32.             versionControl.Getting += Example.OnGetting;
    33.             versionControl.BeforeCheckinPendingChange += Example.OnBeforeCheckinPendingChange;
    34.             versionControl.NewPendingChange += Example.OnNewPendingChange;
    35.  
    36.             // Create a workspace.
    37.             Workspace workspace = versionControl.CreateWorkspace("BasicSccExample", versionControl.AuthorizedUser);
    38.  
    39.             String topDir = null;
    40.  
    41.             try
    42.             {
    43.                 String localDir = @"c:\temp\BasicSccExample";
    44.                 Console.WriteLine("\r\n--- Create a mapping: {0} -> {1}", args[1], localDir);
    45.                 workspace.Map(args[1], localDir);
    46.  
    47.                 Console.WriteLine("\r\n--- Get the files from the repository.\r\n");
    48.                 workspace.Get();
    49.  
    50.                 Console.WriteLine("\r\n--- Create a file.");
    51.                 topDir = Path.Combine(workspace.Folders[0].LocalItem, "sub");
    52.                 Directory.CreateDirectory(topDir);
    53.                 String fileName = Path.Combine(topDir, "basic.cs");
    54.                 using (StreamWriter sw = new StreamWriter(fileName))
    55.                 {
    56.                     sw.WriteLine("revision 1 of basic.cs");
    57.                 }
    58.  
    59.                 Console.WriteLine("\r\n--- Now add everything.\r\n");
    60.                 workspace.PendAdd(topDir, true);
    61.  
    62.                 Console.WriteLine("\r\n--- Show our pending changes.\r\n");
    63.                 PendingChange[] pendingChanges = workspace.GetPendingChanges();
    64.                 Console.WriteLine("  Your current pending changes:");
    65.                 foreach (PendingChange pendingChange in pendingChanges)
    66.                 {
    67.                     Console.WriteLine("    path: " + pendingChange.LocalItem +
    68.                                       ", change: " + PendingChange.GetLocalizedStringForChangeType(pendingChange.ChangeType));
    69.                 }
    70.  
    71.                 Console.WriteLine("\r\n--- Checkin the items we added.\r\n");
    72.                 int changesetNumber = workspace.CheckIn(pendingChanges, "Sample changes");
    73.                 Console.WriteLine("  Checked in changeset " + changesetNumber);
    74.  
    75.                 Console.WriteLine("\r\n--- Checkout and modify the file.\r\n");
    76.                 workspace.PendEdit(fileName);
    77.                 using (StreamWriter sw = new StreamWriter(fileName))
    78.                 {
    79.                     sw.WriteLine("revision 2 of basic.cs");
    80.                 }
    81.  
    82.                 Console.WriteLine("\r\n--- Get the pending change and check in the new revision.\r\n");
    83.                 pendingChanges = workspace.GetPendingChanges();
    84.                 changesetNumber = workspace.CheckIn(pendingChanges, "Modified basic.cs");
    85.                 Console.WriteLine("  Checked in changeset " + changesetNumber);
    86.             }
    87.             finally
    88.             {
    89.                 if (topDir != null)
    90.                 {
    91.                     Console.WriteLine("\r\n--- Delete all of the items under the test project.\r\n");
    92.                     workspace.PendDelete(topDir, RecursionType.Full);
    93.                     PendingChange[] pendingChanges = workspace.GetPendingChanges();
    94.                     if (pendingChanges.Length > 0)
    95.                     {
    96.                         workspace.CheckIn(pendingChanges, "Clean up!");
    97.                     }
    98.  
    99.                     Console.WriteLine("\r\n--- Delete the workspace.");
    100.                     workspace.Delete();
    101.                 }
    102.             }
    103.         }
    104.  
    105.         internal static void OnNonFatalError(Object sender, ExceptionEventArgs e)
    106.         {
    107.             if (e.Exception != null)
    108.             {
    109.                 Console.Error.WriteLine("  Non-fatal exception: " + e.Exception.Message);
    110.             }
    111.             else
    112.             {
    113.                 Console.Error.WriteLine("  Non-fatal failure: " + e.Failure.Message);
    114.             }
    115.         }
    116.  
    117.         internal static void OnGetting(Object sender, GettingEventArgs e)
    118.         {
    119.             Console.WriteLine("  Getting: " + e.TargetLocalItem + ", status: " + e.Status);
    120.         }
    121.  
    122.         internal static void OnBeforeCheckinPendingChange(Object sender, ProcessingChangeEventArgs e)
    123.         {
    124.             Console.WriteLine("  Checking in " + e.PendingChange.LocalItem);
    125.         }
    126.  
    127.         internal static void OnNewPendingChange(Object sender, PendingChangeEventArgs e)
    128.         {
    129.             Console.WriteLine("  Pending " + PendingChange.GetLocalizedStringForChangeType(e.PendingChange.ChangeType) +
    130.                               " on " + e.PendingChange.LocalItem);
    131.         }
    132.     }
    133. }

  • Buck Hodges

    Updating a team project to use new features after upgrading to TFS 11 Beta

    • 3 Comments

    [UPDATE 3/12/12] Aaron pointed out Ewald's recent detailed blog post that walks through adding optional metadata beyond the MSDN article: http://blogs.msdn.com/b/visualstudioalm/archive/2012/03/06/get-your-agile-project-fixed-after-an-upgrade-from-tfs2010-to-tfs11-beta.aspx.

    Team Foundation Server 11 Beta can be used to upgrade your existing TFS 2010 or 2008 server and used in production.  We’ll be supporting upgrade from TFS 11 Beta to RC (you’ll very likely need to go from Beta to RC to RTM in order to follow the supported path).  As Jason said in his beta blog post, TFS 11 beta is “go-live,” which means you can use it in production and receive support (details).

    Once you upgrade your server, you’ll want to enable new features that require things like new work item types.  We are working on changes that will hopefully make it easier for RC.  Here’s the documentation you’ll need to make the necessary changes.

    Updating an Upgraded Team Project to Access New Features

    After you upgrade to Visual Studio 11 Team Foundation Server Beta, you can still access the data from team projects that you created in the previous release. By downloading a localized zip file, extracting the files, and running the provided batch file of commands, you can update your team project to use the following new tools for managing the application lifecycle:

    read more…

  • Buck Hodges

    Permission error with creating a team project from VS 2010 on TFS 2012

    • 28 Comments

    You must use the Visual Studio Team Explorer 2012 (included in all Visual Studio editions or may be separately downloaded) to create a team project on a TFS 2012 server.  If you use VS 2010, you will get an error about not having permission.  The error message is very misleading, because it’s not a problem with your permissions.

    ---------------------------
    Microsoft Visual Studio
    ---------------------------
    TF30172: You do not have permission to create a new team project.
    ---------------------------
    OK   Help  
    ---------------------------

  • Buck Hodges

    Listing the work items associated with changesets for a path

    • 4 Comments

    Philip wrote a simple app to list the work items associated with the changesets for a given path, and it’s in some ways an enhanced update of Naren’s post.

    Given an URL to a collection and a server path (e.g., $/myproject/coolthing), it will list the work items that are associated with the most recent 25 checkins.  This sample shows how to use the linking service to convert the work item artifact URIs that are stored with the changesets to get the core work item fields (ID, assigned to, state, type, and title).

    It will produce output like the following.

    Id: 352694 Title: Improve performance of queuing servicing jobs on Azure.

    You will need to reference the following DLLs to build this, all of which are found on the .NET tab of the Add Reference dialog in Visual Studio 2010.

    • Microsoft.TeamFoundation.Client.dll
    • Microsoft.TeamFoundation.Common.dll
    • Microsoft.TeamFoundation.VersionControl.Client.dll
    using System; 
    using System.Collections.Generic; 
    using System.Diagnostics; 
    using Microsoft.TeamFoundation; 
    using Microsoft.TeamFoundation.Client; 
    using Microsoft.TeamFoundation.VersionControl.Client; 
    
    namespace ListWorkItems 
    { 
        class Program 
        { 
            static void Main(string[] args) 
            { 
                if (args.Length < 2)
                { 
                    Console.WriteLine("Usage: listworkitems <URL for TFS> <server path>"); 
                    Environment.Exit(1); 
                } 
    
                TfsTeamProjectCollection tpc = new TfsTeamProjectCollection(new Uri(args[0]));
                VersionControlServer vcs = tpc.GetService<VersionControlServer>(); 
    
                // Get the changeset artifact URIs for each changeset in the history query
                List<String> changesetArtifactUris = new List<String>(); 
    
                foreach (Object obj in vcs.QueryHistory(args[1],                       // path we care about ($/project/whatever) 
                                                        VersionSpec.Latest,            // version of that path
                                                        0,                             // deletion ID (0 = not deleted) 
                                                        RecursionType.Full,            // entire tree - full recursion
                                                        null,                          // include changesets from all users
                                                        new ChangesetVersionSpec(1),   // start at the beginning of time
                                                        VersionSpec.Latest,            // end at latest
                                                        25,                            // only return this many
                                                        false,                         // we don't want the files changed
                                                        true))                         // do history on the path
                { 
                    Changeset c = obj as Changeset; 
                    changesetArtifactUris.Add(c.ArtifactUri.AbsoluteUri); 
                } 
    
                // We'll use the linking service to get information about the associated work items
                ILinking linkingService = tpc.GetService<ILinking>(); 
                LinkFilter linkFilter = new LinkFilter(); 
                linkFilter.FilterType = FilterType.ToolType; 
                linkFilter.FilterValues = new String[1] { ToolNames.WorkItemTracking };  // we only want work itms
    
                // Convert the artifact URIs for the work items into strongly-typed objects holding the properties rather than name/value pairs 
                Artifact[] artifacts = linkingService.GetReferencingArtifacts(changesetArtifactUris.ToArray(), new LinkFilter[1] { linkFilter });
                AssociatedWorkItemInfo[] workItemInfos = AssociatedWorkItemInfo.FromArtifacts(artifacts);
    
                // Here we'll just print the IDs and titles of the work items
                foreach (AssociatedWorkItemInfo workItemInfo in workItemInfos)
                { 
                    Console.WriteLine("Id: " + workItemInfo.Id + " Title: " + workItemInfo.Title); 
                } 
            } 
        } 
    
        internal class AssociatedWorkItemInfo
        { 
            private AssociatedWorkItemInfo() 
            { 
            } 
    
            public int Id 
            { 
                get 
                { 
                    return m_id; 
                } 
            } 
    
            public String Title 
            { 
                get 
                { 
                    return m_title; 
                } 
            } 
    
            public String AssignedTo 
            { 
                get 
                { 
                    return m_assignedTo; 
                } 
            } 
    
            public String WorkItemType 
            { 
                get 
                { 
                    return m_type; 
                } 
            } 
    
            public String State 
            { 
                get 
                { 
                    return m_state; 
                } 
            } 
    
            internal static AssociatedWorkItemInfo[] FromArtifacts(IEnumerable<Artifact> artifacts)
            { 
                if (null == artifacts)
                { 
                    return new AssociatedWorkItemInfo[0];
                } 
    
                List<AssociatedWorkItemInfo> toReturn = new List<AssociatedWorkItemInfo>(); 
    
                foreach (Artifact artifact in artifacts)
                { 
                    if (artifact == null)
                    { 
                        continue; 
                    } 
    
                    AssociatedWorkItemInfo awii = new AssociatedWorkItemInfo();
    
                    // Convert the name/value pairs into strongly-typed objects containing the work item info 
                    foreach (ExtendedAttribute ea in artifact.ExtendedAttributes)
                    { 
                        if (String.Equals(ea.Name, "System.Id", StringComparison.OrdinalIgnoreCase)) 
                        { 
                            int workItemId; 
    
                            if (Int32.TryParse(ea.Value, out workItemId))
                            { 
                                awii.m_id = workItemId; 
                            } 
                        } 
                        else if (String.Equals(ea.Name, "System.Title", StringComparison.OrdinalIgnoreCase)) 
                        { 
                            awii.m_title = ea.Value; 
                        } 
                        else if (String.Equals(ea.Name, "System.AssignedTo", StringComparison.OrdinalIgnoreCase)) 
                        { 
                            awii.m_assignedTo = ea.Value; 
                        } 
                        else if (String.Equals(ea.Name, "System.State", StringComparison.OrdinalIgnoreCase)) 
                        { 
                            awii.m_state = ea.Value; 
                        } 
                        else if (String.Equals(ea.Name, "System.WorkItemType", StringComparison.OrdinalIgnoreCase)) 
                        { 
                            awii.m_type = ea.Value; 
                        } 
                    } 
    
                    Debug.Assert(0 != awii.m_id, "Unable to decode artifact into AssociatedWorkItemInfo object."); 
    
                    if (0 != awii.m_id)
                    { 
                        toReturn.Add(awii); 
                    } 
                } 
    
                return toReturn.ToArray(); 
            } 
    
            private int m_id; 
            private String m_title; 
            private String m_assignedTo; 
            private String m_type; 
            private String m_state; 
        } 
    }
  • Buck Hodges

    How to get the TFS objects used in our own UI integration

    • 3 Comments

    Philip, a dev on version control, recently helped with a question on how to get the TFS objects we use in our UI.  I thought I’d post since others may find it useful.

    We recently had a request from a customer for a VS add-in that would be able to access the same TfsTeamProjectCollection and VersionControlServer objects that our own UI integration (such as the Team Explorer and Pending Changes toolwindow) are using. In this particular case the customer wanted to hook the BeforeCheckinPendingChange event from the VersionControlServer object and take a specific action when that occurred. But the framework shown in this piece of sample code is generic -- you can use it to get the very same VersionControlServer or WorkItemStore object that our integration is using to connect to TFS.

    The trick here is to hook the ProjectContextChanged event on the TeamFoundationServerExt extensibility object. While that extensibility point won't give you the TfsTeamProjectCollection object directly, we can ask the TfsTeamProjectCollectionFactory's static GetTeamProjectCollection method to retrieve it from a runtime cache. The cache is keyed by URI -- which (handily) is provided by TeamFoundationServerExt. By the time the ProjectContextChanged event fires, the ActiveProjectContext.DomainUri property has already been updated.

    All the services in the TFS client object model are owned by the TfsTeamProjectCollection. Once we have it, we can call GetService to request the VersionControlServer object. There's only one per TfsTeamProjectCollection; the same holds true for WorkItemStore, IBuildServer, or any of the other client object model services you may be familiar with.

    Happy extending!

    using System;
    using System.Diagnostics;
    using System.Windows.Forms;
    using Extensibility;
    using EnvDTE;
    using EnvDTE80;
    using Microsoft.TeamFoundation.Common;
    using Microsoft.TeamFoundation.Client;
    using Microsoft.TeamFoundation.VersionControl.Common;
    using Microsoft.TeamFoundation.VersionControl.Client;
    using Microsoft.VisualStudio.TeamFoundation;
    using Microsoft.VisualStudio.TeamFoundation.VersionControl;

    namespace MyAddin1
    {
        /// <summary>The object for implementing an Add-in.</summary>
        /// <seealso class='IDTExtensibility2' />
       
    public class Connect : IDTExtensibility2
        {
            /// <summary>Implements the constructor for the Add-in object. Place your initialization code within this method.</summary>
           
    public Connect()
            {
            }

            /// <summary>Implements the OnConnection method of the IDTExtensibility2 interface. Receives notification that the Add-in is being loaded.</summary>
            /// <param term='application'>
    Root object of the host application.</param>
            /// <param term='connectMode'>
    Describes how the Add-in is being loaded.</param>
            /// <param term='addInInst'>
    Object representing this Add-in.</param>
            /// <seealso class='IDTExtensibility2' />
           
    public void OnConnection(object application, ext_ConnectMode connectMode, object addInInst, ref Array custom)
            {
                _applicationObject = (DTE2)application;
                _addInInstance = (AddIn)addInInst;

                try
               
    {
                    m_tfsExt = _applicationObject.GetObject("Microsoft.VisualStudio.TeamFoundation.TeamFoundationServerExt") as TeamFoundationServerExt;

                    if (null != m_tfsExt)
                    {
                        m_tfsExt.ProjectContextChanged += new EventHandler(m_tfsExt_ProjectContextChanged);

                        if (null != m_tfsExt.ActiveProjectContext)
                        {
                            // Run the event handler without the event actually having fired, so we pick up the initial state.
                           
    m_tfsExt_ProjectContextChanged(null, EventArgs.Empty);
                        }
                    }
                }
                catch (Exception ex)
                {
                    MessageBox.Show(ex.Message);
                }
            }

            /// <summary>Implements the OnDisconnection method of the IDTExtensibility2 interface. Receives notification that the Add-in is being unloaded.</summary>
            /// <param term='disconnectMode'>
    Describes how the Add-in is being unloaded.</param>
            /// <param term='custom'>
    Array of parameters that are host application specific.</param>
            /// <seealso class='IDTExtensibility2' />
           
    public void OnDisconnection(ext_DisconnectMode disconnectMode, ref Array custom)
            {
                // Unhook the ProjectContextChanged event handler.
               
    if (null != m_tfsExt)
                {
                    m_tfsExt.ProjectContextChanged -= new EventHandler(m_tfsExt_ProjectContextChanged);
                    m_tfsExt = null;
                }
            }

            /// <summary>Implements the OnAddInsUpdate method of the IDTExtensibility2 interface. Receives notification when the collection of Add-ins has changed.</summary>
            /// <param term='custom'>
    Array of parameters that are host application specific.</param>
            /// <seealso class='IDTExtensibility2' />       
           
    public void OnAddInsUpdate(ref Array custom)
            {
            }

            /// <summary>Implements the OnStartupComplete method of the IDTExtensibility2 interface. Receives notification that the host application has completed loading.</summary>
            /// <param term='custom'>
    Array of parameters that are host application specific.</param>
            /// <seealso class='IDTExtensibility2' />
           
    public void OnStartupComplete(ref Array custom)
            {
            }

            /// <summary>Implements the OnBeginShutdown method of the IDTExtensibility2 interface. Receives notification that the host application is being unloaded.</summary>
            /// <param term='custom'>
    Array of parameters that are host application specific.</param>
            /// <seealso class='IDTExtensibility2' />
           
    public void OnBeginShutdown(ref Array custom)
            {
            }

            /// <summary>
            ///
    Raised by the TFS Visual Studio integration package when the active project context changes.
           
    /// </summary>
            /// <param name="sender"></param>
            /// <param name="e"></param>
           
    private void m_tfsExt_ProjectContextChanged(Object sender, EventArgs e)
            {
                try
               
    {
                    if (null != m_tfsExt.ActiveProjectContext &&
                        !String.IsNullOrEmpty(m_tfsExt.ActiveProjectContext.DomainUri))
                    {
                        SwitchToTfs(TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri(m_tfsExt.ActiveProjectContext.DomainUri)));
                    }
                    else
                   
    {
                        SwitchToTfs(null);
                    }
                }
                catch (Exception ex)
                {
                    MessageBox.Show(ex.Message);
                }
            }

            private void SwitchToTfs(TfsTeamProjectCollection tfs)
            {
                if (Object.ReferenceEquals(m_tfs, tfs))
                {
                    // No work to do; could be a team project switch only
                   
    return;
                }

                if (null != m_tfs)
                {
                    m_tfs.GetService<VersionControlServer>().BeforeCheckinPendingChange -= new ProcessingChangeEventHandler(VersionControlServer_BeforeCheckinPendingChange);
                    m_tfs = null;
                }

                if (null != tfs)
                {
                    m_tfs = tfs;
                    m_tfs.GetService<VersionControlServer>().BeforeCheckinPendingChange += new ProcessingChangeEventHandler(VersionControlServer_BeforeCheckinPendingChange);               
                }
            }

            private void VersionControlServer_BeforeCheckinPendingChange(Object sender, ProcessingChangeEventArgs e)
            {
                if (null != e.PendingChange &&
                    !String.IsNullOrEmpty(e.PendingChange.ServerItem))
                {
                    MessageBox.Show("About to check in: " + e.PendingChange.ServerItem);
                }
            }
           
            private DTE2 _applicationObject;
            private AddIn _addInInstance;

            private TeamFoundationServerExt m_tfsExt;
            private TfsTeamProjectCollection m_tfs;
        }
    }

  • Buck Hodges

    A tool to find duplicate copies in a build

    • 2 Comments

    As part of our builds, quite a few projects copy files to the binaries directory or other locations.  These can be anything from image files to test scripts.  To have our builds complete more quickly, we use the multi-process option (/maxcpucount) of msbuild to build projects in parallel.

    This all sounds normal, so what’s the problem?  In a large team, people will sometimes inadvertently add statements to different project files that copy files to the same destination.  When those project files have no references to each other, directly or indirectly, msbuild may build them in parallel.  If it does happen to run those projects in parallel on different nodes and the copies happen at the same time, the build breaks because one copy succeeds and one fails.  Since the timing is not going to be the same on every build, the result is random build breaks.  Build breaks suck.  They drain the productivity of the team and are frustrating.

    Whether the build is continuous integration or gated checkin, these breaks may happen randomly.  They are most likely to happen on incremental builds where the percentage of time spent during the build on doing copies is much higher than a clean build.  Tracking them down as they happen is painful.

    So, I wrote a simple tool to find cases in the log where the destination is the same for more than one copy operation.  The comment in the header explains what the code is looking for.  Running this on the normal verbosity msbuild logs from a clean build ensures that all of the copies are in the log for analysis.  We also build what we call partitions separately, resulting in the number of log files being a multiple of the number of partitions being built (a partition is a subset of the source and is typically a top-level directory in the branch).

    In our internal builds, we record multiple log files for our builds, including minimal, normal, and detailed.  When there’s a problem, you can start with the smaller build logs and increase to the more verbose logging as needed.

    I’m posting this for any of you who might run into the same thing.  Keep in mind that there are other things, such as antivirus software, that can interfere with the build process and result in errors for files being copied.

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.IO;
    using System.Text;
    using System.Text.RegularExpressions;
    using System.Threading.Tasks;
    
    /*
     * This tool finds cases where more than one file is copied to the same target.  This will cause
     * build breaks when msbuild executes the copies in parallel because the copies are independent
     * (there are no dependencies).  This typically occurs in incremental builds because incremental
     * builds do a lot less work (not nearly as much to build), resulting in the copies being a much
     * higher percentage of the build activities and more likely to collide.  Gated checkin,
     * continuous integration, and developer/tester builds are almost always incremental, not clean.
     * These issues are still possible in regular clean builds, such as done nightly by the build lab.
     * 
     * These race conditions are difficult to debug manually.  Since msbuild records all of the copies
     * made via the copy task, we can use the log file to identify cases where the same destination
     * path is used in more than one copy.
     * 
     * Use the *.normal.* logs from a clean build with this tool.
     * 
     * The best thing to do is to ensure that each file copy to a particular destination is done by
     * one and only one project.  When that is the case, you are guaranteed not to have problems
     * with two copies colliding and breaking your build.
     * 
     * Here's example output from buildr.suitesrc.normal.log that shows a copy failure.  Here two
     * copies were executed in parallel and the second one failed, causing the build to fail.
     * 
        48>Project "D:\a1\dd\alm\tfs_core\Admin\Servicing\Data\dirs.proj" (48) is building "D:\a1\dd\alm\tfs_core\Admin\Servicing\Data\Azure\Microsoft.TeamFoundation.Data.Azure.csproj" (55) on node 8 (BuildLinked target(s)).
    
        55>_CopyOutOfDateSourceItemsToOutputDirectory:
             Copying file from "D:\a1\dd\alm\tfs_core\Admin\Servicing\Data\ReleaseManifest.xml" to "D:\a1\binaries.x86ret\bin\i386\ReleaseManifest.xml".
    
    
        48>Project "D:\a1\dd\alm\tfs_core\Admin\Servicing\Data\dirs.proj" (48) is building "D:\a1\dd\alm\tfs_core\Admin\Servicing\Data\SqlServer\Microsoft.TeamFoundation.Data.csproj" (53) on node 4 (BuildLinked target(s)).
    
        53>_CopyOutOfDateSourceItemsToOutputDirectory:
             Copying file from "D:\a1\dd\alm\tfs_core\Admin\Servicing\Data\ReleaseManifest.xml" to "D:\a1\binaries.x86ret\bin\i386\ReleaseManifest.xml".
    
        53>D:\a1\dd\tools\x86\managed\v4.5\Microsoft.Common.targets(3516,5): error MSB3021: Unable to copy file "D:\a1\dd\alm\tfs_core\Admin\Servicing\Data\ReleaseManifest.xml" to "D:\a1\binaries.x86ret\bin\i386\ReleaseManifest.xml". Access to the path 'D:\a1\binaries.x86ret\bin\i386\ReleaseManifest.xml' is denied. [D:\a1\dd\alm\tfs_core\Admin\Servicing\Data\SqlServer\Microsoft.TeamFoundation.Data.csproj]
      
     * 
     * Note that there may be multiple copies in a sequence.
     * 
        291>_CopyOutOfDateSourceItemsToOutputDirectoryAlways:
             Copying file from "D:\a1\dd\suitesrc\TFS\common\deploytools\httpcfg.exe" to "D:\a1\binaries.x86ret\SuiteBin\i386\TFS\Tests\httpcfg.exe".
             Copying file from "D:\a1\dd\suitesrc\TFS\common\deploytools\makecert.exe" to "D:\a1\binaries.x86ret\SuiteBin\i386\TFS\Tests\makecert.exe".
             Copying file from "D:\a1\dd\suitesrc\TFS\common\deploytools\winhttpcertcfg.exe" to "D:\a1\binaries.x86ret\SuiteBin\i386\TFS\Tests\winhttpcertcfg.exe".
           CopyFilesToOutputDirectory:
             Copying file from "D:\int\641\194\suitesrc\tfshttpsconfig.csproj_80399372\objr\x86\TfsHttpsConfig.exe" to "D:\a1\binaries.x86ret\SuiteBin\i386\TFS\Tests\TfsHttpsConfig.exe".
    
     * Nodes are reused by msbuild.  The result is that a given may process many projects, so it's not
     * possible to scan and pair up all of the nodes and project files at once.  In the code below, 
     * you will see that it always tracks the most recent node for that reason.
     * 
     */
    
    namespace FindBadCopies
    {
        class Program
        {
            static void Main(string[] args)
            {
                if (args.Length < 1)
                {
                    Console.WriteLine("Usage: findbadcopies <logfile>\r\n");
                    Console.WriteLine(
    @"This tool scans a build log, such as buildr.suitesrc.normal.log, and produces a
    list of file paths that are the targets of more than one copy and shows which
    project files initiated each copy.  These redundant file copies are prone to
    fail periodically in incremental builds, such as gated check ins and CI builds,
    because copies are a higher percentage of the operations in the build, making
    it more likely that two collide.");
    
                    return;
                }
    
                ProcessFile(args[0]);
            }
    
            private static void ProcessFile(String fileName)
            {
                Dictionary<int, String> nodeTable = new Dictionary<int, String>(1000);
                Dictionary<String, int> pathTable = new Dictionary<String, int>(1000, StringComparer.InvariantCultureIgnoreCase);
                String previousLine;
    
                string[] text = File.ReadAllLines(fileName);
    
                // Process all of the lines in the file, skipping the first line (we need the previous line,
                // and the first line in the file isn't important to this tool).
                int lastNode = 0;
                for (int i = 1; i < text.Length; i++)
                {
                    previousLine = text[i - 1];
    
                    // Record most recent node.  The text that appears with it can be different
                    // (see sample log data).
                    string prevLinePattern = @"([0-9]+)[>]";
                    Match match = Regex.Match(previousLine, prevLinePattern, RegexOptions.IgnoreCase);
                    if (match.Success)
                    {
                        lastNode = Int32.Parse(match.Groups[1].Value);
                    }
    
                    // If the line is recording the start of a project, add it to the table.
                    string pattern = @"[0-9]+[>]Project ""[^""]+"" \([0-9]+\) is building ""([^""]+)"" \(([0-9]+)\)";
                    match = Regex.Match(text[i], pattern, RegexOptions.IgnoreCase);
                    if (match.Success)
                    {
                        int node = Int32.Parse(match.Groups[2].Value);
                        String projectPath = Path.GetFullPath(match.Groups[1].Value);
    
                        // Because nodes are reused, we are only keeping the project path for the most recent use
                        // of a given node.
                        nodeTable[node] = projectPath;
    
                        // If we matched a project line, it can't be a copy line.
                        continue;
                    }
    
                    // If the line is one that records a copy, see if there was an earlier copy made to
                    // the same target path.  First, try the output of a copying task.
                    string copyingPattern = @"Copying file from ""[^""]+"" to ""([^""]+)""";
                    match = Regex.Match(text[i], copyingPattern, RegexOptions.IgnoreCase);
                    if (match.Success)
                    {
                        String targetPath = null;
                        try
                        {
                            targetPath = Path.GetFullPath(match.Groups[1].Value);
                        }
                        catch (Exception e)
                        {
                            // There is a file in the test tree that uses non-English chars that causes
                            // GetFullPath() to throw (TODO: understand why), so we keep the raw text.
                            // Console.WriteLine(match.Groups[1].Value);
                            targetPath = match.Groups[1].Value;
                        }
    
                        // If we have already seen the target path, then we have a duplicate copy path
                        // target to report.
                        int otherNode;
                        if (pathTable.TryGetValue(targetPath, out otherNode))
                        {
                            Console.ForegroundColor = ConsoleColor.Cyan;
                            Console.WriteLine("{0}", targetPath);
                            Console.ResetColor();
    
                            Console.WriteLine("      {0}", nodeTable[otherNode]);
                            Console.WriteLine("      {0}", nodeTable[lastNode]);
                            Console.WriteLine();
                        }
    
                        pathTable[targetPath] = lastNode;
                    }
                }
            }
        }
    }
  • Buck Hodges

    How to subscribe to checkins not under a particular path

    • 3 Comments

    Nick Kirchem, who works on the TFS web access team, recently answered a question on how email subscriptions on checkin alerts.  The question was, how do I subscribe to checkin alerts not under a particular folder?

    Here’s how to do it.

    bissubscribe /eventType CheckinEvent /address someone@domain.com /deliveryType EmailHtml /server http://myserver:8080/tfs/DefaultCollection "'Artifacts/Artifact[@ArtifactType=\"VersionedItem\"][not(starts-with(translate(@ServerItem, \"ABCDEFGHIJKLMNOPQRSTUVWXYZ\", \"abcdefghijklmnopqrstuvwxyz\"), \"$/devdiv/feature/build/qa\"))]' <> null"

    Let’s break it down.

    • /eventType – Here we want CheckinEvent.  There are others.  Rather than list them, I’d recommend the Alerts Explorer that is part of the Team Foundation Server Power Tools.  You can use it to discover more, as each different type has different things you can filter on.
    • /address – The email address to use.
    • /deliveryType – We want HTML-formatted email.
    • /server – Here I’ve used the URL to the default collection on a TFS 2010 server.  You’ll need to edit it to match the server and collection you need to use.
    • expression – The expression is somewhat hard to read.
      • The path we want to filter out is listed last, which $/devdiv/feature/build/qa in this case.  You must use lowercase for your path.
      • The translate XPath function is used to normalize the casing of the server path to be all lower case.  This is important because XPath is case sensitive.
      • The starts-with XPath function tests to see if the path is one we are interested in (in this case to filter out).
      • The not XPath function inverts the test to see if it is not under the path we want to filter out.
      • The quotation marks are escaped so that they can be inside quotation marks on the Windows command prompt.

    I recommend using the Alerts Explorer power tool rather than doing this by hand.  However, in this case, the Alerts Explorer does not support this.

    Nick has also written a feature for TFS 11 to allow you to edit alerts inside the product through the web interface.  An early version of it is in the TFS 11 CTP release that came out in concert with the Windows’ //build conference.  Here’s a screen shot of it.  Note that you can only see it in the UI if you enable email in the TFS Administration Console.  Since then he’s made it easier to use and made it so that you can administer other user alerts, if you are an administrator.

    image

    Related post: Adding a path filter to a CheckinEvent subscription using bissubscribe

  • Buck Hodges

    How to delete a team project from Team Foundation Service (tfs.visualstudio.com)

    • 64 Comments

    [UPDATE 9/13/13] You can now use the web UI to delete a team project.

    [UPDATE 5/14/13] Updated the URLs and version of VS (used to say preview)

    The question came up as to how to delete a team project in the Team Foundation Service (TFService).  When I first tried it, it didn’t work.  Then I realized it’s the one case where you have to explicitly specify the collection name.  It’s surprising because in hosted TFS each account has only one collection.  You cannot create multiple collections currently as you can with on-premise TFS (this will change at some point in the future).  Incidentally, you cannot delete a collection right now either.

    You must have installed the Visual Studio 2012 RTM or newer build to do this (you can also use the standalone Team Explorer 2012).  Even with the patch to support hosting, the 2010 version of tfsdeleteproject.exe will not work.

    If you leave off the collection, here’s the error you will see when I try to delete the team project called Testing.

    C:\project>tfsdeleteproject /collection:https://buckh-test2.visualstudio.com Testing
    Team Foundation services are not available from server https://buckh-test2.visualstudio.com.
    Technical information (for administrator):
      HTTP code 404: Not Found

    With DefaultCollection added to your hosting account’s URL, you will get the standard experience with tfsdeleteproject and successfully delete the team project.

    C:\project>tfsdeleteproject /collection:https://buckh-test2.visualstudio.com/DefaultCollection Testing

    Warning: Deleting a team project is an irrecoverable operation. All version control, work item tracking and Team Foundation build data will be destroyed from the system. The only way to recover this data is by restoring a stored backup of the databases. Are you sure you want to delete the team project and all of its data (Y/N)?y

    Deleting from Build ...
    Done
    Deleting from Version Control ...
    Done
    Deleting from Work Item Tracking ...
    Done
    Deleting from TestManagement ...
    Done
    Deleting from LabManagement ...
    Done
    Deleting from ProjectServer ...
    Done
    Warning. Did not find Report Server service.
    Warning. Did not find SharePoint site service.
    Deleting from Team Foundation Core ...
    Done

    This is the error you will get when using tfsdeleteproject 2010, even with the patch for hosting access.

    C:\Program Files\Microsoft Visual Studio 10.0\VC>tfsdeleteproject /collection:https://buckh-test2.visualstudio.com/DefaultCollection Testing2

    Warning: Deleting a team project is an irrecoverable operation. All version control, work item tracking and Team Foundation build data will be destroyed from the system. The only way to recover this data is by restoring a stored backup of the databases. Are you sure you want to delete the team project and all of its data (Y/N)?y

    TF200040: You cannot delete a team project with your version of Team Explorer. Contact your system administrator to determine how to upgrade your Team Explorer client to the version compatible with Team Foundation Server.

  • Buck Hodges

    Now on Twitter: tfsbuck

    • 2 Comments

    With the build conference last week, I got a Twitter account and started following the comments and responding to questions.  I’m @tfsbuck.

  • Buck Hodges

    TFS 2010 SP1 Cumulative Update 1 available (again)

    • 4 Comments

    Brian posted about the cumulative update for TFS (the TFS SKU, not the client/VS) back in June.  After it was released we learned of a couple of bugs in it, including one where upgrades would not work in certain cases.  Last Thursday, we re-released it with all of the known bugs fixed.  We understand where we went wrong, and we apologize for the inconvenience.

    You should apply this update to the server, build machine, proxy, and SharePoint extensions – all of the things that the TFS SKU installs.  SP1 is a pre-requisite (i.e., this patch does not include SP1).

    Download: http://www.microsoft.com/download/en/details.aspx?id=26211

  • Buck Hodges

    Patch to improve perf and reliability of the Workflow Designer

    • 4 Comments

    Today the .NET team is releasing a cumulative update patch.  This has all of the QFEs up until a couple of months ago rolled into one patch.  Included as part of that is a patch for WPF that improves the performance of the Windows Workflow Designer as well as a hang that a number of folks have hit.  I had a few customers try it out, and they were happy with the improvements.  I recommend this update to you if you work with the WF Designer (e.g., editing the workflow for Team Build definitions).  There are still perf issues even with this fix, and the WF Designer team has made some very good perf improvements for the next release.

    You can find a complete list of the issues fixed at KB 2468871 under More Information.  There are also six features related to ASP.NET and Silverlight listed after the issues.

    Here is the download page: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=3556.

    Enjoy!

    P.S.  This is completely unrelated to the TFS cumulative update that Brian has written about here.  I recommend both.

  • Buck Hodges

    Knowing which thread BackgroundWorker will use for its events

    • 2 Comments

    [UPDATE 7/19/2011]  Stephen pointed me to his article covering this and more in February issue of MSDN Magazine, and I recommend it: http://msdn.microsoft.com/en-us/magazine/gg598924.aspx.

    We hit this recently, so I thought I’d post this email from Chad, a developer on version control, for anyone else who may have missed this subtlety.

    Today we discovered some of our code was making an incorrect assumption about the behavior of BackgroundWorker, so I thought it might be useful to send a note detailing what we found.

    Our code assumed BackgroundWorker would always call ProgressChanged and RunWorkerCompleted on the UI thread.  This mistake was based on the assumption that BackgroundWorker saved off the SynchronizationContext for the thread on which it was created.

    After reviewing the BackgroundWorker code, we found that it actually saves the SynchronizationContext for the thread where RunWorkerAsync is called (by calling AsyncOperationManager.CreateOperation).  Then, ProgressChanged and RunWorkerCompleted are called on that thread if it is still running.  If not, the events appear to be called on a random thread.

    This of course leads to a crash when there are attempts to update UI from the wrong thread.  If you are relying on BackgroundWorker to return you to the UI thread, make sure to only call RunWorkerAsync from the UI thread.

    Hope this helps!

    -Chad

  • Buck Hodges

    Updates to our docs on MSDN last month

    • 0 Comments

    The fine folks who write documentation for our product are woefully outnumbered.  Every month they release updates to the docs, adding new topics and enhancing existing ones.  You can find the latest set of updates described on their blog.

  • Buck Hodges

    Ewald’s posts on TFS Build 2010

    • 3 Comments

    Ewald Hofman, an ALM MVP, has written a great series of blog posts on Team Build in Team Foundation Server 2010.  The 2010 release introduces Windows Workflow as the overall orchestrator of the build process.  Ewald walks you through quite a few topics related to this and the other new features.  Check it out!

    1. Part 1: Introduction
    2. Part 2: Add arguments and variables
    3. Part 3: Use more complex arguments
    4. Part 4: Create your own activity
    5. Part 5: Increase AssemblyVersion
    6. Part 6: Use custom type for an argument
    7. Part 7: How is the custom assembly found
    8. Part 8: Send information to the build log
    9. Part 9: Impersonate activities (run under other credentials)
    10. Part 10: Include Version Number in the Build Number
    11. Part 11: Speed up opening my build process template
    12. Part 12: How to debug my custom activities
    13. Part 13: Get control over the Build Output
    14. Part 14: Execute a PowerShell script
    15. Part 15: Fail a build based on the exit code of a console application
    16. Part 16: Specify the relative reference path
  • Buck Hodges

    Be a developer at Microsoft in Durham, NC

    • 0 Comments

    Do you want to be part of a team of talented developers and build great software?  Here’s your chance to join the TFS team.  In addition to development positions in Redmond (work item tracking client team), I have openings on our development team here in North Carolina.  Please follow one of the links to apply online.

    Job Category: Software Engineering: Development
    Location: United States, NC, Durham
    Job ID: 753545
    Product: Visual Studio Team System
    Division: Server & Tools Business

    Visual Studio Team Foundation Server (TFS) is leading the way in improving the success of software projects, and we want your help! TFS provides software development teams with project and bug management, version control, and build automation. We are now building our services in the cloud using Windows and SQL Azure platforms to make TFS available 24x7 over the internet.

    This is also an opportunity to live on the east coast in North Carolina (Raleigh/Durham) and work on cutting-edge product development for Microsoft!

    Are you passionate about building a great version control experience? Developers interact with version control more than any other part of the system, so you have the opportunity to have a big impact. The position will require you to have or gain extensive knowledge of one or more of these technologies: Visual Studio packages, WPF, WCF, and C#/.NET Framework. Version control in TFS makes use of a wide-range of technologies, so you’ll have the opportunity to learn new stuff and go deep to become an expert in one or more of these areas.

    We’re looking for a developer who seeks big challenges as part of a strong, agile team and has both great collaboration skills and an ability to also work independently to deliver well thought out solutions to tough problems. You must have 3 or more years of experience developing production software using C/C++, C#, or Java and a strong background in object-oriented design and algorithms. A BS in Computer Science, Computer Engineering, Electrical Engineering, or equivalent is required.

    If you enjoy building software with a broad range of technologies and being part of a great team that’s making software development better, join TFS!

    If you are passionate about build or web access instead, you can still apply to one of the positions above, and we can discuss team fit as part of the interview process.

  • Buck Hodges

    Making debugging easier: Source Indexing and Symbol Server

    • 0 Comments

    Have you ever tried to debug an issue in old binaries and you don’t remember which version of the source they correspond to?  Have you debugged without symbols because no one saved them?  Here’s how to make your life easier.

    One of the great features in Team Foundation Server 2010 Build is the ability to have your builds automatically indexed with source server and the symbols stored in symbol server.  Ed Blankenship has posted a great write up on how to configure and use this feature from the build to debugging in Visual Studio.

    Source Server and Symbol Server Support in TFS 2010

    As Jim Lamb announced in June 2009, TFS 2010 introduces support for Source Server and Symbol Server as part of the default automated build process template. This is a really key feature addition but I have found that many developers ask about why it would be so important and why it would help them. Ultimately, we are starting to have more and more tools that need access to the symbol file information and the original source code that was used for compilation. For example, some of the tools that come to mind are:

    By setting up Source Server and Symbol Server support during your build process, you’ll be able to work with assemblies & executables that come from the build servers and still use tools that need information from them.

    more…

    [UPDATE 4/12/2011]  Ewald Hofman pointed out that I missed Cameron’s excellent debugging series posts.  In Cameron’s second post, he points out how to work around an issue with using minidumps with VS 2010 SP1.

    Check it out!

  • Buck Hodges

    OData service for TFS

    • 3 Comments

    Brian Keller has release a new OData service for TFS.  He does a great job explaining it, and he also includes a video demo.

    OData Service for Team Foundation Server 2010

    What the heck is an OData Service for Team Foundation Server 2010?
    I’m glad you asked. The purpose of this project is to help developers work with data from Team Foundation Server on multiple types of devices (such as smartphones and tablets) and operating systems. OData provides a great solution for this goal, since the existing Team Foundation Server 2010 object model only works for applications developed on the Windows platform. The Team Foundation Server 2010 application tier also exposes a number of web services, but these are not supported interfaces and interaction with these web services directly may have unintended side effects. OData, on the other hand, is accessible from any device and application stack which supports HTTP requests. As such, this OData service interacts with the client object model in the SDK (it does not manipulate any web services directly).

    What is OData?
    OData exposes a way to work with data over the web. If you’re new to OData, I suggest spending a few minutes at http://www.odata.org/ reading about this evolving standard. It uses interfaces similar to REST, so that you can programmatically consume and manipulate data from any device or application stack which supports HTTP requests. DPE has been working with several organizations (such as PayPal, Facebook, and Netflix) and product groups to enable OData where it makes sense to do so. Team Foundation Server was an obvious choice since it not only allows developers to extend TFS in new and interesting ways, but it also allows us to further showcase support for this evolving standard with the developer community at large.

    more…

    Enjoy!

  • Buck Hodges

    Professional Team Foundation Server 2010 is now out!

    • 3 Comments

    Professional Team Foundation Server 2010

    In the year since the release of TFS 2010, we’ve seen a run of great new books coming, all by authors who really know their subject matter extremely well.  At the beginning of the year, Sayed Ibrahim Hashimi and William Bartholomew published Using MSBuild and Team Foundation Build, the book on MSBuild and TFS Build.

    Then Mickey Gousset, Brian Keller, Ajoy Krishnamoorthy, and Martin Woodward brought us Professional Application Lifecycle Management.  My copy of that book came in handy when I wrote a post on using the code metrics power tool with TFS Build.  It covers the full range of the VS ALM 2010 product.

    Now Professional Team Foundation Server 2010 written by Ed Blankenship, Martin Woodward, Grant Holliday, and Brian Keller is now out.  I got my copy the other day and highly recommend it.  Martin wrote a great blog post on the book, and in it he describes the differences between Professional ALM and Professional TFS.

    People have asked us what’s the difference between the ALM book and the Pro TFS book.  The ALM book was deliberately written as an overview to the huge amount of functionality available in the entire Visual Studio Application Lifecycle Management suite.  Though there are a couple of chapters, the Team Build one in particular, that get pretty technical – the Pro ALM book tries to keep things approachable by everyone.

    The Pro TFS 2010 book is a deep dive on TFS.  We tried to make it so that you can pick up the book having never used TFS before any by the end of it not only know how to use TFS but how to administer a complex TFS instance and even use it to study for the TFS Administration exam.  I’ve learnt something from every single chapter in the Pro TFS book, but I would also hope that someone new to TFS could pick up the book and learn just enough to get going then come back for more over time.

    They’ve included information on every major area of TFS and have included some coverage of the test features that integrate with TFS.  One of the things that makes the book great is that it includes some great information on features of the product you may not even know about.  For example, did you know you can use Active Directory to automatically configure version control proxies for your distributed teams (check out chapter 24)?  Want to understand your server’s health and diagnose performance issues (see chapter 21)?

    Jeff Levinson’s Software Testing with Visual Studio 2010 covers the testing features of VS ALM 2010, which was a huge area of focus for us in the 2010 release.  In it he covers creating test cases, reporting, and lab management, which is a powerful and complex new feature in 2010.

    In May we’ll get Professional Scrum with TFS 2010, so stay tuned for more.

  • Buck Hodges

    How to reject checkins with code analysis violations

    • 0 Comments

    Andrew Hall wrote a great post on the Code Analysis Team Blog about how to use the code analysis checkin policy with gated checkin in Team Foundation 2010 Build to reject checkins that have code analysis warnings or errors.  He shows you how to configure the rule set and set up the gated build definition to enforce the code analysis rules you’ve chosen.

    Preventing check-ins to TFS that contain code analysis warnings

    Recently we have received several questions regarding Visual Studio Code Analysis integration with Team Foundation Server’s check-in policy and build server, so I thought it would be helpful to clarify the behavior and expose some relatively hidden functionality.

    more…

    Enjoy!

  • Buck Hodges

    We’re here, and we have a sign to prove it

    • 2 Comments

    Yeah, seven years in this location, and the building had no outward markings to indicate that our office is here.  It’s official.  We’re here.  Really.

    sign1 sign2
  • Buck Hodges

    How to turn on compression for TFS 2010 web services

    • 1 Comments

    In the past, we’ve turned on compression for the SOAP responses for the TFS web services.  In TFS 2010, you must do it manually.  In the future, I hope we have it turned on by default.  It’s particularly good for teams that aren’t at the same location as the TFS server.  For users on a high-speed corporate network, it’s not likely to matter.

    Grant wrote a post on how to turn it on: TFS2010: How to enable compression for SOAP traffic.

  • Buck Hodges

    How to distribute custom checkin policies and work item controls using the power tools

    • 4 Comments

    Custom checkin policies and custom work item controls are great ways to take advantage of the extensibility of TFS.  You can use checkin policies to enforce certain standards on checkins (even in your builds).  Custom work item controls allow you to add controls to your work item forms that present data in particular way, access other systems, etc.  However, there’s no mechanism in Team Explorer to download and install these.

    Youhana has written a post on how to use a feature in the power tools that not many folks know about.  By creating a couple of version control folders in each team project, you can have folks use the Team Members node in Team Explorer to download and install them.  This means that your users don’t need to know where to put the files on disk or the registry entries to create to make them work.  There’s not an auto-update mechanism there right now, so users will need to do this again if you subsequently update the dlls.  To get to this feature, you need to have the Team Foundation Server Power Tools installed on each machine where you want to use this feature.

    Distributing custom check-in policies & WIT controls using team members

    The team members component of the TFS power tools (available here) has a feature to help TFS users distribute custom check-in policies and WIT controls. Basically, the administrator would add the dlls containing the policies and components to a special folder in version control and users then can install the components using the “personal settings” dialog in team members. These are the detailed steps:

    more…

    Enjoy!

  • Buck Hodges

    Moving work item description fields to HTML

    • 0 Comments

    Neno’s been blogging a lot this month, and many of his posts have helpful tools associated with them.  The post below caught my eye as particularly useful.  We’ll be using HTML fields more going forward, and he has a tool to help you move your existing work items to use an HTML field for the Description.

    Enriching your Work Item Descriptions by Moving them to a HTML field

    In the Visual Studio Scrum 1.0 process template (and most likely in future process templates), Microsoft is using HTML fields with rich formatting for the work item description fields.

    In VS Scrum 1.0…

    • Product Backlog Items and Tasks are usingMicrosoft.VSTS.Common.DescriptionHtml.
    • Bugs are using Microsoft.VSTS.TCM.ReproSteps instead.

    You can customize your current process template and add a new HTML description today.

    more…

    Enjoy!

  • Buck Hodges

    VS 2010 SP1 crashes when viewing build on a TFS 2008 server

    • 4 Comments

    Unfortunately, we introduced a regression into Visual Studio 2010 SP1 in the process of fixing a performance issue in the build details view that a number of customers had reported (viewing the log was really slow for larger builds). We made this change late in SP1. I apologize for the inconvenience. I want to make sure you know about that patch if you hit the problem.

    The fix is available at http://connect.microsoft.com/VisualStudio/Downloads/DownloadDetails.aspx?DownloadID=34824.

    KB2522890 - VS10 SP1 crashes on build details from TFS 2008 build explorer

    Issue Description
    Visual Studio 2010 SP1 crashes or shows the following error when attempting to view a build report on a TFS 2008 server:

    "TF50316: The following name is not valid. Verify that the name does not exceed the maximum character limit, only contains valid characters, and is not a reserved name"

    Additional Information about the issue resolved by this Hotfix can be found in its Knowledge Base article at http://support.microsoft.com/kb/2522890

  • Buck Hodges

    Managing TFS 2010: How to clean up test attachment data

    • 0 Comments

    Test attachment data generated by the new testing features in VS 2010 can add a large amount of data to your TFS server.  In fact, we discovered on our own “dogfood” server that test data was taking up more space than the version control data.  You can read more about it in Grant’s post here.

    You can use the Test Attachment Cleaner for Visual Studio Ultimate 2010 & Test Professional 2010 to delete old test data to reduce the size.  Here’s the description from that page.

    Overview:

    In Visual Studio 2010, with the introduction of Visual Studio Test Professional 2010 & Visual Studio Premium/Ultimate 2010 SKUs, testers can author manual and automated Test cases, configure the different diagnostic data collectors (as part of Test Settings), associate the Test Settings with Test Plan/Suites and then execute these test cases as part of Test Runs. The execution of a Test Run (whether automated or manual) generates a bunch of diagnostic data, which may be captured either automatically by the system or manually by the tester. This diagnostic data is critical in eliminating the “no repro” bug scenarios between the testers and developers.

    However, the downside of this rich diagnostic data captures is that the system/user generated diagnostic data, over a period of time, can grow at a rapid pace and start taking up database space. With Visual Studio 2010, the database administrator has little or no control over what data gets attached as part of Test Runs – i.e., there are no policy settings he can control to limit the size of the data capture OR no retention policy to determine how long to hold this data before initiating a cleanup. In such scenarios, the Admin has no mechanism to:

    1. Determine which set of diagnostic captures is taking up how much space AND

    2. Reclaim the space for runs which are no longer relevant from business perspective.

    The “Test Attachment Cleaner” powertool fills this void by serving both the above points.

Page 3 of 23 (569 items) 12345»