Buck Hodges

Visual Studio Online, Team Foundation Server, MSDN

  • Buck Hodges

    Power Toy: tfpt.exe


    [UPDATE 8/9/2007]  I fixed the broken link to the power tools page. 

    [UPDATE 9/8/2006]  TFPT is now available in its own small download: http://go.microsoft.com/?linkid=5422499!  You no longer need to download the VS SDK.  You can find more information about the September '06 release here.

    Back at the start of October, I wrote about the tfpt.exe power toy.  The beta 3 version has been released with the October SDK release.  In the future, we plan to have a better vehicle for delivering it.

    Here's the documentation, minus the screenshots, in case you are trying to decide whether to download the SDK.  The documentation is included in the SDK release as a Word document, including screenshots of the various dialogs (yes, most commands have a GUI, but you can still use the commands from scripts by specifying the /noprompt option).

    Review  The only command not documented is the review command, which is very handy for doing code reviews.  When you run "tfpt review" you get a dialog with a list of your pending changes that you can check off as you diff or view each one.

    I hope you find these useful.  Please leave a comment, and let us know what you think.

    Team Foundation PowerToys


    The Team Foundation PowerToys (TFPT) application provides extra functionality for use with the Team Foundation version control system. The Team Foundation PowerToys application is not supported by Microsoft.

    Five separate operations are supported by the TFPT application: unshelve, rollback, online, getcs, and uu. They are all invoked at the command line using the tfpt.exe application. Some of the TFPT commands have graphical interfaces.

    Unshelve (Unshelve + Merge)

    The unshelve operation supported by tf.exe does not allow shelved changes and local changes to be merged together. TFPT’s more advanced unshelve operation allows this to occur under certain circumstances.

    If an item in the local workspace has a pending change that is an edit, and the user uses TFPT to unshelve a change from a shelveset, and that shelved change is also an edit, then the changes can be merged with a three-way merge.

    In all other cases where changes exist both in the local workspace and in the shelveset, the user can choose between the local and shelved changes, but no combination of the changes can be made. To invoke the TFPT unshelve tool, execute

    tfpt unshelve

    at the command line. This will invoke the graphical interface for the TFPT unshelve tool:

    Running TFPT Unshelve on a Specified Shelveset

    To skip this dialog, you can specify the shelveset name and owner on the command line, with

    tfpt unshelve shelvesetname;shelvesetowner

    If you are the owner of the shelveset, then specifying the shelveset owner is optional.

    Selecting Individual Items Within a Shelveset for Unshelving

    If you specify a shelveset on the command line as in “Running TFPT Unshelve on a Specified Shelveset,” or if you select a shelveset in the window above and click Details, you are presented with the Shelveset Details window, where you can select individual changes within a shelveset to unshelve.

    You can check and uncheck the boxes beside individual items to mark or unmark them for unshelving. Click the Unshelve button to proceed.

    Unshelve Conflicts

    When you press the Unshelve button, all changes in the shelveset for which there is no conflicting local change will be unshelved. You can see the status of this unshelve process in the Command Prompt window from which you started the TFPT unshelve tool.

    There may, however, be conflicts which must be resolved for the unshelve to proceed. If any conflicts are encountered, the conflicts window is displayed:

    Edit-Edit Conflicts

    To resolve an edit-edit conflict, select the conflict in the list view and click the Resolve button. The Resolve Unshelve Conflict window appears.

    For edit-edit conflicts, there are three possible conflict resolutions. 

    • Taking the local change abandons the unshelve operation for this particular change (it would be as if the change had not been selected for unshelving).
    • Taking the shelved change first undoes the local change, and then unshelves the shelved change. This results in the local change being completely lost.
    • Clicking the Merge button first attempts to auto-merge the two changes together, and if it cannot do so without conflict, attempts to invoke a pre-configured third-party merge tool to merge the changes together. The local change is not lost by choosing to merge – if the merge fails, the local change remains.

    The Auto-Merge All Button

    The Auto-Merge All button is enabled when there are edit-edit conflicts remaining that are unresolved. Clicking the button goes through the edit-edit conflicts and attempts to auto-merge the changes together. For each conflict, if the merge succeeds, then the conflict is resolved. If not, then the conflict is marked as “Requires Manual Merge.” In order to resolve conflicts marked as “Requires Manual Merge,” you must select the conflict and click the Resolve… button. Clicking the Merge button will then start the configured third-party merge tool. If no third-party merge tool is configured, then the conflict must be resolved by selecting to take the local change or take the shelved change.

    Generic Conflicts

    Any other conflict (a local delete with a shelved edit, for example) is a generic conflict that cannot be merged.

    There is no merge option for generic conflicts. You must choose between keeping the local change and taking the shelved change.

    Aborting the Unshelve Process

    Because the unshelving process makes changes to the local workspace, and because the potential exists to discover a problem halfway through the unshelve process, the TFPT Unshelve application makes a backup of the local workspace before starting its execution if there are pending local changes. This backup is stored as a shelveset on the server. In the event of an abort, all local pending changes are undone and the backup shelveset is unshelved to the local workspace. This restores the workspace to the state it was in before the unshelve application was run.

    The backup shelveset is named by adding _backup and then a number to the name of the shelveset that was unshelved. For example, if the shelveset TestSet were unshelved, the backup shelveset would be named TestSet_backup1. Up to 9 backup shelvesets can exist for each shelveset.

    With the backup shelveset, changes made during an unshelve operation can be undone after the unshelve is completed but before the changes are checked in, by undoing all changes in the workspace and then unshelving the backup shelveset:

    tf undo * /r

    tf unshelve TestSet_backup1


    Sometimes it may be necessary to undo a checkin of a changeset. This operation is directly not supported by Team Foundation, but with the TFPT rollback tool you can pend changes which attempt to undo any changes made in a specified changeset.

    Not all changes can be rolled back, but in most scenarios the TFPT rollback command works. In any event, the user is able to review the changes that TFPT pends before checking them in.

    To invoke the TFPT rollback tool, execute

    tfpt rollback

    at the command line. This will invoke the graphical user interface (GUI) for the TFPT rollback tool. Please note that there must not be any changes in the local workspace for the rollback tool to run. Additionally, a prompt will be displayed to request permission to execute a get operation to bring the local workspace up to the latest version.

    The Find Changesets window is presented when the TFPT rollback tool is started. The changeset to be rolled back can be selected from the Find Changesets window.

    Specifying the Changeset on the Command Line

    The Find Changesets window can be skipped by supplying the /changeset:changesetnum command line parameter, as in the following example:

    tfpt rollback /changeset:3

    Once the changeset is selected, either by using the Find Changesets window or specifying a changeset using a command-line parameter, the Roll Back Changeset window is displayed.

    Each change is listed with the type of change that will be counteracted by a rollback change.

    To rollback a:

    The tool pends a:

    Add, Undelete, or Branch








    Unchecking a change in the Roll Back Changeset window marks it as a change not to be rolled back. There are cases involving rolling back deletes which may result in unchecked items being rolled back. If this occurs, clear warnings to indicate this are displayed in the command prompt window. If this is unsatisfactory, undo the changes pended by the rollback tool.

    When the changes to roll back have been checked appropriately, pressing the Roll Back button starts the rollback. If no failures or merge situations are encountered, then the changes should be pended and the user returned to the command prompt:

    Merge scenarios can arise when a rollback is attempted on a particular edit change to an item that occurred in-between two other edit changes. There are two possible edit rollback scenarios: 

    1. An edit is being rolled back on an item, and the edit to roll back is the latest change to the content of the item. 

    This is the most common case. Most rollbacks are performed on changesets that were just checked in. If the edit was just checked in, it is unlikely that another user has edited it in the intervening time.

    To roll back this change, an edit is pended on the item, and the content of the item is reverted to the content from before the changeset to roll back. 

    1. An edit is being rolled back on an item, and the edit to roll back is not the latest change to the content of the item. 

    This is a three-way merge scenario, with the version to roll back as the base, and the latest version and the previous version as branches. If there are no conflicts, then the changes from the change to roll back (and only the change to roll back) are extracted from the item, preserving the changes that came after the change to roll back. 

    In the event of a merge scenario, the merge window is displayed:

    To resolve a merge scenario, select the item and click the Merge button. An auto-merge will first be attempted, and if it fails, the third-party merge tool (if configured) will be invoked to resolve the merge. If no third-party merge tool is configured, and the auto-merge fails, then the item cannot be rolled back:

    The Auto-Merge All button attempts an auto-merge on each of the items in the merge list, but does not attempt to invoke the third-party merge tool.


    Any changes which fail to roll back will also be displayed in the same window.


    With Team Foundation, a server connection is necessary to check files in or out, to delete files, to rename files, etc. The TFPT online tool makes it easier to work without a server connection for a period of time by providing functionality that informs the server about changes made in the local workspace.

    Non-checked-out files in the local workspace are by default read-only. The user is expected to check out the file with the tf checkout command before editing the file. When working in this

    When working offline with the intent to sync up later by using the TFPT online tool, users must adhere to a strict workflow: 

    • Users without a server connection manually remove the read-only flag from files they want to edit. Non-checked-out files in the local workspace are by default read-only, and when a server connection is available the user must check out the file with the tf checkout command before editing the file. When working offline, the DOS command “attrib –r” should be used.
    • Users without a server connection add and delete files they want to add and delete. If not checked out, files selected for deletion will be read-only and must be marked as writable with “attrib –r” before deleting. Files which are added are new and will not be read-only.
    • Users must not rename files while offline, as the TFPT online tool cannot distinguish a rename from a deletion at the old name paired with an add at the new name.
    • When connectivity is re-acquired, users run the TFPT online tool, which scans the directory structure and detects which files have been added, edited, and deleted. The TFPT online tool pends changes on these files to inform the server what has happened.  

    To invoke the TFPT online tool, execute 

    tfpt online

    at the command line. The online tool will begin to scan your workspace for writable files and will determine what changes should be pended on the server.

    By default, the TFPT online tool does not detect deleted files in your local workspace, because to detect deleted files the tool must transfer significantly more data from the server. To enable the detection of deleted files, pass the /deletes command line option.

    When the online tool has determined what changes to pend, the Online window is displayed.

    Individual changes may be deselected here if they are not desired. When the Pend Changes button is pressed, the changes are actually pended in the workspace.

    Important Note: If a file is edited while offline (by marking the file writable and editing it), and the TFPT online tool pends an edit change on it, a subsequent undo will result in the changes to the file being lost. It is therefore not a good idea to try pending a set of changes to go online, decide to discard them (by doing an undo), and then try again, as the changes will be lost in the undo. Instead, make liberal use of the /preview command line option (see below), and pend changes only once.

    Preview Mode

    The Online window displayed above is a graphical preview of the changes that will be pended to bring the workspace online, but a command-line version of this functionality is also available. By passing the /preview and /noprompt options on the command line, a textual representation of the changes that the TFPT online tool thinks should be pended can be displayed.

    tfpt online /noprompt /preview


    The TFPT online tool by default operates on every file in the workspace. Its focus can be more directed (and its speed improved) by including only certain files and folders in the set of items to inspect for changes. Filespecs (such as *.c, or folder/subfolder) may be passed on the command line to limit the scope of the operation, as in the following example:

    tfpt online *.c folder\subfolder

    This command instructs the online tool to process all files with the .c extension in the current folder, as well as all files in the folder\subfolder folder. No recursion is specified. With the /r (or /recursive) option, all files matching *.c in the current folder and below, as well as all files in the folder\subfolder folder and below will be checked. To process only the current folder and below, use

    tfpt online . /r


    Many build systems create log files and/or object files in the same directory as source code which is checked in. It may become necessary to filter out these files to prevent changes from being pended on them. This can be achieved through the /exclude:filespec1,filespec2,… option.

    With the /exclude option, certain filemasks may be filtered out, and any directory name specified will not be entered by the TFPT online tool. For example, there may be a need to filter out log files and any files in object directories named “obj”.

    tfpt online /exclude:*.log,obj

    This will skip any file matching *.log, and any file or directory named obj.

    GetCS (Get Changeset)

    The TFPT GetCS tool gets all the items listed in a changeset at that changeset version.

    This is useful in the event that a coworker has checked in a change which you need to have in your workspace, but you cannot bring your entire workspace up to the latest version. You can use the TFPT GetCS tool to get just the items affected by his changeset, without having to inspect the changeset, determine the files listed in it, and manually list those files to a tf.exe get command.

    There is no graphical user interface (GUI) for the TFPT GetCS tool. To invoke the TFPT GetCS tool, execute

    tfpt getcs /changeset:changesetnum

    at the command line, where changesetnum is the number of the changeset to get.

    UU (Undo Unchanged)

    The TFPT UU tool removes pending edits from files which have not actually been edited.

    This is useful in the event that you check out fifteen files for edit, but only actually make changes to three of them. You can back out your edits on the other twelve files by running the TFPT UU tool, which compares hashes of the files in the local workspace to hashes the server has to determine whether or not the file has actually been edited.

    There is no graphical user interface (GUI) for the TFPT UU tool. To invoke the TFPT UU tool, execute

    tfpt uu

    at the command line. You can also pass the /changeset:changesetnum argument to compare the files in the workspace to a different version.


    Help for each TFPT tool, as well as all its command-line switches, is available at the command line by running

    tfpt help

    or for a particular command, with

    tfpt help <command>


    tfpt <command> /?

  • Buck Hodges

    Brian Harry on when to ship TFS

    About half way through a thread on unlocking files locked by another user, Brian wrote a long response in the Team Foundation Forum about deciding what changes to make at this point and knowing when to ship.  It's a good read.  I expect that Brian will likely turn it into a blog post at some point.
  • Buck Hodges

    Validating XML characters in SOAP messages


    I've written about using the SoapHttpClientProtocol subclasses generated by wsdl.exe several times over the last year, including handling authentication, HTTP response codes, and setting timeouts properly.  Today I needed to change the code in TFS to better handle characters that are not allowed in XML.

    The problem is that if you have a method on your web service that takes a String parameter, someone may call that method with a string that contains characters that are not allowed in XML.  That input may come from a command line switch or a text box in a GUI.

    The XmlWriter used by SoapHttpClientProtocol is XmlTextWriter.  XmlTextWriter doesn't do any character validation.  If the string passed to WriteString() includes characters that are not valid for XML, your XML output will also.  The characters below 32, except for tab, carriage return, and new line, the UTF-8 BOM, and invalid surrogate pairs are not allowed by the XML standard.

    The XmlReader used by the ASP.NET web services does do character validation.  If it finds an invalid XML character, the web service will respond with HTTP 400 Bad Request.  That doesn't help the user figure out what's going on.

    Elena Kharitidi suggested overriding the GetWriterForMessage() method from SoapHttpClientProtocol in the subclass that was generated by wsdl.exe and providing a character-validating XmlWriter.

    The documentation shows an example of creating a subclass of XmlTextWriter to check the characters.  However, it would be better to be able to use a framework class to do it without rolling our own.  Fortunately, there is such a class in the framework.

    With a little poking around, I found the XmlCharCheckingWriter class that is internal to the framework.  Now we just need to get the framework to give us an instance of that class.  A little more poking around and experimentation resulted in the piece of code shown below.

    If you run it under the debugger and put a breakpoint on line 5, you'll see that the base SoapHttpClientProtocol method returns an XmlTextWriter.  If you step down to the XmlWriter.Create() call, you'll see that the framework gives us the XmlCharCheckingWriter instance that we want in response to the CheckCharacters setting being true.

    Now, if you add the following code to your wsdl.exe-generated subclass of SoapHttpClientProtocol, you'll get an ArgumentException on the client when trying to write the invalid XML in the SOAP message.  The exception message will state that there is an invalid character.  The result is a significant improvement over getting a generic HTTP 400 Bad Request from the web service.

    // Override this method in order to put in character validation.
    protected override XmlWriter GetWriterForMessage(SoapClientMessage message,
                                                     int bufferSize)
        XmlWriter writer = base.GetWriterForMessage(message, bufferSize);
        // Choose the encoding the same way the framework code does.
        Encoding encoding = RequestEncoding != null ? RequestEncoding :
                                                      new UTF8Encoding(false);
        // We want the character validation to be done on the client side
        // rather than getting an obscure HTTP 400 Bad Request message
        // from the server (the XmlReader used by the web services does 
        // character validation, while the writer used in the base class
        // does not).
        // We create this second XmlWriter to get an XmlCharCheckingWriter
        // instance.  The Create(XmlWriter, XmlWriterSettings) code path 
        // does that (we don't need the overhead in an XmlWellformedWriter).
        XmlWriterSettings xws = new XmlWriterSettings();
        xws.Encoding = encoding;
        xws.Indent = false;
        xws.NewLineHandling = NewLineHandling.None;
        xws.CheckCharacters = true;      // make sure char is valid for XML
        writer = XmlWriter.Create(writer, xws);
        return writer;
  • Buck Hodges

    Getting email when someone overrides a policy


    James Manning today pointed out a post from Marcel de Vries showing how to register for email when someone checks in after overriding a policy failure.  While any policy failres and override comment are included in the standard check-in email, this allows you to get an email specifically when someone overrides a policy failure.  It's only a few lines of code, and it uses the built-in event system support for sending emails.  He even provides a link to a zip file with the solution.

    How to receive email on a Team Foundation check in policy violation

    [Update] He mentions in the post that the delivery of the events is guaranteed: "the implementation also has a guaranteed delivery system using SQL server."  Though the events do get queued for delivery in a table in the SQL DB, there's no guarantee that an event gets recorded, because the recoding of the event is not transactional with respect to the work that generated the event.  It's pretty robust, but it's not guaranteed.

  • Buck Hodges

    Beta 3 refresh released


    Beta 3 of TFS has now been released and should be available on MSDN today (or very soon :-).  The differences between beta 3 and beta 3 refresh are small for most things.  Jeff Beehler wrote about some of the beta 3 refresh differences two weeks ago.  The most important reason for the beta 3 refresh is that it is built against and uses all of the final release versions of Visual Studio 2005, .NET 2.0, and SQL Server 2005.  Beyond that, many of the fixes in it were to enhance the setup experience and to address a few localization issues.  So, if you are wondering whether some particular bug is fixed, the answer is most likely to be no, with the exception of some issues that Jeff mentioned.

    If you have beta 3 installed, you'll want to read the following guide to succesfully upgrade without losing your existing data.

    Migrating to Visual Studio 2005 Team Foundation Server Beta 3 Refresh

  • Buck Hodges

    Changing the encoding of a pending add works for RTM

    I had mentioned in my September 10th post on file type detection that you couldn't change the encoding on a pending add, without undoing the add and re-adding it.  While it won't work in beta 3 or beta 3 refresh, you'll be able to change the encoding on a pending add in the next public release.  That means you can use the edit command with /type option from the command line or the properties dialog in the GUI to tweak the encoding before checking it in.
  • Buck Hodges

    Displaying the sizes and dates of files in the server


    Have you wished that the output of the dir command from tf.exe would show you the dates and sizes of the files like cmd.exe's dir command?  Even though tf.exe won't do that in version 1, the version control API provides the call necessary.  This little app is similar to my last post on displaying labels on a file.  You can find a simple example of creating a workspace, pending changes, and so forth in my September API example.

    The GetItems() method is the key part of the app.  Here's the declaration on the VersionControlServer class.

    public ItemSet GetItems(String path, RecursionType recursion)

    For this overload of the GetItems() method, there's no version parameter, so it defaults to get the Item objects for the latest version in the repository.  There is another overload that takes the version, and you could pass "new WorkspaceVersionSpec(wsInfo.Name, wsInfo.OwnerName)" as the version to get the Item objects corresponding to the versions in the workspace.

    The first argument is the path on which we want data.  That can be either a local path or a server path.

    The recursion parameter specifies how deep we want to go.  If the path is a directory, specifying RecursionType.OneLevel retrieves only the items directly contained in that directory, just like specifying the wildcard "*" would.  If the path is a file, there is no difference between one level of recursion and none because a file can't have children.  The default for this app is onle level of recursion, which is the same as what both cmd.exe's dir command and tf.exe's dir command use.

    If the user specifies /r, the app passes RecursionType.Full.  For a directory, that will return all of its descendants.  If the path doesn't match a directory, the server will recursively find all items under path that match.  The path it uses is the directory portion of the path, and the pattern is the file name portion of the path.  So, if the user specifies "c:\project\readme.txt /r" the server will return all files called readme.txt underneath c:\project, recursively.

    In the app, we only want the items for display, which is the Items property on the ItemSet object returned by GetItems().  The other two properties on ItemSet, QueryPath and Pattern, indicate how the server used the specified path to do the matching.  In our previous example of "c:\project\readme.txt /r" the result would be QueryPath set to the server path corresponding to c:\project and Pattern set to readme.txt.

    Here's an example of running the app.

    D:\ws1>D:\code\projects\DirSize\DirSize\bin\Debug\DirSize.exe . /r
    10/25/2005 11:40:25 AM    <DIR>             $/testproj
    10/25/2005 03:59:50 PM    <DIR>             $/testproj/A
    10/25/2005 03:59:50 PM    <DIR>             $/testproj/B
    10/26/2005 11:04:34 PM                 7996 $/testproj/out.txt
    10/25/2005 04:55:50 PM                    6 $/testproj/A/a.txt

    2 files, 3 folders, 8002 bytes

    To build it, you can create a Windows console app in Visual Studio, drop this code into it, and add the following references to the VS project.


    using System;
    using System.Globalization;
    using System.IO;
    using Microsoft.TeamFoundation;
    using Microsoft.TeamFoundation.Client;
    using Microsoft.TeamFoundation.VersionControl.Client;
    using Microsoft.TeamFoundation.VersionControl.Common;
    namespace DirSize
        class Program
            static void Main(string[] args)
                // Check and get the arguments.
                String path;
                RecursionType recursion;
                VersionControlServer sourceControl;
                GetPathAndRecursion(args, out path, out recursion, out sourceControl);
                Item[] items = null;
                    // Get the latest version of the information for the items.
                    ItemSet itemSet = sourceControl.GetItems(path, recursion);
                    items = itemSet.Items;
                catch (TeamFoundationServerException e)
                    // We couldn't contact the server, the item wasn't found,
                    // or there was some other problem reported by the server,
                    // so we stop here.
                if (items.Length == 0)
                    Console.WriteLine("There are no items for " + path);
                    Array.Sort(items, Item.Comparer);
                    long totalBytes = 0;
                    int numFiles = 0, numFolders = 0;
                    String format = "{0,-25} {1,-8} {2, 8} {3}";
                    foreach (Item item in items)
                        if (item.ItemType == ItemType.File)
    // In addition to the information printed, the Item object has
    // the changeset version and MD5 hash of the file content as
    // properties on item
    . numFiles++; totalBytes += item.ContentLength; Console.WriteLine(format, item.CheckinDate.ToLocalTime().ToString("G"), String.Empty, item.ContentLength, item.ServerItem); } else { numFolders++; Console.WriteLine(format, item.CheckinDate.ToLocalTime().ToString("G"), "<DIR>", String.Empty, item.ServerItem); } } Console.WriteLine(); Console.WriteLine("{0} files, {1} folders, {2} bytes", numFiles, numFolders, totalBytes); } } private static void GetPathAndRecursion(String[] args, out String path, out RecursionType recursion, out VersionControlServer sourceControl) { if (args.Length > 2 || args.Length == 1 && args[0] == "/?") { Console.WriteLine("Usage: dirsize"); Console.WriteLine(" dirsize [path] [/r]"); Console.WriteLine(); Console.WriteLine("With no arguments, shows the size information for the current directory."); Console.WriteLine("If a path is specified, it shows the size information for that path."); Console.WriteLine("If /r is specified, all of the children of the path will be included."); Console.WriteLine(); Console.WriteLine("Examples: dirsize $/secret"); Environment.Exit(1); } // Figure out the server based on either the argument or the // current directory. WorkspaceInfo wsInfo = null; if (args.Length < 1 || args.Length == 1 && args[0] == "/r") { path = Environment.CurrentDirectory; } else { path = args[0]; try { if (!VersionControlPath.IsServerItem(path)) { wsInfo = Workstation.Current.GetLocalWorkspaceInfo(path); } } catch (Exception e) { // The user provided a bad path argument. Console.Error.WriteLine(e.Message); Environment.Exit(1); } } if (wsInfo == null) { wsInfo = Workstation.Current.GetLocalWorkspaceInfo(Environment.CurrentDirectory); } // Stop if we couldn't figure out the server. if (wsInfo == null) { Console.Error.WriteLine("Unable to determine the server."); Environment.Exit(1); } TeamFoundationServer tfs = TeamFoundationServerFactory.GetServer(wsInfo.ServerName); // RTM: wsInfo.ServerUri.AbsoluteUri); sourceControl = (VersionControlServer)tfs.GetService(typeof(VersionControlServer)); // Pick up the recursion, if supplied. By default, // we want item and its immediate children, if it is a folder. // If the item is a file, then only the file will be returned. recursion = RecursionType.OneLevel; if (args.Length == 1 && args[0] == "/r" || args.Length == 2 && args[1] == "/r") { recursion = RecursionType.Full; } } } }

    [Update 11/4/2005] I added some explanation of the version used with the GetItems() call and the HashValue property of the Item class.
  • Buck Hodges

    Displaying the labels on a file, including label comments


    Unfortunately, there's not a fast, efficient way to see the list of labels in the system with the full comment without also seeing a list of all of the files included in a label.  You also can't efficiently answer the question, "What labels involve foo.cs?"  While this won't be changed for v1, you can certainly do it using code.  I mentioned on the TFS forum that I'd try to put together a piece of code to do this.  The result is the code below.

    The code is really simple to do this, but I ended up adding more to it than I originally intended.  All that's really necessary here is to call QueryLabels() to get the information we need.

    Let's look at the QueryLabels() call in a little detail, since it is the heart of the app.  Here is the method declaration from the VersionControlServer class.

    public VersionControlLabel[] QueryLabels(String labelName,
    String labelScope,
                                             String owner, 
                                             bool includeItems,
                                             String filterItem,
                                             VersionSpec versionFilterItem)

    By convention, methods that begin with "Query" in the source control API allow you to pass null to mean "give me everything."  In the code below, I don't want to filter by labelName or owner, so I set those to null to include everything.

    If the user specified a server path for the scope (scope is always a server path and not a local path), we'll use it, and otherwise we'll use the root ($/).  The scope of a label is, effectively, the part of the tree where it has ownership of that label name.  In other words by specifying the label scope, $/A and $/B can have separate labels named Foo, but no new label Foo can be used under $/A or $/B.  For this program, setting the scope simply narrows the part of the tree it will include in the output.  For example, running this with a scope of $/A would show only one label called Foo, but running it with $/ as the scope (or omitting the scope) would result in two Foo labels being printed.

    Folder PATH listing for volume Dev
    Volume serial number is 0006EE50 185C:793F

    D:\ws1>tf label Foo@$/testproj/A A
    Created label

    D:\ws1>tf label Foo@$/testproj/B B
    Created label

    Foo (10/25/2005 4:00 PM)
    Foo (10/25/2005 4:00 PM)

    The most important parameter here is actually includeItems.  By setting this parameter to false, we'll get the label metadata without getting the list of files and folders that are in the label.  This saves both a ton of bandwidth as well as load on the server for any query involving real-world labels that include many thousands of files.

    The remaining parameters are filterItem and versionFilterItem.  The filterItem parameter allows you to specify a server or local path whereby the query results will only include labels involving that file or folder.  It allows you to answer the question, "What labels have been applied to file foo.cs?"  The versionFilterItem is used to specify what version of the item had the specified path.  It's an unfortunate complexity that's due to the fact that we support rename (e.g., A was called Z at changeset 12, F at changeset 45, and A at changeset 100 and beyond).  Before your eyes glaze over (they haven't already, right?), I just set that parameter to latest.

    Here's an example of using the program with the tree mentioned earlier.  I modified the Foo label on A to have a comment, so it has a later modification time.

    D:\ws1>tf label Foo@$/testproj/A /comment:"This is the first label I created."
    Updated label Foo@$/testproj/A

    Foo (10/25/2005 4:05 PM)
       Comment: This is the first label I created.
    Foo (10/25/2005 4:00 PM)

    Then I added a file under A, called a.txt, and modified the label to include it.  Running the app on A\a.txt, we see that it is only involved in one of the two labels in the system.

    D:\ws1>tf label Foo@$/testproj/A A\a.txt
    Updated label

    D:\ws1>d:\LabelHistory\LabelHistory\bin\Debug\LabelHistory.exe A\a.txt
    Foo (10/25/2005 4:56 PM)
       Comment: This is the first label I created.

    To build it, you can create a Windows console app in Visual Studio, drop this code into it, and add the following references to the VS project.


    using System;
    using System.IO;
    using Microsoft.TeamFoundation;
    using Microsoft.TeamFoundation.Client;
    using Microsoft.TeamFoundation.VersionControl.Client;
    using Microsoft.TeamFoundation.VersionControl.Common;
    namespace LabelHistory
        class Program
            static void Main(string[] args)
                // Check and get the arguments.
                String path, scope;
                VersionControlServer sourceControl;
                GetPathAndScope(args, out path, out scope, out sourceControl);
                // Retrieve and print the label history for the file.
                VersionControlLabel[] labels = null;
                    // The first three arguments here are null because we do not
                    // want to filter by label name, scope, or owner.
                    // Since we don't need the server to send back the items in
                    // the label, we get much better performance by ommitting
                    // those through setting the fourth parameter to false.
                    labels = sourceControl.QueryLabels(null, scope, null, false, 
                                                       path, VersionSpec.Latest);
                catch (TeamFoundationServerException e)
                    // We couldn't contact the server, the item wasn't found,
                    // or there was some other problem reported by the server,
                    // so we stop here.
                if (labels.Length == 0)
                    Console.WriteLine("There are no labels for " + path);
                    foreach (VersionControlLabel label in labels)
                        // Display the label's name and when it was last modified.
                        Console.WriteLine("{0} ({1})", label.Name,
                        // For labels that actually have comments, display it.
                        if (label.Comment.Length > 0)
                            Console.WriteLine("   Comment: " + label.Comment);
            private static void GetPathAndScope(String[] args,
                                                out String path, out String scope,
                                                out VersionControlServer sourceControl)
                // This little app takes either no args or a file path and optionally a scope.
                if (args.Length > 2 || 
                    args.Length == 1 && args[0] == "/?")
                    Console.WriteLine("Usage: labelhist");
                    Console.WriteLine("       labelhist path [label scope]");
                    Console.WriteLine("With no arguments, all label names and comments are displayed.");
                    Console.WriteLine("If a path is specified, only the labels containing that path");
                    Console.WriteLine("are displayed.");
                    Console.WriteLine("If a scope is supplied, only labels at or below that scope will");
                    Console.WriteLine("will be displayed.");
                    Console.WriteLine("Examples: labelhist c:\\projects\\secret\\notes.txt");
                    Console.WriteLine("          labelhist $/secret/notes.txt");
                    Console.WriteLine("          labelhist c:\\projects\\secret\\notes.txt $/secret");
                // Figure out the server based on either the argument or the
                // current directory.
                WorkspaceInfo wsInfo = null;
                if (args.Length < 1)
                    path = null;
                    path = args[0];
                        if (!VersionControlPath.IsServerItem(path))
                            wsInfo = Workstation.Current.GetLocalWorkspaceInfo(path);
                    catch (Exception e)
                        // The user provided a bad path argument.
                if (wsInfo == null)
                    wsInfo = Workstation.Current.GetLocalWorkspaceInfo(Environment.CurrentDirectory);
                // Stop if we couldn't figure out the server.
                if (wsInfo == null)
                    Console.Error.WriteLine("Unable to determine the server.");
                TeamFoundationServer tfs =
                                                          // RTM: wsInfo.ServerUri.AbsoluteUri);
                sourceControl = (VersionControlServer)tfs.GetService(typeof(VersionControlServer));
                // Pick up the label scope, if supplied.
                scope = VersionControlPath.RootFolder;
                if (args.Length == 2)
                    // The scope must be a server path, so we convert it here if
                    // the user specified a local path.
                    if (!VersionControlPath.IsServerItem(args[1]))
                        Workspace workspace = wsInfo.GetWorkspace(tfs);
                        scope = workspace.GetServerItemForLocalItem(args[1]);
                        scope = args[1];

    [Update 10/26] I added Microsoft.TeamFoundation.Common to the list of assemblies to reference.

    [Update 7/12/06]  Jeff Atwood posted a VS solution containing this code and a binary.  You can find it at the end of http://blogs.vertigosoftware.com/teamsystem/archive/2006/07/07/Listing_all_Labels_attached_to_a_file_or_folder.aspx.

  • Buck Hodges

    Web Load Testing webcast Tuesday (Oct. 25) at 1:00 pm PST


    If you want to learn more about web load testing in VSTS, you'll want to check out Ed Glas' webcast on Tuesday, Oct. 25.  Brian Harry has been using VSTS load testing developed by Ed's group to answer the question, "How many users will your Team Foundation Server support?"  Find out how you can do the same for your own web site or web service.

    MSDN Webcast: Load and Web Testing with Microsoft Visual Studio 2005 Team System (Level 200)    

    Start Time:   Tuesday, October 25, 2005 1:00 PM (GMT-08:00) Pacific Time (US & Canada) 
    End Time:
       Tuesday, October 25, 2005 2:00 PM (GMT-08:00) Pacific Time (US & Canada) 
    Event Description 
     Products: Visual Studio.

     Recommended Audience: Developer.

     Language: English-American
     Description:   By using Microsoft Visual Studio 2005 Team System as a platform, you can better manage the software development life cycle. You have the flexibility to customize and extend this platform to meet organizational needs. In this webcast, gain a general understanding of the Web and load testing features in Visual Studio 2005.

    Presenter: Ed Glas, Group Manager, Microsoft Corporation

  • Buck Hodges

    Team Foundation Beta 3 Virtual PC is headed your way

    The beta 2 Virtual PC image was hugely popular.  The new Team Foundation beta 3 VPC image is now making its way up to MSDN.  So, later today or Monday you'll be able to download and run a single-server and client beta 3.  Just make sure you have plenty of RAM since you'll be running everything on one machine (say 2GB+).
  • Buck Hodges

    Beta 3 known bug when comparing large files: "Invalid access code (bad parameter)."


    In the forum, Carl Daniel reported getting the message "Invalid access code (bad parameter)" when comparing two different versions of a file in Team Foundation Source Control Beta 3.

    Unfortunately this is a bug that was discovered and fixed after beta 3 was released.  The problem is that large files (on the order of 500K) will crash the diffmerge.exe tool.  I'm not certain as to the exact size that begins to trigger the problem.

    If you hit this bug, you'll need to work around it by using another diff tool until a newer public release fixes the problem (the upcoming beta 3 refresh will still have this bug).  Of course, you could also reduce the size of the file, but that's not often an option.

  • Buck Hodges

    September dogfood statistics


    You can find the latest dogfood statistics on Brian's blog.

    [Update 10/12]  John Lawrence also has the statistics as usual, including Excel charts.  I saw Brian's, and I didn't think to check there.

  • Buck Hodges

    SQL error in get with beta 3


    On the forum, a couple of users have run into this issue with get, so I thought I would mention it here.

    There's a known issue in beta 3 where mapping a server path to a local path that exceeds the NTFS limit results in a SQL error message.  The RTM code will give you a nice message to that effect rather than the SQL error message.

    A database error occurred (SQL error 8152) ---> String or binary data would be truncated.

    MyServer.TFSVersionControl..prc_Get: Database Update Failure - Error 8152 executing EXECUTESQL statement for #versionedItems

    The statement has been terminated.

    If you run into this problem, you can fix it by editing your workspace mappings to change the local path to be shorter.

    For example, if you have "$/project/some_path" mapped to "c:\documents and settings\user\visual studio projects\some_path" then change the local path to something shorter, such as "c:\projects\some_path" and then run get again.  To modify your mappings in VS, use File -> Source Control -> Workspaces, select your workspace, and click Edit.

  • Buck Hodges

    Ed Hintz has started blogging

    Ed Hintz has started blogging with a post that describes some of the differences between working with SourceSafe and working with TFS.  Ed's the dev lead for the source control client team and is the person to whom I report.  So, check it out and be sure to let him know about the things you do and don't like in the TFS source control integration for VS 2005.  Your feedback will not likely change v1 at this point, but if you want to influence the next release, now is the time to speak up.
  • Buck Hodges

    TFS version control command line: tf.exe


    For folks who like command lines or those who've found the typical command line experience a little spartan, you should try tf.exe.

    One interesting feature of our command line is that it uses dialogs for a number of commands when you are running it interactively (i.e., you don't specify /noprompt or /i).  For instance, you get the same check-in dialog from the checkin command in tf.exe as you get from VS.

    The following commands bring up the same dialogs you'd find in VS.

    • changeset
    • checkin
    • difference runs external diff viewer
    • get will take you to the conflict dialog as necessary
    • history
    • resolve
    • shelve
    • unshelve
    • view runs the associated editor
    • workspace

    That makes working with the command line a lot more convenient than you might otherwise think.  You can find the command line docs on MSDN at http://msdn2.microsoft.com/en-us/library/cc31bk2e(en-us,vs.80).aspx.

  • Buck Hodges

    Brian Harry posts about the changes between beta 2 and beta 3


    A while back I wrote a little about what's changed since beta 2 and the release of beta 3.

    Brian Harry, the person who heads Team Foundation (he is Product Unit Manager), has just written the post about what changed.  In What's new with TFS Beta 3?, he covers the whole product, as you might expect.  You'll want to read it.

    One of the things he mentions is something new, tfpt.exe.  That is a power toy that provides some very useful features that didn't make it into the regular product.  We've been using it internally now for a little while.  I'll post more about it when it is available in the next couple of weeks.

    • merging changes in a shelveset with changes in you workspace
    • rolling back a changeset
    • checking out all writable files in your workspace that aren't checked out
    • a dialog focused on code reviews
    • undoing unchanged files
    • getting just the files in a particular changeset
  • Buck Hodges

    Using Source Code Control in Team Foundation


    Chris Menegay's article on MSDN, Using Source Code Control in Team Foundation, provides a good overview of some of the major features.

    As a part of Visual Studio 2005 Team System and the new Team Foundation Server, Microsoft is providing a true enterprise-class source code control system. Team Foundation Version Control (TFVC) is provided as part of Team Foundation Server and offers integrated source control for Visual Studio 2005. Make no mistake about it: TFVC does not share any heritage with Visual SourceSafe. TFVC was written from the ground up to solve limitations of using VSS on large development projects. Instead of relying on a file system as the repository, TFVC leverages Microsoft SQL Server 2005 as a robust, scalable, high-performance storage mechanism.

    Team Foundation Version Control provides all of the basic functionality of other source control mechanisms and most of the enhancements listed for Visual SourceSafe 2005. In addition, TFVC provides new advanced feature, including Shelving, checkin policies, and integration with the new work item tracking system. These features are described later in this article.

  • Buck Hodges

    VSS converter for TFS beta 3

    Akash Maheshwari, PM for the source control converters for TFS, has posted a set of articles on using the VSS converter with TFS beta 3.  You'll want to check that out if you are converting from VSS, because that's the most up-to-date source of information.
  • Buck Hodges

    Why are there two Source Control entries on the File menu?


    If you add the Code Analysis check-in policy to your Team Project, you'll see two entries for Source Control on the File menu in VS, as shown below.

    All of the standard menu items for TFS source control are under the second Source Control entry.  The issue is cosmetic and is related to a change we made in the TFS source control provider used in VS.  This should be fixed by the time version 1 is released.

    On a related note, Eric Lee has a post regarding an issue with solutions that were bound to source control in TFS beta 2.  If you have a solution was originally bound to source control in beta 2, you can remove the bindings by editing the .sln file, as he describes.  Alternatively, if you open your solution without editing it, you should get a dialog stating that the "associated source control plug-in is not installed or could not be initialized."  Choosing to "permanently remove source control association bindings" will also remove the bindings from the solution file.

    After the solution file no longer has the old bindings, you can then re-bind the solution using File -> Source Control -> Change Source Control if it's already checked in or right-click on the solution and choose Add to Source Control to put it in TFS beta 3.

  • Buck Hodges

    The 10,000th Changeset


    Back on July 13, I wrote about the dates of changeset "milestones" in our dogfood system.  Today we hit changeset 10,000.  The time between 9000 and 10000 was longer than usual because of check-in restrictions in place leading up to beta 3 (bug fixes had to be individually reviewed and approved in ask mode).

    Here's the full list.  The rate of changesets being created should increase as we continue to add more teams to the system.

    Changeset 10000 was checked in on September 26
    Changeset 9000 was checked in on September 2
    Changeset 8000 was checked in on August 16
    Changeset 7000 was checked in on July 30
    Changeset 6000 was checked in on July 15
    Changeset 5000 was checked in on June 29
    Changeset 4000 was checked in on June 13
    Changeset 3000 was checked in on May 20
    Changeset 2000 was checked in on April 26
    Changeset 1000 was checked in on March 15
    Changeset 1 (creation of the server - that upgrade started with a fresh server installation) was checked in on December 10, 2004

  • Buck Hodges

    Active Directory is no longer required with Team Foundation Beta 3


    There have been a lot of requests from folks wanting to use TFS without Active Directory.  Since several questions and answers in comments on Jeff Beehler's blog dealt with this, I thought it's worth mentioning in a post that TFS beta 3 supports workgroup installations, and Active Directory is not required in that configuration.

    [Update 09/22/05]  Doug's comment below addresses another long-standing issue: "Additionally, beta 3 now supports Windows 2000 Active Directories as well as the previously supported Windows Server 2003 version."

  • Buck Hodges

    Team Foundation Beta 3 has been released!


    Today we signed off on Team Foundation Beta 3!  If you used beta 2, beta 3 is a vast improvement.  Beta 3 should hopefully show up on MSDN in about two days.  You may remember that beta 3 has the go-live license and will be supported for migration to the final release version 1, which means this is the last time you have to start from scratch.

    With beta 3, single-server installation is once again supported!  I know many people didn't install the July CTP because of the lack of a single-server installation.  With each public release, installation has gotten easier and more reliable, and this is the best installation thus far.

    I wrote about what changed between beta 2 and the July CTP.  That's still a good summary.  Between the July CTP and beta 3, we fixed a lot of bugs, further improved performance across the product, improved the handling of authenticating with the sever using different credentials (e.g., you're not on the domain), improved installation, and more.

    If you have distributed teams, be sure to try out the source control proxy server.  It's one of the features we have to support distributed development.

    While you are waiting on TFS to show up, you'll want to make sure you already have Visual Studio 2005 Team Suite Release Candidate (build 50727.26) and SQL Server 2005 Standard (or Enterprise) Edition September CTP (build 1314.06, which uses the matching 2.0.50727.26 .NET framework)).

    TFS beta 3 is build 50727.19.  The reason the minor number is different than the VS RC minor number is due to the fact that TFS beta 3 was built in a different branch.  The major build number stopped changing at 50727 (July 27, 2005 build) for all of Visual Studio, and only the minor number changes now.

    Here's a list of my recent posts that are particularly relevant to beta 3.

    This one will need to be updated (URLs changed):  TFS Source Control administration web service.

    [Update 9/22/05]  Updated links and SQL info.

  • Buck Hodges

    Working remotely with TFS Version Control

    Here in North Carolina, we have a 10Mb connection to the outside world (it was a 3Mb connection until a few months ago), so we know what it's like to use TFS remotely.  In addition to minimizing the number of requests from the client to the server, TFS Version Control contains several features that aid in working in a situation where the connection to the server is much slower than the local network: compressed files, compressed SOAP responses, and a caching proxy server.
    Every file that we upload as part of checking in or shelving is compressed using GZip.  If a file is larger after being compressed, which may be due to it being in a compressed format (e.g., ZIP or JPEG) already or being encrypted, the file will be uploaded without being compressed.  When files are downloaded, they are still compressed as they were when they were uploaded.
    Communication between the client and server in TFS uses HTTP for everything, and it uses SOAP for everything other than file uploads and downloads.  We support IIS 6 compression of the SOAP responses (requires IIS settings for asmx pages -- I expect this is in the installation guide, but I haven't made time to go check), so the communication with the remote server is compressed.  Since SOAP is XML, responses compress very well.
    The caching proxy in version 1 caches versions of files, and it does not handle any other parts of the client-server communications.  When a user configures a client to use a proxy server (in VS, Tools -> Options, click on Source Control and then Team Foundation Source Control to get to the proxy settings), the client still calls the server for the list of files to download (e.g., diff, get, view, undo) and calls the proxy server to download the file contents.
    The first user to download a file must wait until the file is transferred over the remote link, but subsequent downloads are served from the proxy's local disk copy.  To prevent the lag for the first user, you could, for example, set up a workspace that exists for the purpose of populating the cache by having a Windows task scheduled to run "tf get" every so often (say 30 minutes) or hook into the check-in event from the mid-tier and kick off a get in the workspace (see Continuous Integration Demo Code from Doug Neumann's TLN301 PDC Talk for an example of hooking into the check-in event generated by the app tier).
    The amount of disk space used by the proxy is configurable, and it removes the least recently used files first when it needs to reclaim disk space.  With large hard drives being cheap, it's not hard to have a caching proxy server cache every version of every file in the server and not have to delete anything.
    Here in North Carolina, we have a proxy server set up that is on our 100Mb network, so downloads are local while the rest of the communication is still over the WAN link.  We also take advantage of the IIS compression for the SOAP responses (it's turned on for our dogfood server).
    I wrote back in March that our experience with using TFS had been good, and it's gotten significantly better since then due to features like the proxy, compressed SOAP responses, and the performance optimizations that have been made.
    [Update 10/01/05]  When you install the server, SOAP response compression is automatically configured.  You don't need to make any changes to take advantage of it.
  • Buck Hodges

    Outlook macro to create work item and changeset hyperlinks


    [NOTE: If this post looks truncated, scroll down or make the browser window wider.  The style sheet causes this.]

    Ed Hintz, a dev lead on the TFS Version Control team, sends a lot of email with TFS work item numbers.  Work items can be viewed (not edited) in a web browser using an URL constructed from the server name and the work item number.  Our dogfood system is a single server, so the server is always the same.  So, Ed wrote a macro for Outlook that will convert the current selection to a hyperlink to the work item.

    Here's the code for the macro, which he bound to Alt+L.  The hyperlinks generated will work for beta 3, but you would need to tweak the hyperlink for earlier releases.

    Sub LinkToWorkItem()
    ' Convert the current selection to a work item hyperlink
        ActiveDocument.Hyperlinks.Add Anchor:=Selection.Range, Address:= _
            "http://TFserver:8080/WorkItemTracking/Workitem.aspx?artifactMoniker=" _
    & Selection.Text, _
            SubAddress:="", ScreenTip:="", TextToDisplay:=Selection.Text
    End Sub

    To do the same thing for changesets, use the following.

    Sub LinkToChangeset()
    ' Convert the current selection to a changeset hyperlink
        ActiveDocument.Hyperlinks.Add Anchor:=Selection.Range, Address:= _
            "http://TFserver:8080/VersionControl/VersionControl/Changeset.aspx?artifactMoniker=" _
    & Selection.Text & "&webView=true" _
            , SubAddress:="", ScreenTip:="", TextToDisplay:=Selection.Text
    End Sub

    If you have never created a macro in Outlook, create a new mail message, click in the body (otherwise, the Macros menu is disabled), go to Tools -> Macro -> Macros, enter the name "LinkToWorkItem" in the dialog's text box, and click Create.  In the Visual Basic Editor that opens up, paste the body of the LinkToWorkItem macro into the subroutine, save, and close the VB Editor.  To associate it with Alt+L, go to Tools -> Customize, click the Keyboard button, scroll down to Macros in the Categories list box, click in the "Press new shortcut key" text box, and press Alt+L.  Now click Assign, Close, and Close.  In the new mail message you have up, type a number, select it, and hit Alt+L to test it.

    Now when you want to turn a plain number into a work item or changeset link, highlight the number and run the macro via the shortcut you've assigned.

    [Update 10/03/05]  Updated code comment.

  • Buck Hodges

    Continuous Integration Demo Code from Doug Neumann's TLN301 PDC Talk


    Doug Neumann's TLN301 presentation, VSTS: Behind the Scenes of Visual Studio 2005 Team Foundation Server (slides), featured a demonstration of how to use the server's check-in event notification to kick off a build for a continuous integration build system using Team Build.  A number of people asked for it, so we've decided to post it here.

    Doug's demonstration used the July CTP, but since beta 3 will hopefully be released this week, the code has been modified to run on beta 3.  You'll need to create a web service and put the following code into it.  Here are the assemblies you'll need to reference.


    The code is intentionally simplified to meet the needs of a demo (e.g., a real continuous integration system would need more intelligence in handling check-in events that come in while the current build is running, etc.), but it's a good example of how to hook into the Team Foundation server's events and build useful extensions.

    Once you've built and deployed your web service using VS 2005, you'll need to add a subscription for your web service.  The check-in event is sent to your continuous integration web service via SOAP.  The following command, with changes as necessary for the name of your machine (here, I use localhost), must be run on the application tier to add an event subscription for your service.

    bissubscribe /eventType CheckinEvent /userId mydomain\myusername /address http://localhost:8080/ContinuousBuild/Service.asmx /deliveryType Soap /domain localhost

    Now when you check code into your server, your continuous integration service will kick off a build.

    The Team Build team plans to post a more elaborate continuous integration example.

    [UPDATE 2/20/06]  I updated the version numbers in the SoapDocumentMethod attribute to work with RC and RTM releases.

    using System;
    using System.Collections;
    using System.Collections.Generic;
    using System.Collections.Specialized;
    using System.ComponentModel;
    using System.Configuration;
    using System.Diagnostics;
    using System.Globalization;
    using System.IO;
    using System.Reflection;
    using System.Threading;
    using System.Web;
    using System.Web.Services;
    using System.Web.Services.Protocols;
    using System.Xml;
    using System.Xml.Serialization;
    using Microsoft.Win32;
    using Microsoft.TeamFoundation.Build.Common;
    using Proxy = Microsoft.TeamFoundation.Build.Proxy;
    using Microsoft.TeamFoundation.Client;
    using Microsoft.TeamFoundation.WorkItemTracking.Client;
    using Microsoft.TeamFoundation.WorkItemTracking.Common;
    [WebService(Namespace = "http://tempuri.org/")]
    [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
    public class Service : System.Web.Services.WebService
        public Service ()
                            RequestNamespace = "http://schemas.microsoft.com/TeamFoundation/2005/06/Services/Notification/03")]
        public void Notify(string eventXml)     // Do not change the name of this argument.
            ThreadPool.QueueUserWorkItem(CallBuild, eventXml);
        public void CallBuild(object state)
            string eventXml = (string)state;
            // De-serializing the event
            XmlDocument Xmldoc = new XmlDocument();
            string teamProject = Xmldoc.DocumentElement["TeamProject"].InnerText;
            string owner = Xmldoc.DocumentElement["Owner"].InnerText;
            // NOTE: hard-code info for demo
            string teamFoundationServer = "http://localhost:8080";
            string buildType = "Continuous Integration Build";
            string buildMachine = "localhost";
            string buildDirectoryPath = "c:\\builds";
            Proxy.BuildController controller = Proxy.BuildProxyUtilities.GetBuildControllerProxy(teamFoundationServer);
            Proxy.BuildStore store = Proxy.BuildProxyUtilities.GetBuildStoreProxy(teamFoundationServer);
            Proxy.BuildParameters buildParams = new Proxy.BuildParameters();
            buildParams.TeamFoundationServer = teamFoundationServer;
            buildParams.TeamProject = teamProject;
            buildParams.BuildType = buildType;
            buildParams.BuildDirectory = buildDirectoryPath;
            buildParams.BuildMachine = buildMachine;
            string buildUri = controller.StartBuild(buildParams);
            // wait until the build completes
            BuildConstants.BuildStatusIconID status;
            bool buildComplete = false;
                Proxy.BuildData bd = store.GetBuildDetails(buildUri);
                status = (BuildConstants.BuildStatusIconID)bd.BuildStatusId;
                buildComplete = (status == BuildConstants.BuildStatusIconID.BuildSucceeded ||
                    status == BuildConstants.BuildStatusIconID.BuildFailed ||
                    status == BuildConstants.BuildStatusIconID.BuildStopped);
            } while (!buildComplete);
            if (status == BuildConstants.BuildStatusIconID.BuildFailed)
                // create a workitem for the developer who checked in
                CreateWorkItem(teamFoundationServer, teamProject, owner);
        public void CreateWorkItem(string server, string projectName, string owner)
            TeamFoundationServer tfs = TeamFoundationServerFactory.GetServer(server);
            WorkItemStore store = (WorkItemStore) tfs.GetService(typeof(WorkItemStore));
            WorkItemTypeCollection workItemTypes = store.Projects[projectName].WorkItemTypes;
            // Enter the work item as a bug
            WorkItemType wit = workItemTypes["bug"];
            WorkItem workItem = new WorkItem(wit);
            workItem.Title = "The changes submitted have caused a build break - please investigate";
            string[] ownerSplit = owner.Split('\\');
            owner = ownerSplit[ownerSplit.GetLength(0) - 1];
            workItem.Fields["System.AssignedTo"].Value = owner;
            workItem.Fields["Microsoft.VSTS.Common.Priority"].Value = 1;
Page 19 of 23 (563 items) «1718192021»