Buck Hodges

Visual Studio ALM (VSALM, formerly VSTS) - Team Foundation Service/Server (TFS) - MSDN

  • Buck Hodges

    How to add the Team Foundation assemblies to the .NET tab in the VS Add Reference dialog


    To write an app using the Team Foundation API, you're going to need to reference assemblies that are in the GAC.  It's not possible to add a reference to a .NET assembly in the GAC in VS when you need to add a reference to an assembly.

    The GAC'ed Team Foundation assemblies are also copied to the PrivateAssemblies folder under the VS directory.  When you want to add a reference to a TFS assembly in VS solution, you can choose Browse and find the assembly.

    To make it more convenient, you can also add the TFS assemblies to the .NET tab in the Add Reference dialog.  This knowledge base article describes how to add an assembly to the list in the .NET tab.

    Based on that, here's a simple batch script that will add all of the GAC'ed Team Foundation assemblies to the list.  There are probably assemblies you'll never need to use in this list, so feel free to trim it down.  You can copy the text to a file called register.bat and run it.  The batch script assumes that you installed VS in the normal Program Files directory.  Since this script modifies your registry, all of the usual disclaimers apply, you should back it up beforehand, etc.

    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.Build.Common /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.Client /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.Common /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.Common.Library /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.VersionControl.Client /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.VersionControl.Common /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.VersionControl.Common.Integration /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.WorkItemTracking.Client /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.WorkItemTracking.Client.Cache /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.WorkItemTracking.Client.DataStore /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.WorkItemTracking.Client.Provision /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.WorkItemTracking.Client.QueryLanguage /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.WorkItemTracking.Client.RuleEngine /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f
    reg add HKCU\Software\Microsoft\.NETFramework\AssemblyFolders\Microsoft.TeamFoundation.WorkItemTracking.Proxy /ve /d "%programfiles%\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies" /f

    After running the script, you should see the GAC'ed Team Foundation assemblies listed in the .NET tab.

    [Update 1/12]  I fixed some errors in the first couple of paragraphs.

  • Buck Hodges

    Migrating CVS and Subversion repositories to Team Foundation Server


    A number of users have asked about converting existing CVS and Subversion repositories to Team Foundation Server.  While we have no plans to produce such, we had hoped third parties would.  Today someone from Component Software posted an announcement of such a converter.

    Here's the description from their web page.

    CS-Converter is a powerful and reliable conversion tool, used to migrate from various legacy version control systems (such as RCS, CVS, Subversion and Visual SourceSafe) to Microsoft Team Foundation Version Control (TFVC) system.

    Disclaimer: I don't have any relationship with the company, haven't tried the software, am not endorsing it, etc., etc.

    [UPDATE 10/20/2009] With the Component Software product being discontinued, I wanted to mention that there is another company now with a tool to convert CVS and subversion to TFS called Timely Migration.

  • Buck Hodges

    Locks based on file types (extensions) and shelving


    Recently, someone asked about locks, shelving, and buddy builds (i.e., shelving your changes and unshelving and building them in another workspace to make sure everything builds cleanly).

    I am trying to add a number of files into our source control.  The list of files I want to add contains a few each of .ico and .bmp, and one .xls files, all which get locked (locked to prevent check-out) when I add them, presumably because they cannot be merged if someone else checks them out and changes them.

    The fact that they get locked seems to deny the possibility of buddy building, even on my own second machine, because they are locked on a by-workspace rather than by-user basis.

    I tried selecting the binary items in the hierarchy under the Source Control Explorer, and right-clicking to the “unlock” command, but it always says that the file could not be found in my workspace.

    What am I missing?  A preference setting or checkbox somewhere that does not lock added binary files?  Is there some way to turn off or override my own lock?  Is it really not possible to unshelve shelvesets containing binary files to a second machine?

    Locking and shelving, while keeping the changes in your workspace, don't mix.  The files that are configured to be locked via file type extension (in VS, Team -> Team Foundation Server Settings -> Source Control File Types) are locked exclusively.

    An exclusive checkout lock prevents any other changes from being pended on the file involved.  The file type locking mechanism prevents user from unlocking files that are locked via that mechanism.  So, to do a buddy build, you would need to shelve and undo, which is best accomplished by either unchecking the “Preserve local changes” checkbox in the GUI shelve dialog or using the /move option on the shelve command.  Alternatively, you can change the file types to allow multiple checkout.

    The problem is even worse for users where exclusive checkout is turned on for an entire team project, as even plain text files are locked exclusively in that case.  As with the file type extension locking mechanism, users cannot unlock files that are locked due to the team project setting.  When you shelve, you need to move the changes to the shelveset (don't keep them in your workspace).

    For us, the file type locking causes problems both with buddy builds and with a check-in system we use called gauntlet.  With gauntlet, we shelve our changes and submit them to gauntlet for it to build them and check them in.  It can't unshelve any item that requires an exclusive lock if you still have the pending change in your own workspace.  As a result, we've turned off exclusive checkout based on file extensions by changing each to allow multiple checkout.

  • Buck Hodges

    Disabling or changing output colors used by the command line (tf.exe)


    In the forum, a user asked how to turn off the colors used in the output of the version control command line for Team Foundation (tf.exe).  I thought I'd repost it here.


    When tf.exe (beta 3 refresh version) displays an error message the text is colored in yellow on black. My command prompt windows usually have black text on white background. Is it possible to let tf. exe display all output without changing the color, e.g. via an environment variable?


    You can change or turn off the coloring by changing the settings in tf.exe.config (in the same location as tf.exe).

    Here is the list of display settings.

    Display.FallbackWidth - the width the command line uses when the output is not going to the console; this is used in column calculations, separators and defaults to 80
    Display.DisableColors - turns colors on or off (defaults to true)
    Display.ErrorBackground - the background of error or warning text (defaults to black)
    Display.ErrorForeground - the foreground of error or warning text (defaults to yellow)
    Display.InfoBackground - the background of informational text (defaults to black)
    Display.InfoForeground - the foreground of informational text (defaults to cyan)

    You can turn off coloring altogether by adding the following to your tf.exe.config file.

          <add key="Display.DisableColors" value="true" />

    If you simply want to alter the coloring to make it look better, you could use something like the following.  The color choices are black, blue, cyan, darkblue, darkcyan, darkgray, darkgreen, darkmagenta, dark red, darkyellow, gray, green, magenta, red, white, and yellow.

          <add key="Display.ErrorForeground" value="blue" />
          <add key="Display.ErrorBackground" value="white" />

    Your tf.exe.config file will end up looking something like the following.

    <?xml version="1.0"?>
        <gcConcurrent enabled="true" />
        <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
          <publisherPolicy apply="yes" />
          <probing privatePath="PrivateAssemblies" />

        <add key="Display.DisableColors" value="true" />


    [Update 1/7] Fixed formatting issues.

  • Buck Hodges

    Beta TFS MSSCCI plugin now available for VB6, VC6 (not yet VS 2002/2003)


    Brian Harry announced the availability of the beta TFS MSSCCI plugin for VB6, VC6 (not yet VS 2002/2003).

    You can download it at http://www.microsoft.com/downloads/details.aspx?familyid=32202966-EF04-442F-8C5C-88BDF15F551C&displaylang=en.


  • Buck Hodges

    How to improve command line performance


    If you are using the tf.exe command line in scripts to convert from another version control system or to run other tasks that involve a lot of calls, there are several factors that have a large impact on the performance.  Other than the command files, all of the following applies to Visual Studio as well, but VS has the significant advantage of amortizing one-time overhead across a long execution time.  The command line pays one-time costs every time it runs, so it's important to minimize those costs when performance matters.

    Check LAN connection settings (applies now and for RTM)

    First, check your LAN connection settings in Internet Explorer (Tools -> Internet Options -> Connections -> LAN Settings).  Often, the best settings are either to have no boxes checked or to have both of the bottom two checkboxes checked, "Use a proxy server" and "Bypass proxy server for local addresses."  The reason is that the .NET 2.0 framework network code gets its settings from the settings in IE.  Prior to the December CTP, there was no way to override this.

    How much difference does it make?  It makes a 1 - 2 second difference per tf.exe execution on our network.  Of course, these settings may not work on your network, either for tf.exe or IE, depending upon your network configuration; you'll need to test it.

    Beginning with the December CTP, there is an optional registry setting that you can use to tell the Team Foundation client to bypass the proxy server without changing your IE settings.  In HKCU (per user) or HKLM (global), you can create the registry entry Software\Microsoft\VisualStudio\8.0\TeamFoundation\RequestSettings\BypassProxyOnLocal of type string with the value "true" to get the improved performance.

    Avoid running tf.exe on the application tier (applies now and for RTM)

    You might think that running tf.exe on the application tier would give you the best performance.  After all, the network communication would be minimized, right?  Well, it doesn't work that way.

    Team Foundation has a registration service that the client uses in order to get to the various services, including version control.  When a client app, such as tf.exe, VS, or your own custom application, needs to use version control, the TF client code must request the service definition from regsistration service on the server.  To avoid constantly requesting service registration information that rarely changes, the client-side registration code maintains a cache and only makes the server call when the cache is out of date.

    However, when the client code executes on the application tier, the client-side registration code detects that it is running on the application tier and does not cache the service registration information.  Additionally, there's a bug (that won't even be fixed for RTM due to being posponed to v2) that results in the client-side registration code requesting the registration information twice.  So, every execution of tf.exe invokes two registration service information requests when run on the application tier.

    The result is that tf.exe is faster when run on a machine other than the application tier (both now and for RTM).  So how much difference does this make?  It can save 1 - 3 seconds.  How much it saves really depends on some additional factors.  The call to get the registration information may take a few hundred milliseconds or so (multiplied by two for the app tier).  The rest of the savings comes from SOAP proxy class not being generated.  The client-side SOAP proxies other than version control use the standard SOAP proxy support provided by the .NET 2.0 framework where the client-side proxy class is generated by wsdl.exe at build time.  At runtime, the framework uses reflection to dynamically generate, compile, and load serialization assembly the first time the proxy class is used.  As you might guess, that's very expensive for the command line.

    The best setup would be a client machine that's connected to an application tier via a fast, local network switch.

    Use NGen with tf.exe (applies to beta 3 refresh and earlier)

    With .NET 2.0, NGen has improved substantially.  NGen generates native image dlls for .NET assemblies and executables (you can find more details on NGen here).  The result is that less time is required to load and run the assemblies.

    The December CTP and beyond will use NGen during setup to create the native images.  For beta 3 and beta 3 refresh, you can use ngen.exe to reduce the time to load and run tf.exe.  You only need to run ngen.exe once.  Bring up a Visual Studio 2005 Command Prompt (Start -> All Programs -> Microsoft Visual Studio 2005 -> Tools -> Visual Studio 2005 Command Prompt), change to the %PROGRAM_FILES%\Microsoft Visual Studio 2005\Common 7\IDE directory and execute the following command.

    ngen install tf.exe

    NGen will examine tf.exe's dependencies and generate native images for them.  The result will be a faster start up time for every execution of tf.exe.  How much time does it save?  This may save up to 200 ms.

    Generate command files for sequences of tf.exe commands (applies now and for RTM)

    Team Foundation Server uses Windows authentication (NTLM) to authenticate users.  That means that every initial request from tf.exe must go through the NTLM handshake.  Since many tf.exe commands make only one or two requests, that means that most requests incur the authentication overhead.  For a long sequence of tf.exe commands, that overhead can really add up.

    If you are going to run a sequence of tf.exe commands, you can put them into a command file.  Preliminary documentation for command files is available here on MSDN.  The documentation refers to some addtional commands that are not available now, such as cd, rem, quit, and exit.  You can also run "tf @" and start typing or piping input as it reads from the standard input stream (experiment with it).

    When you execute a command file, only the first request incurs the authentication overhead.  After that, the authenticated connection is reused[1].  How much time does this save?  Well, there's no quick answer here.  It's the product of the authentication overhead multiplied by the number of authentication requests you can avoid by using command files.  Additionally, it's the runtime startup overhead multiplied by the number of times you would have invoked tf.exe (this isn't nearly as significant if you've used ngen.exe as explained previously).


    We've covered the most important command line performance issues that you can control: LAN settings (or TFS registry setting for RC/RTM), not running tf.exe on the application tier, running ngen on the beta 3 tf.exe, and using command files to avoid authentication overhead.  If you follow the recommendations discussed above, the tf.exe performance should be noticeably better.


    [1]  Technically, it's the first request on each thread.  Uploads, downloads, and pending adds (if you are adding more than 200 files at a time) use multiple background threads.  If all of the authenticated connections are in use, a thread will open a new connection that must be authenticated.  The authenticated connections are part of a pool (connection group) that are reused as needed.

  • Buck Hodges

    Backing up and restoring Team Foundation


    Rob Caron posted a link to the re-launched VSTS User Education team blog.  It includes a post of the procedures for backing up and restoring Team Foundation Server.  That includes the databases, SharePoint, reports, etc.  It also includes restoring TFS to a different server.  There have been lots of requests for this in the forum, so now's the time to give it a try.

    They have also posted documentation on permissions and the default groups, roles, and permissions.

    Since this is pre-release documentation, be sure to email them with any problems in the documentation.

    [Update 3/2/06]  You can find updated procedures for restoring TFS at http://blogs.msdn.com/vstsue/articles/511396.aspx.

  • Buck Hodges

    Team Foundation Version Control API documentation


    I've mentioned the Team Foundation Version Control client API example before as good way to get started with the API, and there are examples with labels and items too.  You can find documentation for the classes and methods in the Team Foundation Version Control API in the October Visual Studio SDK release.  Once you install it, you can find the Version Control API doc in
    C:\Program Files\Visual Studio 2005 SDK\2005.10\VisualStudioTeamSystemIntegration\Version Control\Object Model (Help 1.x format).zip.

  • Buck Hodges

    The "Filter by solution" button


    You are probably familiar with the Pending Changes window if you've been using Visual Studio with Team Foundation (View -> Other Windows -> Pending Changes).  Have you ever wanted to have the Pending Changes window show only the pending changes for files in the currently open solution and not all of the other changes in your workspace?

    There's button, called "Filter by solution," that does exactly that.  It's located between the Refresh button and the Workspace selector.

    The button acts like a toggle.  When it's in the "active" state, the Pending Changes window restricts the list of pending changes to those involved in the currently open solution.  If you don't have a solution open, the button is disabled.


  • Buck Hodges

    Warehouse troubleshooting guide

    Bryan MacFarlane posted a troubleshooting guide for warehouse problems in the MSDN Team Foundation Forum.  If you are having problems with your warehouse or reports, it's a good resource to try to figure out what's going wrong.
  • Buck Hodges

    Source Control Explorer now shows more information about the state of items


    One of the problems that users encounter when using Source Control Explorer is knowing what state the file or folder is in, particularly if it is shown in gray rather than black text.  In beta 3 refresh and ealier, gray text meant that the file or folder was unmapped, cloaked, deleted, or not downloaded.  There was no way to tell a deleted file from a cloaked file, for instance.  It often left users asking the question, "Why is that file (or folder) gray and how do I change that?"

    Recently, we added more information to what Source Control Explorer shows simply by changing the "Latest" column to say more than yes or no. 

    In the screen shot below, you can now see that I have the latest of the folder Mine (the folder itself, we'd have to go into the folder to see the state of the contents); the folder Misc is cloaked; the folder Old has been deleted; and I do not have the latest version of foo.cs.  If I had added an folder that wasn't mapped to the example below, you'd see "Not Mapped" for that folder.

    By default, viewing deleted files and folders is turned off, in which case I wouldn't see the folder Old.  To have Source Control Explorer display deleted files and folders, go to Tools -> Options -> Source Control -> Visual Studio Team Foundation and check the view deleted items setting.

  • Buck Hodges

    Using VS to get the URI for a Team Project


    Someone asked a question today about getting the URI for a Team Project.  James Manning's Team Project code sample showed how to do it using the API.  However, that was overkill in this case.  Bill Essary pointed out that you can select the Team Project in Team Explorer, right-click on it, and choose Properties (rather than right-click, you could just hit F4).  The Properties window shows the URI as the Url property (yes, apparently no one noticed that).

    The URI you get will look something like the following.  All Team Foundation URI's start with "vstfs:///" that will be tranlated to actual server URL's as needed.


    The URI isn't needed for any normal day-to-day tasks.  Occasionally, you may have the need for it for some advanced admin activity.  TfsSecurity, for example, can accept Team Project URI's for some options.

  • Buck Hodges

    New CTP of Team Foundation is available (NOT supported by go-live)


    Jeff Beehler has just written a post about the newly-released December CTP of TFS.  Jeff describes some of what changed, but probably the most important thing is that it is NOT supported under the go-live license.  There will not be any migration scripts to move to it or from it.

    The reason for that is that this release hasn't been through much formal testing.  Sure, we upgraded our dogfood system with it, so we think it works well enough.  But the goal here was to get a drop out for folks to test out all of the changes we've made, though not in production use.

    If you do try it out and find a bug, or if you find a bug in beta 3, please file it through the MSDN Product Feedback Center.  All bugs filed there get reviewed and assigned to the appropriate dev, as appropriate.  The bugs from PFC are automatically imported into TFS Work Item Tracking in our dogfood system.

  • Buck Hodges

    C# 3.0 features


    I just read Ian Griffiths' post on C# 3.0 and LINQ - Expression Trees (linked from Jomo Fisher's C# 3.0 post, which was linked in John Rivard's Why Visual Studio targets only one version of the .NET Framework, which was in a link from Soma's Multi-targeting of .NET FX in Visual Studio post, which I received in an email notification).  He provides a link to the C# 3.0 spec, which I read through quickly before reading his post.  I hadn't looked at any of this stuff that was evidently unveiled publicly at PDC '05, so this probably isn't news for a lot of people.

    Anyway, reading Ian's blog post tied the whole thing together for me, including the expression trees for query optimization, among other things.  When I was reading through the spec, some of it was pretty apparent, but I see now why each of the features is there given the context of LINQ.  I can only imagine what obtuse code some folks will write with extension methods.

  • Buck Hodges

    Work Item Tracking will use display names for RTM


    In beta 3 (including refresh) and earlier, work items have always been assigned to user names.  For some organizations, that's a real problem, as the user names do not have any relationship to the user's real name.  We have fixed this for RTM.  Brian Harry sent the following email about it, and I thought it was worth sharing (substituting Joe Developer for an actual employee's name).  It's another example of where customer feedback has had a significant impact on the product.

    We went round and round on what to do about this.  For most of the product cycle TFS has used peoples' aliases everywhere.  I don't think we even asked the question very hard about whether or not that was the right thing.  It's what we've always used internally and it seemed obvious that's what TFS would use.
    A few months ago we started hearing  significant feedback from customers that this is not workable.  We did a bunch of research and found that many customers have obtuse aliases like nm39756 and that no one in the organization can recognize them.  Many customer say this was a significant adoption blocker for them.  Working together with customers and trying to balance addressing their issue against the cost of changes at this late point in the product cycle, we decided to make a change.  In version control, build and a few other areas we still use users' aliases.  However in work item tracking we use display names.  I expect that in a future version we will unify this to make them consistent.
    It is true that display names need not be unique.  This can cause some confusion, however work item tracking isn't the only place in an organization where having multiple people with the same display name can create a problem.  It makes address books hard to use, email, etc.  What we do internally is to qualify duplicate names.  For example, there are multiple Joe Developers at Microsoft.  The one in our organization has the display name "Joe Developer (VSTS)".  This is the "best practice" we will recommend to customers.
    We investigated adding support for automatically qualifying display names when we import them from AD (for example by using their alias).  However we decided that the solution would require a while (and customer feedback) to get right and decided not to do it in this version.
    The duplicate display names do not create security problems because all security decisions in the system are made based on the user's SID and not any of the human readable strings.  Further, the UI we use to manipulate security uses the Windows UI to full resolve user names or works with aliases (which are unique).

    I hope this helps you understand where things are and what we recommend to customers.

  • Buck Hodges

    How to diff Word documents


    John Lawrence responded to someone's question about how to diff MS Word documents, and I thought the answer would help a few more folks.

    Team Foundation enables you to specify different diff and merge tools for different file types.

    We don’t have a built in compare tool for .doc files but I did a quick search and found quite a few available for download (e.g. http://www.documentlocator.com/download/difftools.htm). I don’t know if we have any specific recommendations.

    I installed DiffDoc (because it was free… http://www.softinterface.com/MD/MD.htm) and configured it to be my diff tool for .doc files:

    1. Navigate to Tools->Options->Source Control->Visual Studio Team Foundation
    2. Click the Configure User Tools button
    3. In the dialog that pops up, click Add
    4. Enter ‘.doc’ (no apostrophes) in the Extension field to indicate you’re adding a comparison tool for .doc files
    5. For the Command field, click “…” and navigate to your diff tool exe (in my case “C:\Program Files\Softinterface, Inc\DiffDoc\DiffDoc.exe”)
    6. Then enter the command line parameters to drive your specific diff tool. %1 expands to be the original file, %2 expands to the new one. In this case, enter "/M%1 /S%2" in the Arguments text box (without the quotation marks).

    That should do it – next time you try and view a diff of .doc files, it will launch this tool and you should be able to compare them.

  • Buck Hodges

    Team Foundation Beta 3 Virtual PC is available


    The last time I wrote about the VPC for beta 3, it didn't make up to MSDN like it was supposed to.  Well, a beta 3 VPC is available now, and a beta 3 refresh VPC will be available as soon as it makes its way through the infrastructure (yeah, I know, trust me at your own risk).

    The beta 3 VPC is under Visual Studio 2005 -> Visual Studio 2005 Team System Release Candidate -> Visual Studio 2005 Team System Release Candidate VPC (English).  The description is copied below.


    Thank you for evaluating Visual Studio 2005 Team Suite RC and Visual Studio 2005 Team Foundation Server Beta 3. This self-extracting VPC is provided as-is, as governed by the terms of the included EULA.

    VPC Contents

    • Microsoft Windows Server 2003 Standard Edition
    • Microsoft Visual Studio 2005 Team Suite RC
    • Microsoft Visual Studio 2005 Team Foundation Server Beta 3
    • Microsoft .NET Framework 2.0 Redistributable Package RC
    • Microsoft SQL Server 2005 Community Technology Preview
    • Microsoft Office 2003 Standard Edition

    AdventureWorks Demo
    This VPC has been prepared with a sample demo. To access the demo files, open the solution (.sln) file located in the C:\AdventureWorksDemo folder. The sample code is intended to demonstrate features of Visual Studio Team System, and therefore may exhibit poor coding conventions, security vulnerabilites, etc. The AdventureWorks application should not, and may not, be used in any form for production applications.

    Recommended System Requirements

    • PC with 2.0 gigahertz or faster processor
    • 1.5 GB RAM minimum
    • 10 GB available hard disk space
    • Super VGA (800 x 600) or higher video
    • DVD-ROM drive
    • Microsoft Virtual PC 2004 software

    NOTE: For best performance, the VPC should be extracted to a second high-speed hard drive.

  • Buck Hodges

    Hold the shift key and double click


    I wrote this back in August, but these shortcuts weren't availabe in the July CTP.  Now that beta 3 is out, you may want to give them a try.

    I've learned two shortcuts recently that I didn't know existed in TFS source control integration in VS.  These may or may not work based on the build you are using.

    Double clicking the Source Control node in the Team Explorer will bring up the Source Control Explorer window.  What I learned recently is that holding the shift key and double clicking Source Control (after selecting only that node) will bring up the Pending Changes window.

    The other shortcut is more useful.  You may already know that double clicking a file in the checkin window will bring it up in the editor.  If you hold the shift key while double clicking a pending edit, it will launch your diff viewer so you can see what's changed.  That's kind of handy.

  • Buck Hodges

    1,000,000 files in the dogfood system


    Jeff Beehler just checked in another 100,000 files, pushing us over 1,000,000 files.  We now have 1,042,659 files in the dogfood system.  Brian Harry sent out a dogfood statistics update in email today, prior to Jeff's check-in.  Watch for the latest stats on John Lawrence's blog.

  • Buck Hodges

    Go to Changeset coming for RTM


    You still won't be able to drag and drop in Source Control Explorer (some changes are just too large at this point), but you will be able to go to a changeset (without going through the back door using Get Specific Version solely for changeset look up).  In the next drop, the Edit -> Go to menu item will bring up the Find Changeset dialog when the Source Control Explorer has focus.  Also, it should appear on the Team menu.

    How about a power toy VS plugin to fill in the gaps?

    There are lots of things we'd like to do, but we can't at this point in the cycle.  We are thinking about putting together a VS power toy plugin with some of the things that didn't make it into v1, for version control and work item tracking.  What power toy features would help you get more done and have a happier v1 experience?

  • Buck Hodges

    How to work around the diff viewer problem in Team Explorer Beta 3


    If you install only the Team Explorer (and not VSTS), there is a known issue (7.3) where the diff viewer won't work.  It also affects the built-in merge tool, but I think that's less painful for users who just need Team Explorer.

    The tool is diffmerge.exe, and it uses msdiff.dll. The msdiff.dll needs several missing several VC runtime dlls that normally get installed as part of the VS installation.

    To work around this problem in beta 3, you can hook up a third-party diff tool.  Just go to Tools->Options->Source Control->Visual Studio Team Foundation Server and click on Configure User Tools to specify the diff tool of your choice.  Enter "*" for the Extension, the path to your diff viewer's executable in the Command and leave the Arguments as "%1 %2" to hook it up.

    For example, you could use windiff from the Windows XP tools.  I think you can just copy the windiff.exe after extracting it on one machine.

    You can find some docs for it here.

  • Buck Hodges

    How to upgrade from TFS Beta 3 to RTM


    Allen Clark has posted the document on upgrading beta 3 to RTM so you'll know what to expect.

    Preparing to upgrade to RTM

    If you’re using Beta 3 Refresh, you may be wondering what you can do now to make the server upgrade to RTM as painless as possible. Here are a few things that will help.

    • If you customize work item types, don’t create work item field reference names that have more than 70 characters, or that begin with a number.
    • If it’s not too late, deploy Team Foundation Server on the hardware you’re going to use at RTM.
    • If you are going to deploy to new hardware, go ahead and install Team Foundation Server on the new hardware and try it out just to make sure you don't have configuration issues. You can uninstall Team Foundation prior to upgrading.
    • Keep track of the customizations you’re making to the process templates; you’ll need to do these again before you’ll be able to create new projects with your custom process templates.

    I’ve included the specification below so that you can get a better idea of what the upgrade will be like and let us know if you feel like we haven’t covered your scenario. Keep in mind that this isn’t final, but the actual upgrade will be similar to what’s described here.

    Allen Clark

    Program Manager, Team Foundation Server

    Design Goals

    When V1 ships, customers must be able to upgrade their data without support from Microsoft.

    • Upgrading to RTM will not be effortless, but the work required must be reasonable.
    • Customers should not get into a server-down state unexpectedly.
    • Minimize the impact on the Setup team.
  • Buck Hodges

    Power Toy: tfpt.exe


    [UPDATE 8/9/2007]  I fixed the broken link to the power tools page. 

    [UPDATE 9/8/2006]  TFPT is now available in its own small download: http://go.microsoft.com/?linkid=5422499!  You no longer need to download the VS SDK.  You can find more information about the September '06 release here.

    Back at the start of October, I wrote about the tfpt.exe power toy.  The beta 3 version has been released with the October SDK release.  In the future, we plan to have a better vehicle for delivering it.

    Here's the documentation, minus the screenshots, in case you are trying to decide whether to download the SDK.  The documentation is included in the SDK release as a Word document, including screenshots of the various dialogs (yes, most commands have a GUI, but you can still use the commands from scripts by specifying the /noprompt option).

    Review  The only command not documented is the review command, which is very handy for doing code reviews.  When you run "tfpt review" you get a dialog with a list of your pending changes that you can check off as you diff or view each one.

    I hope you find these useful.  Please leave a comment, and let us know what you think.

    Team Foundation PowerToys


    The Team Foundation PowerToys (TFPT) application provides extra functionality for use with the Team Foundation version control system. The Team Foundation PowerToys application is not supported by Microsoft.

    Five separate operations are supported by the TFPT application: unshelve, rollback, online, getcs, and uu. They are all invoked at the command line using the tfpt.exe application. Some of the TFPT commands have graphical interfaces.

    Unshelve (Unshelve + Merge)

    The unshelve operation supported by tf.exe does not allow shelved changes and local changes to be merged together. TFPT’s more advanced unshelve operation allows this to occur under certain circumstances.

    If an item in the local workspace has a pending change that is an edit, and the user uses TFPT to unshelve a change from a shelveset, and that shelved change is also an edit, then the changes can be merged with a three-way merge.

    In all other cases where changes exist both in the local workspace and in the shelveset, the user can choose between the local and shelved changes, but no combination of the changes can be made. To invoke the TFPT unshelve tool, execute

    tfpt unshelve

    at the command line. This will invoke the graphical interface for the TFPT unshelve tool:

    Running TFPT Unshelve on a Specified Shelveset

    To skip this dialog, you can specify the shelveset name and owner on the command line, with

    tfpt unshelve shelvesetname;shelvesetowner

    If you are the owner of the shelveset, then specifying the shelveset owner is optional.

    Selecting Individual Items Within a Shelveset for Unshelving

    If you specify a shelveset on the command line as in “Running TFPT Unshelve on a Specified Shelveset,” or if you select a shelveset in the window above and click Details, you are presented with the Shelveset Details window, where you can select individual changes within a shelveset to unshelve.

    You can check and uncheck the boxes beside individual items to mark or unmark them for unshelving. Click the Unshelve button to proceed.

    Unshelve Conflicts

    When you press the Unshelve button, all changes in the shelveset for which there is no conflicting local change will be unshelved. You can see the status of this unshelve process in the Command Prompt window from which you started the TFPT unshelve tool.

    There may, however, be conflicts which must be resolved for the unshelve to proceed. If any conflicts are encountered, the conflicts window is displayed:

    Edit-Edit Conflicts

    To resolve an edit-edit conflict, select the conflict in the list view and click the Resolve button. The Resolve Unshelve Conflict window appears.

    For edit-edit conflicts, there are three possible conflict resolutions. 

    • Taking the local change abandons the unshelve operation for this particular change (it would be as if the change had not been selected for unshelving).
    • Taking the shelved change first undoes the local change, and then unshelves the shelved change. This results in the local change being completely lost.
    • Clicking the Merge button first attempts to auto-merge the two changes together, and if it cannot do so without conflict, attempts to invoke a pre-configured third-party merge tool to merge the changes together. The local change is not lost by choosing to merge – if the merge fails, the local change remains.

    The Auto-Merge All Button

    The Auto-Merge All button is enabled when there are edit-edit conflicts remaining that are unresolved. Clicking the button goes through the edit-edit conflicts and attempts to auto-merge the changes together. For each conflict, if the merge succeeds, then the conflict is resolved. If not, then the conflict is marked as “Requires Manual Merge.” In order to resolve conflicts marked as “Requires Manual Merge,” you must select the conflict and click the Resolve… button. Clicking the Merge button will then start the configured third-party merge tool. If no third-party merge tool is configured, then the conflict must be resolved by selecting to take the local change or take the shelved change.

    Generic Conflicts

    Any other conflict (a local delete with a shelved edit, for example) is a generic conflict that cannot be merged.

    There is no merge option for generic conflicts. You must choose between keeping the local change and taking the shelved change.

    Aborting the Unshelve Process

    Because the unshelving process makes changes to the local workspace, and because the potential exists to discover a problem halfway through the unshelve process, the TFPT Unshelve application makes a backup of the local workspace before starting its execution if there are pending local changes. This backup is stored as a shelveset on the server. In the event of an abort, all local pending changes are undone and the backup shelveset is unshelved to the local workspace. This restores the workspace to the state it was in before the unshelve application was run.

    The backup shelveset is named by adding _backup and then a number to the name of the shelveset that was unshelved. For example, if the shelveset TestSet were unshelved, the backup shelveset would be named TestSet_backup1. Up to 9 backup shelvesets can exist for each shelveset.

    With the backup shelveset, changes made during an unshelve operation can be undone after the unshelve is completed but before the changes are checked in, by undoing all changes in the workspace and then unshelving the backup shelveset:

    tf undo * /r

    tf unshelve TestSet_backup1


    Sometimes it may be necessary to undo a checkin of a changeset. This operation is directly not supported by Team Foundation, but with the TFPT rollback tool you can pend changes which attempt to undo any changes made in a specified changeset.

    Not all changes can be rolled back, but in most scenarios the TFPT rollback command works. In any event, the user is able to review the changes that TFPT pends before checking them in.

    To invoke the TFPT rollback tool, execute

    tfpt rollback

    at the command line. This will invoke the graphical user interface (GUI) for the TFPT rollback tool. Please note that there must not be any changes in the local workspace for the rollback tool to run. Additionally, a prompt will be displayed to request permission to execute a get operation to bring the local workspace up to the latest version.

    The Find Changesets window is presented when the TFPT rollback tool is started. The changeset to be rolled back can be selected from the Find Changesets window.

    Specifying the Changeset on the Command Line

    The Find Changesets window can be skipped by supplying the /changeset:changesetnum command line parameter, as in the following example:

    tfpt rollback /changeset:3

    Once the changeset is selected, either by using the Find Changesets window or specifying a changeset using a command-line parameter, the Roll Back Changeset window is displayed.

    Each change is listed with the type of change that will be counteracted by a rollback change.

    To rollback a:

    The tool pends a:

    Add, Undelete, or Branch








    Unchecking a change in the Roll Back Changeset window marks it as a change not to be rolled back. There are cases involving rolling back deletes which may result in unchecked items being rolled back. If this occurs, clear warnings to indicate this are displayed in the command prompt window. If this is unsatisfactory, undo the changes pended by the rollback tool.

    When the changes to roll back have been checked appropriately, pressing the Roll Back button starts the rollback. If no failures or merge situations are encountered, then the changes should be pended and the user returned to the command prompt:

    Merge scenarios can arise when a rollback is attempted on a particular edit change to an item that occurred in-between two other edit changes. There are two possible edit rollback scenarios: 

    1. An edit is being rolled back on an item, and the edit to roll back is the latest change to the content of the item. 

    This is the most common case. Most rollbacks are performed on changesets that were just checked in. If the edit was just checked in, it is unlikely that another user has edited it in the intervening time.

    To roll back this change, an edit is pended on the item, and the content of the item is reverted to the content from before the changeset to roll back. 

    1. An edit is being rolled back on an item, and the edit to roll back is not the latest change to the content of the item. 

    This is a three-way merge scenario, with the version to roll back as the base, and the latest version and the previous version as branches. If there are no conflicts, then the changes from the change to roll back (and only the change to roll back) are extracted from the item, preserving the changes that came after the change to roll back. 

    In the event of a merge scenario, the merge window is displayed:

    To resolve a merge scenario, select the item and click the Merge button. An auto-merge will first be attempted, and if it fails, the third-party merge tool (if configured) will be invoked to resolve the merge. If no third-party merge tool is configured, and the auto-merge fails, then the item cannot be rolled back:

    The Auto-Merge All button attempts an auto-merge on each of the items in the merge list, but does not attempt to invoke the third-party merge tool.


    Any changes which fail to roll back will also be displayed in the same window.


    With Team Foundation, a server connection is necessary to check files in or out, to delete files, to rename files, etc. The TFPT online tool makes it easier to work without a server connection for a period of time by providing functionality that informs the server about changes made in the local workspace.

    Non-checked-out files in the local workspace are by default read-only. The user is expected to check out the file with the tf checkout command before editing the file. When working in this

    When working offline with the intent to sync up later by using the TFPT online tool, users must adhere to a strict workflow: 

    • Users without a server connection manually remove the read-only flag from files they want to edit. Non-checked-out files in the local workspace are by default read-only, and when a server connection is available the user must check out the file with the tf checkout command before editing the file. When working offline, the DOS command “attrib –r” should be used.
    • Users without a server connection add and delete files they want to add and delete. If not checked out, files selected for deletion will be read-only and must be marked as writable with “attrib –r” before deleting. Files which are added are new and will not be read-only.
    • Users must not rename files while offline, as the TFPT online tool cannot distinguish a rename from a deletion at the old name paired with an add at the new name.
    • When connectivity is re-acquired, users run the TFPT online tool, which scans the directory structure and detects which files have been added, edited, and deleted. The TFPT online tool pends changes on these files to inform the server what has happened.  

    To invoke the TFPT online tool, execute 

    tfpt online

    at the command line. The online tool will begin to scan your workspace for writable files and will determine what changes should be pended on the server.

    By default, the TFPT online tool does not detect deleted files in your local workspace, because to detect deleted files the tool must transfer significantly more data from the server. To enable the detection of deleted files, pass the /deletes command line option.

    When the online tool has determined what changes to pend, the Online window is displayed.

    Individual changes may be deselected here if they are not desired. When the Pend Changes button is pressed, the changes are actually pended in the workspace.

    Important Note: If a file is edited while offline (by marking the file writable and editing it), and the TFPT online tool pends an edit change on it, a subsequent undo will result in the changes to the file being lost. It is therefore not a good idea to try pending a set of changes to go online, decide to discard them (by doing an undo), and then try again, as the changes will be lost in the undo. Instead, make liberal use of the /preview command line option (see below), and pend changes only once.

    Preview Mode

    The Online window displayed above is a graphical preview of the changes that will be pended to bring the workspace online, but a command-line version of this functionality is also available. By passing the /preview and /noprompt options on the command line, a textual representation of the changes that the TFPT online tool thinks should be pended can be displayed.

    tfpt online /noprompt /preview


    The TFPT online tool by default operates on every file in the workspace. Its focus can be more directed (and its speed improved) by including only certain files and folders in the set of items to inspect for changes. Filespecs (such as *.c, or folder/subfolder) may be passed on the command line to limit the scope of the operation, as in the following example:

    tfpt online *.c folder\subfolder

    This command instructs the online tool to process all files with the .c extension in the current folder, as well as all files in the folder\subfolder folder. No recursion is specified. With the /r (or /recursive) option, all files matching *.c in the current folder and below, as well as all files in the folder\subfolder folder and below will be checked. To process only the current folder and below, use

    tfpt online . /r


    Many build systems create log files and/or object files in the same directory as source code which is checked in. It may become necessary to filter out these files to prevent changes from being pended on them. This can be achieved through the /exclude:filespec1,filespec2,… option.

    With the /exclude option, certain filemasks may be filtered out, and any directory name specified will not be entered by the TFPT online tool. For example, there may be a need to filter out log files and any files in object directories named “obj”.

    tfpt online /exclude:*.log,obj

    This will skip any file matching *.log, and any file or directory named obj.

    GetCS (Get Changeset)

    The TFPT GetCS tool gets all the items listed in a changeset at that changeset version.

    This is useful in the event that a coworker has checked in a change which you need to have in your workspace, but you cannot bring your entire workspace up to the latest version. You can use the TFPT GetCS tool to get just the items affected by his changeset, without having to inspect the changeset, determine the files listed in it, and manually list those files to a tf.exe get command.

    There is no graphical user interface (GUI) for the TFPT GetCS tool. To invoke the TFPT GetCS tool, execute

    tfpt getcs /changeset:changesetnum

    at the command line, where changesetnum is the number of the changeset to get.

    UU (Undo Unchanged)

    The TFPT UU tool removes pending edits from files which have not actually been edited.

    This is useful in the event that you check out fifteen files for edit, but only actually make changes to three of them. You can back out your edits on the other twelve files by running the TFPT UU tool, which compares hashes of the files in the local workspace to hashes the server has to determine whether or not the file has actually been edited.

    There is no graphical user interface (GUI) for the TFPT UU tool. To invoke the TFPT UU tool, execute

    tfpt uu

    at the command line. You can also pass the /changeset:changesetnum argument to compare the files in the workspace to a different version.


    Help for each TFPT tool, as well as all its command-line switches, is available at the command line by running

    tfpt help

    or for a particular command, with

    tfpt help <command>


    tfpt <command> /?

  • Buck Hodges

    Brian Harry on when to ship TFS

    About half way through a thread on unlocking files locked by another user, Brian wrote a long response in the Team Foundation Forum about deciding what changes to make at this point and knowing when to ship.  It's a good read.  I expect that Brian will likely turn it into a blog post at some point.
  • Buck Hodges

    Validating XML characters in SOAP messages


    I've written about using the SoapHttpClientProtocol subclasses generated by wsdl.exe several times over the last year, including handling authentication, HTTP response codes, and setting timeouts properly.  Today I needed to change the code in TFS to better handle characters that are not allowed in XML.

    The problem is that if you have a method on your web service that takes a String parameter, someone may call that method with a string that contains characters that are not allowed in XML.  That input may come from a command line switch or a text box in a GUI.

    The XmlWriter used by SoapHttpClientProtocol is XmlTextWriter.  XmlTextWriter doesn't do any character validation.  If the string passed to WriteString() includes characters that are not valid for XML, your XML output will also.  The characters below 32, except for tab, carriage return, and new line, the UTF-8 BOM, and invalid surrogate pairs are not allowed by the XML standard.

    The XmlReader used by the ASP.NET web services does do character validation.  If it finds an invalid XML character, the web service will respond with HTTP 400 Bad Request.  That doesn't help the user figure out what's going on.

    Elena Kharitidi suggested overriding the GetWriterForMessage() method from SoapHttpClientProtocol in the subclass that was generated by wsdl.exe and providing a character-validating XmlWriter.

    The documentation shows an example of creating a subclass of XmlTextWriter to check the characters.  However, it would be better to be able to use a framework class to do it without rolling our own.  Fortunately, there is such a class in the framework.

    With a little poking around, I found the XmlCharCheckingWriter class that is internal to the framework.  Now we just need to get the framework to give us an instance of that class.  A little more poking around and experimentation resulted in the piece of code shown below.

    If you run it under the debugger and put a breakpoint on line 5, you'll see that the base SoapHttpClientProtocol method returns an XmlTextWriter.  If you step down to the XmlWriter.Create() call, you'll see that the framework gives us the XmlCharCheckingWriter instance that we want in response to the CheckCharacters setting being true.

    Now, if you add the following code to your wsdl.exe-generated subclass of SoapHttpClientProtocol, you'll get an ArgumentException on the client when trying to write the invalid XML in the SOAP message.  The exception message will state that there is an invalid character.  The result is a significant improvement over getting a generic HTTP 400 Bad Request from the web service.

    // Override this method in order to put in character validation.
    protected override XmlWriter GetWriterForMessage(SoapClientMessage message,
                                                     int bufferSize)
        XmlWriter writer = base.GetWriterForMessage(message, bufferSize);
        // Choose the encoding the same way the framework code does.
        Encoding encoding = RequestEncoding != null ? RequestEncoding :
                                                      new UTF8Encoding(false);
        // We want the character validation to be done on the client side
        // rather than getting an obscure HTTP 400 Bad Request message
        // from the server (the XmlReader used by the web services does 
        // character validation, while the writer used in the base class
        // does not).
        // We create this second XmlWriter to get an XmlCharCheckingWriter
        // instance.  The Create(XmlWriter, XmlWriterSettings) code path 
        // does that (we don't need the overhead in an XmlWellformedWriter).
        XmlWriterSettings xws = new XmlWriterSettings();
        xws.Encoding = encoding;
        xws.Indent = false;
        xws.NewLineHandling = NewLineHandling.None;
        xws.CheckCharacters = true;      // make sure char is valid for XML
        writer = XmlWriter.Create(writer, xws);
        return writer;
Page 18 of 23 (560 items) «1617181920»