Buck Hodges

Visual Studio Online, Team Foundation Server, MSDN

March, 2007

  • Buck Hodges

    Configuring Team Foundation Server to use fully-qualified domain names


    This week the following question came up.  I've seen this come up before, and there are probably forum threads on it, but I figured I'd post it here.  Bill Essary provided the answer to the question.  As always, keep notes on what you do so that you can undo it if necessary.


    Is there a way to configure TFS to use fully-qualified domain names (FQDN, e.g., tfsserver.mycompany.com) for TFS, WSS, and Reporting Services?


    1) Run "tfsadminutil activateat <FullyQualifiedDomainName>"

    2) Update the following registry key with the FQDN: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\VisualStudio\8.0\TeamFoundation\ReportServer

    3) To force FQDN references in e-mail notifications, set TFSUrlPublic in the TFS root web.config file to http://<FQDN>:8080.

    There are a handful of other places where the TFS URL is stored, but they typically wouldn't matter if the goal is to ensure that all public access to the server is done via FQDN.  The ones that are missed govern communication local to the TFS AT (ex: TFS scheduler prodding TFS warehouse).  If you want to get them all, use the SSL-only configuration topic for TFS as a guide.

    4) Add the domain to the Intranet Zone or Trusted Sites list in IE for all clients (see KB 303650)

    If you are using TFS 2008 with SharePoint 3.0 (or Microsoft Office SharePoint Services 2007), you will need to do the following as well.

    After following the steps above with TFS2008 + WSS 3.0 you will observe that when you try to access the team portal using http://FQDN/sites/TeamProjectName you will be automatically redirected to http://NETBIOSNAME/sites/TeamProjectName. 

    This behavior is by design and is caused by alternate access mapping. In order to avoid this you will have to create a custom alternate access mapping which has the FQDN as the internal URL and as the public URL.

    1. Open WSS3.0 Central Administration
    2. Click on Operations tab
    3. Click on Alternate access mapping
    4. Click the Add Internal URLs button
    5. In the dropdown select the default website (port 80)
    6. Enter the FQDN in the textbox
    7. Set the Zone to Custom

    Additional step required when .NET 3.5 SP1 is installed 

    If you have Service Pack 1 for .NET 3.5 installed on your 2008 server (and if you do, make sure you also have SP1 for TFS 2008 installed), you will need to make an additional change.  KB 926642 describes what you will need to do.  The registry change is probably the simplest approach, but you will need to decide which approach is best for you based on the security tradeoffs.

    The need for this setting is due to the fact that .NET 3.5 SP1 includes changes to support Windows' features to defeat reflection attacks.  Unfortunately, that causes problems with the way FQDN support works in TFS.  You can find additional background on the Windows' security settings at Getting Caught by Loopback.  You can also read about the impact on SQL Reporting Services at Reporting Services HTTP 401 (Unauthorized) - Host Headers require your attention.

    [UPDATE February 27, 2008] Ladislau Szomoru provided additional steps required when making this change with TFS 2008 and SharePoint 3.0.

    [UPDATE October 17, 2008] I've added information about handling the new security checks enabled by .NET 3.5 SP1 that will prevent FQDN from working without taking additional steps.

    tags: ,

  • Buck Hodges

    Better integrating other build tools into your build


    Aaron Hallberg wrote a great post today showing how to use a custom task to better integrate other build tools, such as devenv (aka Visual Studio), as part of your build in Team Build.

    Building Non-MSBuild Projects With Team Build

    Building non-MSBuild projects in Team Build has never been a particularly nice experience...  You can make it work, typically by invoking DevEnv or some other tool using an Exec task, but you don't get any output in your build report or build log, etc.  Additionally, it was recently pointed out to me (by D. Omar Villareal from Notion Solutions) that when a build type is set up that only builds non-MSBuild projects, the resulting builds don't end up in default Build Reports generated by TFS, since the relevant information doesn't get generated for the Warehouse!

    To solve some of these issues, I've put together a custom MSBuild task for compiling non-MSBuild projects in Team Build.  The basic idea here is to execute an external executable (like the Exec task does), redirect stdout and stderr to the build log, and insert some minimal data into the Team Build database.  I had it inherit from the task base class I put together way back in August of last year.  As always, no guarantees are made about the awesomeness, or lack thereof, of this sample code.


    If you decide that you want to have this show up as a separate build step, you can the calls to AddBuildStep() followed by UpdateBuildStep() as he showed in an earlier post.

    tags: , , , , ,

  • Buck Hodges

    Version Control Server blog: Orcas destroy feature, merging with labels


    The Version Control Server blog has shown some signs of life.  Bill Tutt has written a couple of posts.  The first one is about the Orcas version control feature that lots of customers have requested: destroy.  The destroy command, which is only available from the tf.exe command line, will permanently remove the specified files and folders from your database (delete just hides them really).

    Destroy: A new feature for Orcas

    Destroy gives you the ability to permanently delete version control files/folders from TFS.  It can also be used to destroy the file contents while preserving the change set history.


    His second post covers subtle problem of performing a merge where one, or both, of the versions specified is a label (LsomeLabelName).  While he doesn't mention it in the post, specifying a workspace as the version to merge works the same way as specifying a label.  In other words, just like you generally don't want to merge using a label as the version, you also don't want to use a workspace version specifier (W or WsomeWorkspacename;someOwner).  The problem is that deleted files can't be labeled, nor can they be in your workspace at the version they were deleted.  This is in fact why merging always uses the latest version by default, rather than what you have in your workspace.

    If this all seems a bit arcane, just follow the simple rule of not merging using labels for the versions, unless you spend the time to really understand what you are getting.

    The Deceptive Allure of Merging with Labels

    Labels are an interesting part of version control system. In version 1 of TFS Version Control, labels do not contain deleted items. 


    tags: , , , , , ,

  • Buck Hodges

    Stop piling on when the build breaks: Build checkin policy for Continuous Integration in Orcas


    Last fall, Clark Sell wrote a blog post called Stop, the build is broken!! that introduced a checkin policy that reported errors when the build was broken.  If you are using continuous integration where every checkin starts a build, you want folks to stop and fix build breaks when they occur, rather than pile on more checkins and perhaps make the problem worse (or at least harder to sort out).

    Since we've added support for continuous integration in Team Build for Orcas (screencasts, demo), we thought that it was a really great idea, and we've added a simple checkin policy in Orcas Team Build that does this (it will be in beta 1, but it is not in the March Orcas CTP).  It works differently than his does (he had was constrained to what v1 had to offer), and it only works with Orcas clients (not with TFS 2005 clients, which would see an error message about not finding the checkin policy).

    Here's what the policy does.

    1. Request from the server a list of build definitions affected by this check in
    2. For each build definition returned where the last build was not "good," create a checkin policy error message containing the build definition's name and the user that triggered the build.

    If the policy detects a broken CI build, you'll get a message like the following when you attempt to check in.

    The last build of definition WebProjects_SimpleWebService, triggered by user buck, failed.

    A "good" build is one where compilation and testing were successful.  If something goes wrong after the test phase, it's still considered a good build.  This notion of a good build is the same as it was in v1, and it has some shortcomings.  We're going to refine it and make it more flexible in the release after Orcas.

    There's nothing to configure for this checkin policy, so you aren't stuck with maintaining a list of build definitions for the checkin policy to monitor.  The first step calls the same code on the server that is used by the continuous integration feature.  Based on the list of pending changes' server paths involved in the checkin and the workspace mappings for each of the build definitions, the server is able to quickly determine which build definitions are affected by your changes.  It's all automatic!

    See Walkthrough: Customizing Check-in Policies and Notes for how to enable a checkin policy for a team project.

    We're interested in your feedback, so post a comment and let us know what you think.

    Low-level details

    If you want to see how it works and see a little of the new Orcas Build API, I'll explain the details of how it works.  If you aren't interested in the low-level details, feel free to skip this.

    Here is all of the code that isn't just "boilerplate" checkin policy code.

    To prevent being called repeatedly in a short time span, it uses a timer to ensure that a minimum of 10 seconds elapse between calls.  There's nothing special about 10 seconds, and we may even lengthen it to a minute.  The important part is that since this policy makes at least one web service call, it needs to make sure being evaluated often doesn't cause too many web service calls and present a performance problem.

    The first thing that the policy's Evaluate() method does is get a reference to the central object in the Orcas Team Build API, IBuildServer.  Next it gets the list of pending changes that are going to be checked in.

    Then it calls GetAffectedBuildDefinitions(), which does what I described in step 1 earlier.  It's a new web service method on the Orcas server that determines which build definitions are affected by changes to a list of server paths.  Having the workspace mappings for the build definitions stored in the Orcas database, rather than in the old WorkspaceMapping.xml file, is what makes this and continuous integration efficient and automatic.  Otherwise, you'd have to manually specify what paths affect each build definition, which would be a maintenance headache.

    After getting the affected build definitions, it checks to see if the artifact URI for the last build is the same as the artifact URI for the last good build.  If those are set to the same URI, the last build was good.  Otherwise, the most recent build was not a good build.  Here we also check to see whether the build type is a continuous integration build, either every checkin (Individual) or a set of checkins over some time period (Batch).

    If we have any broken builds, we need to get the details for the build so that we can report who may have broken the build.  For CI builds where it is building each checkin individually, it is really the person that broke the build (assuming this is the first broken build).  For CI builds where it's building the checkins from a period of time, such as the last 30 minutes, it might be the person who broke the build or it may not, since more than one person may have checked in.  Regardless, that's a good person to start with when investigating the broken build.

            public override void Initialize(IPendingCheckin pendingCheckin)
                m_timer = new Stopwatch();
            public override PolicyFailure[] Evaluate()
                if (Disposed)
                    throw new ObjectDisposedException(null);
                IBuildServer buildServer = (IBuildServer) PendingCheckin.GetService(typeof(IBuildServer));
                // If there are any pending changes, determine whether there build definitions that are
                // affected for which the last build was not a good build.  Make sure that we don't call
                // this rapidly in succession.
                List<PolicyFailure> failures = new List<PolicyFailure>();
                PendingChange[] pendingChanges = PendingCheckin.PendingChanges.CheckedPendingChanges;
                if (pendingChanges.Length > 0 && 
                    (!m_timer.IsRunning || m_timer.ElapsedMilliseconds >= 10000))
                    IBuildDefinition[] definitions = buildServer.GetAffectedBuildDefinitions(
    PendingChange.ToServerItems(pendingChanges)); List<Uri> brokenBuilds = new List<Uri>(); List<IBuildDefinition> brokenBuildDefs = new List<IBuildDefinition>(); foreach (IBuildDefinition definition in definitions) { // Since this policy is geared toward folks using continuous integration, only fail for build // definitions that have CI trigger. if (definition.LastBuildUri != definition.LastGoodBuildUri && (definition.ContinuousIntegrationType == ContinuousIntegrationType.Batch || definition.ContinuousIntegrationType == ContinuousIntegrationType.Individual)) { brokenBuilds.Add(definition.LastBuildUri); brokenBuildDefs.Add(definition); } } if (brokenBuilds.Count > 0) { // Look up the broken builds to see who triggered them. IBuildDetail[] buildDetails = buildServer.QueryBuildsByUri(brokenBuilds.ToArray(), null, QueryOptions.None); // Create a failure for each broken build, skipping any build that wasn't returned due to // insufficient permissions or being deleted. for (int i = 0; i < buildDetails.Length; i++) { if (buildDetails[i] != null) { String requestedFor = UserNameUtil.MakePartial(buildDetails[i].RequestedFor, PendingCheckin.PendingChanges.Workspace.VersionControlServer.AuthenticatedUser); failures.Add(new PolicyFailure(ResourceStrings.Format(ResourceStrings.BuildPolicyBuildBroken, brokenBuildDefs[i].Name, requestedFor), this)); } } } } m_timer.Reset(); m_timer.Start(); return failures.ToArray(); } [NonSerialized] private Stopwatch m_timer;
  • Buck Hodges

    Orcas: Unit testing to be available in Visual Studio Professional


    Naysawn Naderi wrote a post today stating that many of the unit test features in Team System will be included in Visual Studio Professional in Orcas.

    Here's the entire post, but I've highlighted part below that I think you'll want to read and perhaps provide feedback to the team as to your opinion.  If folks want this, you may want to open an item on Connect and then post here (or somewhere else) so that other folks can vote on it.

    Before you get too excited, beta 1 isn't out, and it isn't standing at the door waiting to be released.  I think that comment is just there so that whenever it does come out, you'll know that those features were consciously omitted.

    Unit Testing Trickling into Pro!

    Due to popular demand we have decided to add the majority of the unit testing features of Team System to the Pro Sku of Visual Studio.  With the release of Orcas, the support for authoring, generating and running unit tests, will become available to the users of the Pro Sku of Visual Studio. Pro customers will also take advantage of the some of the unit testing improvements we have added into Orcas, specifically generating for generics, performance improvements, the ability to unit test devices and better IDE integration (I’ll try my best to blog on the details soon). We are in general very open to the concept of trickling down other functionality introduced in Team System into other Skus over time, so please let us know if you feel that other items should trickle down as well. Keeping this pattern keeps us on our toes to ensure that we are always adding high value features higher up the stack. We love hearing your feedback and take your suggestions very seriously (I’m not just saying this - I have been continually surprised at how much time is spent on user’s feedback).

    Again, we are very excited about the trickle down as we hope that it will introduce the concept of unit testing to the average .Net developer.  Our team hopes that every developer will see the major benefits of unit testing and will regularly author and execute the tests throughout the product life cycle.    

    To the beta users: you may notice that a few pieces of the unit testing puzzle is missing from the Pro Versions of Beta 1 – specifically, authoring test lists, remotely executing tests and generating code coverage results.  We have been debating if some of these features should also trickle down and would be very interested in your feedback.  For example, the ability to author test lists has been excluded from the Pro Sku since many felt that its chief benefit comes to those which author test lists and run them as a part of a Build Verification Test prior to checking code into Team Foundation Server. Some, however feel that it is still convenient to organize tests in a list regardless of check-ins.  How do you feel? Does the test list editor (formerly called Test Manager) belong in Pro?  

    tags: , , ,

  • Buck Hodges

    Outlook 2007: Viewing all RSS feeds in a single folder like an RSS "inbox"


    Since I've started using Outlook 2007 to manage my RSS feeds, I've wanted to have all of the posts show up in a folder, much like new mail shows in up the inbox.  I don't like have to scroll through the list of folders and visit each one containing an unread post.  It's just too tedious unless you have only a couple of feeds that you read.  So, I finally set up an Outlook Search Folder to give me the folder view I've wanted.

    Here's what I did.

    1. Right click on Search Folders, and choose New Search Folder.
    2. Choose Create a custom Search Folder and click the Choose button
    3. Give it a name and then click the Criteria button
    4. Click on the Advanced tab
    5. Click the Field drop down button, select Frequently-used fields, and click on RSS Feed
    6. Set the Condition to be "is not empty" and click the Add to List button
    7. Click the Field drop down button, select All Mail fields, and click on In Folder
    8. Set the Condition to be "doesn't contain," enter "deleted" as the Value (without the quotation marks), and click the Add to List button
    9. Hit OK on all of the dialogs
    10. Sort the new search folder by Received, or however you like to see your items sorted
    11. Drag your new search folder up to the Favorite Folders list to make it convenient


    Technorati tags: , ,

  • Buck Hodges

    VSTS profiler: Installing only the command line tool


    I'm posting this mostly to remind myself of where it is.  This came across on the internal VSTS discussion alias today.

    Visual Studio Profiler has a standalone installation: look for vs_profiler.exe in the Visual Studio installation CD/DVD. Installing that package allows users to collect, analyze and report performance data from command line. The package does not include any GUI. It is light-weight installation, suitable for production or testing environments.

    See the following article for more details on how to profile applications from command line:


    If you are just getting started with profiling in VSTS, you may want to check out Jeff Atwood's quick walkthrough of profiling from the GUI: Simple Profiling with Team Developer.

    tags: , ,

  • Buck Hodges

    How to get a complimentary copy of Visual Studio 2005 Standard Edition


    According to the following Microsoft web page, you can get a complimentary VS 2005 Standard Edition by attending "labcasts."  You'll have to go to the web page and read the rules, etc.  I thought I'd pass this along, since I stumbled across it tonight.

    The Mouse is Mightier than the Sword

    Defy All Challenges With Microsoft Visual Basic 2005 -- Microsoft Labcasts Show You How

    A Hands-On Experience

    Interested in leveraging your existing development skills to become more versatile and productive? Attend our new Visual Basic 2005 Labcast Series. Experience for yourself how to turn Microsoft Visual Basic 2005 into a mighty weapon against your development challenges.

    The Visual Basic 2005 Labcasts make it easy for you to evaluate Visual Basic 2005 in a convenient and virtual environment. Get in-depth training and one-on-one real-time assistance from Visual Basic experts. Select from six informative labcasts and find out how to develop creative and modern breakthrough applications by combining the power of Visual Basic 6.0 with Visual Basic 2005.

    Learn how to

    • Modernize and simplify installation of Visual Basic 6.0 applications
    • Add new data-bound forms to Visual Basic 6.0 applications
    • Introduce simple background threading to Visual Basic 6.0 applications
    • View and edit SQL Server™ data
    • Create data-centric Web applications with your Visual Basic desktop development skill
    • Keep Microsoft Word documents in sync with your database

    For a limited time, when you attend two Visual Basic 2005 Labcasts and submit the evaluation form for each session, you will be eligible for a complimentary* copy of Visual Studio 2005 Standard Edition.


  • Buck Hodges

    VSTS future releases: Orcas, Rosario, and Power Tools


    A new high-level roadmap for Orcas, Rosario, and Power Tools is now available.  The Visual Studio Team System Future Releases web page covers these topics at varying levels of detail, with the greatest amount detail being supplied for Orcas, of course.  The Orcas TFS information is basically the same as what we've published before.

    Here's what it says about Rosario.

    Visual Studio Team System code name "Rosario"

    The next major release of Visual Studio Team System is code-named “Rosario” and will be released following the “Orcas” release. In this exciting release, we will be delivering new innovations to build on our award-winning Application Lifecycle Management (ALM) solution. Some of the major scenarios and features in Visual Studio Team System code-named “Rosario” will include:

    • Joint prioritization and management of IT projects through integration with Microsoft Project Server
    • Project management across multiple projects for proactively load balancing resources according to business priorities
    • Full traceability (inc. hierarchical work items) to track project deliverables against business requirements and the ability to conduct rapid impact analysis of proposed changes
    • Comprehensive metrics and dashboards for shared visibility into project status and progress against deliverables
    • Powerful new features to enable developers and testers to quickly identify, communicate, prioritize, diagnose and resolve bugs
    • Integrated test case management to create, organize and manage test cases across both the development and test teams
    • Testing automation and guidance to help developers and testers focus on business-level testing rather than repetitive, manual tasks
    • Quality metrics for a ‘go/no-go’ release decision on whether an application is ready for production and has been fully tested against business requirements
    • Rapid integration of remote, distributed, disconnected and outsourced teams into the development process
    • Easy customization of process and guidance from Microsoft and partners to match the way your team works
    • Improvements to multi-server administration, build and source control

    Only the last bullet mentions build specifically, but we have significant plans for Team Build in Rosario.  It's also involved to various degrees in several other bullets.

  • Buck Hodges

    Web interface now available for TFS: Microsoft acquires TeamPlain


    We have never had a web interface for Team Foundation Server work item tracking, much to the astonishment of our customers.  Well, there's a fix for that now.  And you can download it today (see below for details).

    Brian Harry wrote a post this morning about the acquisition of TeamPlain.

    Microsoft Acquires TeamPlain!

    Today we are announcing that Microsoft has acquired DevBiz Business Solutions, the makers of the popular TeamPlain Web Access for Team System.  TeamPlain is a web front end for VSTS that enables users to access the majority of TFS functionality from within a Web browser.  The focus of TeamPlain is on work item tracking but it also includes some valuable version control capabilities (like viewing history/change sets, diffing files, browsing the source base, etc.), some SharePoint integration, Reporting services integration, and some upcoming build support.  TeamPlain gives VSTS a new avenue to reach a broader array of people within the development team who don’t use Visual Studio today and don’t want to install Visual Studio clients on their machines.  It also improves reach by enabling some access from non-Windows clients.


    When will it be available to our customers?  Here's the answer, quoted from the same post (emphasis is mine).

    TeamPlain will become Microsoft Visual Studio Team System Web Access.  Effective today, TeamPlain is available, at no additional charge, to users who own a Team Foundation Server and can be downloaded from here.  It will be accessible by any user properly licensed with a TFS CAL.  Support will continue to be provided by the current staff via the DevBiz online forums.

    Over the next few months, we will be rebranding TeamPlain as a Microsoft product and running it through our release process.  When that is complete, we will be releasing it as a VSTS Power Tool, transitioning support to the Microsoft forums and beginning CSS (phone) support.  In the Orcas timeframe, we will be releasing Team System Web Access as an official, documented, localized, and officially supported component of Team Foundation Server.

    TeamPlain also provides some access to TFS version control over the web.

    So, if you are a TFS customer, please download it, start using it, and let us know what you think of it.  As with everything else, the sooner you send in your feedback, the more likely you'll see changes in upcoming releases.

  • Buck Hodges

    Vista sidebar build monitor and PowerShell scripts for Team Build Orcas


    Jason Prickett, a developer on Team Build, has written some very nice posts.  All of these posts only work with Team Build Orcas.

    The first one shows a very cool Vista sidebar which Jason has fiddled with a while back as a side project and finally got around to finishing it off and making it available.  His post has a screenshot as well.

    Monitoring Build Status in your Windows Vista Sidebar

    So, I have been wanting to create a sidebar gadget for a while, but I also wanted it to be something useful. In this post I have attached the source code for a sidebar gadget that monitors the build status of a particular build definition.

    Key Features:

    • Each instance of the sidebar gadget monitors a particular build definition.
    • An icon represents the status of the most recent build for that definition.
    • The text is a hyperlink to the build report for the most recent build.
    • It automatically polls (based on a Java Script timer) the server to get the latest info.
    • As you can see, you can create as many instances as your sidebar will hold!


    Then he has two neat posts showing how to use PowerShell with the Orcas Team Build client API.  Apparently, James has got him hooked on PowerShell.

    Average Build Size

    So, I haven't actually gotten a request for this, but I thought it might be useful :)

    Let's say you are using the new features of Team Build like drop management and continuous integration. So far you have the binaries for 12 builds sitting on your server for definition X. You notice you are almost out of disk space on that server and need to purchase some more. But how much? After looking at your drop management policy for definition X, you see that the maximum number of builds you will need to keep is 20. But how much disk space does that correspond to? You could go to the drop location and sum up the size of every folder, but is there a faster way? Of course there is!

    In the following PowerShell script, I get the list of builds for definition X and then sum up the size of all files in the drop locations.


    Team Build OM thru PowerShell - Example GetAffectedBuildDefinitions

    Well, it's been a long time since I have had the time (okay made the time) to blog on something interesting. To make up for that, I have a quick entry that includes all kinds of interesting tidbits.

    This entry is based on Team Build's next version which is available in a CTP release right now, and will be in Beta very soon (don't ask, I don't know when). In the next version of Team Build, we have included an Object Model (OM) that wraps all the functionality of the Web Services. This makes writing your own apps that need build information, much easier.

    As I talk about how to use the methods and objects in the OM, I will show examples using PowerShell commands. PowerShell is an incredible recently new shell for Windows, that allows you to manipulate not just text output by command line apps, but real objects returned from any managed library. If you want to know more (and you should), goto to http://www.microsoft.com/PowerShell

    So, here's a quick example:

    Problem: What Build Definitions are building this C Sharp project?


  • Buck Hodges

    Aaron's back and has new posts on Team Build


    Aaron Hallberg took a month off after the birth of his daughter, Stella.  Now that he's back, he's got a couple of interesting posts.

    The first one applies to TFS 2005 (aka v1) and shows you how to pass custom properties to each solution that you build.  It's something that a lot of folks have needed to be able to do, but the solution with v1 is not good.  If you choose to do this, you will have to fix your build when you upgrade to Orcas (notice that I didn't say if you upgrade to Orcas ;-)).  In Orcas, we've made this really easy to do, so all of this only applies to TFS 2005 (v1).

    Passing custom properties to individual solutions in Team Build

    Gautam Goenka posted an article on this topic way back on April 20, 2006.  It included a targets file which overrode the standard Team Build CoreCompile target and allowed user-specified properties to be passed into the MSBuild tast that Team Build uses to build the solution in the SolutionToBuild item group.  This approach is fine if you want to pass the same custom property values into each solution in the SolutionToBuild item group, but what if you want to pass different property values into each solution?

    Manish Agarwal posted an article that could help get you started here.  His goal was to enable redirecting assemblies to solution-specific subdirectories, but it was easily extendable to passing other user-specified properties on a solution-specific basis.  Unfortunately, it also some problems, including breaking the calculation of errors/warnings during the compilation phase of the build.

    Before pressing on, I should say that we do not recommend overriding the Core* targets in a Team Build build.  The primary reason here is that you will almost certainly be broken after you upgrade to Orcas if you override these targets, since most of them will be changing radically in that new version.  See this post by Buck Hodges for more details here.  The good news here is that the issues that caused people to override the Core* targets in Team Build v1 have been addressed in Orcas, so you should no longer find it necessary to do this sort of thing.

    Having said all that, attached you will find a new CoreCompile override that will allow you pass custom property values into each solution via solution-specific metadata.  For example, if you wanted each solution to put its binaries into a solution-specific subdirectory (rather than the default behavior, where all binaries end up in a single directory), you could do the following:


    In his most recent post, Aaron describes some of the lower-level changes that we made to the targets file and build process in Orcas (v2) that significantly improve your ability to customize and extend the build process, including how to pass custom properties to each solution being built.  A great deal of what we have done here has been based on the feedback we've gotten from the forum, blogs, and other discussions with customers.

    Orcas Changes

    The March CTP of Orcas is now available (Buck has a post on this here), and with it comes your first chance to play around with the new and improved Team Build.  I'm pretty excited about the changes in the build process (defined in Microsoft.TeamFoundation.Build.targets), so I figured I'd mention some of them:


    Technorati tags: , , ,

  • Buck Hodges

    Guidance on using branches in TFS version control


    Jeff Beehler announced the initial release of guidance on using branching and merging with your development process with TFS.  This has been a pretty popular request from folks.

    Branching guidance now available

    I'm extremely excited to announce the availability of Branching guidance for Team Foundation Server.  This was a collaborative effort between the TFS product team and members of our "Ranger" program who are focused on accelerating adoption of V1.  While the product documentation will tell you how each of our tools works, it doesn't provide insight into the best practice usage of each.  The guidance that we recently released attempts to fill that gap in the area of version control and how to best work with branches. 

    Here's the high level overview:

    • Introduction
    • Parallel Development
    • Branching Defined
    • Creating Isolation in Team Foundation Server
    • General Branching Structure Guidance
    • Branching Strategies
    • Broad Areas of Branching Isolation
    • Creating Your Branching Strategy
    • Defining Your Code Promotion Model
    • Feature Crews: How Microsoft Does It
    • End-to-End Implementation Scenario
    • Appendix

  • Buck Hodges

    TFS shipped one year ago today!


    As Jeff Beehler points out, we shipped TFS one year ago today.  In the intervening time, we reorganized and completed the TFS development work on Orcas (minus a few design change requests (DCRs)), which you can try out in the March Orcas CTP (since it's based on TFS code from January, not everything is in there, such as build scheduling).  If you want to see a demo of the Orcas Team Build features, check out this Channel 9 interview with Jim Lamb and me.

    tags: , ,

  • Buck Hodges

    Orcas Team Build: WCF web services replace .NET remoting


    In Visual Studio Team Foundation Server 2005 (v1), the communication from the application tier (AT) to the build computer used .NET remoting.  This was the only component in Team Foundation Server that used .NET remoting, as all other communication used web services (see Team Foundation Server Security Architecture).

    As part of the .NET 3.0 framework that shipped in the Vista timeframe (the Orcas framework is .NET 3.5 while VS 2005 shipped .NET 2.0, if you are trying to keep the version numbers straight), Windows Communication Foundation (WCF) introduced the ability to create web services that run without IIS being installed.  This is great because the build computer is a case where we'd like to use web services, but we didn't want to require IIS on the build computer.

    Moving to web services means that the communication from the AT to the build computer is now SOAP/XML over HTTP rather than a binary .NET protocol.  It also helped us improve our error handling, which is cleaner and easier with web services and can work the way it does in the rest of the product.  Going forward, it should help with versioning (the version of the service is based on the URL to the web service).

    Moving beyond Orcas, we have bigger plans for the build agent (an Orcas build agent is a build computer running the build Windows service listening on a particular port -- basically, a computer name and port pair).  Using WCF fits much better into those plans.

    For a whole lot other reasons, a v1 build computer cannot be used with an Orcas (v2) Team Foundation Server (AT).  You must use an Orcas build agent with an Orcas AT.  Starting with the first beta of Orcas (it's not in the March CTP), the build report will tell you if the build agent you specified is not an Orcas build agent.

    If you want to manually determine whether a particular service on a build computer is using .NET remoting or WCF web services, you can simply telnet to it, type some random characters, and hit Enter twice.  The output is different and is shown below.  The first one shows v1 and the second v2.

    c:\> telnet whidbey-buildcomputer 9191

    .NET☺☻☻♥☺♥☺☺hServer encountered an internal error. To get more info turn on
    customErrors in the server's config file.

    Connection to host lost.

    c:\> telnet orcas-buildagent 9191

    HTTP/1.1 400 Bad Request
    Content-Type: text/html
    Date: Fri, 16 Mar 2007 20:23:18 GMT
    Connection: close
    Content-Length: 35

    <h1>Bad Request (Invalid Verb)</h1>

    Connection to host lost.

    Technorati tags: , , , , ,

  • Buck Hodges

    New add-in: Viewing other users' pending changes from within Visual Studio


    Fairly often, someone asks for the ability to see the pending changes in the system from within Visual Studio.  Running "tf status /user:* $/ /r" (or whatever you want) just isn't appealing to everyone.

    Ognjen Bajic posted about Ekobit's new free VS add-in that will allow you to view the pending changes in the system from within Visual Studio.

    List Pending Changes Browser Screenshot

    for Microsoft® Visual Studio® 2005 Team Edition for Software Developers
    (List Pending Changes Browser is a free software ( licence agreement ))

    If you have ever needed to find out who has checked out files from your project on the Visual Studio Team System source control then the only solution was to browse through all the folders of your project in the Visual Studio Source Control window. Because this is such a common issue, Ekobit has developed a small Add-In for Visual Studio 2005 which can help you solve this problem.

    Check it out!

    tags: , , , , ,

  • Buck Hodges

    Channel 9 interview and demo of Orcas Team Build


    The interview, which contains a demo, has finally been posted (you may remember it being mentioned in December).  I'd like to say thanks to Brian Keller for doing this!

    Continuous Integration with Team Build “Orcas”

    Jim Lamb and Buck Hodges on the Team Foundation Server team show off the new Continuous Integration support they are building for the "Orcas" release of Team Foundation Server! ‘nuff said. Check out the demo!

    see the video...

    This demo was done with a build near the end of the feature crew for CI, and I'm happy to say that it went really well (a demo is usually a good test!).  You'll know that the demo is real when we make a wrong choice and have to figure out what we did wrong.  It didn't take us long to figure it out, but we went and improved that experience of dealing incorrect workspace mappings better after this.  Making it easier to debug problems is one of the things that we've tried to do in Orcas.

    Scheduled builds are supported as well, but they didn't exist when this interview occurred.

    You can try it out for yourself in the March Orcas CTP.  It doesn't have build scheduling either, but it's got everything else you'll see in the interview.

    Be sure to check out Brian's other interviews of folks in the North Carolina office.

    [UPDATE 4/26/2007]  The final two interviews have been posted!

    tags: , , , ,

  • Buck Hodges

    Accessing Team Foundation version control from Ant and Java-based CruiseControl


    Martin Woodward of Team[rise announced the Ant and Java-based CruiseControl support for Team Foundation version control in his post on the release of Teamprise 2.1.  There's lots of good stuff in there, but this stuck out at me in particular.  If you are using TFS and Java, this is a great addition.  They've even made the Ant and CruiseControl support available for anyone, not just Teamprise customers!

    Teamprise 2.1 Released

    Yesterday we made version 2.1 of Teamprise publicly available.  If you've previously installed Teamprise, then I recommend that you upgrade to this version as it includes a number of bug-fixes that while not critical are definitely useful.  People who purchased a Teamprise Client Suite for Version 1 or 2.0 get the upgrade free of charge.  For the full release notes look here

    <good stuff omitted -- check out the full post>

    Ant Tasks

    We now have a suite of Ant integrations to TFS to perform various useful tasks.  For more information see the reference.  We've been using these for a while internally but one of the biggest areas that customers have been asking for help in is in integrated build automation.  The Ant tasks allow you to talk to Team Foundation Server Version Control from Ant.

    One of the important things to note about the Ant tasks is that they work by calling out to the "tf" command line client.  We have written the tasks to be compatible with both the Microsoft tf.exe client and our own Command Line Client that comes as part of the Teamprise Client Suite.  Therefore, even if you are not a Teamprise customer you can still download and use the ant tasks on your windows system with Microsoft's client installed.  If you want to run that same Ant script on your Unix / Mac boxes then you can using our client.

    Cruise Control Integration

    While TFS was in Beta 2, I wrote a CruiseControl.NET integration to TFS to ease adoption of the new tool as a version control and work item repository for existing .NET 1.1 projects.  Now I have written a similar integration to the original Java based CruiseControl (version 2.6) build server - again to ease adoption for existing Java projects.  In common with the Ant Tasks, this integration talks to TFS via the command line so you can use it with either the Microsoft or Teamprise command line clients.  For more information see the manual.

    tags: , , , , ,

  • Buck Hodges

    Configuring the build to use the version control proxy


    If you have a version control proxy at a remote site and you wish to have Team Build use the version control proxy, you'll need to modify the registry of the account that's running the build service on the build computer.  Here's what you'll need to put into the registry.  You'll need to change "someproxy" to your proxy's name.  Also, if you need to do this with Orcas, you'll need to change 8.0 to 9.0 for this to have the desired effect.

    Windows Registry Editor Version 5.00




    The client code that reads the registry to get the proxy setting does not look at HKLM.  It only reads from HKCU.

    tags: , , ,

  • Buck Hodges

    I'm in Redmond


    I don't travel much.  It's been almost exactly a year since I was in Redmond.  In March 2006, Ed Hintz and I gave a presentation on version control.

    This time Jim Lamb and I are here to learn about some of the internal build systems as part of our planning for the future.  Well, Jim's here to party with the PMs as well.  I just get to go to meetings.  :-(

  • Buck Hodges

    TFS Workspaces: 2005 and Orcas


    Martin Woodward has done a great job explaining workspaces  and working folder mappings in a pair of recent posts.  For those who are new to TFS or haven't taken the time to think about the concepts and uses, it's a good idea to read them.

    While his posts apply to both Team Foundation Server 2005 and Orcas, version control adds two new advanced capabilities to working folder mappings in Orcas.  I call them advanced because many users won't have any need for them, as with Martin's comment about not having a need for cloaks at TeamPrise.  However, if you need these features, they are really great.

    One-level mappings

    The first new feature is the one-level mapping.  In 2005 and by default in Orcas, mapping a server path to your local disk maps the entire tree under that folder to your local disk.  In other words, a working folder mapping is recursive.

    If you work on a large code base, you may want to map a top-level directory's immediate children and then selectively map subdirectories under it.  In 2005, the only choice was to map the top-level directory and cloak whatever you didn't want to get.  If you have a large number of directories or the new directories appear rather frequently, it's a painful solution.

    To specify a one-level working folder mapping, you simply add an asterisk to the end of it.  For example, the code that the TFS group works on lives in $/orcas/pu/tsadt in our own server (it hosts lots of projects for the Developer Division).  There are a lot of directories under $/orcas/pu/tsadt that most TFS developers don't need in order to work.  However, we all need the files in $/orcas/pu/tsadt itself (not recursively) in order to build.

    So, with Orcas TFS, we can specify the mappings as follows.

    $/orcas/pu/tsadt/*                  c:\code\orcas
    $/orcas/pu/tsadt/ddsuites/*         c:\code\orcas\ddsuites
    $/orcas/pu/tsadt/ddsuites/src/vset  c:\code\orcas\ddsuites\src\vset
    $/orcas/pu/tsadt/edev               c:\code\orcas\edev
    $/orcas/pu/tsadt/public             c:\code\orcas\public
    $/orcas/pu/tsadt/tools              c:\code\orcas\tools
    $/orcas/pu/tsadt/vscommon           c:\code\orcas\vscommon
    $/orcas/pu/tsadt/vset               c:\code\orcas\vset

    As you can see, the first two mappings are one-level mappings, meaning that only the files directly contained by specified directory are downloaded.  Then, I map the directories that I care about.  Note that order is not important here: I didn't have to specify the one-level mappings first.  It just turned out that way when I sorted them.

    Unless I'm forgetting something (I'm writing this on an airplane), that should pretty much set up a standard TFS developer's workspace.  I'll spare you the mappings we had to use with 2005, but it's a lot shorter list.

    Mappings under cloaks

    The second new capability is active mappings under cloaked mappings.  In TFS 2005 we checked for and declared an error for this.  There actually wasn't a good reason to allow it.  We had put that rule back in place back in 2003 when the design was different, and we just never removed the check.  Funny how that happens.

    In TFS 2005 the following would be illegal.  However, in Orcas it is supported.

    $/orcas/pu/tsadt                      c:\code\orcas
    (cloaked) $/orcas/pu/tsadt/framework
    $/orcas/pu/tsadt/framework/win/core   c:\code\orcas\core

    Here I've chosen to map all of tsadt, cloak the framework subdirectory, and then map framework\win\core.  Now, you have to be careful when you do this sort of thing, because you can quickly create a hairball set of mappings.  When you need it, though, it can be really useful.

    Why use these features?

    So why would anyone need to use either of these two features?  One reason is disk space.  We work in feature crews when developing new feature crews, so you end up with multiple branches (copies, effectively) of $/orcas/pu/tsadt on your disk.  Disk drives are big these days, but you can run out of space with these multi-gigabyte trees mapped to your disk plus the gigabytes of binaries and installation files that hey produce when you build them.

    Another reason is bandwidth.  We work in North Carolina, and our entire office shares a 10 Mbps line.  Even with the version control proxy, it's worth the time and effort to make sure that you don't bring down that extra gigabyte if you don't need it.

    Unless you work with really large branches, you won't care about these features.  But if you need them, you'll love them (and you can try them out with the March Orcas CTP)!

    tags: , , , , ,

  • Buck Hodges

    TFS migration toolkit spec posted


    The Team Foundation Server migration toolkit spec has been published.  Particularly if you are interested in writing version control or work item tracking conversion tools, be sure to check it out and give them feedback.

    tags: , , , ,

  • Buck Hodges

    Eric Lee's Orcas Team Build Screencasts


    Eric Lee of Counterpunch Software (former Softie) has started posting about Orcas features in the March CTP.

    First, he goes through a list of new features with some screenshots.

    A Baker's Dozen of New Features in Orcas

    There is nothing like having a new version of Visual Studio to play around with :) Not that Visual Studio 2005 is all that old, but the March CTP of Orcas was just released. I picked up the VPC version last night and gave it a whirl.

    So far it looks great! I was expecting a very small upgrade to VS 2005, but there are some really substantial features here.

    Here are a few features that caught my eye in no particular order.

    Then he posted a couple of nice Orcas Team Build screencasts.  Watch these to see the continuous integration and more in action.  There short, and he keeps moving so you won't get bored.  It's cool to watch it work for someone else, since no one outside of DevDiv had used it until the March CTP release.

    Continuous Build!!

    Hey guys,

    One of the most frequent feature requests made to the Team Foundation Server development team is for continuous build.

    Continuous build can take many forms, but the general idea is to automatically kick off a team-wide build when a developer makes a check-in. Some development teams like to gather up a few check-ins before kicking off the build.

    The idea is to try and catch changes that break the build as soon as possible.

    The exciting news is that with the Orcas release of Team Foundation Server, there is a feature in the build called Triggers. These triggers enable you to determine when a build should be kicked off. One such trigger will start a build with each checkin; so we finally have Continuous Build!

    I made a quick, casual video about the feature, check it out if you get the chance.

    Click here to view the screencast


    Build Agents, Queuing and more...

    Hey guys,

    The Team Build development must have been working pretty hard since the last release. The March CTP has a really mature-feeling build of this feature.

    It really looks like a lot of thought has been put into how to make Team Build easier to use and more powerful.

    There are a number of things besides continuous build that is worth a look at - build agents are managed in a much better way, there is a build definition editor, builds are 'queued' rather than just started, and build results are much easier to work with.

    I made a casual, grab-bag style video of these new features; check it out if you get the chance.

    Click here to view the screencast

    tags: , , , ,

  • Buck Hodges

    How to enable a checkin policy via the version control API


    I recently needed to test a new checkin policy that I wrote.  In order to do that, I needed to enable a checkin policy for the unit test's team project using the version control API.  I was a little surprised when it wasn't quite as obvious as I had hoped, and I had to poke around in the source to figure it out.  So, I thought I'd post the code snippet.

                using Microsoft.TeamFoundation.TeamFoundation.Client;
                using Microsoft.TeamFoundation.VersionControl.Client;

                TeamFoundationServer tfs = new TeamFoundationServer(http://yourserver:8080);
                VersionControlServer vcs = (VersionControlServer) tfs.GetService(typeof(VersionControlServer));

                TeamProject tp = vcs.GetTeamProject("SomeTeamProject");
                MyCustomPolicy customPolicy = new MyCustomPolicy();
                bool foundPolicy = false;
                foreach (PolicyType policyType in Workstation.Current.InstalledPolicyTypes)
                    if (policyType.Name == customPolicy.Type)
                        tp.SetCheckinPolicies(new PolicyEnvelope[] { new PolicyEnvelope(customPolicy, policyType) });
                        foundCustomPolicy = true;

                if (!foundCustomPolicy)
                    throw new ApplicationException("Something's wrong -- MyCustomPolicy is not installed");

    tags: , , ,

  • Buck Hodges

    Visual Studio 4.0 SDK is now available (includes updates to TFS extensibility documentation)


    The Visual Studio 4.0 SDK was released today.  Here's the announcement.

    On behalf of the VS Tools Ecosystem team, we are pleased to announce that after months of hard work, we have completed shipping the VS SDK 4.0 RTM release! We have done tons of work to make this release friendly to developers who are new to Visual Studio extensibility. It is available for immediate download on the Microsoft Download Center.

    We are excited to present some of the new features included in this release:

    • VS SDK Browser – the new entry-point to the entire SDK; includes new QuickStart Tutorials and a completely revamped sample browsing experience
    • Package Load Analyzer – allows developers to easily debug package load failures
    • Toolbox Installer redistributable package – allows component vendors that simplifies deployment, along with sample
    • TFS Contents – new and updated TFS samples and documentation
    • Sandcastle – new set of tools for generating managed class library documentation (Sandcastle)
    • Updated Setup experience
    • And much more!

    Here's what's new in it for TFS in particular.

    Team Foundation Server Extensibility

    • New Work Item Custom Control API Sample and Help Documentation
      • Demonstrates how to use the Work Item Custom Control APIs introduced in Visual Studio 2005 SP1.
    • New Version Control Merge History API Documentation
      • Demonstrates how to use the enhanced QueryMergesWithDetails API introduced in Visual Studio 2005 SP1.
    • Updated Team Explorer PcwTESample
      • This sample can be found in \VisualStudioTeamSystemIntegration\Team Explorer and Project Creation\PcwTESample\.
      • Fixed problems that prevented the sample from building.
    • Updated Team Foundation Server Collectibles Sample
      • This sample can be found in \VisualStudioTeamSystemIntegration\Team Foundation Core Services\ExtendingTeamFoundationSample\.
      • Re-written to use best practices.
      • Updated ReadMe.doc to reflect new architecture.
    • All Team Foundation Server Pubic Assemblies Updated to Visual Studio 2005 SP1
      • These assemblies can be found in \VisualStudioIntegration\Common\Assemblies\.
    • Updated Team Foundation Server BisSubscribe.exe Tool
      • This tool was updated to the RTM version and can be found in \VisualStudioTeamSystemIntegration\Utilities\EventSubscriptionTool\.
      • The SDK version of this tool now provides unsubscribe capability.
    • Added Visual Studio Team Explorer Package GUID’s, Menu ID’s and Group ID’s Header Files
      • The newly provided header file contains the constants required for integrating with existing Team Explorer shortcut menus when you create Visual Studio Integration Package projects. The TFS_VC_IDs.h file can be found in \VisualStudioTeamSystemIntegration\Version Control\ and the TFS_WIT_IDs.h file can be found in \VisualStudioTeamSystemIntegration\Work Item Tracking\.

    tags: , , , , ,

Page 1 of 1 (25 items)