Visual Studio Build Process, VC++ Libraries and Check-in Process

Visual Studio Build Process, VC++ Libraries and Check-in Process

  • Comments 10

Hi again. My name is Ale Contenti and I’m the dev lead of the VC++ Libraries team. You might have see me previously in my channel9 interviews. I’m now in Italy for a short vacation and I’ll be in Barcelona for TechEd starting from Nov 5. I’m really excited to go there and mingle with a lot of VC++ and Visual Studio users! After TechEd, Steve Teixeira (Group Program Manager for Visual C++) and I will be in Munich, Germany to meet with customers.

Today I want to talk about how we check-in changes to the VC++ Libraries, and how the libraries are used by the rest of VC++ and Visual Studio.

You maybe know that the entire Visual Studio is built in our lab every night starting at 9pm (Fri & Sat night excluded, usually). The VC++ Libraries are among the first things that are built. Why? Because these libraries are immediately used by the rest of the Visual Studio components!

One of the first libraries built is the C-RunTime (or CRT). The CRT consists of three main parts:

·         The header files (like stdio.h)

·         The import libraries (like msvcrt.lib)

·         The DLLs (like msvcr90.dll)

We build these three parts and publish them to a binaries output folder, more or less like it happens for a normal VC++ project.

Additionally, the header files and the import libraries are “published” to another folder. The rest of Visual Studio components (e.g. the VC++ compiler cl.exe or the C# compiler csc.exe) are built using these newly published include files and import libraries.

This has a few interesting implications:

·         The entire Visual Studio is always built with the latest VC++ Libraries (CRT, STL, ATL and MFC)

·         The entire Visual Studio will load the latest VC++ DLLs at runtime

·         If there is a regression in the VC++ Libraries (for example an issue in a widely used functionality), the effects will be seen in the rest of Visual Studio as well

·         Each change to the VC++ Libraries headers and libs need to be carefully tested to make sure the build is not broken and the Visual Studio components work correctly

To make sure each VC++ Library check-in will not break the rest of Visual Studio we rely on:

·         An extensive set of regression tests

·         An automatic check-in system which ensures there are no build breaks and no test failures

The regression tests are usually updated with the unit tests which accompany the code changes.

The automatic check-in system uses more than 20 machines to build different parts of Visual Studio and then run a lot of tests targeting the newly built binaries. When everything looks ok (build and tests) the automatic system (we call it “the gauntlet” J) will take care of checking-in the changes and send mail with the check-in details.

I hope this was interesting and insightful. Let me know if you have any question/comment.

Ale Contenti

VC++ Libraries Dev Lead

  • Ale,

    Thanks, that was very interesting.  I find it intriguing how different products are built and go together.

    I know you have a bunch of custom test/build tools, but do you guys use Visual Studio Team System as your source/build foundation?  Or, leverage any of the TS workflow capabilities?


  • Has enough time passed so we can now pile on?

    Could you reduce the code bloat in the statically linked libraries? Seriously, the startup code alone needs serious refactoring. (And I'm pretty sure the MFC libraries could be trimmed, er refactored to reduce size, by 20%.)

  • "If there is a regression in the VC++ Libraries (for example an issue in a widely used functionality), the effects will be seen in the rest of Visual Studio as well"

    That needs a "sometimes".  This kind of test is a useful one, one of many, but it is not as big a test as you think it is.

    One time I made some adjustments to a C compiler to get it running on a new platform and got it working as a self-compiler on the new platform.  I thought I'd tested it pretty well, but I was wrong.  The result still failed to compile a Fortran compiler.  It turned out that at the time of a successful self-compile, I'd made less than half of the adjustments that the C compiler really needed.  Add a few applications and the number of necessary adjustments would surely increase further (that didn't happen because the company was testing me, not making a product out of it).

  • I'm curious to know how you use an SCM system to do this, could you comment on how the code you want to check in ends up in the 20+ test machines before actually being checked in? Have you build your own 'pre'-SCM system or something like that?

  • Since Ale is OOF right now at TechEd Europe, I'll try to step in and answer any questions I know about (and maybe even some that I don't ;)).

    Jared - right now we've got kind of a mix between our old custom toolsets and VSTS / TFS.  The problem is that across the division there is still a lot of custom tooling based on our internal source control and bug tracking systems for instance (that MS has been using for years now).  We have source control mirrored in both TFS and our custom system so that people can use either one, as we do with our bug tracking (or at least we did at one point - things may have changed since I last investigated).  We do track features and that sort of thing in VSTS.  Hopefully we will move more fully to Team Foundation Server in the next release as more tools are migrated over.  

    Joe - Unfortunately, refactoring for a library is usually not a very good option since ideally we maintain as much backwards compatibility with previous versions as possible (either binary or source compatibility depending on the scenario).  I do agree that there are bits and pieces that we could trim, and that actually we do trim as a result of our code-coverage testing initiatives, for instance Gopher support in MFC's networking support classes might be a candidate, but in general its hard to remove things or make them work different without breaking users without giving them an equal benefit.  Anyway, not sure how that relates to our build and checkin processes, but I guess I'm just saying that it's a hard issue to deal with, although maybe not impossible.  As disk space and bandwidth becomes cheaper, hopefully we're still getting the free lunch in terms of our customer's hardware getting better and faster and able to deal with it (really, the VC libraries are really small compared to say, .Net or Java).  We're definitely open to feedback in this are though.

    Norman - you are definitely correct.  My current job (SDET on libraries) exists because you are correct.  Luckily, self-build is not our only verification mechanism.  We have lots of additional testing .  Maybe Ale's point is just that at least by building the rest of VS with our libs, we get at least some immediate coverage.

    Legolas - our source control system has the concept of a "changelist" or I think it's called "shelveset" in TFS which is basically a list of files that have been changed that we intend to check in.  My understanding is that the gauntlet system takes that changelist, applies it to its local source control enlistment, builds the product configurations and then deploys it on a bunch of machines where tests can be run.  Then if all the tests pass, the gauntlet machine will check in the changelist for you.  

    Anyway, I'm not the expert on any of these areas, but that's my understanding of them.  Hopefully when Ale gets back he can correct anything I got wrong :).  Thanks for the interest.  

    Ben Anderson

    Visual C++ Libraries Team

    This posting is provided "AS IS" with no warranties, and confers no rights.

  • Thanks Ben for answering a bunch of questions while I'm here in Barcelona!

    Let me say the atmosphere here is great! Lots of people, lots of talks, lots of VS users. It's just super cool. And TechEd is wonderfully organized!

    Anyway, back to the questions. I agree with everything Ben said.

    Also, about the verification Norman is talking about, it's not only a self build. Each new Visual Studio component does use the newly built CRT. Though, you're correct in saying this is not a complete coverage. As soon as we discover missing coverage from our tests, we also beef them up with adequate regression testing, as Ben pointed out.

    About refactoring the libs, Joe, Ben has a good point. I'm interested in which code bloat you're talking about in the CRT startup. Most of the things we do in the CRT startup are pretty basic, like init the streams, init the global variables, init the cmdline, etc. Is there something you don't use?

    Thanks for all the comments.

    Ale Contenti

    VC++ Libraries Dev Lead

  • "there is still a lot of custom tooling based on our internal source control and bug tracking systems for instance (that MS has been using for years now)"

    You mean you weren't dogfooding Source Safe?

  • Hi,

    I have a problem in building a large VC++ project, my project is very big, i have all its obj files in a directory, but when i build the project, it start to recreate all of them again that take long time.

    Is there any way that i only link the before created obj files in VC++ IDE?

    (I can't use Link.exe command line, because my input files are very abounded and i can't name all of them in the command line.)

    My e-mail is



  • Sara, this looks like a dependency problem in the project settings. Obj files should not rebuild if no changes were made in the cpp files.

    More information is required though to identify what is causing this. Can you open up a forum thread on the Visual C++ General Forum ( and provide more information there: please try the following steps:

    - from Tools > Options > Project and Solutions > VC++ Project Settings > "Build Logging" = Yes and "Show Environment in Log" = Yes.

    - clean and rebuild the project

    - build the project a second time

    - take a look at the buildlog.htm and identify what are the files being rebuilt during the second build

    - check timestamps for those files (any cpp datetime should be greater than the datetime of its coresponding obj file)

    - if nothing is obviously wrong at this stage, it might be a problem on the file settings and it would be useful for us to take a look at  the command line switches for that file and the environment variables, so please post this on the forum thread too.


    Marian Luparu

    Visual C++ IDE

  • Oups... My comment regarding timestamps should read like this (changes marked between **):

    - check timestamps for those files (any cpp datetime should *not* be greater than the datetime of its coresponding obj file)

    Sorry for the error.


Page 1 of 1 (10 items)