Hello, this is Andy Rich from the Visual C++ front-end team. Today, I’ll be discussing the use of precompiled header files (aka PCH files) in our new intellisense architecture.
Back in May, Boris briefly mentioned an intellisense optimization based on precompiled header technology. This post will elaborate on that comment by providing a glimpse into how Intellisense PCH files (or iPCH files) work. We’ve all become accustomed to precompiled headers improving build throughput, and now in Visual Studio 2010, we use the same technology to improve intellisense performance in the IDE.
The VC++ 2010 intellisense engine mimics the command-line compiler by using a translation unit (TU) model to service intellisense requests. A typical translation unit consists of a single source file and the several header files included by that source file (and the headers those headers include, etc.). The intellisense engine exists to provide users with answers to questions, such as what a particular type is, what the exact signature of a function is (and its overloads), or what variables are available in the current scope beginning with a particular substring.
In order for the intellisense compiler to provide this information, the intellisense engine must initially parse the TU like the command-line compiler, recursively parsing all #include files listed at the top of the source file before parsing the rest of the source file. Thanks to C++ scoping rules, we know that we can skip all method bodies except the one you might currently be in, but, other than this optimization, the rest of the translation unit must be parsed to give an accurate answer. We refer to this as the “pre-parse.”
Pre-parses are not always required, as users spend much of their time writing code inside of a local scope. Through careful tracking of user edits, we can say whether or not the user has changed information which requires a new pre-parse. When this happens, we throw away our old pre-parse and start again.
So, even though you aren’t editing header files, they must be continually parsed as part of the pre-parse. As a translation unit grows in size, these parses require progressively more CPU and memory resources, and will lead to a drop in intellisense performance. Parsing is slow, and parsing a lot of information (as in a complex translation unit) can be very slow; 3 seconds is not uncommon, and that is simply too long for an intellisense response.
Luckily, there is an optimization developed for command-line compilers that can also be applied to the intellisense compiler: pre-compiled headers (PCHs). The PCH model presupposes that your translation units mostly share a lot of the same common includes. You inform the compiler of this set of headers, and it builds a pre-compiled header file. On subsequent compilations, instead of re-compiling this set of headers, the compiler loads the PCH file and then proceeds to compile the unique portion of your translation unit.
Unique includesThere are a few caveats to this model. First, the “common” portion of your headers must be the first files compiled in each translation unit, and they must be in the same order in all translation units. Most developers refactor their headers to have a common header file for this purpose; this is what stdafx.h in the Visual C++ project templates is intended for.
In general, if you have PCH set up for use with the build compiler, the intellisense compiler is able to pick up those PCH options and generate a PCH that can be used. Because the intellisense compiler uses a different PCH format from the build compiler, separate PCH files are created for the use of the intellisense compiler. These files are typically stored under your main solution directory, in a subdirectory labeled ‘ipch’. (Future releases may have the command-line and intellisense compilers share these PCHs, but for now, they are separate.)
The intellisense compiler can load these iPCH files to save not only parse time, but memory as well: all translation units that share a common PCH will share the memory for the loaded PCH, further reducing the working set. So, a properly set-up PCH scheme for your solution can make your intellisense requests execute much more rapidly and reduce memory consumption.
Here are some important things to keep in mind when configuring your project for iPCH:
· iPCH and build compiler PCH share the same configuration settings (configurable on a per-project or per-file basis through “Configuration Properties->C/C++->Precompiled Headers”).
· The iPCH should represent the largest set of common headers possible, except for commonly edited headers.
· All translation units should include the common headers in the same order. This may be best configured through a single header file that includes all the other headers, with respective .cpp files including only this “master”header file.
· You can have different iPCH files for different translation units – but only one iPCH file can be used for any given translation unit.
· The intellisense compiler will not create a iPCH file if there are errors in it – open up the ‘error list’ window and look for any intellisense errors; eliminate these errors in order to get PCH working.
· If you feel that an iPCH has somehow become corrupted, you can shut down the IDE and delete the iPCH directory.
I just want to say that eventhough VS2005/VS2010 is slower than VC6, then they provide a much better user experience when it comes to functionality. Very happy about the introduction of iterator debugging, standard STL, code analysis, CLI/C++ interop to integrate with .NET code. And I look forward to the intellisense starting to work again (didn't work in VC6 either), and the "auto" keyword for easier use of template classes.
But yes the pure .NET developers are the ones getting most out of the new Visual Studio releases. And if I was going to start a new project, then I definitely would do it in C# instead of native C++.
Btw. I became happy when I read this:
> Suggestions to “delete your .ncb file” (due
> to DB corruption) in order to restore
> functionality of Browsing/Intellisense are a
> thing of the past.
And then you suddenly say:
> If you feel that an iPCH has somehow become
> corrupted, you can shut down the IDE and
> delete the iPCH directory.
Brand new caches that can become corrupt :)
<i>Brand new caches that can become corrupt<i>
Is it not better to have had caches and lost them, than to never have had caches at all?
> Is it not better to have had caches and lost
> them, than to never have had caches at all?
I love caches, cannot live without them. But I would prefere they didn't go corrupt on a daily basis (like NCB file does), or at least could self repair.
Speaking of C++/CLI support; I've stumbled on a post on Microsoft's Connect that Managed Incremental Builds are being dropped from Visual Studio 2010. Is this really true?
This was introduced with such fanfare in Visual Studio 2008 and how it improved build times. So why is it being dropped now? Is this more of the future trend to downgrade the C++\CLI experience?
I know some people really dislike C++/CLI but it can be incredibly useful and in our situation unavoidable, so I hope it does not become deprecated anytime soon.
This is where I found the it was being cut:
Regarding C++/CLI "dropdown", I guess that the problem is that supporting C++/CLI is becoming prohibitively expensive for MS, so they have no choice but focus on the most common scenario (i.e. produce managed libraries with heavy interop stuff) and drop all convenience features. After all, we have no choice but follow because no one else in the industry is developing C++/CLI compilers (see http://www.mono-project.com/CPlusPlus; In short, C++/CLI is great, but way too complex).
The bottom line: if you rewrite your apps in C# and use C++/CLI only when you have no choice, you’ll be fine. Intellisense will be eventually supported and the compiler will be robust and reasonably fast.
> .net is where the business for microsoft, so it is obviously their choice and they allocate resource and time to develop that rather than the age old C++
Keep in mind that both .NET and C++ are heavily used within Microsoft, and aren't just external products. Even VS2010, for all its WPF goodness, still has a _lot_ of native code in it, and that's all C++, not to mention Windows etc. And the quality of developer tools translates directly to the quality of products.
Visual Assist (plugin) can add intellisense to C++/CLI projects (works with VS2010 beta 2). In addition, as far as I know, Microsoft is also working on this, but it will not be done before the VS2010 release, so will be in a later version / SP.
Is there a fix available for the problem with PCH on VS 2008 and Windows 7?
Yes, the problem in VS2008 is fixed, and a patch is available here: https://connect.microsoft.com/VisualStudio/Downloads/DownloadDetails.aspx?DownloadID=25785&wa=wsignin1.0 (This patch goes on top of the VS2008 SP1 compiler, so the SP1 is a prerequisite.)
For anyone who is interested, you can read more about the PCH rebasing issue here: http://blogs.msdn.com/vcblog/archive/2009/11/12/visual-c-precompiled-header-errors-on-windows-7.aspx
What are the requirements if i write a plugin for Visual Studio 2010 and put a different c++ compiler.
Say realview arm compiler or gcc for windows or the xbox 360 compiler?
Oops, my question was unclear.
Will Intellisense work automatically if i write such a plugin?
Microsoft, once again you suck. In earlier releases we just ticket a box and it did the precompiled headers automatically. Now we have to screw around with a whole mess of stuff to compensate for the fact you guys couldn't code yourself out of a wet paper bag. You're useless.
Boffo - I was simply explaining in detail how PCHs are used by our compiler and intellisense compiler, and trying to give a simplified overview of the PCH model. (And offer some tips on optimizing your PCH experience.)
Nothing has actually changed for how PCH is configured through the IDE. You should still be able to check a box and have it work exactly as it did before.
(Though hopefully now you have more information about what selecting that box does behind the scenes...)
Kingofthebongo, a plugin like you describe should not affect IntelliSense at all in Visual Studio 2010. If you are asking whether the compiler you add will be used to provide IntelliSense, the answer is no. That would likely require quite a bit more work on your part. But the existing IntelliSense provider should continue to work just fine.