Hi, I am James Wang, an SDET on the VC++ Compiler Front End team. Currently, I am working on designing the test architecture for the new IntelliSense engine. I am responsible to design tests that make sure the IntelliSense engine gives correct answer for quick info, member list, parameter help, and etc.
Currently we have a test suite that directs testing the IntelliSense feature. But one of the drawbacks is that it only targets a very limited amount of scenarios. Testing converge becomes a major issue. But due to vast varieties of C/C++ constructs, testing every scenario becomes an impossible or at least a very time consuming task. In the meantime, we do have ~100,000 of tests that targeting compiler / libraries that covers a majority of C/C++ constructs. Would it be better to adapt these tests for the purpose of IntelliSense engine testing?
To accomplish this, I wrote a tool that will analysis any arbitrary C/C++ code and generate IntelliSense test site. The tool will also generate the expected results for each test site. So for any arbitrary C/C++ code, we can generate a set of self verifying IntelliSense tests on quick info, member list, parameter help, and etc. As the result, we can reuse the ~100,000 tests for the purpose of IntelliSense testing.
Two things make this possible:
1. Better intermediate language (IL) representation. After parsing the source code, the new IntelliSense engine generate a better IL and expose a wealth of APIs that allows traversing and querying various compiler artifacts. This allows me to figure out where to create IntelliSense test site and what the expected results should be. For example: while traversing through the IL, if I see a field operator (->), I may want to generate a member list tests. For the expected results, I can query the type for the left operand of the field operator and retrieved the expected member list.
2. Componentized the IntelliSense engine. The new IntelliSense engine exposes API that allows engine level testing. So I can do IntelliSense testing without the IDE. Otherwise, running a large amount of IntelliSense tests become impractical due to the time needed to start / shut down the IDE for each tests.
Hopefully, this new approach will improve the test coverage for the new IntelliSense engine and drive up its quality.
are you saying this will work better in orcas? in vs 2005, with my project, it isn't working _at_ _all_. (but the same goes for "go t definition": also doesn't work at all...)
My biggest problem is not so much accuracy, but performance and scalability. On a dual processor Opteron 256 workstation with 4 gigabytes of memory (running Windows XP 64), we are still force to disable IntelliSense, because it takes minutes for it to respond. It is not possible to use the editor, because even the displaying of tooltips brings everything to a standstill. SP1 for VS2005 did not fix these issues for us.
Your tests should without doubt establish baseline performance metrics. IntelliSense was okay in VS 2003 and even worked in VS 2005 Beta 1 for the same project we are using now, but with the final VS 2005 release I have to agree with Thomas, it just does not work.
My problem with IntelliSense is that it just stops working at some point. The first time you rebuild a project it's usually OK, but after a while editing, building, debugging... it seems like it's just forgotten stuff.
I don't know how you design test cases to catch this, other than asking for users with broken IntelliSense and somehow gathering data from them.
Right now, for example, most IntelliSense is working OK in the file I'm looking at, but one particular object puts "Type of expression to the left of . or -> is not a class, struct or union'" when it actually is a valid object (the code compiles and runs).
We use VS2005 with a huge C++ project. IntelliSense for the most part works ok although I'd have to concur with the views expressed above. It can slow your machine down an awful lot whilst generating the IntelliSense (although this is better since SP1 as it now runs in the background more readily). Also it does indeed stop working after a while (sorry I can't be more specific about what triggers this). I usually end up deleting my .ncb file when this happens.
Whilst we're on the topic of IntelliSense, will Orcas have decent support for XML comment blocks in unmanaged code ? The support in C# is great, but is somewhat lacking in the old, dusty world of native C++ :)
I also have issues with VS2005 native C++ IntelliSense. SP1 has slightly improved updating performance, put overall my satisfaction is very low. Maybe with VS2005, you guys bit off more than you could chew too quickly. I'm not sure what happened, but with VS2003, I had a much more reliable, preformant and productive overall experience (same goes for VS6). Hopefully, Orcas will address many burdensome issues with VS2005, including IntelliSense.
I also see the issue where after time, IntelliSense seems to break and not find things that obviously exist (and sometimes have existed for many weeks).
I do appreciate that you guys are working on this.
Wow. As always with C++, any post discussing Intellisense brings up performance and accuracy issues. I'm going to clarify a couple of things on behalf of my colleague James here.
The first thing to note is that the work that James is discussing is here is targeting our infrastructure for the release *following* VS2008 (indeed, in our world , beta2 equates to being ready to start working on the next version). In this release currently dubbed Orcas+1 (how ingenious), we are planning a huge overhaul of the Intellisense & Browsing system to address once and for all the performance issues that have plagued us.
In the meantime of course, we want the experience with VS2008 to be as great as possible. To this effect, I want to respond to the specific issues you have brought up. The first thing is that our Intellisense system tries very hard to understand your code in the way that it is built. The downside is that it breaks down when it doesn't understand how code is built precisely (e.g. missing a macro definition). The simplest way to diagnose this is to generate a log file by adding /acplog:<filepath> to the advanced C++ options, which we can look at for population errors.
The second and IMHO most egregious problem is performance. Intellisense, by its very nature, "overreacts" to edits because it needs to recompute the meaning of everything that may have been affected by the edit. When modifying a header file, the effects can be pretty horrible, especially on a single-proc machine. We have improved this somewhat but not significantly (we're fixing this architecture for post VS2008 as I said though).
If some of you are willing to work with us in a tight loop, we are currently profiling our existing Intellisense functionality for any possible performance gains. Please contact me directly at email@example.com so I can include you this process.
Visual C++ PM
My concern is the size of intelisence file. For a very common project, it create a db file of 18M only for intelisence. It's absolutly unaccpetalbe, and it makes my PC really really slow.
My suggestion is that you make an option to disable intelisence if you can't make it fast enough.
For me the problem isn't speed, or failures when the code isn't well defined, but failures when the code is properly written.
Sometimes I'll write a line of code where Intellisense works, then press enter to write a new line starting with the exact same object and find that Intellisense doesn't have any info to give me, despite the previous line working perfectly and being well-formed code!
Equally mystifying is Intellisense's complete lack of ability to work on certain areas of the codebase - it just seems unable to parse entire subsystems, meaning you can't use it at all when dealing with certain classes.
I also experience the gradual degradation of IntelliSense in VS2005, to where it stopped working at all the last few days. A web search to me to a discuss of this problem and a solution: deleting the .ncb file, which did the trick. (The search also brought me here.) Perhaps there should be a menu option in Visual Studio that would delete the file and rebuild it.
Andrew, could you try setting up the logging I mentioned and we can investigate your specific failures? Often, it's the case that we don't grok all the source due to incomplete macro context.
My experience is similar to Andrew's. I have been fighting with Intellisense for the better part of a year. Just experienced exactly the same occurance as Andrew has sent me once again to look for answers. One thing that would help is a knowledge base of information about what not to do so as to be able to "clean up" code so that Intellisense "works." Though it is odd to modify code so that the tools works correctly, I would be willing to deal with such just to not be annoyed.
One of my other problems have appeared now in two places and have to do with a namespaces being somehow shown nested within itself in the class view. This has now occured in two different projects and is accompanied by an Intellisense that does not work for variables that both compile correctly and to my eye do not put serious strains on the language syntax.
Sometimes deleting the .ncb file and letting it rebuild does help for a while, sometimes it doesn't. I can put the mouse cursor on an identifier, right-click and choose either "go to declaration"[*] or "go to definition"[*] and it says there's no declaration or definition. Still compiles OK.
I can right-click and choose "go to definition"[*] and it goes to the declaration in a .h file which isn't the definition in a .cpp file. Have to find the .cpp file myself and find the definition myself. If the definition is a function body then sometimes the combo boxes above the code can be used in jumping to the function definition, sometimes not.
I can right-click and select "browse all references to this identifier"[*] and the resulting list will be empty, no references at all, not even the instance that I just clicked on.
[* If any of these approximate quotes is too obscure to be recognized, please say, and I'll post what it actually says in Japanese.]
Could be possible to add a button to manually update the Intelisense data? Currently I manage really big projects and is a pain to work while Intelisense is updating the NCB.
VS has no idea when it should update the NCB. I KNOW when I want it. Automatic updating in background just "locks" my weak computer. Is really annoying. That + the autorecovery makes my big project simply a pain.
I am experiencing the problems listed by the above posts. As Bubu suggested, it would be nice to make Intellisense manually updatable. Enabling “Manual Rebuild” mode (Somewhere buried in the VS settings screen), would also prevent Intellisense from "going nuts" and rebuilding itself when you trying to do something else (like type).
Then when you go for a coffee break, you could press the "Rebuild Intellisense" button (aka. the "Hammer My Computer" button.)
Maybe a related useful feature would be a semi-manual mode where Intellisense is rebuilt at the end of a compilation. I know you guys would like to keep Intellisense always updated (and independent of compilation), but this automatic rebuilding does not seem to work out unless you have new + powerful workstation.
VS2005 SP1 has slightly improved Intellisense Rebuilding performance, but it's not enough.
Hmm, seems like an intresting post. IMHO, we shouldn't really be putting too much onto the native c/c++ compiler's intellisense. The basic architecture as far as native languange intellisense is concerned is flawed. Why is it that i say its flawed, i cant explain technically since i did not work on it. But from the sight of it its not very worthwhile. (well okay, flawed is too harsh a word, maybe "weak" should go )
The biggest problem is the non-standard legacy C/C++ codebase. I think its very unfair to be harsh to intellisense devs (or MSFT, for that matter). They do their real best to make what helps, so one should appriciate their efforts. They can do so much.
I had a discusion with Steven Texiera (program manager for vc++) and he told how the core compiler, a lot of it, is still very OLD. It just works great, and no-one can really touch it, or change it. As long as my suggestion goes, i say re-do it. Back in 1992/93 the business models and programming styles were different. People had a boost of productivity with innovative and performant dev products from MSFT. But that was it. (looking at the native c/c++ lines). Now we need more and the granny's compiler is ... well .... falling prey to generation gap. I think by redo-ing the compiler, or atleast some major parts of it to incorporate support for intellisense, or by exporting valuable information, or for that matter incorporating on-the-fly compilation of chunks of code we could overcome the intellisense problem. Right now its like making granny ride the Tour-de-france. She's really wise, but give her a break:) Because legacy code is here to stay. we have to deal with that. Between the user and the code is the compiler and well, IDE. We cant touch legacy code. Doing fixes on the front is not going to be very efficient. Re-doing the compiler (atleast we could research into that, see if it is any helpful.. if not we've got nothing to lose do we?) might help. Going like right now would be a waste, trust me. C++0x would be comming around 2k8, so obviously microsoft would be there to embrace it and a lot of effort will be diverted to making a better model for that and this will be neglected.
A more far-fetched idea could be inventing self-learning (in the true sense) machines. Ah.. but then .... you people should've called it when people were writing non-standard code. (im too young for that:p)