Hi there! I’m John Socha-Leialoha and I’m a big fan of TDD. When I asked around about writing C++ tests, I was told there isn’t any support for C++ unit tests in Visual C++. Not true. Check this blog post where I tell you what’s required in order to write unit test for native C++ code using Visual Studio.
[Go to John’s full article…]
Riiiight... Testing native C++ through C++/CLI...
Or for those serious about TDD in C++ there is www.boost.org/.../test or code.google.com/.../googletest
Or UnitTest++ which I like very much! The distribution contains a test-suite for the framework with Visual Studio Project files.
Thanks, guys, for these great suggestions. Let me add my own :-)
WinUnit: http://winunit.codeplex.com/, whose author published an article about in MSDN Magazine a couple of years ago: msdn.microsoft.com/.../cc136757.aspx
The good thing about the "natural" approach being taken by VS out-of-box is that you may link your tests with the reporting that Team Foundation Server (TFS) offers like code-coverage, or processes like MSF for Agile reports, etc. It's already there, you can use it or eventually work out in order to plug your testing tool of preference.
I just have a question for you, Gregory, as by reading your comment, looks like you say that testing native code through C++/CLI is not a serious approach. I'm just curious about what do you think it's invalid in that.
While this is a interesting trick, I personally wouldn't use any approach that requires the code under test to be compiled with fundamentally different compile settings, and flipping /clr is pretty serious. One case where I saw IJW fail was where function pointers were being exported to initialization code through init_seg(); the /clr compiled code emitted metadata tokens instead of function pointers, causing the init routine to crash.
Another downside to doing unit testing this way is that by moving code from the EXE/DLL into a LIB you are taking a hit on link speed, since changes to a static lib defeats incremental linking.
I appreciate your feedback. Maybe a misunderstanding from my side (should ask the author of the article although he's on leave these days) but I believe that the code you are testing, the one that is in the static library, is not being compiled under /clr. In that sense, your tested logic is still running outside the managed sandbox. Am I too wrong? I agree with you, though, that what you were eventually putting as an .exe or .dll, now in order to test it you are putting it into a .lib file. Produced code, however, shouldn't be different despite its destination.
Your other two comments (init_seg phase and linkage speed) may be true, although I'm unsure how much applicable they are in the context of unit testing. Don't get me wrong: I just want to recall what is the purpose of unit testing. Based on a vendor-neutral definition like the one you may get in Wikipedia (en.wikipedia.org/.../Unit_testing), "unit testing is a method by which individual units of source code are tested to determine if they are fit for use. A unit is the smallest testable part of an application."
Based on that definition, unit testing is intended to test the functional correctness of a small portion of code, not a whole application -where other kind of tests fit better.- With that consideration in mind, it's expected that the two problems you mentioned should rarely occur (to avoid saying "won't occur at all"). I do agree that unit tests are arranged into collections and they are all tested at once, typically in batch processes like nightly builds. If that is the case, I feel like the linkage hit you mentioned wouldn't be something noticeable in the overall process. Please correct me if I'm wrong there.
Thanks for the feedback, guys!
> Maybe a misunderstanding from my side (should ask the author of the article although he's on leave these days) but I believe that the code you are testing, the one that is in the static library, is not being compiled under /clr. In that sense, your tested logic is still running outside the managed sandbox. Am I too wrong?
This separation only works for functions that are actually compiled into the .lib; it doesn't work for inline functions and templates. IIRC, #pragma managed/unmanaged are particularly ineffective in the latter case.
I've been using this technique for about a year now and it seems to be working good so far. The only recent downside I've met was the introduction of the C++0x nullptr keyword, which we cannot use because the unit tests compiled with /clr interpret nullptr as a managed keyword. We would have to use __nullptr instead on our code to avoid breaking the unit tests.
You're right, Ricardo. In fact, that consideration is common in C++/CLI - native interoperability. If you take a look to the official documentation about nullptr in C++/CLI (available at msdn.microsoft.com/.../4ex65770(v=VS.100).aspx ) you'll find
"The __nullptr keyword is a Microsoft-specific keyword that has the same meaning as nullptr, but applies to only native code. If you use nullptr with native C/C++ code and then compile with the /clr compiler option, the compiler cannot determine whether nullptr indicates a native or managed null pointer value. To make your intention clear to the compiler, use nullptr to specify a managed value or __nullptr to specify a native value."
I want to remind that the native tested functionality, not its managed testing counterpart, is not to be compiled as /clr and therefore any reference to native nullptr (the traditional NULL or (void*)0 before C++0x) is to remain as nullptr. In other words, despite the consideration, you don't need to touch your logic being tested to make it suitable for this testing framework.
This is still mostly bogus unless your DLL is intended to be used from .NET. Unit testing does not mean you more or less test the code as intended, but test it actually as intended. There is a big difference. So, for example, how do you know whether an error was caused by your code or by .NET interop? (I use a .NET interface to my native DLLs to do system tests, but not unit tests else the testing would be hopelessly incomplete.)
What a great news: in order to write unit tests for your C++ code in VS, all you need is to use some non C++ (and effectively OS-dependent?) framework which, based on the comments here and the actual blog post, doesn't even seem to be suitable for testing C++ code.
@Joe While desirable to keep the testing context as much production-like as possible, “unit testing” per-se is all about correctness proving for a single, isolated functionality (like a function or method) without much said about the environment (not at least in its formal definition). To put a couple of examples, unit testing for mobile application components aren’t usually being tested in the mobile device itself –not event in a device-emulated environment-. Likewise, we typically use XML files to mimic databases or web services responses, thus avoiding dealing with connectivity issues, etc, and that mocking technique doesn’t invalidate unit tests either.
Yet, your point about potential failures in the interop mechanism are valid although, do we really know about those interop inconsistencies today? My point here is about giving some likelihood to the current status of this IJW interop technique instead of discarding it right away: it has been out there for years, having matured with every new version of VS, being today the preferred approach for managed / unmanaged interoperability.
@kdprw: Being VS an already OS-dependent tool, getting its out-of-box testing mechanism based in an OS-dependent technology like C++/CLI doesn’t add/subtract anything, does it? ;-)
"Being VS an already OS-dependent tool, getting its out-of-box testing mechanism based in an OS-dependent technology like C++/CLI doesn’t add/subtract anything, does it? ;-)"
It is fairly common to want to port a project to other platforms which may require other compilers, tools, and environments with the same code base. If you want to open source anything you cannot control what people are going to be running things on. If you're building libraries for commercial release it's a bit clunky to have a unit test so environment specific.
Not being able to bring unit tests across easily, let's say to a unix based system, seems like a fairly major stopping point. Even if you are certain you don't need to port your project today it's a little harder to say you'll never need to. If you can't easily bring your unit tests across that's a personal deal breaker. This is beyond the method itself feeling kind of clunky and contrived and setup heavy.
While this method may be "interesting" I think I'll be sticking with one of several in-language implementations. Recently been using boost's testing framework with some success, still playing with others though.
Sorry, Diego, but your wrong. You test native code with native code, anything else is NOT testing, but self-indulgent crap. Perhaps this explains so much buggy Microsoft code.
@Michael Hamilton, your point is certainly valid. In the cross-platform scenario you mentioned this out-of-box approach is not adequate and probably one of those frameworks some other readers said above we'll be better suited. The good news about those alternative testing frameworks is that many are enabled to be hooked into VS as extensions.
@Joe, I wish that were the reason of our buggy code: it's just a matter of swtching to CppUnit and we are leading the race again! ;-D
Seriously, if you found a definition that specifies that the validity of a unit test depends on the fact that the tester portion is built in the same environment than the tested part, I'll conceal you're right. Otherwise, as long as it fails when it's expected to fail, works fine when it's expected so, sorry for disagreeing here, friend. Related question: if you had to unit test for correctness a stored procedure, should the test itself be another stored procedure?
Thanks for linking this Diego. (I'm assuming it was you as you're answering all the Qs)
When I first looked at using VS Test I recoiled at the unfamiliar C++/CLI code, but John's post is a great getting started guide. Very clear and to the point. I think I'll give it a go with my next project.