Michael Howard has a FAQ on this here – there's also more information on this and related defenses in one of my chapters in Writing Secure Code for Windows Vista. One of the things I'd like to point out about enabling this, and several other defenses, like NX, SafeSEH and some others, is that you get them by default in 64-bit code. I'd suspect that for many of you, you'll get more testing done on 32-bit in the near term, and by turning these on, you're narrowing the differences between the two platforms. While I seriously doubt that setting HeapEnableTerminationOnCorruption is going to cause any regressions, you want to maximize your chances of finding these.

There's one question not answered in Michael's FAQ that this inquiring mind would like to know – just exactly what triggers this? I suppose I should go look at the source and see where it branches, but in the sample apps I wrote for WSCV, I couldn't find any that wouldn't abort without this flag set. If someone has some info on this that I can post, please send it.

It was interesting – the heap management in Vista is much, much more robust than previous versions. For example, you can malloc 3 buffers right in a row, then use the first pointer to just trash all 3 buffers. On XP and Win2k3, the app just trundles right along until you try to free the trashed buffers, and then it dies in a possibly exploitable manner. On Vista, as soon as you try to free the first buffer, the app is in the exception handler. Even though I'm not sure how much extra HeapEnableTerminationOnCorruption does for you, it's still a good thing to have set.

Speaking of which, here's a handy way to do this on cross-platform code without making unneeded API calls:

#if !defined (_WIN64) && _WIN32_WINNT >= 0x0600

void SetHeapTerminate(HANDLE HeapHandle)

{

    (void)HeapSetInformation( HeapHandle, HeapEnableTerminationOnCorruption, NULL, 0 );

}

#else // 64-bit or prior to Vista

void SetHeapTerminate(HANDLE HeapHandle)

{

    // Avoid W4 warnings -

    HeapHandle;

}

#endif