At Microsoft, we have a number of internal tools and websites that help improve our interaction with the community. One such site at our disposal is a tool that helps us browse feedback provided about MSDN Library topics. This feedback, which at times can be very telling, is provided via a rating scale and comments text box at the bottom of each page under the heading How would you rate the usefulness of this content?.Earlier this week I was using this tool to browse through my team's topics trying to gleam how customers rated our documentation, when I came across one particular topic that stood out. This topic, which was the description of the rule Move pinvokes to native methods class, had a high number of comments (over 10) with a particularly low average rating (1.75 out of 9). After reading through the comments it was blatantly obviously as to why it was rated so low; it was missing a code sample that actually showed how to actually fix the violation. One particular commentator summed it up:
I already have an example of the what violates the rule in my own code! How about an example of the fix instead?
After reading through the topic, I wholeheartly agree and I've filed a bug internally for User Education (UE) to fix this. In the meantime however, if you head over on the sister MSDN Wiki site (which the standalone FxCop now links to), I've added a content block that greatly expands on the topic, showing what the topic is missing; the correct way to fix this violation.
In conclusion, if you come across a topic that is lacking in, or contains, wrong information, please spend 30 seconds filling out the comments at the bottom of the page - it can certainly help us to determine what topics need work.