In the second section of the Channel 9 tour videos, Rico Mariani talks about performance and I once again butcher his last name (I once made the same mistake at a BillG review, sorry man <sigh>; it's pronounced "Mary-annie", got it now <g>).

One thing we've seen a lot is that it is really easy to take one or two lines of managed code and really blow your working set and startup time.  It's a double edged sword.  On the one hand, you get a really easy to use class library, on the other you don't feel the pain that would otherwise make you think twice about using an expensive feature.

Two concrete examples include the XML serializer and compiled Regex expressions.  I've seen requests for Assembly.Unload() that really are due to having these extra dynamically generated assemblies around.  This performance report on the internal Headtrax application explains how that application invoked the C# compiler 16 times at startup!  (this is discussed in the interview section with Scoble coming up in a new segment).  If your application is slow to start, see if you are making this very easy mistake.

We've kicked around a bunch of ideas on this in the past.  Obviously our first goal is that you always get the fastest solution by default (you should precompile your XML serializer code for example).  We've toyed with giving you a "red/yellow/green" gradient in the docs or Intellisense that would hint at where you are using an expensive feature.  Profilers are also useful for you to figure this kind of stuff out; but do require work and planning.

So my question for you:  how much power is too much?  Should we be making the library much more "in your face" or harder to use for things that are really going to cost you?  What are your favorite "feature rich" API's that might be causing this unexpected/unwanted overhead?