For a compiler, the common things are code size vs. speed.
The conventional wisdom is to optimize for code-size, because that tends to reduce page faults / get better cache usage and thus ultimately be faster. (I've just exhausted my knowledge on that topic. Head over to Rico's blog if you need more.)
I think an often missed optimization goal is for simplicity.
In my opinion, prefer to optimize for simplicity. Why?
"...about 97% of the time: Premature optimization is the root of all evil". (That's Rico, quoting Knuth, quoting Tony Hoare). I'm not advocating being dumb: common sense should still rule. For example, if you need to lookup in a data structure, don't quadruple index it. But as least use a single indexing scheme that you expect to be the common case. [Update, fixed quote thanks to Peter]
Some practical cases:
The irony is that some of these simplifications are not only easier to understand, but can perform faster too. For example, a binary search can wreck your cache lines whereas a linear search might play well. Or you may try to optimize something yourself (eg, inline assembly) where the C++ optimizer could do it better.