I had a bit of a bummer morning back at MMS 2008. I woke up early (fortunately the conference was in Pacific and my body was in Central, so it was less torturous than getting up early normally is for me) and went down to get set up nice and early. I decided to add a bunch of extra demos at the last minute, you see, so I figured I could use the time.

Unfortunately, the presenter who was going to have the room after me decided to get there even earlier than I did.

So I got to sit and wait.

With around 1/2 an hour to go, and my new demos completely untried, I started getting set up. After getting all fired up and ready to go, I plugged in to the projector system. If you have ever presented at a Microsoft conference, you know that it's not just a VGA cable, it's also that silly USB cable that they insist you have to have, but never have a good explanation of why you can't live without it. Normally, from what I can tell, it doesn't do anything that I would want.

But that morning, it was a special surprise.

Plugging in the USB cable made the keyboard not work correctly. Neither the external keyboard they provided nor the keyboard sitting happily attached to my Tablet PC. I couldn't type any longer.

Now, fortunately I do have a Tablet PC, and the writing functionality *did* still work. I wasn't hosed after all! But I was adding demos on the fly, and I reverted to typing things out of habit on two separate occasions. Oops. So I fixed it, and did a "mouse and ink only" demo on application compatibility. Overall, I think I recovered pretty well, or at least decently.

But it didn't matter - I got slammed. In the reviews:

"Gee a working keyboard can't be that tough to come by."

Apparently it wasn't enough that I had to slog through demos while somebody else's equipment that horked my box, but then I had to get a bad review because of it.

Which brings me to my point:

Your reviews affect the person who is delivering the session only.

I'm fairly certain that nobody else is digging into my session looking for equipment failures (although they probably do aggregate it). I suspect that KVM is still floating around somewhere and will re-emerge somewhere and bite somebody else. Why? Because I don't have any influence over the equipment. I don't run conferences. I just show up and plug in.

The same is true for food. I've had quite a few conference reviews telling me how awful the lunch was. Except ... I don't make lunch at conferences. In fact, if you aren't handed a microwave meal for lunch, it's extremely unlikely that I made your lunch at all - conference or not. Telling me you didn't enjoy the lunch will not get you a better lunch next time.

Giving me a bad score because of bad food, faulty equipment, inadequate snacks, or lack of ice cream doesn't get you those things, but it can keep me away next time. Conference planners like to keep people who get high scores and dump those who don't. Why? Because they want you to be happy and come to the next conference, and that starts with great presenters.

So, low scores are your way to attempt to vote somebody off of the island. If everyone agrees with you, it just might happen. And you should do that - we want you to tell us if we've wasted your time so we stop doing that!

But that brings me to my next point:

People reading evaluations do not reserve the top score for God.

I've done this one myself - but it works a little different in Conference-Land than I would have thought. I would reserve the top score for "absolute perfection", which of course nobody could ever achieve. But, just in case the greatest presenter in the world has the best session of their lives the day I happened to be in the audience, I'd have that score to let them know.

Conference organizers don't think like that. If you have a 5-point Leikert scale, they expect you to average above a 4. If you don't consistently do that, they start thinking of getting you off of the island. They don't view 5 as reserved for God and 4 as the best a mortal could hope for. They view 5 as an excellent session delivered by a mortal. Top box is expected, not something that should be out of reach almost all of the time.

This doesn't mean that you should go handing out top box scores all of the time. People should always earn their scores. But if you are thinking to yourself, "self, that session rocked so hard - I've never seen a better one - I'm giving that presenter the best a mortal can hope for, an amazing 4 out of 5!" then you don't actually communicate that sentiment. Rather, you communicate "that was OK - better than average, at least. Try harder next time."

For, while evals may seem like a place to vent after a tiring week, they're actually more than that. You can try to vote people off of the island if they are just plain awful. You can try to tell people "I love the session and I'd like to come again, but the next time I do, fix this problem because it was really annoying." Just make sure that you are speaking the same language as the people reading it, so the right people interpret it the right way, and what you get next time will be more likely to be what you want (if you didn't get that in the first place).

Oh, and written comments? Always the best feedback. I love them.

I'm looking forward to your feedback from TechEd 2008!

 

Oh yeah - I’m gunning to participate in the  Windows Vista Bloggers’ Panel hosted by Mark Russinovich at TechEd IT Pro week. For some reason they want me to perseverate in reminding you of this. I cannot explain why - seems a bit odd to me. But I hope you'll join in the panel anyway and fling some really tough questions at the panel. It's not to late to register to attend TechEd - head on over to http://www.microsoft.com/events/teched2008/itpro/registration/regprocess.mspx.