Larry Osterman's WebLog

Confessions of an Old Fogey
Blog - Title

Why no Easter Eggs?

Why no Easter Eggs?

Rate This
  • Comments 35

Yesterday's post caused a bit of a furor in the comments thread.  A large number of people leaving comments (and others) didn't understand why the OS division has a "no Easter Eggs" policy.

If you think about this, it's not really that surprising.  One of the aspects of Trustworthy Computing is that you can trust what's on your computer.  Part of that means that there's absolutely NOTHING on your computer that isn't planned.  If the manufacturer of the software that's on every desktop in your company can't stop their developers from sneaking undocumented features into the product (even features as relatively benign as an Easter Egg), how can you be sure that they've not snuck some other undocumented feature into the code.

Even mandating that you have access to the entire source code to the product doesn't guarantee that - first off, nobody in their right mind would audit all 10 million+ lines of code in the product before deployment, and even if you DID have the source code, that doesn't mean anything - Ken Thompson made that quite clear in his Turing Award lecture.  Once you've lost the trust of your customers, they're gone - they're going to find somewhere else to take their business.

And there are LOTS of businesses and governments that have the sources to Microsoft products.  Imagine how they'd react if (and when) they discovered the code?  Especially when they were told that it was a "Special Surprise" for our users.  Their only reaction would be to wonder what other "Special Surprises" were in the code.

It's even more than that.  What happens when the Easter Egg has a security bug in it?  It's not that unplausable - the NT 3.1 Easter Egg had a bug in it - the easter egg was designed to be triggered when someone typed in I LOVE NT, but apparently it could also be triggered by any anagram of "I LOVE NT" - as a result, "NOT EVIL" was also a trigger.

Going still further, Easter Eggs are percieved as a symptom of bloat, and lots of people get upset when they find them.  From Adequacy.org:

Now if you followed the link above and read the article you may be thinking to yourself...
  • Is this what MS developers do when they should be concentrating on security?
  • How often do they audit their code?
  • What's to stop someone from inserting malicious code?
  • Is this why I pay so much for Windows and MS Office?
  • I know other non-MS software contains EEs but this is rediculous.
  • One more reason why peer review is better as EEs and malicious code can be removed quickly.
  • Is this why security patches takes so long to be released?
  • This is TrustWorthy Computing!?!
  • From technofile:

    Even more disturbing is the vision of Microsoft as the purveyor of foolishness. Already, the cloying "Easter eggs" that Microsoft hides away in its software -- surprise messages, sounds or images that show off the skill of the programmers but have no usefulness otherwise -- are forcing many users to question the seriousness of Microsoft's management.
       A company whose engineers can spend dozens or even hundreds of hours placing nonsensical "Easter eggs" in various programs would seem to have no excuse for releasing Windows with any bugs at all. Microsoft's priorities are upside down if "Easter egg" frills and other non-essential features are more important than getting the basic software to work right.

    From Agathering.net:

    "and some users might like to know exactly why the company allows such huge eggs to bloat already big applications even further"

    I've been involved in Easter Eggs in the past - the Exchange 5.0 POP3 and NNTP servers had easter eggs in them.  In our case, we actually followed the rules - we filed a bug in the database ("Exchange POP3 server doesn't have an Easter Egg"), we had the PM write up a spec for it, the test lead developed test cases for it.  We even contacted the legal department to determine how we should reference the contingent staff that were included in the Easter Egg. 

    But it didn't matter - we still shouldn't have done it.  Why?  Because it was utterly irresponsible.  We didn't tell the customers about it, and that was unforgivable, ESPECIALLY in a network server.  What would have happened if there had been a buffer overflow or other security bug in the Easter Egg code?  How could we POSSIBLY explain to our customers that the reason we allowed a worm to propagate on the internet was because of the vanity of our deveopers?  Why on EARTH would they trust us in the future? 

    Not to mention that we messed up.  Just like the NT 3.1 Easter Egg, we had a bug in our Easter Egg, and we would send the Easter Egg out in response to protocol elements other than the intended ones.  When I was discussing this topic with Raymond Chen, he pointed out that his real-world IMAP client hit this bug - and he was more than slightly upset at us for it.

     

    It's about trust.  It's about being professional.  Yeah, it's cool seeing your name up there in lights.  It's cool when developers get a chance to let loose and show their creativity.  But it's not cool when doing it costs us the trust of our customers.

     

    Thanks to Raymond, KC and Betsy for their spirited email discussion that inspired this post, and especially to Raymond for the awesome links (and the dirt on my broken Easter Egg).

     

    Edit: Fixed some html wierdness.

    Edit2: s/anacronym/anagram/

    • >> The explanation you gave is not even childish - some thing sillier than that. Asking for removing easter eggs and accepting that as a proof that they'll never get malware embedded in the software for instance.

      Ummm, he never said that...
    • > One of the aspects of Trustworthy Computing
      > is that you can trust what's on your computer.

      Trustworthy computing is a DRM initiative. That sentence should read:

      + One of the aspects of Trustworthy Computing
      + is that Microsoft can trust what's on your
      + computer.
    • "A good example of how these things can get out of control, is the "NSA_Key" issue from some years ago. Presumeably someone chose that name as a joke, or without considering its potential interpretation. Whatever the rationale behind the name, it caused a firestorm of protest about "NSA backdoors" in microsoft products. Oops! "

      It's more than just that, Windows used to have entry points named:

      Death
      Resurrection
      PrestoChangoSelector
      TabTheTextOutForWimps
      WinOldAppHackOMatic
      UserSeeUserDo
      Bunny_351
      Brute
      FixUpBogusPublisherMetaFile

      I don't think these names are innately bad _private_ names, but to have them exposed in export tables was pretty bad.
    • Of course, the easy way around all of the easter egg stuff is for MS to start putting credits in the About screen for each product. That way the devs get their names up in lights... and everyone's happy.

      That doesn't address the "cool factor" of writing a cool easter egg though, of course.
    • People will complain no matter what you do, it's pointless to try to pander to everybody. The no easter eggs policy just reinforces in my mind the Microsoft is nothing more than faceless, humorless, Big Corporate, and therefore, not trustworthy.

      There's nothing at all wrong with them as long as they aren't "snuck in" and go through the same quality review that everything else does.
    • In fact, the EE is nothing more than just one feature. If we get any MS product, there are many features which are never used by a particular user. It may be because the user does not know about them, or if he does know, he can't imagine how he can use them and what are they added for. In case of EE, the usage is quite obvious - a bit of fun. So EE looks even better comparing with other unused features. The same is true about security: unused features have more security risk, comparing with EE, because EE is quite simple (usually a bit of UI).
    • SPJ wrote:
      "People who have even a little common sense and are that suspicious will ask for source. They know that's the only way to be sure."

      Read the link Larry gave to Ken Thompson's lecture. Even if you compile from source, you *cannot* be sure what the program will do.
    • mschaef:

      http://blogs.msdn.com/oldnewthing/archive/2003/10/15/55296.aspx
    • re adequacy.org

      Have you read that site before, or did you just google for contrary opinions. It sits well alongside the story on AMD: http://www.adequacy.org/public/stories/2002.1.28.153048.268.html

      Not everything is as it seems....
    • me: Actually, Raymond gave me the links, and I used them. Afterwards, I learned that Adequacy was an intentional troll site, but it doesn't really matter, even though it's a troll post, the others aren't.

    • Carlos -
      From the Ken Thompson article - "Figure 6 shows a simple modification to the compiler that will deliberately miscompile source whenever a particular pattern is matched. If this were not deliberate, it would be called a compiler "bug." Since it is deliberate, it should be called a "Trojan horse." "

      In my post, I said we need the compiler source and library source and the OS source etc. to be sure enough to trust a program. You _can_ trust source code, not infected binaries of compiler that produces untrustworthy code from trustworthy source. So if you have source for everything you can theorotically verify that it does what it is supposed to do and just that.

      Oh and yes, the CPU which executes the code needs to be trusted!
    • SPJ wrote:
      > Oh and yes, the CPU which executes the code needs to be trusted!

      Indeed, I gather that some of the CPU microcode update procedures, are now well-known publically. So I guess, with enough smarts, you /could/ trojan a modern CPU. For those not aware, google on "microcode update" (including the quotes) for lots of relevant hits.
    • A good story about "why no easter eggs" is the tale of the Adobe Photoshop easter egg that was snuck in by a developer and therefore wasn't tested. Of course, it had a globalization bug which caused it to crash on systems that used double-byte characters, so there was much embarrassment when, one April 1st, the application became useless in various squiggly-text-using countries around the world. The embarrassment was so great in Japan that Adobe actually went to the trouble of printing stickers apologizing for the problem and slapped them on the boxes sold in Japan.
    • I applaud this policy as a security engineer.
    • Well, this year I didn't miss the anniversary of my first blog post.
      I still can't quite believe it's...
    Page 2 of 3 (35 items) 123