I just posted about the Internet Explorer focus on backwards compatibility over on the IE team blog


There's plenty of bad HTML on the web today, and by that I mean a lack of closing tags, overlapping tags and implied tags. 

I thought I'd add some historical perspective from the back of my mind.

I remember back in 1996 when we started coding up the then new Trident rendering engine for Internet Explorer 4, at that time there was lots of bad HTML content already on the web. Indeed there was probably a higher percentage of bad content then as there were fewer good HTML editing tools available then. We knew that if we couldn't render existing content on the internet our browser would immediately be rejected by our customers. So we coded an in built tolerance for bad HTML. I particularly recall our developers pulling their hair out when it came to matching the table rendering algorithm of the then dominant Netscape browser.

The fact is that content on the internet can and does live forever and any browser must continue to be able to render that content. Does that mean we should encourage such content? Clearly not, but it's important that developers know they can rely on rendering behavior not changing as the internet evolves.