One of the major investments we’re making during the IE9 project is support for more web standards. Web developers all around the world consistently gave us feedback that they wanted to use the same pages with the same markup across browsers. By working closely with the W3C and its members on the newest web standards, we can make that dream a reality for web developers.
Today, we released an updated version of the IE9 Platform Preview build. In conjunction with implementing support for several more web standards, we developed more test cases. These new test cases are available as usual on the IE Testing Center. We’re formally submitting these 118 new test cases to the W3C for review, feedback, and inclusion into the official W3C test suites for each of these web standards.
One of the questions that we hear each time we write, publish, and submit new test cases to the W3C is, “How should I think about the IE Testing Center?” There are several dimensions to the IE Testing Center so I’ll go through them one-by-one.
The IE Testing Center is part of the current IE project. Like we did during the IE8 project, we will make these test cases available immediately to the web community and submit these cases to the W3C working groups for inclusion into the official test suites.
There are two main tables on the IE Testing Center. The first table is merely a results rollup of the second table. The second table has links to each test case we’ve developed during the IE9 project for each standards specification.
The columns (aka Browsers)
The columns represent the most recent broadly available version of the biggest browser engines. This means Gecko, WebKit, Trident, and Presto. Given that there are two major WebKit implementations and they don’t always use the same version, Google Chrome and Apple Safari have separate columns. Based on feedback from other W3C members, we also added IE8 to the table for consistency.
The rows (aka Standards)
The rows of the first table include the core technologies that web developers have told us are most important to them when they consider the modern, under-development web technologies.
Proposed HTML5 features have received a lot of attention in recent months. In practice, the functionality described in the W3C’s HTML5 specification actually depends heavily on many other W3C specifications. To make sure that HTML5 works correctly, it’s important to test some of these other foundational technologies as well. This is basically the greater metropolitan area of “HTML5”, which includes HTML5 itself plus the suburbs of SVG 1.1 2nd Edition, CSS3, DOM L2 and L3, and ECMAScript 5.
The second table includes links to each test case that Microsoft has submitted to each W3C working group for inclusion into their official test suites. This is a proper subset of all officially submitted cases to the working groups.
The cells (aka the results)
Each cell in the first table is a summary of the pass rate for each specification across each of the major shipping browser versions. These are contrasted with the most recent IE9 Platform Preview. The cell coloring is simply Microsoft Excel 2007’s conditional formatting Green – Yellow – Red color scale, which provides a smooth color gradient from Red to Green as the pass percentage increases.
Each cell in the second table is the test result for a specific test case on a given browser. These are simply listed as Pass/Fail and colored Green/Red, respectively.
Another question that comes up a lot is “Why is the IE9 Platform Preview so green while others aren’t?” When we make a decision to implement a given web standard, we methodically walk through the specification and start building the test cases for the spec while also building the implementation. It resembles test driven development, which works well for web standards, as long as there is a comprehensive test suite. If we fundamentally change a test case based on feedback, it usually drops the current IE9 Platform Preview’s pass rate.
Feedback on the test cases
Thank you, Jason Upton Test Manager, Internet Explorer