“HTML5” is huge. Different specifications are at different status: First Public Working Draft, Working Draft, Candidate Recommendation, Proposed Recommendation, and lastly Recommendation.
As we stated many times before, it’s important to make it right.
Browser makers have a big responsibility with developers: it’s wrong to claim “standard support” for a specification that is still changing. Or, at least, they should make clear it is still a work in progress (for example, by using vendor prefix extensions).
Rushing the implementation of a specific feature and call it “done deal” is dangerous and in some circumstance can bring to unpleasant results. Today we’ve seen an example of this with an important specification: Web Sockets.
Web Sockets enable Web applications to maintain bidirectional communications with server-side processes. The specification (from the WebApps WG) is currently in the “Working Draft” stage; there is also a dependency on the Web Sockets Protocol, discussed in the IETF hybi mailing list.
Chrome 4+ has been the first to implement it in a “final RTW build” at the end of 2009, followed by Safari 5.0.2. Instead Firefox was planning to support them in the version 4beta and Opera in version 11, but they never went in production (credits for monitoring its status before going live). IE doesn’t implement this spec in current build.
Over time the specification changed and each browser tried to adapt by releasing sequential implementation updates. Some developer even built “The Ultimate HTML5 Browser Support Test” based on specs like this that are still in Working Draft stage. A few users commented on IE Blog about IE being “the only browser” not supporting it yet.
On Nov 26th, Adam Barth shared the results of his experiments with IETF.
“The Upgrade-based handshake is vulnerable to attack in network configurations involving transparent (or intercepting) proxies.”
“The Upgrade-based handshake is vulnerable to attack in network configurations involving transparent (or intercepting) proxies.”
In other words, the current protocol used by Web Sockets might be unsecure and unstable.
Based on this discovery, Firefox and Opera promptly did the right thing, announcing they would disable Web Sockets in future releases – until a solution is found. At the time I’m writing this post, I haven’t seen any announcement made by Google or Apple yet, although I’m confident they will follow soon (as Mozilla developers do).
In developer language, this is what I call a “breaking change”. Changing (or even worst, removing) support for a feature from a version to the next one – possibly breaking all applications built by developers based on the assumption that the feature (a supposedly “web standard”) would have been supported moving forward.
After today announcements, I wonder how many apps using Web Sockets will need to be brought offline, or “freezed” or “re-written” – at least until the spec get more solid and secure? Did these developers use browser feature detection, or they just assumed that any version after (for example) Chrome 4 will support that feature the same way?
What about those existing enterprise solutions, for example Kaazing, that rely on the assumption that “most browsers” already support Web Sockets? Will they just use the Flash-based layer (as they do for IE today) for any browser?
Personally I like Web Sockets. I’m looking forward to seeing them available in all browsers. But I also do care about consistent implementations, that work the same (interoperable, secure, stable) way across any browser – over time. I don’t want to write some code today, falling in the “(non) Web Standard trap”, and then have to re-write my code in 1 year from now because that particular implementation wasn’t exactly ready for prime time yet and has been removed or changed.
I don’t want to see the “IE6 phenomenon” happening again to “Chrome4” or others. Do you?
Someone today was so disappointed that he created a short humoristic video…
Time to go back to W3C and IETF to discuss what went wrong and look at what we can do to accelerate the progress of this (and other) specs...
I wouldn't compare it to the IE6 catastrophe, but I am curious what the solution to web sockets will be.
Thats a weird comparison when one company auto updates properly and the other has zero auto updating aside from the OS level.
This argument might make more sense if Chrome didn't have seamless updates. If WebSockets (or any other "alpha" feature) are deemed unsafe or problematic, Google pushes an update out.
I think you are making a good point. Is "seamless updates" in the browser "enough" in this scenario?
We are talking about a feature that has been available for more than one year now.
It wasn't labeled as "incomplete" to developers. It wasn't using vendor prefix extensions (like, let's say, newest CSS3 features...or new performance APIs).
I think it's good that browser vendors experiments with work in progress specification, but at the same time - they have a responsability with developers to explain clearly that "it might change" in the future.
Otherwise, no matter how quickly you update the browser, developers website will break.
what do you think?
It should be pointed out that Opera 11 hasn't actually shipped yet--the WebSockets implementation has only been in alpha/beta builds. When 11 actually is shipped, it will be behind a preference just like FireFox 4.
@Mike: thanks for the correction, for some reason I thought Opera11 was already out. I updated the post above.
It's not the standard that's at fault, it's the proxy servers.
As is stated by a previous commenter, the problem is not in the WebSocket protocol itself, it is in the caching proxies that can be corrupted. This is a significant, but fairly important distinction.
WebSockets can be used to trigger the problem, but so can Flash and Java. Neither of the latter technologies are being disabled though. In fact, in Adam Barth's paper, he used Flash and Java sockets to run the tests, not WebSockets.
The protocol needs to be updated to avoid causing this issue, and it is sensible of the browser vendors to pause while this is done. However it is also important not to be an alarmist about this and condemn WebSockets to the scrapheap. The problems will be removed from the protocol (to accommodate vulnerable proxies), and WebSockets will continue to be a great option for pushing data to connected clients.
@Max Williams: From what I've read, nobody is being alarmist or condemning WebSockets to the scrapheap. What people are saying, which is rightly justified, is that the specs need more time to be fully vetted before being pushed out as a production-ready implementations, WebSockets included. That's not a bad thing, especially when it comes to the safety of users.
Is it then your position, Giorgio, that IE8 should have vendor-prefixed its use of postMessage, querySelectorAll, and XDomainRequest?
I do not believe that Chrome 4's use of WebSockets will hold back the web because:
- Chrome 4 had few users, as a proportion of the web
- It has virtually zero users now
- Unlike with IE6, Google has demonstrated that it isn't going to disband the team after releasing Chrome 4, leaving it to grow in market use but not keeping up with the direction of standards and developer techniques.
Even though they're a competitor, I think Google has done a good job with updating users, living with the cost of removing speculative features once they are outdated by standardized ones, and setting developer expectation.
As Mike mentioned, Chrome auto-updates so I don't think we need to worry about having a ton of old clients out there. I think it's important to keep a few things in mind. First, WebSockets server support is still not trivial to set up. You have to jump through hoops at the moment, and in jumping through those hoops you will probably figure out that WebSockets is still in development. Yes, it will change, but if you've managed to set it up you can hopefully manage to upgrade your server code. We wanted to ship WebSockets so that people could experiment and provide feedback as the spec was being developed. Trying to design by committee in a vacuum without any implementation experience often doesn't go well. The cost of doing that is that people have to be willing to move along to the next version when it's ready. We have made it clear from the Chrome side that we intend to ship the updated version of the protocol.
It's important to note that the research paper Adam Barth et al published does not demonstrate a working attack against the actual WebSocket implementation, but rather against one part the protocol taken in isolation. There are other parts of the protocol that would make an actual attack more complicated in practice. Given that their experiment was conducted with existing, shipping plugins, and given that even so it only targets a small percentage of users, the likelihood of someone actually taking the effort to try to exploit someone using the WebSocket code shipped in Chrome (against which an attack has not yet been demonstrated) seems like a bit of a stretch at the moment. We already have detailed a proposal for a more secure version, and are addressing various concerns that have been raised by others in the standards community.
Senior Product Manager, Google Chrome team,
Editor, WebSocket Protocol
@Mike: I agree with you, Chrome did a good job shifting users from version to version. And I also like how they are using Canary builds to experiment with the platform and I’m OK if their Canary build (with some flag enabled) doesn’t display a website…that’s where I enjoy more as a developer. Experimenting and pushing the boundaries until breaking the product. Same apply to IE Platform Previews or Firefox Nightlies and so on… :)
As you noted, today they are still a small (but growing) proportion of the web. Adding, changing or removing what you called a "speculative feature" between versions won't have probably a large impact on public websites.
Actually, in an ideal world, it should have almost zero impact. If the browser doesn’t screw (I’m thinking about us in the past…) AND if a developer uses correctly features detection and other best practices, his website will be solid and ready to “resist” over time…regardless what feature each browser support (or stop supporting).
In my experience, this is not always the case. There are developers (or tools?) that sometime make assumptions based on what's available “today”. [honestly, I admit I did the same mistake in my early years of web development].
In this scenario, it’s not important that in Dec 2009 Chrome version was 4 and today most of Chrome users are on 8. Instead, it’s important what features were available in that December, because developers building websites took their design and architecture decisions based on the features available at that time. Maybe without thinking that the same supposedly standard feature could have changed one year later. And today all those “ex-Chrome4-now-Chrome8” users risk to find a broken site. [And btw, there is no roll-back button…]
I’m obviously thinking about the worst case scenario, but these things happened in the past, and unfortunately keep happening today. Even recently, I found a website using Canvas…that was working only on one browser. The code was fine, I’m sure it would have worked on Trident and Gecko and WebKit – but because the developer wasn’t doing feature detection properly (note: this is a euphemism :)), it didn’t.
IE has a long history. The IE team did amazing things in the past. And yes, they also did big mistakes (you mentioned just one, I can bring many more ;)). Everyone does their own mistakes. I’m ambitious, but I also like to be realistic. As long as we learn from each other mistakes, I’ll be happy. :)
Btw, I like your terminology for “speculative features”. I wonder however how to define that boundary between “speculative feature” and “feature”.
Should browsers be responsible in public builds and take 100% risks in the “developer” builds? Or – looking at other points of views - how much risk is the user willing to take with their browser? How much risk is a developer considering when he build his website?
@Ian: ops, I just saw your message. Thanks for giving an update here, it's really appreciated. It’s great to see the conversations already happening at IETF; it must have been two very busy days :). For other readers, you can follow at www.ietf.org/.../threads.html (Ian please correct me if I’m missing other locations).
>You have to jump through hoops at the moment, and in jumping through those >hoops you will probably figure out that WebSockets is still in development.
I suspect you are obviously in a good position to know how Web Sockets as a “speculative feature” (ok ok, I’ll stop calling it this way :)) are being used at the moment. I’d be interested to know how many developers are already using Web Sockets. Is this something you are tracking?
Btw, if it wasn't clear - I genuily love the idea of having sockets in a browser. I can think about some interesting scenario coming out from there...and looking forward to them reaching a solid level of stability. I appreciate all browsers (and in general all people in IETF community) efforts to make this possible!
Well, IE9 is going to have un-prefixed <video> and <audio>, and those specifications might yet change such that what's in IE9RC isn't compliant (even IE9 final, since I think that will be before HTML5 reaches Recommendation status). I don't think they're wrong to do so, either!
Sites can be broken by any change, any bug fix (including, as all ES5 implementors have found, improving standards compliance), any addition of a feature or change in settings of a feature (like ActiveX activation). IE6 is a problem because people were encouraged to write to features that were never submitted to standardization, identified bugs in standard support weren't fixed, and performance and security weren't improved. And then it had many years to metastatize across web sites, documentation, tooling and similar. In a standards-based web, there are multiple implementations to break ties and improve site testing, and developer expectation isn't that you can write it and forget it for 5 years.
The bigger risk is that people don't implement ahead of standardization, and get broad developer, implementor and user feedback to shape the final form. I would much rather see Microsoft participating in the discussions earlier and more actively, and proposing things like site integration points for standardization, than waiting to implement anything until it's Fully Standardized.
>I would much rather see Microsoft participating in the discussions earlier and >more actively, and proposing things like site integration points for >standardization, than waiting to implement anything until it's Fully >Standardized.
Great, we are on track then ;-)
Standardizing HTML6: blogs.msdn.com/.../standardizing-html6-through-the-w3c-my-trip-to-tpac-2010.aspx
Standards Development at W3C TPAC: blogs.msdn.com/.../web-standards-development-at-the-w3c-tpac-2010.aspx