Over the last two days I’ve been attending the ODF Interoperability Workshop, a fascinating event that brought together ODF implementers from many countries to talk about the issues and collaborate on interoperability testing. The workshop web site covers the details of the agenda, provides a variety of related content (including the presentations), and lists the objectives of the event:
The aim is to provide a low-level hands-on interoperability testing environment in which vendors and community members can fine tune the interoperability capabilities of their ODF implementations and make test cases, recommendations and create best practices for implementors. The ultimate goal is to achieve full seamless interoperability for the entire feature set of ODF across all suppliers, platforms and supported technologies. The workshop is meant for people who write and architect the code to handle the actual ODF in applications - desktop editors and viewers, online apps, mobile, etc. Participants should represent every major team behind the various competing ODF products, their direct (technical) management and community leaders, as well as the members of the ODF OIC committee.
The aim is to provide a low-level hands-on interoperability testing environment in which vendors and community members can fine tune the interoperability capabilities of their ODF implementations and make test cases, recommendations and create best practices for implementors.
The ultimate goal is to achieve full seamless interoperability for the entire feature set of ODF across all suppliers, platforms and supported technologies.
The workshop is meant for people who write and architect the code to handle the actual ODF in applications - desktop editors and viewers, online apps, mobile, etc. Participants should represent every major team behind the various competing ODF products, their direct (technical) management and community leaders, as well as the members of the ODF OIC committee.
It was a productive two days, both in terms of what was accomplished in the official activities of the event and also in terms of the networking opportunities it provided.
The first day started with a speech by Frank Heemskerk, the Minister of Foreign Trade for the Netherlands. He discussed the the Dutch government’s policy on the use of open standards, and made a direct appeal to the attendees to “go beyond compliancy and help achieve broad-based open standards."
Mr. Heemskerk was followed by Ineke Schop, Program Manager for Netherlands in Open Connection. Ms. Schop described her view of the goals and aspirations of open standards in general, as well as some of the specific steps being taken by her organization to deliver on those objectives.
After that we got into the details of working through ODF interoperability issues. There were a variety of sessions by implementers and members of the ODF TC and OIC TC, and you can find all the details in the agenda posted on the workshop web site. Note that the presentations are also included in the online agenda – several have already been posted, and the rest will be available soon. Video interviews were recorded with many of the attendees, and those should be available soon as well.
It was great to see some old friends again, and I also met many people I knew before only through their voices on ODF TC calls or their online presence in the ODF community, including Oliver-Rainier Wittman (Sun), Mingfei Jai (IBM), Marc Maurer ( AbiWord), Zaheda Bhorat (Google), and many others. My colleague Peter Amstein, the chief architect of our ODF support, was also in attendance, and it was an opportunity for him to get to know the people behind many other ODF implementations.
During the afternoon of each day, we did interoperability testing and had many informal discussions about specific technical issues. Some of the tests were based on specific issues that people already knew about, and at other times we worked through specific scenarios that OIC TC chair Bart Hanssens had defined, as well as scenarios that attendees created. This testing resulted in identification of varying interpretations of the spec, bugs, and other issues that can now be resolved to improve the overall state of interoperability. I’ll not be talking about specific details of those tests, because we were asked to conduct ourselves in accordance with the Chatham House Rule and not name specific products in post-event blogging or reporting. This policy was in place to assure that the event could be productive and results-oriented, and I’d say this worked very well – all of the implementers were open and pragmatic about working through issues that came up in testing.
There were many bloggers and Twitterers in attendance, so I expect others will post their thoughts on the event after everyone gets back home; I noticed that Floschi already has a nice summary posted.
It was a very useful event, and I’d like to give special thanks to Fabrice Mous and Michiel Leenaars, who worked tirelessly to provide a great experience for the attendees.
PingBack from http://identi.ca/notice/5411796
Wow! That looked like a busy 2-day hack-fest!
Will we get more data on the scenarios described here: http://plugtest.opendocsociety.org/doku.php ?
Embedded spreadsheets in text and in presentations are left blank (no implementation bugs listed)... Is it because of radically different approach to the problem from suite to suite (one embeds a complete ODF document, another simply copies over a 'table' element, following draft ODF 1.2...), that made testing irrelevant?
About my former comment, on your former blog post: yes, I understand you value documentation in a development process; indeed, documentation is extremely important, especially if looking for interoperability (in some cases, access to source code would help); taking part and actively pushing these plug-fests forward is GOOD.
About a 'filter', let's change the word to parser/loader, would there be possibility, in the future, to have access to alpha, beta, nightlies etc. of the ODF parser/loader for Office 2007 SP2 so that someone, on the other side of the world, and who can't speak English, can try it?
If I take other suites' situation, several have localized versions, and use localized mailing lists for user comments and support (some also have localized developers mailing lists for major languages); that allows a Vietnamese user to report a bug in, say, glyph alignment, discussion on the bug, how reproducible it is, BEFORE it is translated into a main developers' bug report (by then, it has been well documented, and sometimes faulty code detected).
For example, I recently reported a valid bug in an MS product, in my native language (which is one of the 10 most spoken languages in the world); the following answer and bug report (apart from my submission) were translated in English. I could recover and continue until the bug report was closed, but it all took place in English.
Here, you'd have already lost most of the world's population that doesn't speak English: how can you discuss and document a bug, something that requires precision and understanding, if one side doesn't speak the language of the other? Localized support would be nice, at first - even if informal (mailing lists).
When you don't have code or can't compile, you can at least make use of daily builds, and send said build to a bug reporter to test if the bug was solved; in many cases, if you simply give access to daily builds publicly, you can allow users that submitted a bug to maintain and close their bug themselves - offloading your own staff's workload, and reducing inertia (the delay between a bug appearing/changing and developers being notified).
"non-registered non-paying customers" could be: users of previous MS Office versions, wanting to test future builds without wanting to go through the hurdle of registering to Connect beta programs (many are invite-only, so they are not even sure they CAN register) or of paying for an MSDN subscription. So yes, in opposition with paying, registered customers :p
Why would they want to do that? To test a future product's advancement, and see how it fits against their requirements. If they think it matches mostly well, but notice an annoying bug, THEN they may want to register and report said bug - not before.
Mitch: the participants agreed to keep the plugtest wiki alive, so make sure you'll visit it on a regular basis ;-)
@Bart: I'll certainly keep an eye on the wiki. Note that a complete description of an interoperability problem is a very good thing, very open trial versions of solutions to said problem is an even better thing (thus my lengthy comment above ^^)