There is a lot of interesting (and once confidential) stuff that came out of the Mix conference this week.

Jon Udell's  "Watching Anders Hejlsberg reinvent the relationship between programs and data" ... offers an enthusiastic summary:

A lot of the time, when we use the web, we’re effectively performing joins among data sources. You visit one site to look up some data, then you grab some of it and plug it into another site. If you’re lucky, somebody will have built a mashup, on a third site, to facilitate that join. But what if your browser had the data manipulation chops to help you do that mashup directly? I’m hoping that technologies like Silverlight and LINQ will enable things like that to happen.

Jon doesn't elaborate a whole lot on what "reinventing the relationship between programs and data" means.  Fortunately, Anders' LINQ presentation itself is now online.  Some points he makes about the new relationship between programs and data include:

Data != Objects  - Now you can query data without it being in a database ... LINQ offers "native support for queries as a first class concept". C# and VB now offer the same expresive power as SQL and XQuery built right into the languages.  LINQ unifies and brings to a higher level of abstraction the whole notion of data-driven programming. 

Imperative->Declarative - Programmers can focus on "what now how" without having to learn a new language such as SQL or XQuery.  Imperative approaches have run their course; we won't see 10x productivity improvements anytime soon, nor will compilers be able to automatically adapt imperative code to new hardware.  But the declarative approach will support 10x improvements in the performance of data-driven applications and the "what not now" approach allows compilers to adapt to multicore architectures and so forth.  The key is to adapt programming styles to a more declarative, query-oriented style.  LINQ supports this.

OO programming languages have no notion of projection, you have to do it with tedious declarations and imperative transformations.  LINQ makes this much easier, with anonymous types and functional construction.  In the demo, Anders usees this approach to transform XML data into business objects with very little code, and no schemas or code generators.

I'd elaborate a bit on the last point.  Lots of technologies exist that leverage some explicit mapping from raw data to programming objects; that mapping might be a conceptual model, a schema, an ontology, or an adapter.  For example, XQuery is used to integrate and query over diverse data sources, but only after the data has been explicitly transformed from its native representation (e.g. SQL tables) to an XQuery data model.  LINQ to Objects and LINQ to XML, on the other hand exploit the IEnumerable<T>, aka monads/monoids, that is implicit in almost all collections of data, or exposed as axes in XML documents.  To me, that's the essence of LINQ's reinvention of the relationship between programs and data: conceptualizing data in an object graph, database, XML document (or catalog,  LDAP directory,  ad infinitum) allows developers to write smaller, cleaner, more parallelizable, etc. programs with much less concern for the plumbing.


Putting all this into a realistic application, Aaron Dunnington and Tim Scudder of the Data Programmability / XML team gave a Mix Presentation that exploits the XML features in Silverlight clients and LINQ, etc. on the server.  Their demo is called "The Socializer", showing how Silverlight applications can bring all sorts of social networking data together and display it in a visually appealing way. Technologies used in the demo include Silverlight, C# 3.0,  LINQ to Objects, LINQ to XML, RDF, FOAF and more. 


Needless to say, all this Silverlight love from unexpected parties isn't going unchallenged.  The most popular statement of the opposing side seems to be the fine rant from Mark Pilgrim. His most trenchant bits are unquoteable, but the gist of it seems to be that Adobe and Microsoft (and Sun's "alternative to Ajax") indicate the return to the bad old days of proprietary technology and vendor lockin.  Astonishingly enough I see it differently: The purely standards-based web is quite functional and interoperable for relatively static content, but the "Web 2.0" demands for browser-centered applications that are dynamic, attractive, and portable are very difficult to meet with currently standardized technologies.  As Dan Ingalls of Sun put it:

AJAX sort of deals with all of the old way of doing things. It makes it simpler, which is great, but underneath it’s still all this junky HTML, Document Object Model, CSS, all that stuff, where 30 years ago, we knew how to do that stuff cleanly with a dynamic programming language and a simple graphics model.

That's NOT to say that Ajax is dead.  As far as I know, there was a lot of Microsoft Ajax support announced at Mix, and more in the pipeline.  I hope and pray we (inside and outisde MS) don't repeat the IE6 mistake of declaring "junky HTML, DOM, CSS, all that stuff" obsolete and stop maintaining the standards and implementations.  It will live on, and improve for some time... while LINQ/Sliverlight/Apollo/Flair/whatever mature and potentially drive a new generation of web standards.  But a new generation of web standards won't be invented by committees, but by developers, then polished  by competition ... then committees can come along and tidy up.  Just like Web 1.0 came about.

It's interesting watching various pundits and analysts read the tea leaves of the numerous LINQ subprojects, Silverlight versions, and incubation projects such as Volta to figure out the "Microsoft" master plan for pulling it all together.  Just as customer pain with today's technology drives diverse innovation across the industry, so it does within Microsoft, and the process resembles evolution in action more than intelligent design by a supreme architect.  I don't think we're seeing a silly season so much as a Cambrian explosion of new ideas from all sorts of places, and Father Darwin alone knows how it will end up.