The Improvement Map database is running on SQL Server 2008 and has a straightforward design that contains 27 tables. The need for stored procedures is reduced to just one due to the use of the Entity Framework. The database contains one view which consolidates data from 10 tables and is used as an index for advanced filtering capabilities. There is also full-text index that is used for the keyword search capability.
The Data Access Layer (DAL) for the application is made up of an Entity Data Model (EDM) built using the Entity Framework (EF). It exposes the tables/view as entity classes and provides an object model to programs and query against. One of the biggest benefits of using EF for our DAL was the speed of development and the ability to leverage two other technologies in the .NET Framework 3.5 – ADO.NET Data Services and Dynamic Data (which I describe next). And my favorite feature is NO TEDIOUS CRUD STORED PROCEDURES REQUIRED TO WRITE AND TEST!
One of the biggest shifts in development when moving from traditional web-based applications to a RIA platform like Silverlight is accessing data. Getting data to the client is primarily done through web services - via REST, SOAP, RSS and others. Usually you write a web service that exposes all of the CRUD operations you need to consume your data model. To write and test all of this code is a tremendous amount of mundane work. The alternative is to use a tool like ADO.NET Data Services which was introduced in .NET 3.5. It allows you to automatically generate a RESTful web service that exposes your EDM for consumption over HTTP.
As with most data driven applications we faced the task of how we were going to get the data into the database and manage it over time. One option was to write an application that would allow users to add, edit and delete the data –which for the most part would not be very mentally challenging.
Or we could use a tool like ADO.NET Dynamic Data which was also introduced in .NET 3.5. It enables you to automatically create a fully functional web site from an EDM.
The web site provides all of the screens needed to manage the data in the database – lists, add, edit and delete screens. The Dynamic Data application can be customized using data annotations and partial classes. You can also completely replace the auto-generated screens with your own if your requirements dictate specialized behavior.
Logging errors in a traditional web application is pretty straightforward because your code is running on the server and it can be given access to a logging provider. With the Silverlight application running in the browser, if errors occur they need to be sent back to the server for logging. We considered creating a web service but for simplicity sake we decided to create a simple ASP.NET page that can be called to save the exception details using log4NET. This worked out great because all of the other applications are using log4NET and we were able to leverage the existing infrastructure.
One of the current limitations of Silverlight that people often mention is the in ability to print. While Silverlight as a platform does not provide a printing mechanism, it is easy to use other tools that provide this functionality. The tool we chose is called Tall PDF to dynamically generate a PDF on the server and send it back to the user’s browser for viewing and printing. The nice thing about generating a PDF is that you have complete control over the format of the printed version. In traditional HTML-based web applications, controlling the format of printed versions from the browser print mechanism is often difficult and limited.
We wanted to be able to provide personalization and permissions based on the user. We have a custom security framework that all of the other applications use and we needed to integrate it to provide centralized configuration and single sign-on capability. We wrapped the security framework in an ASP.NET membership provider and then we utilized a class in the .NET Framework called System.Web.ApplicationServices.AuthenticationService. This class allows you to configure a membership provider and expose it as a WCF web service. The service is secured using SSL to mitigate the security risks of transmitting user’s credentials in plain text. Now users can login to the main web site or Silverlight application and be authenticated in both.
The main IHI.org site has a wealth of information pertinent to the data in the tool. We wanted an easy way for the content authors to be able to relate content on the web site to data in the tool without duplication. So we modified the public site’s CMS to allow an author to tag content with special tags created for the tool. The tool then makes a call to a dynamic RSS feed passing in a tag as a parameter. The feed is then generated on the fly and passed back to the tool for display. This allows the authors to manage the content in their normal workflow.
Leveraging some of the tools I mentioned earlier (Entity Framework, ADO.NET Data Services and Dynamic Data) freed us up to spend a lot more time on designing and developing a compelling and intuitive the user experience (UX). We estimate that the use of those tools saved us at least 3 months of development and testing time. It allowed us to be more agile and responsive to user feedback and the dreaded “requirements clarifications.” We were also able to make modifications to the data model much later in the project lifecycle than is typically recommended.
The application development started before the final version of Silverlight 3 was released so we developed in Silverlight 2 hoping we could upgrade to Silverlight 3 in the middle of the project. Immediately after the release version 3 we attempted an upgrade of the project and it went smoothly and only required a few minor coding changes. We were eager to try out some of the new v3 features like element to element binding, visual effects, GPU acceleration, animation easing functions, increased performance, better XAP compression (25% smaller), and merged resource dictionaries.
One of the features provided by the ADO.NET team is a Data Services library specifically designed for consuming an ADO.NET Data Service (ADO-DS) from Silverlight. In Silverlight you write LINQ queries against the data source as if it was local. The query is automatically converted to a RESTful URL and sent to the ADO-DS, the ADO-DS returns the data asynchronously as an ATOM feed. The data is then parsed and rehydrated into .NET objects and handed to you so for you to use in the application. The ATOM format is very verbose and can create large payloads but we turned on HTTP compression and reduced a feed that was 1.5 MB to about 60K – about the size of a medium image on a web page. ADO-DS has the capability to send data using a more succinct JSON format but the Silverlight client library only understands ATOM in this release.
We wanted to architect the application in a modular and testable fashion so we looked at the guidance from Microsoft’s Patterns and Practices group on building modular Silverlight (and WPF) applications called PRISM2. It provides code and documentation that helps you build modular applications. You can pick and choose the parts of PRISM you want to use and you can ignore or replace portions with your own. The elements that we leveraged were: a decoupled event bus, dependency injection, UI region management and a commanding infrastructure.
The Improvement Map application does not appear immense but with 17 Different Views and User Controls, and 5 services (traffic logging, exception logging, security, user settings, content feeds) building, testing and maintaining/enhancing it would have been difficult without a modular architecture.
The initial learning curve for PRISM was not that steep and the reference implementation provided as part of the guidance served as a great example to see how the concepts can be implemented in an actual application.
To continue with the goal of a modular architecture, we knew we needed to have a clear separation of UI from the business logic and data layers.
We looked at several design patterns and chose to go with the awkwardly named Model-View-ViewModel (M-V-VM) pattern which has become very popular with developers because it is tailored specifically to work with the rich databinding provided in Silverlight and WPF.
The idea is that each View has a ViewModel which is a model (data) that is tailored for display in the View and optimized for databinding. The Views are pretty dumb and their primary job is to display data and allow a user to interact with it.
The one weak area with Silverlight and M-V-VM is the lack of the rich commanding infrastructure that WPF provides. PRISM2 has a commanding implementation that we utilized but in many cases we resorted to simple event handlers in the view that call methods on the ViewModel to initiate commands/actions.
Given that applications built in RIA technologies do not have separate web pages like a HTML-based web application does, there is no inherent traffic logging that is provided by the web server. So you have to explicitly implement a mechanism to track user activity when moving through your Silverlight application.
Our other applications all send their traffic to Google Analytics and we wanted to be able to include the Improvement Map traffic data in that same dataset. So we devised a “fake” URL scheme that looks like normal web traffic but is actually being manufactured by the application.
Whenever a part of the application wants to track usage, it publishes a traffic event using the PRISM event bus called the EventAggregator. The traffic logging service (which is injected via Unity) subscribes to this event. When it receives notification of traffic, the traffic logging service sends off a request to Google to store the data.