Last time, I spoke a bit about the general process of enabling search engines to index content in a Silverlight web application.  This time, I’d like to deep-dive into the SilverlightStore SEO Example application available online.

Overview of the Sample Application

The SilverlightStore application is a super simple application that is light on functionality and really intended to provide a clear overview of how to SEO-enable a Silverlight site.  It uses Silverlight 3, ASP.NET and the new .NET RIA Services.

Figure 1 – screenshot of the SilverlightStore sample application.

The sample application uses a simple XML file (“/App_Data/Products.xml”) containing information about Microsoft products.  This information is used to populate a collection of POCO Products (via LINQ) that are exposed to our Silverlight client using a .NET RIA Services DomainService.

The DomainService exposes product information to our Silverlight client and also to our ASP.NET controls (via the DomainDataSource control).

Logical Flow
Figure 2 – Logical separation of components in the sample application.

The beauty of this flow is that our content comes from 1 centralized source: the DomainService.  One could replace the DAL with ADO.NET Entity Framework, LinqToSQL, nHibernate, etc without breaking the flow.  Similarly, one can bring on new clients (such as an AJAX client) and plug those into the DomainService.

The example application contains a bit of Silverlight code to consume the DomainService and display products and product details—I won’t be getting into those details here*.

*Unless one would find it interesting, in which case please do let me know.

Generation of Down-Level Content

Let’s dig into the good stuff.  This application uses the same Master to Content Page relationship described in part one so I won’t dwell on that point.  What I do want to review is the use of the ASP.NET DomainDataSource control used to query our DomainService for entities and generate the corresponding down-level content.

If you crack open the Default.aspx page, notice the use of the DomainDataSource and how it refers to the ProductsDomainService.GetProducts() method.  We then use this DataSource control with a traditional ASP.NET ListView control to render all of our products inside of the Silverlight <OBJECT> tag defined in the MasterPage.

<%@ Register TagPrefix="ria" 
Namespace="System.Web.DomainServices.WebControls" Assembly="System.Web.DomainServices.WebControls" %> <asp:Content ID="DownLevelContent" ContentPlaceHolderID="SilverlightDownLevelContent" runat="server"> <h2>Products</h2> <ria:DomainDataSource runat="server" ID="ProductsDomainDataSource" DomainServiceTypeName="SilverlightStore.ProductsDomainService" SelectMethod="GetProducts" /> <asp:ListView runat="server" DataSourceID="ProductsDomainDataSource" DataKeyNames="Name"> <LayoutTemplate> <ul class="product-list"> <asp:PlaceHolder id="itemPlaceholder" runat="server" /> </ul> </LayoutTemplate> <ItemTemplate> <li class="product"> <span class="image"> <a href="Product.aspx?Name=<%# Eval("Name")%>"> <img src="<%# Eval("ImageSmall")%>" alt="<%# Eval("Name")%>" /> </a> </span> <br /> <span class="name"> <a href="Product.aspx?Name=<%# Eval("Name")%>"> <%# Eval("Name")%> </a> </span> <span class="summary"><%# Eval("Summary")%></span> <span class="price"><%# Eval("Price", "{0:c}")%></span> </li> </ItemTemplate> </asp:ListView> </asp:Content>

The Default.aspx page renders a product listing that contains links to the Product.aspx product detail page.  If you dig into the Product.aspx page, you’ll notice that it uses the DomainDataSource control in a similar fashion but displays all of the product details.

If you disable JavaScript (or Silverlight specifically), load the Default.aspx page in your browser and notice that the down-level experience provides you with the same content as you’d see in Silverlight.  You’re able to review all of the products and dig into the details—you just miss out on some of the fancy transitions and effects.  Not only are we making this content indexable by search engines, we’re also broadening our client reach.

Sitemaps and Robots

Indexable content is all fine and well … but how do you let the search engines know you exist?  An organic approach is to have external sites link to you and search engine crawlers will naturally stumble upon your site and begin indexing it.  (In fact, this will happen no matter what you do.)  By why leave things to chance?

If you’re serious about SEO, there are two components you must add to your site:

  1. A Robots.txt file.
  2. One or more XML sitemaps.

The Robots.txt file is used to define directives for search engine crawlers to obey. (Note: they are not required to obey these—but most will.)  You can define what user-agents are allowed access to what areas of your site.  You can even declare the location of your sitemaps.  (We’ll get into sitemaps in a moment.)

For a real world example of a robots.txt file, I’d recommend you check out Amazon’s robots.txt here:  You’ll notice that it defines a few areas of the site that should not be indexed and also provides some sitemap links.

XML Sitemap

Sitemaps are used to provide search engines with … well, a map of your site.  :)  You can define all of the important entry point URLs that you want search engines to index.

You should definitely provide links to your common entry points in your sitemap as well as additional information to indicate the update frequency and priority.  That’s not to say crawlers won’t continue to crawl your site without sitemaps—they will, but using a sitemap allows you to express more information about your site than crawling offers.

If you look at the “Sitemap.aspx” page inside of the SilverlightStore sample application, you’ll notice that it makes use of the DomainDataSource and an ASP.NET Repeater control that are used to iterate through all of the products and generate XML (conforming to the sitemap standard) with links to all of the product detail pages.

<%@ Register TagPrefix="ria" 
Namespace="System.Web.DomainServices.WebControls" Assembly="System.Web.DomainServices.WebControls" %> <ria:DomainDataSource runat="server" ID="ProductsDomainDataSource" DomainServiceTypeName="SilverlightStore.ProductsDomainService" SelectMethod="GetProducts" /> <asp:Repeater runat="server" DataSourceID="ProductsDomainDataSource"> <HeaderTemplate> <urlset xmlns=""> </HeaderTemplate> <ItemTemplate> <url> <loc>
<%= new Uri(this.Request.Url,"Product.aspx?Name=").ToString()%><%#
loc> <lastmod><%# Eval("LastModified")%></lastmod> <changefreq><%# Eval("ChangeFrequency")%></changefreq> <priority><%# Eval("Priority")%></priority> </url> </ItemTemplate> <FooterTemplate> </urlset> </FooterTemplate> </asp:Repeater>

Putting It All Together – Deep Links

Now we have a Silverlight application with content, the corresponding down-level HTML content, a robots.txt and sitemap to help guide search engine indexing.  What’s left?  We need a good deep-linking story!

Now that search engines can index the down-level deep-links to our product detail pages, how do we respond to those deep-links if a user visits with Silverlight installed?  We need to convey the deep-link information contained in the URL to the Silverlight application.

We do this in the SilverlightStore application by passing the deep-link information to our Silverlight client by using InitParams.

If a user arrives at our site via a link like http://<site>/Product.aspx?ProductID=123, we can handle this inside of the Products.aspx by passing the query information to the Silverlight control.  (This is done in the sample application by appending to the SeoSilverlightApplication.InitParams property---but if you’re rendering your own <OBJECT> tag, you’ll want to handle this <PARAM> modification yourself.)

The InitParams values are treated as a dictionary collection inside of the Silverlight application and you can access these by either listening to the Application.Startup event or by registering your own application service.  (The sample application uses an application service called DeepLinkService.)

The only thing left is to determine how to map URL deep-link information to Silverlight state.  In the sample application, we pass the product name as a deep-link and the Silverlight application understands that it must construct it’s own internal XAML Uri to load the product.  There are plenty of other ways of doing this such as making use of the HttpRequest PathInfo string in coordination with UriMapping inside of Silverlight 3.


Once again, we see that using Silverlight 3, ASP.NET and .NET RIA Services together provides a powerful array of tools that can be used to quickly build a compelling Silverlight application that is indexable and discoverable.