At last night’s meeting of the Learn Silverlight group I’m a part of, the topic of search visibility of Silverlight pages was brought up. I had heard a few times that one of the advantages of XAML being made up of text, and of the Silverlight .xap container being a standard zip was that it was supposed to make things easier to parse and search, but I hadn’t seen an actual example of this. There was quite a bit of interest in figuring out how to index a site that was made up primarily of Silverlight with the content accessed through the container, so we began poking around.
As with many of the questions that are asked in our group, we started looking for answers by seeing how Vertigo did it. I remember that you could share links to specific pieces of memorabilia on the Hard Rock site, so we went there. I grabbed a random piece of memorabilia and grabbed it’s permalink.
From the link, we could see that they were just passing in an argument to the base page. Loading this page, we could see that the title and meta tags had changed, but it was basically the same page. It made sense, but how were they exposing this to the search engines. I navigated over to their robots.txt file, and there was the answer: they had defined a site map, which contained links to every item on the page. Each item was just an argument to the same page, but the search engine was seeing them each as a unique page. It didn’t need to read the Silverlight control at all. The Silverlight control was able to show the item based off of the page argument, and the page prettied itself up with a title and some meta content to make itself relevant to the search engines. The same thing could be done inside the Silverlight page, but they were removing the necessity to do so. They were basically taking the same approach as they would for any site with dynamic content. Well played, Vertigo.
PingBack from http://asp-net-hosting.simplynetdev.com/deep-linking-and-search-visibility-in-silverlight/