In this blog we are going to write an example on how to extend the SEO Toolkit functionality, so for that we are going to pretend our company has a large Web site that includes several images, and now we are interested in making sure all of them comply to a certain standard, lets say all of them should be smaller than 1024x768 pixels and that the quality of the images is no less than 16 bits per pixel. Additionally we would also like to be able to make custom queries that can later allow us to further analyze the contents of the images and filter based on directories and more.
For this we will extend the SEO Toolkit crawling process to perform the additional processing for images, we will be adding the following new capabilities:
A crawler module is a class that extends the crawling process in Site Analysis to provide custom functionality while processing each URL. By deriving from this class you can easily raise your own set of violations or add your own data and links to any URL.
It includes three main methods:
Create a Class Library in Visual Studio and add the code shown below.
As you can see in the BeginAnalysis the module registers three new properties with the Report using the Crawler property. This is only required if you want to provide either a custom text or use it for different type other than a string. Note that current version only allows primitive types like Integer, Float, DateTime, etc.
During the Process method it first makes sure that it only runs for known content types, then it performs any validations raising a set of custom violations that are defined in the Violations static helper class. Note that we load the content from the Response Stream, which is the property that contains the received from the server. Note that if you were analyzing text the property Response would contain the content (this is based on Content Type, so HTML, XML, CSS, etc, will be kept in this String property).
When running inside IIS Manager, crawler modules need to be registered as a standard UI module first and then inside their initialization they need to be registered using the IExtensibilityManager interface. In this case to keep the code as simple as possible everything is added in a single file. So add a new file called "RegistrationCode.cs" and include the contents below:
This code defines a standard UI IIS Manager module and in its client-side initialize method it uses the IExtensibilityManager interface to register the new instance of the Image extension. This will make it visible to the Site Analysis feature.
To test it we need to add the UI module to Administration.config, that also means that the assembly needs to be registered in the GAC.
To Strongly name the assembly
In Visual Studio, you can do this easily by using the menu "Project->Properties", and select the "Signing" tab, check the "Sign the assembly", and choose a file, if you don't have one you can easily just choose New and specify a name.
After this you can compile and now should be able to add it to the GAC.
To GAC it
If you have the SDK's you should be able to call it like in my case:
"\Program Files\Microsoft SDKs\Windows\v6.0A\bin\gacutil.exe" /if SampleCrawlerModule.dll
(Note, you could also just open Windows Explorer, navigate to c:\Windows\assembly and drag & drop your file in there, that will GAC it automatically).
Finally to see the right name that should be use in Administration.config run the following command:
"\Program Files\Microsoft SDKs\Windows\v6.0A\bin\gacutil.exe" /l SampleCrawlerModule
In my case it displays:
SampleCrawlerModule, Version=18.104.22.168, Culture=neutral, PublicKeyToken=6f4d9863e5b22f10, …
Finally register it in Administration.config
Open Administration.config in Notepad using an elevated instance, find the </moduleProviders> and add a string like the one below but replacing the right values for Version and PublicKeyToken:
After registration you now should be able to launch IIS Manager and navigate to Search Engine Optimization. Start a new Analysis to your Web site. Once completed if there are any violations you will see them correctly in the Violations Summary or any other report. For example see below all the violations in the "Images" category.
Since we also extended the metadata by including the new fields (Image Width, Image Height, and Image Pixel Format) now you can use them with the Query infrastructure to easily create a report of all the images:
And since they are standard fields, they can be used in Filters, Groups, and any other functionality, including exporting data. So for example the following query can be opened in the Site Analysis feature and will display an average of the width and height of images summarized by type of image:
And of course violation details are shown as specified, including Recommendation, Description, etc:
As you can see extending the SEO Toolkit using a Crawler Module allows you to provide additional information, whether Metadata, Violations or Links to any document being processed. This can be used to add support for content types not supported out-of-the box such as PDF, Office Documents or anything else that you need. It also can be used to extend the metadata by writing custom code to wire data from other system into the report giving you the ability to exploit this data using the Query capabilities of Site Analysis.
The IIS SEO Toolkit includes a lot of functionality built-in such as built-in violation rules, processing of different content types (like HTML, CSS, RSS, etc) and more, however it might not do all the things that you would need it to do, for example, it might not process a set of documents that you use, or it might not gather all the information that you are interested in while processing a document. The good news is that it includes enough extensibility to let you build on top of its rich capabilities and provide additional ones easily using .NET.
There are three main extensibility points in this first release, including:
This is the first of a series of extensibility blog entries for the IIS SEO Toolkit where I will cover all of the extensibility points mentioned above.
Two weeks ago I presented at DevConnections the talk "AMS10: Developing and Deploying for the Windows Web App Gallery", here are the slides.
Download the Web Application Gallery Talk slides here.
A few final links:
Microsoft Web Platform: http://www.microsoft.com/web/
Download Web PI: http://www.microsoft.com/web/downloads/platform.aspx
Submit your Applications at: http://www.microsoft.com/web/gallery/developer.aspx
In the new version of the IIS SEO Toolkit we added two new reports that are very interesting, both from an SEO perspective as well as from user experience and site organization. These reports are located in the Links category of the reports
This report shows a summary of all the redirects that were found while crawling the Web site. The first column (Linking-URL) is the URL that was visited that resulted in redirection to the Linked-URL (second column). The third column (Linking-Status code) specifies what type of redirection happened based on the HTTP status code enumeration. The most common values will be MovedPermanently/Moved which is a 301, or Found/Redirect which is a 302. The last column shows the status code for the final URL so you can easily identify redirects that failed or that redirected to another redirect.
This report is interesting because Redirects might affect your Search Engine rankings and make your users have the perception that your site is slower. For more information on Redirects see: Redirects, 301, 302 and IIS SEO Toolkit
This is probably one of my favorite reports since it is almost impossible to find this type of information in any other 'easy' way.
The report basically tells you how hard it is for users that land in your home page to get to any of the pages in your site. For example in the image below it shows that it takes 5 clicks for a user to get from the home page of my site to the XGrid.htc component.
This is very valuable information because you will be able to understand how deep your Web site is, in my case if you were to walk the entire site and layout its structure in a hierarchical diagram it would basically be 5 levels deep. Remember, you want your site to be shallow so that its easily discoverable and crawled by Search Engines.
Even more interesting you can double click any of the results and see the list of clicks that the user has to make it to get to the page.
Note that it shows the URL, the Title of the page as well as the Text of the Link you need to click to get to the Next URL (the one with a smaller index). So as you can see in my case the user needs to go to the home page, click the link with text "XGrid", which takes it to the /XGrid/ url (index 3) which then needs to click the link with text "This is a new...", etc.
Note that as you select the URLs in the list it will highlight in the markup the link that takes you to the next URL.
The data of this report is powered by a new type of query we called Route Query. The reason this is interesting is because you can customize the report to add different filters, or change the start URL, or more.
For example, lets say I want to figure out all the pages that the user can get to when they land in my site in a specific page, say http://www.carlosag.net/Tools/XGrid/editsample.htm:
In the Dashboard view of a Report, select the option 'Query->New Routes Query'. This will open a new Query tab where you can specify the Start URL that you are interested.
As you can see this report clearly shows that if a user visits my site and lands on this page they will basically be blocked and only be able to see 8 pages of the entire site. This is a clear example on where a link to the Home page would be beneficial.
Other common scenarios that this query infrastructure could be used for is to find ways to direct traffic from your most common pages to your conversion pages, this report will let you figure out how difficult or easy it is to get from any page to your conversion pages
One question that I've been asked several times is: "Is it possible to schedule the IIS SEO Toolkit to run automatically every night?". Other related questions are: "Can I automate the SEO Toolkit so that as part of my build process I'm able to catch regressions on my application?", or "Can I run it automatically after every check-in to my source control system to ensure no links are broken?", etc.
The good news is that the answer is YES!. The bad news is that you have to write a bit of code to be able to make it work. Basically the SEO Toolkit includes a Managed code API to be able to start the analysis just like the User Interface does, and you can call it from any application you want using Managed Code.
In this blog I will show you how to write a simple command application that will start a new analysis against the site provided in the command line argument and process a few queries after finishing.
The most important type included is a class called WebCrawler. This class takes care of all the process of driving the analysis. The following image shows this class and some of the related classes that you will need to use for this.
The WebCrawler class is initialized through the configuration specified in the CrawlerSettings. The WebCrawler class also contains two methods Start() and Stop() which starts the crawling process in a set of background threads. With the WebCrawler class you can also gain access to the CrawlerReport through the Report property. The CrawlerReport class represents the results (whether completed or in progress) of the crawling process. It has a method called GetUrls() that returns an instance to all the UrlInfo items. A UrlInfo is the most important class that represents a URL that has been downloaded and processed, it has all the metadata such as Title, Description, ContentLength, ContentType, and the set of Violations and Links that it includes.
If you are not using Visual Studio, you can just save the contents above in a file, call it SEORunner.cs and compile it using the command line:
C:\Windows\Microsoft.NET\Framework\v3.5\csc.exe /r:"c:\Program Files\Reference Assemblies\Microsoft\IIS\Microsoft.Web.Management.SEO.Client.dll" /optimize+ SEORunner.cs
After that you should be able to run SEORunner.exe and pass the URL of your site as a argument, you will see an output like:
Processed - Remaining - Download Size
56 - 149 - 0.93 MB
127 - 160 - 2.26 MB
185 - 108 - 3.24 MB
228 - 72 - 4.16 MB
254 - 48 - 4.98 MB
277 - 36 - 5.36 MB
295 - 52 - 6.57 MB
323 - 25 - 7.53 MB
340 - 9 - 8.05 MB
358 - 1 - 8.62 MB
362 - 0 - 8.81 MB
Start URL: http://www.carlosag.net/
Start Time: 11/16/2009 12:16:04 AM
End Time: 11/16/2009 12:16:15 AM
Status Code summary
OK - 319
MovedPermanently - 17
Found - 23
NotFound - 2
InternalServerError - 1
The most interesting method above is RunAnalysis, it creates a new instance of the CrawlerSettings and specifies the start URL. Note that it also specifies that we should consider internal all the pages that are hosted in the same directory or subdirectories. We also set the a unique name for the report and use the same directory as the IIS SEO UI uses so that opening IIS Manager will show the reports just as if they were generated by it. Then we finally call Start() which will start the number of worker threads specified in the WebCrawler::WorkerCount property. We finally just wait for the WebCrawler to be done by querying the IsRunning property.
The remaining methods just leverage LINQ to perform a few queries to output things like a report aggregating all the URLs processed by Status code and more.
As you can see the IIS SEO Toolkit crawling APIs allow you to easily write your own application to start the analysis against your Web site which can be easily integrated with the Windows Task Scheduler or your own scripts or build system to easily allow for continuous integration.
Once the report is saved locally it can then be opened using IIS Manager and continue further analysis as with any other report. This sample console application can be scheduled using the Windows Task Scheduler so that it can run every night or at any time. Note that you could also write a few lines of PowerShell to automate it without the need of writing C# code and do that by only command line, but that is left for another post.
Today we are announcing the final release of the IIS Search Engine Optimization (SEO) Toolkit v1.0. This version builds upon the Beta 1 and Beta 2 versions and is 100% compatible with those versions so any report you currently have continues to work in the new version. The new version includes a set of bug fixes and new features such as:
This version can upgrade both Beta 1 and Beta 2 version so go ahead and try it and PLEASE provide us with feedback and any additional things you would like to see for the next version at the SEO Forum in the IIS Web site.
Click here to install the IIS SEO Toolkit.
Yesterday I presented the session "AMS04: Boost Your Site’s Search Ranking with the IIS Search Engine Optimization Toolkit" at the ASP.NET Connections, it was fun to talk to a few attendees that had several questions around the tool and SEO in general. It is always really interesting learning about all the unique environments and types of applications that are being built and how the SEO Toolkit can help them.
Here are the IIS SEO Toolkit slides that I used.
Here you can find the IIS SEO Toolkit download.
And by far the easiest way to get it installed is using the Microsoft Web Platform Installer.
Please send any question and feedback at IIS SEO Toolkit Forums.
And by the way, stay tuned for the RTW version of IIS SEO Toolkit coming SOON.
One of my favorites features in the IIS Search Engine Optimization (SEO) Toolkit is what we called Report Comparison. Report Comparison basically allows you to compare two different versions of the results of crawling the same site to see what changed in between. This is a really convenient way to track not only changes in terms of SEO violations but also to be able to compare any attributes on the pages such as Title, Heading, Description, Links, Violations, etc.
There are a couple of ways to get to this feature.
1) Use the Compare Reports task. While in the Site Analysis Reports listing you can select two reports by using Ctrl+Click, and if both reports are compatible (e.g. they use the same Start URL) the task "Compare Reports" will be shown. Just clicking on that will get you the comparison.
2) Use the Compare to another report menu item. While in the Dashboard view of a Report you can use the "Report->Compare To Another Report" menu item which will show a dialog where you can either select an existing report or even start a new analysis to compare with.
In both cases you will get the Report Comparison Page displaying the results as shown in the next image.
The Report Comparison page includes a couple of "sections" with data. At the very top it includes links showing the Name and the Date when the reports were ran. If you click on them it will open the report directly just as if you had used the Site Analysis report listing view.
The next sections shows a lot of interesting built-in data such as:
Whenever you click the links you get a query dialog that you can customize just as any Query in the Query builder, where you can Add/Remove columns, add filters, etc.
My favorite one is the "Modified URLs" source when you actually can add filters that compare URLs coming from the two different reports.
Note that when you double click or "right-click –> Compare Details" any of the rows you get a side-by-side comparison of everything in the URL:
Again, you can use any of the tabs to see side-by-side things like the Content of the pages or the Links both versions have or the violations, or pretty much everything that you can see for just one.
Finally, you can also right click on the Query dialog and choose "Compare Contents". This will launch whatever File Comparison tool you have configured using the "Edit Feature Settings". In this case I have configured WinDiff.exe which shows something like:
As you can see Report Comparison offers is a powerful feature that allows you to keep track of changes between two different reports. This easily allows you to understand over time how your site has been affected by changes. For Site managers it will allow them to query and maintain a history with all the changes. You can imagine that using an automated build process that runs IIS SEO Toolkit crawling whenever a build is made that keeps the report stored somewhere and potentially annotate it with the build number you could even keep a correlation of changes in code with Web site crawling.
Next week I will be presenting at the ASP.NET Connections event in Las Vegas the following topics:
I will also be participating in a session called: "Q&A session with Scott Guthrie and the ASP.NET and VWD teams at DevConnections" on Wednesday.
It should be fun. If you are around stop by the Microsoft Web Platform booth where I will be hanging around the rest of the time trying to answer any questions and getting a chance to learn more about how you use IIS or any problems you might be facing.