Posts
  • CarlosAg Blog

    IIS Admin Pack Technical Preview 2 Released

    • 13 Comments

    Today we are releasing the Technical Preview 2 of the IIS Admin Pack, it is an update of the release we made on February.

    Install the Admin Pack and Database Manager today!

    Admin Pack (x86):  http://www.iis.net/downloads/default.aspx?tabid=34&i=1682&g=6

    Database Manager (x86):  http://www.iis.net/downloads/default.aspx?tabid=34&g=6&i=1684

    Admin Pack (x64): http://www.iis.net/downloads/default.aspx?tabid=34&i=1683&g=6

    Database Manager (x64):  http://www.iis.net/downloads/default.aspx?tabid=34&g=6&i=1685

    New Features:

    There are a lot of interesting features we've added to almost every component for this release:

    Database Manager

    1. Specify your own Connections. We heard during TP1 that it was a very important feature to specify your own database connection information without us automatically reading them from Connection Strings, with TP2 now you can do it. We've also added a switch that allows administrators the ability to disable this feature in case they have concerns and want to still enforce the "only read the connectionStrings from config" functionality and prevent users from trying to add their own.
    2. Extensibility. In these release we are making the API's public to write your own Database Provider that allows you to plugin your own database and reuse all of our UI and remoting, all you need to do is implement a few functions (CreateTable, DeleteTable, InsertRow, DeleteRow, etc) and your provider will be ready to use remotely over HTTPS to manage your own DB.
    3. Support for MySQL. In the upcoming weeks we will be releasing support for MySQL using the extensibility model mentioned above.
    4. Small things. New Toolbar in the Connections Panel to simplify discovery of commands
    5. Use of SMO. In this release we are using SQL Server Management Objects for all the schema manipulation, this means that things like "scripts exported from SQL inluding 'GO statements' and others will work in the Query window"

    Configuration Editor

    1. Choose where Configuration is read: now allows you to specify where you want to read and write your configuration from. This feature is great for advanced users that really understand the inheritance of our distributed configuration system and want to take advantage of it. Now ehen you go to a site, application or anywhere else, you will by default have the same experience where we read configuration from the deepest configuration path, however, now you can use the "From:" Combo box and tell us where you really want to read the configuration from, for example the following image shows how the options look like for a folder underneath Default Web Site. As you can see now you can choose if you want to use locationPath's or go directly to the web.config. This plays really nice with locking allowing you to lock items for delegated users, but still be able to change things yourself pretty easily. This change also works with script generation so now when you generate your scripts you can customize where to read/write configuration.
    2. Lots of small things:Now, all the changes you perform will be show bolded untill you commit them. Enhanced the locking functionality to better reflect when attributes/elements are locked. Several minor bug fixes for script generation, collections without keys, etc.

    IIS Reports

    1. No longer depends on LogParser. TP1 was using LogParser for parsing logs. This version no longer uses LogParser which menas no additional installs for this. We also measured performance and we see an increase of up to 40% which means faster reports. (In my machine for logs of up to 6.4GB or 17 million rows it takes about 2 minutes to generate a report, wee see about 5-7 seconds for 1 million rows)
    2. Better Reports. We took a look at the set of reports and we extended the list of reports as well as added new filters for them, for example, the status code report now lets you drill down and see which URL's generated a particular Status code, etc.
    3. Export to CSV and XML.
    4. Extensibility: For this release just as for DBManager we've made the API's of the UI public so that you can extend and add your own set of reports by writing a class derived from either TableReportDefinition or ChartReportDefinition. This means that just by overriding a method a returning a DataTable, we will take care of the formatting, adding a Chart and letting your users export in HTML, MHTML, CSV, XML, etc.

     UI Extensions

    1. Bug fixes for all the features like Request Filtering, FastCGI, ASP.NET Authorization Rules, and ASP.NET Error Pages.

    As you can see the extensibility is a big theme now, and in my following posts I will be showing how to extend both IIS Reports as well as DBManager.

  • CarlosAg Blog

    The evil WinForms Splitter

    • 1 Comments
    Beware of SplitPosition.
    Today I spent quite some time debugging an issue in the new product I am working on.
    Well, to summarize what I was seeing in our UI is that for some reason certain information that I was expecting to be there when a TreeNode was expanded, it just wasn’t there. It was completely surprising to me, since in that particular code path, we do not start multiple threads or use Application.DoEvents nor anything like that, basically all we do is a simple assignment in the TreeView after select event, something like:

            private void OnTreeViewAfterSelect(object sender, TreeViewEventArgs e) {
                _myObject = DoSomeProcessing();
            }
     
    However, for some reason in another event handler of our TreeView, _myObject was not set. How can this be?
     
    Well, after quite some interesting time with VS 2005 (which rocks!), the problem was due to an interesting raise condition caused by (believe it or not) a WinForms Splitter. What was happening is that DoSomeProcessing changed some properties, that caused the UI to perform a layout and inside that code, we set the SplitterPosition property of the Splitter. Well, surprise-surprise, Splitter calls Application.DoEvents in its property setter!!!.
    What DoEvents does is basically lets Windows pop the next message from the windows message pump and process it, so the next event was actually fired, and _myObject ended up not being set.
     
    To illustrate the problem with a simple sample, try this code:
    (Just copy the code and paste it into notepad.
    Save it as TestApp.cs and compile it using “csc.exe /target:winexe TestApp.cs”
    )
     
    using System;
    using 
    System.Drawing;
    using 
    System.Threading;
    using 
    System.Windows.Forms;

    namespace 
    TestApp {
        
    public class Form1 : Form {

            
    private TreeView _treeView;
            private 
    Label _label;
            private 
    Splitter _splitter;
            private 
    Button _someButton;
            
            
    [STAThread]
            
    static void Main() {
                Application.Run(
    new Form1());
            
    }

            
    public Form1() {
                InitializeComponent()
    ;

                
    // Just add some nodes...
                
    TreeNode node _treeView.Nodes.Add("Node 1");
                
    node.Nodes.Add("Node 1.1");
                
    node.Nodes.Add("Node 1.2");
                
    _treeView.Nodes.Add("Node 2");
            
    }

            
    private void InitializeComponent() {
                _treeView 
    = new TreeView();
                
    _splitter = new Splitter();
                
    _label = new Label();
                
    _someButton = new Button();

                
    SuspendLayout();

                
    // treeView1
                
    _treeView.Dock DockStyle.Left;
                
    _treeView.Location = new Point(528);
                
    _treeView.TabIndex 1;
                
    _treeView.AfterSelect += new TreeViewEventHandler(OnTreeViewAfterSelect);
                
    _treeView.BeforeSelect += new TreeViewCancelEventHandler(OnTreeViewBeforeSelect);
                
                
    // splitter
                
    _splitter.Location = new Point(12628);
                
    _splitter.TabIndex 1;
                
    _splitter.TabStop = false;
                
                
    // label1
                
    _label.BackColor SystemColors.Window;
                
    _label.BorderStyle BorderStyle.Fixed3D;
                
    _label.Dock DockStyle.Fill;
                
    _label.Location = new Point(12928);
                
    _label.TabIndex 2;
                
                
    // button1
                
    _someButton.Dock DockStyle.Top;
                
    _someButton.Location = new Point(55);
                
    _someButton.TabIndex 0;
                
                
    // Form 
                
    ClientSize = new Size(500400);
                
    Controls.Add(_label);
                
    Controls.Add(_splitter);
                
    Controls.Add(_treeView);
                
    Controls.Add(_someButton);
                
    ResumeLayout(false);
            
    }

            
    private void OnTreeViewAfterSelect(object sender, TreeViewEventArgs e) {
                _label.Text 
    "Node selected:" + e.Node.Text;
            
    }

            
    private void OnTreeViewBeforeSelect(object sender, TreeViewCancelEventArgs e) {
                
    // Just sleep 500ms to simulate some work
                
    Thread.Sleep(500);

                
    // Now update the SplitPosition
                
    _splitter.SplitPosition 100;

                
    // simulate 500ms of more work ...
                
    Thread.Sleep(500);
            
    }
        }
    }

    Colorized by: CarlosAg.CodeColorizer  


     
    Run it and select the TreeView, notice how ugly everything works.
    Basically every time you select a different node you will get an ugly flickering, getting to see how selection jumps from the newly selected node to the last selected node, and then back to the new selected node.
     
    Well, luckily in Visual Studio 2005, there is a new class called SplitContainer that simplifies everything.
    It even adds new features, such as letting you set a MaxSize for both the left panel and the right panel, and many more features. Best of all, there is no Application.DoEvents in their code, so you can have code that behaves deterministically.
    Bottom line, you do want to use SplitContainer if at all possible.  
  • CarlosAg Blog

    Razor Migration Notes 1: Moving a SitemapPath Control to ASP.NET Web Pages

    • 2 Comments

    After many years I decided that it is time to rewrite my Web site using Razor. A bit of history, I started it around 2003 using ASP.NET 1.1. When .NET 2.0 came around in 2005 I migrated to it and it was great being able to leverage features like MasterPages, Themes, Sitemaps, and many other features. Honestly it is a pretty simple Web site, with mostly content, so very few controls, Sitemap, my own custom Menu control, and a couple more. Last week it was moved to use .NET 4.0 and it feels its about time to go back and update it a bit, both in look and features. So this (if time permits) will be the first of a series of migration notes that I discover as I move it to use ASP.NET Razor (aka WebPages). Do note that this is not meant to be a best practice in anyway, I would never claim I can make such a thing, these will be only my personal notes as I discover more details in ASP.NET WebPages features and as I move my own implementation to use them.

    So with that, one of the first things I faced during this migration, was the use of a Sitemap control (asp:SiteMapPath) in my MasterPage (future post about moving from MasterPages coming). I knew about Sitemap API, so I just decided to write a simple Sitemap helper that I can now use anywhere in Razor. The code is pretty simple, it basically generates an unordered list of links using <ul> and <li> with <a> inside, and used CSS to layout them in a way that I liked.

    SitemapPath Control in WebForms

    The original code I was using in my MasterPage looked like the following:

    <asp:SiteMapPath CssClass="HeaderText" runat="server" ID="siteMap" ShowToolTips="true" NodeStyle-ForeColor="White" CurrentNodeStyle-Font-Bold="true" />

    And generated the following markup:

    <span id="siteMap" class="HeaderText"><a href="#siteMap_SkipLink"><img alt="Skip Navigation Links" height="0" width="0" src="http://blogs.msdn.com/WebResource.axd?d=S2jbW9E-HYlS0UQoRCcsm94KUJelFI6yS-CQIkFvzT6fyMF-zCI4oIF9bSrGjIv4IvVLF9liJbz7Om3voRpNZ8yQbW3z1KfqYr4e-0YYpXE1&amp;t=634219272564138624" style='border-width:0px;' /></a><span><a title='Home' href='/' style='color:White;'>Home</a></span><span> &gt; </span><span><a title='Free tools for download' href='/Tools/' style='color:White;'>Tools</a></span><span> &gt; </span><span style='color:White;font-weight:bold;'>Code Translator</span><a id='siteMap_SkipLink'></a></span>

    Which looks like the following in the browser:

    image

    I used some CSS to set the color, and background and other stuff, but still to set the last item to bold required me to use a property in the Sitemap to get it to look the way I wanted.

    My Sitemap Helper in Razor

    Since I was familiar with the Sitemap API and my goal was to change as “little” as possible as part of this first migration, I decided to write a Sitemap helper that I can use in my Layout pages. The code in the Page is as simple as it gets, you just call @Helpers.Sitemap() and that’s it (added the Div below to get some context in the markup, but that was already there with the SitemapPath control anyway):

    <div class="bannerPath">
    @Helpers.Sitemap()
    </div>

    This new helper version generates the markup below. I don’t know about you, but I can sure make more sense of what it says, and I imagine Search Engines will as well, I decided to use more semantically correct markup using a <nav> to signal navigational section and use a list of links.

    <nav>
        <ul class="siteMap">
            <li><a href="http://blogs.msdn.com/" title="Home">Home</a>&nbsp;&gt;&nbsp;</li>
            <li><a href="http://blogs.msdn.com/Tools/" title="Free tools for download">Tools</a>&nbsp;&gt;&nbsp;</li>
            <li><span>Code Translator</span></li>
        </ul>
    </nav>

    And it looks like the following in the browser (I decided to remove the underlining, and have more padding, and a new font, but all of that is CSS):

    image

    The Sitemap helper code

    The code to do the Sitemap was pretty simple, just use the SiteMap API to get the current node. Since I’m picky and I wanted to generate the markup in the “right” order (note you could use CSS to float them to the right instead), I used a Stack to push the nodes while traversing them up. Finally just generate the <li>.

    @helper Sitemap()
    {
        SiteMapNode currentNode = SiteMap.CurrentNode;
        <nav>
        <ul class="siteMap">
        @if (currentNode != null)
        {
            // Push into a stack to reverse them
            var node = currentNode;
            var nodes = new Stack<SiteMapNode>();
            while (node.ParentNode != null)
            {
                nodes.Push(node.ParentNode);
                node = node.ParentNode;
            }
           
            while(nodes.Count != 0)
            {
                SiteMapNode n = nodes.Pop();
                <li><a href="@n.Url" title="@n.Description">@n.Title</a>&nbsp;&gt;&nbsp;</li>
            }
            <li><span>@currentNode.Title</span></li>
        }
        else
        {
            <li><span>@Page.Title</span></li>
        }
        </ul>
        </nav>
    }

     

    To make it look the way I wanted I used the following CSS:

    .siteMap

      { float:right; font-size:11px; color:White; display:inline; margin-top:3px; margin-bottom:3px; margin-left:0px; margin-right:10px; } .siteMap li,span { float:left; list-style-type:none; padding-left:5px; border-width:0px;} .siteMap span { font-weight:bold; } .siteMap a,a.Visited { color:White; text-decoration:none; }

     

    Conclusion

    SitemapPath control gives you a really easy way to put together a navigation control based on the Sitemap APIs (and the Web.Sitemap file in my case). Creating a simple ASP.NET Razor helper is actually pretty easy since all the functionality needed is there in the base API’s and although it required some code (20 lines of code) now I feel like I have more control over my markup, can style it in anyway I want using CSS and have cleaner markup rendered.

    I’m sure there are better ways to do this, but as I said, the goal of this first pass is to push my site soon with as little changes possible while keeping the same functionality first.

  • CarlosAg Blog

    Using the SEO Toolkit to generate a Sitemap of a remote Web Site

    • 3 Comments

    The SEO Toolkit includes a set of features (like Robots Editor and Sitemap Editor) that only work when you are working with a local copy of your Web Site. The reason behind it is that we have to understand where we need to save the files that we need to generate (like Robots.txt and Sitemap XML files) without having to ask for physical paths as well as to verify that the functionality is added correctly such as only allowing Robots.txt in the root of a site, etc. Unfortunately this means that if you have a remote server that you cannot have a running local copy then you cannot use those features. (Note that you can still use Site Analysis tool since that will crawl your Web Site regardless of platform or framework and will store the report locally just fine.)

    The Good News

    The good news is that you can technically trick the SEO Toolkit into thinking you have a working copy locally and allow you to generate the Sitemap or Robots.txt file without too much hassle (“too much” being the key).

    For this sample, lets assume we want to create a Sitemap from a remote Web site, in this case I will use my own Web site (http://www.carlosag.net/ , but you can specify your own Web site, below are the steps you need to follow to enable those features for any remote Web site (even if it is running in other versions of IIS or any other Web Server).

    Create a Fake Site

    • Open IIS Manager (Start Menu->InetMgr.exe)
    • Expand the Tree until you can see the “Sites” node.
    • Right-click the “Sites” node and select “Add Web Site…”
    • Specify a Name (in my case I’ll use MySite)
    • Click “Select” to choose the DefaultAppPool from the Application Pool list. This will avoid creating an additional AppPool that will never run.
    • Specify a Physical Path where you will want the Robots and Sitemap files to be saved. I recommend creating just a temporary directory that clearly states this is a fake site. So I will choose c:\FakeSite\ for that.
    • Important. Set the Host name so that it matches your Web Site, for example in my case www.carlosag.net.
    • Uncheck the “Start Web site immediately”, since we do not need this to run.
    • Click OK

    This is how my Create site dialog looks like:

    image

    Use the Sitemap Editor

    Since we have a site that SEO Toolkit thinks it is locally now you should be able to use the features as usual.

    • Select the new site created above in the Tree
    • Double-click the Search Engine Optimization icon in the Home page
    • Click the link “Create a new Sitemap”
    • Specify a name, in my case Sitemap.xml
    • Since this is a remote site, you will see that the physical location option shows an empty list, so change the “URL structure” to will use the “<Run new Site Analysis>..” or if you already have one you can choose that.
    • If creating a new one, just specify a name and click OK (I will use MySite). At this point the SEO Toolkit starts crawling the Remote site to discover links and URLs, when it is done it will present you the virtual namespace structure so you can work with.
    • After the crawling is done, you can now check any files you want to include in your Sitemap and leverage the Server response to define the changed date and all the features as if the content was local. and Click OK

    This is the way the dialog looks when discovered my remote Web site URLs:

    image

    You will find your Sitemap.xml generated in the physical directory specified when creating the site (in my case c:\FakeSite\Sitemap.xml").

    Use the Robots Editor

    Just as with the Sitemap Editor, once you prepare a fake site for the remote server, you should be able to use the Robots Editor and leverage the same Site analysis output to build your Robots.txt file.

    image

    Summary

    In this blog I show how you can use the Sitemap and Robots Editor included in the SEO Toolkit when working with remote Web sites that might be running in different platforms or different versions of IIS.

  • CarlosAg Blog

    Adding IIS Manager Users and Permissions using PowerShell

    • 2 Comments

    Today somebody ask in the IIS.net Forums how could they automate the process of adding IIS Manager Users and their Permissions using a script or a command line and I thought it would be useful to post something that hopefully will be easy to find and refer to.

    One way they found to do it through configuration however they were not getting the password encrypted.

    The first thing that I would like to highlight is that the password is not encrypted, it is actually stored as a hash which means just entering the password in clear text will not work the only way it will work is if  you calculate the same hash our current implementation does.

    Having said that manually adding the users is also not a good idea since the IIS Manager functionality is extensible and its storage can be replaced to store the users in SQL Server or any other backend. Our built-in implementation stores them in Administration.config but at any given time someone could have a different provider which means your code will not work either.

    So then what is the right way? Well the right way is using existing API’s we surface in Microsoft.Web.Management.dll, in particular Microsoft.Web.Management.Server.ManagementAuthentication and Microsoft.Web.Management.ManagementAuthorization. Using these API’s will make sure that it will call the right provider and pass the correct arguments ensuring that you do not have to implement or know any details about their implementation.

    These types are really easy to consume from managed code but it does mean you have to write code for it. However the good news is that through PowerShell this gets as simple as it can possibly get.

    So just launch PowerShell (make sure its in elevated as an administrator)

    Here is how you add a user and grant him access for Default Web Site:

    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Web.Management") 
    [Microsoft.Web.Management.Server.ManagementAuthentication]::CreateUser("MyUser", "ThePassword")
    [Microsoft.Web.Management.Server.ManagementAuthorization]::Grant("MyUser", "Default Web Site", $FALSE)
  • CarlosAg Blog

    Creating a Setup Project for IIS Extensions using Visual Studio 2008

    • 2 Comments

    Introduction

    IIS 7 provides a rich extensibility model, whether extending the server or the user interface, one critical thing is provide a simple setup application that can install all the required files, add any registration information required, and modify the server settings as required by the extension.
    Visual Studio 2008 provides a set of project types called Setup and Deployment projects specifically for this kind of applications. The output generated for these projects is an MSI that can perform several actions for you, including copying files, adding files to the GAC, adding registry keys, and many more.
    In this document we will create a setup project to install a hypothetical runtime Server Module that also includes a User Interface extension for IIS Manager.
    Our setup will basically perform the following actions:
    •    Copy the required files, including three DLL’s and an html page.
    •    Add a couple of registry keys.
    •    Add the managed assemblies to the GAC
    •    Modify applicationHost.config to register a new module
    •    Modify administration.config to register a new UI extensibility for InetMgr
    •    Create a new sample application that exposes the html pages
    •    Finally, we will remove the changes from both configuration files during uninstall

    Creating the Setup Project

    Start Visual Studio 2008. In the File Menu, select the option New Project.
    In the New Project Dialog, expand the Other Project Types option in the Project type tree view.
    Select the option Setup and Deployment type and select the option Setup Project. Enter a name for the Project and a location. I will use SampleSetup as the name.

    image

    Adding Files to the Setup

    • Select the menu View->Editor->File System. This will open the editor where you can add all the files that you need to deploy with your application. In this case I will just add an html file that I have created called readme.htm.
    • To do that, right click the Application Folder directory in the tree view and select the option Add File. Browse to your files and select all the files you want to copy to the setup folder (by default <Program Files>\<Project Name>.

    Adding assemblies to the GAC

    Adding assemblies to the setup is done in the same File System editor, however it includes a special folder called Global Assembly Cache that represents the GAC in the target system.
    In our sample we will add to the GAC the assemblies that have the runtime server module and the user interface modules for IIS Manager. I have created the following set of projects:

    1. SampleModule.dll that includes the runtime module on it.
    2. SampleModuleUI.dll that contains the server-side portion of the IIS Manager extension (ModuleProvider, ModuleService, etc).
    3. SampleModuleUIClient.dll that contains the client side portion of the IIS Manager extension (Module, ModulePage, TaskLists, etc).


    Back in Visual Studio,

    • Select the menu option View->Editor->File System
    • Right-click the root node in the Tree view titled File System on Target Machine and select the option Add Special Folder.
      Select the option Global Assembly Cache Folder.
      Right click the newly added GAC folder and choose the option Add File and browse to the DLL and choose OK. Another option is using the Add Assembly and use the "Select Component" dialog to add it.
      Visual Studio will recognize the dependencies that the assembly has, and will try to add them to the project automatically. However, certain assemblies such as Microsoft.Web.Administration, or any other System assemblies should be excluded because they will already be installed in the target machine.
    • To ensure that you don't ship system assemblies, in the Solution Explorer expand the Detected Dependencies folder and right click each of the assemblies that shouldn't be packaged and select the option Exclude. (In our case we will exclude Microsoft.Web.Administration.dll, Microsoft.Web.Management.dll, Microsoft.ManagementConsole.dll and MMCFxCommon.dll)
      After completing this, the project should look as follows:

    image

    Adding Registry Keys

    Visual Studio also includes a Registry editor that helps you adding any registry keys in the target machine. For this sample I will just add a registry key in HKEY_LOCAL_MACHINE\Software\My Company\Message. For that:
    Select the menu option View->Editor->Registry.
    Expand the HKEY_LOCAL_MACHINE node and drill down to Software\[Manufacturer].
    [Manufacturer] is a variable that holds the name of the company, and can be set by selecting the SampleSetup node in Solution Explorer and using the Property Grid to change it. There are several other variables defined such as Author, Description, ProductName, Title and Version that helps whenever dynamic text is required.
    Right click [Manufacturer] and select the option new String Value. Enter Message as the name. To set the value you can select the item in the List View and use the Property Grid to set its value.
    After completing this, the project should look as follows:

    clip_image002

    Executing Custom Code

    To support any custom code to be executed when running the setup application, Visual Studio (more explicitly MSI) supports the concept of Custom Actions. These Custom Actions include running an application, a script or executing code from a managed assembly.
    For our sample, we will create a new project where we will add all the code  to read and change configuration.
    Select the option File->Add->New Project.
    Select the Class Library template and name it SetupHelper.

    image

    • Since we will be creating a custom action, we need to add a reference to System.Configuration.Install to be able to create the custom action. Use the Project->Add Reference. And in the .NET Tab select the System.Configuration.Install and press OK.
    • Since we will also be modifying server configuration (for registering the HTTP Module in ApplicationHost.config and the ModuleProvider in administration.config) using Microsoft.Web.Administration we need to add a reference to it as well. Again use the Project->Add Reference, and browse to <windows>\system32\inetsrv and select Microsoft.Web.Administration.dll
    • Rename the file Class1.cs file to be named SetupAction.cs and make the class name SetupAction. This class needs to inherit from System.Configuration.Install.Installer which is the base class for all custom actions and it has several methods that you can override to add custom logic to the setup process. In this case we will add our code in the Install and the Uninstall method.
    using System;
    using System.ComponentModel;
    using System.Configuration.Install;

    namespace SetupHelper {
       
    [RunInstaller(true)]
       
    public class SetupAction : Installer {
           
    public override void Install(System.Collections.IDictionary stateSaver) {
               
    base.Install(stateSaver);

               
    InstallUtil.AddUIModuleProvider(
                    "SampleUIModule"
    ,
                    "SampleUIModule.SampleModuleProvider, SampleUIModule, Version=1.0.0.0, Culture=neutral, PublicKeyToken=12606126ca8290d1"
               
    );

               
    // Add a Server Module to applicationHost.config
                InstallUtil.AddModule(
                    "SampleModule"
    ,
                    "SampleModule.SampleModule, SampleModule, Version=1.0.0.0, Culture=neutral, PublicKeyToken=12606126ca8290d1"
               
    );

               
    // Create a web application
                InstallUtil.CreateApplication(
                    "Default Web Site"
    ,
                    "/SampleApp"
    ,
                   
    Context.Parameters["TargetDir"]
               
    );
           
    }

           
    public override void Uninstall(System.Collections.IDictionary savedState) {
               
    base.Uninstall(savedState);

               
    InstallUtil.RemoveUIModuleProvider("SampleUIModule");
               
    InstallUtil.RemoveModule("SampleModule");
               
    InstallUtil.RemoveApplication("Default Web Site", "/SampleApp");
           
    }
       
    }
    }
       

    As you can see the code above is actually really simple, it just calls helper methods in a utility class called InstallUtil that is shown at the end of this entry. You will also need to add the InstallUtil class to the project to be able to compile it. The only interesting piece of code above is how we pass the TargetDir from the Setup project to the Custom action through the Parameters property of the InstallContext type.

    Configuring the Custom Action

    To be able to use our new Custom Action we need to add the SetupHelper output to our setup project, for that:
    Select the option View->Editor->File System
    Right-click the Application Folder node and select the option Add Project Output... and select the SetupHelper project in the Project drop down.

    image

    After doing this, the DLL will be included as part of our setup.

    Adding the Install Custom Action

    Select the option View->Editor->Custom Actions
    Right-click the Install node and select the option Add Custom Action… drill down into the Application Folder and select the Primary output from SetupHelper.

    image

    Click OK and type a name such as InstallModules

    Now, since we want to pass the TargetDir variable to be used as the physical path for the web application that we will create within our Installer derived-class, select the custom action and go to the Property Grid. There is a property called CustomActionData. This property is used to pass any data to the installer parameters class, and uses the format “/<name>=<value>”. So for our example we will set it to: /TargetDir="[TARGETDIR]\"

    image

    Adding the Uninstall Custom Action

    In the same editor, right-click the Uninstall node and select the option Add Custom Action…, again drill down into the Application Folder and select the Primary output from SetupHelper.
    Press OK and type a name such as UninstallModules.
    After doing this the editor should look as follows:

    image

    Building and Testing the Setup

    Finally we can build the solution by using the Build->Rebuild Solution menu option.
    This will create a file called SampleSetup.msi, in the folder SampleSetup\SampleSetup\Debug\SampleSetup.msi
    You can now run this MSI and it will walk through the process of installing. The user interface that is provided by default can also be configured to add new steps or remove the current steps. You can also provide a Banner logo for the windows and many more options from the View->Editor->User Interface.

    clip_image002[4]clip_image002[6]

    Visual Studio provides different packaging mechanisms for the setup application. You can change it through the Project Properties dialog where you get the option to use:
    1)    As loose uncompressed files. This option packages all the files by just copying them into a file system structure where the files are copied unchanged. This is a good packaging option for CD or DVD based setups
    2)    In setup file. This option packages all the files within the MSI file
    3)    In cabinet files. This option creates a set of CAB files that can be used in scenarios such as diskette based setup.

    You can also customize all the setup properties using the property grid, such as DetectNewerInstalledVersion which will warn users if a newer version is already installed or RemovePreviousVersion that will automatically remove older versions for the user whenever he tries to install a new one.

     

    64-bit considerations

    Turns out that the managed code custom action will fail under 64-bit platform due to it being executed as a 32-bit custom action the following blog talks about the details and shows how you can fix the issue:

    http://blogs.msdn.com/heaths/archive/2006/02/01/64-bit-managed-custom-actions-with-visual-studio.aspx

     

     

    Summary

    Visual Studio 2008 provides a simple option to easily create Setup applications that can perform custom code through Custom actions. In this document we created a simple custom action to install modules and InetMgr extensions through this support.

    For the latest information about IIS 7.0, see the IIS 7 Web site at http://www.iis.net

    InstallUtil

    This is the class that is used from the SetupHelper class we created to do the actual changes in configuration. As you can see it only has six public methods, AddModule, AddUIModuleProvider, CreateApplication, RemoveApplication, RemoveModule, and RemoveUIModule. The other methods are just helper methods to facilitate reading configuration.

    using System;
    using Microsoft.Web.Administration;

    namespace SetupHelper {

       
    public static class InstallUtil {

           
    /// <summary>
            /// Registers a new Module in the Modules section inside ApplicationHost.config
            /// </summary>
            public static void AddModule(string name, string type) {
               
    using (ServerManager mgr = new ServerManager()) {
                   
    Configuration appHostConfig = mgr.GetApplicationHostConfiguration();
                   
    ConfigurationSection modulesSection = appHostConfig.GetSection("system.webServer/modules");
                   
    ConfigurationElementCollection modules = modulesSection.GetCollection();

                   
    if (FindByAttribute(modules, "name", name) == null) {
                       
    ConfigurationElement module = modules.CreateElement();
                       
    module.SetAttributeValue("name", name);
                       
    if (!String.IsNullOrEmpty(type)) {
                           
    module.SetAttributeValue("type", type);
                       
    }

                       
    modules.Add(module);
                   
    }

                   
    mgr.CommitChanges();
               
    }
           
    }

           
    public static void AddUIModuleProvider(string name, string type) {
               
    using (ServerManager mgr = new ServerManager()) {

                   
    // First register the Module Provider 
                    Configuration adminConfig = mgr.GetAdministrationConfiguration();

                   
    ConfigurationSection moduleProvidersSection = adminConfig.GetSection("moduleProviders");
                   
    ConfigurationElementCollection moduleProviders = moduleProvidersSection.GetCollection();
                   
    if (FindByAttribute(moduleProviders, "name", name) == null) {
                       
    ConfigurationElement moduleProvider = moduleProviders.CreateElement();
                       
    moduleProvider.SetAttributeValue("name", name);
                       
    moduleProvider.SetAttributeValue("type", type);
                       
    moduleProviders.Add(moduleProvider);
                   
    }

                   
    // Now register it so that all Sites have access to this module
                    ConfigurationSection modulesSection = adminConfig.GetSection("modules");
                   
    ConfigurationElementCollection modules = modulesSection.GetCollection();
                   
    if (FindByAttribute(modules, "name", name) == null) {
                       
    ConfigurationElement module = modules.CreateElement();
                       
    module.SetAttributeValue("name", name);
                       
    modules.Add(module);
                   
    }

                   
    mgr.CommitChanges();
               
    }
           
    }

           
    /// <summary>
            /// Create a new Web Application
            /// </summary>
            public static void CreateApplication(string siteName, string virtualPath, string physicalPath) {
               
    using (ServerManager mgr = new ServerManager()) {
                   
    Site site = mgr.Sites[siteName];
                   
    if (site != null) {
                       
    site.Applications.Add(virtualPath, physicalPath);
                   
    }
                   
    mgr.CommitChanges();
               
    }
           
    }

           
    /// <summary>
            /// Helper method to find an element based on an attribute
            /// </summary>
            private static ConfigurationElement FindByAttribute(ConfigurationElementCollection collection, string attributeName, string value) {
               
    foreach (ConfigurationElement element in collection) {
                   
    if (String.Equals((string)element.GetAttribute(attributeName).Value, value, StringComparison.OrdinalIgnoreCase)) {
                       
    return element;
                   
    }
               
    }

               
    return null;
           
    }

           
    public static void RemoveApplication(string siteName, string virtualPath) {
               
    using (ServerManager mgr = new ServerManager()) {
                   
    Site site = mgr.Sites[siteName];
                   
    if (site != null) {
                       
    Application app = site.Applications[virtualPath];
                       
    if (app != null) {
                           
    site.Applications.Remove(app);
                           
    mgr.CommitChanges();
                       
    }
                   
    }
               
    }
           
    }

           
    /// <summary>
            /// Removes the specified module from the Modules section by name
            /// </summary>
            public static void RemoveModule(string name) {
               
    using (ServerManager mgr = new ServerManager()) {
                   
    Configuration appHostConfig = mgr.GetApplicationHostConfiguration();
                   
    ConfigurationSection modulesSection = appHostConfig.GetSection("system.webServer/modules");
                   
    ConfigurationElementCollection modules = modulesSection.GetCollection();
                   
    ConfigurationElement module = FindByAttribute(modules, "name", name);
                   
    if (module != null) {
                       
    modules.Remove(module);
                   
    }

                   
    mgr.CommitChanges();
               
    }
           
    }


           
    /// <summary>
            /// Removes the specified UI Module by name
            /// </summary>
            public static void RemoveUIModuleProvider(string name) {
               
    using (ServerManager mgr = new ServerManager()) {
                   
    // First remove it from the sites
                    Configuration adminConfig = mgr.GetAdministrationConfiguration();
                   
    ConfigurationSection modulesSection = adminConfig.GetSection("modules");
                   
    ConfigurationElementCollection modules = modulesSection.GetCollection();
                   
    ConfigurationElement module = FindByAttribute(modules, "name", name);
                   
    if (module != null) {
                       
    modules.Remove(module);
                   
    }

                   
    // now remove the ModuleProvider
                    ConfigurationSection moduleProvidersSection = adminConfig.GetSection("moduleProviders");
                   
    ConfigurationElementCollection moduleProviders = moduleProvidersSection.GetCollection();
                   
    ConfigurationElement moduleProvider = FindByAttribute(moduleProviders, "name", name);
                   
    if (moduleProvider != null) {
                       
    moduleProviders.Remove(moduleProvider);
                   
    }

                   
    mgr.CommitChanges();
               
    }
           
    }
       
    }
    }
  • CarlosAg Blog

    Using IIS Manager Users Authentication in your Web Application

    • 2 Comments

    Today in the IIS.NET Forums a question was asked if it was possible to use the same IIS Manager Users authentication in the context of a Web Application so that you could have say something like WebDAV using the same credentials as you use when using IIS Manager Remote Administration.

    The IIS Manager Remote Administration allows you to connect to manage your Web Site using credentials that are not Windows Users, but instead just a combination of User and Password. This is implemented following a Provider model where the default implementation we ship uses our Administration.config file (%windir%\system32\inetsrv\config\administration.config) as the storage for this users. However, you can easily implement a base class to authentication against a database or any other users store if needed. This means you can build your own application and call our API's (ManagementAuthentication).

    Even better in the context of a Web Site running in IIS 7.0 you can actually implement this without having to write a single line of code.

    Disclaimer: Administration.config out-of-the box only has permissions for administrators to be able to read the file. This means that a Web Application will not be able to access the file, so you need to change the ACL's in the file to provide read permissions for your Application, but you should make sure that you limit the read access to the minimum required such as below.

    Here is how you do it:

    1. First make sure that your Web Site is using SSL to use this. (Use IIS Manager and right click your Web Site and Edit Bindings and add an SSL binding).
    2. So that we can restrict permissions further, make your application run in its own Application Pool, this way we can change the ACL's required to only affect your application pool and nothing else. So using IIS Manager go to Application Pools and add a new Application running in Integrated Mode, and give it a name you can easily remember, say WebMgmtAppPool (we will use this in the permissions below).
    3. Disable Anonymous Authentication in your application. (Use IIS Manager, drill-down to your application, double click the Authentication feature and disable Anonymous Authentication and any other authentication module enabled).
    4. Enable the Web Management Authentication Module in your application, you can add a Web.config file with the following contents on it:
      <configuration>
       
      <system.webServer>
         
      <modules>
           
      <add name='WebManagementBasicAuthentication' 
                 type
      ='Microsoft.Web.Management.Server.WebManagementBasicAuthenticationModule, Microsoft.Web.Management, Version=7.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' />
          </
      modules>
       
      </system.webServer> 
      </configuration> 
    5. Modify the ACL's in the required configuration files:
      1. Give read access to the config directory so we can access the files using the following command line (note that we are only giving permissions to the Application Pool)
        icacls %windir%\system32\inetsrv\config /grant "IIS AppPool\WebMgmtAppPool":(R)
      2. Give read access to the redirection.config:
        icacls %windir%\system32\inetsrv\config\redirection.config /grant "IIS AppPool\WebMgmtAppPool":(R)
      3. Finally give read access to administration.config:
        icacls %windir%\system32\inetsrv\config\administration.config /grant "IIS AppPool\WebMgmtAppPool":(R)
    6. At this point you should be able to navigate to your application using any browser and you should get a prompt for credentials that will be authenticated against the IIS Manager Users.

    What is also nice is that you can use URL Authorization to further restrict permissions in your pages for this users, for example, if I didn't want a particular IIS Manager User (say MyIisManagerUser) to access the Web Site I can just configure this in the same web.config:

    <configuration>
     
    <system.webServer>
           
    <security>
               
    <authorization>
                   
    <add accessType="Deny" users="MyIisManagerUser" />
                </
    authorization>
           
    </security>
     
    </system.webServer>
    </configuration>

    If you want to learn more about remote administration and how to configure it you can read: http://learn.iis.net/page.aspx/159/configuring-remote-administration-and-feature-delegation-in-iis-7/

  • CarlosAg Blog

    Announcing: IIS SEO Toolkit v1.0 release

    • 3 Comments

    Today we are announcing the final release of the IIS Search Engine Optimization (SEO) Toolkit v1.0. This version builds upon the Beta 1 and Beta 2 versions and is 100% compatible with those versions so any report you currently have continues to work in the new version. The new version includes a set of bug fixes and new features such as:

    1. Extensibility. In this version we are opening a new set of API's to allow you to develop extensions for the crawling process, including the ability to augment the metadata in the report with your own, extend the set of tasks provided in the Site Analysis and Sitemaps User Interface and more. More on this on a upcoming post.
    2. New Reports. Based on feedback we added a Redirects summary report in the Links section as well as a new Link Depth report that allows you to easily know which pages are the "most hidden pages" in your site, or in other words if a user landed at your sites home page, "how many clicks does he need to do to reach a particular page".
    3. New Routes Query. We added a new type of Query called Routes. This is the underlying data that powers the "Link Depth" report mentioned above, however it is also exposed as a new query type so that you can create your own queries to customize the Start page and any other kind of things, like filtering, grouping, etc.
    4. New option to opt-out from keeping a local cache of files. We added a new switch in the "Advanced Settings" of the New Analysis dialog to disable the option of keeping the files stored locally. This allows you to run a report which runs faster and that consumes a lot less disk space than when keeping the files cached. The only side effect is that you will not be able to get the "Content" tab and the contextual position of the links as well as the Word Analysis feature. Everything else continues to work just as any other report.
    5. HTML Metadata is now stored in the Report. By leveraging the Extensibility mentioned in bullet 1, the HTML parser now stores all the HTML META tags content so that you can later use them to write your own queries, whether to filter, group data or just export it, this gives you a very interesting set of options if you have any metadata like Author, or any custom.
    6. Several Bug Fixes:
      1. Internal URLs linked by External URLs now are also included in the crawling process.
      2. Groupings in queries should be case sensitive
      3. Show contextual information (link position) in Routes
      4. The Duplicate detection logic should only include valid responses (do not include 404 NOT Found, 401, etc)
      5. Canonical URLs should support sub-domains.
      6. Several Accessibility fixes. (High DPI, Truncation in small resolutions, Hotkeys, Keyboard navigation, etc).
      7. Several fixes for Right-To-Left languages. (Layout and UI)
      8. Help shortcuts enabled.
      9. New Context Menus for Copying content
      10. Add link position information for Canonical URLs
      11. Remove x-javascript validation for this release
      12. Robots algorithm should be case sensitive
      13. many more

    This version can upgrade both Beta 1 and Beta 2 version so go ahead and try it and PLEASE provide us with feedback and any additional things you would like to see for the next version at the SEO Forum in the IIS Web site.

    Click here to install the IIS SEO Toolkit.

  • CarlosAg Blog

    Application Request Routing and the IIS 7.0 Web Management Service

    • 1 Comments

    Yesterday I was having a conversation with Anil Ruia who happens to be the ARR (Application Request Routing) developer and based on customer feedback we discussed the idea of using ARR in the context of Remote Management in IIS which solves a question that several people asked me before and thought it would be fun to try it out.

    Basically the question that I got asked was "Can I have a single entry-point exposed for Remote Management?", or in other words "Can I provide users with remote administration giving them a single server name like management.myhostingcompany.com, instead of having to give them the specific machine name where their site lives?". So far the answer to these questions was "not easily", however with the use of ARR and URL Rewriter we will see how easy it is to achieve this.

    The only thing you need for this to work is install the new URL Rewrite and ARR Module both available here http://blogs.iis.net/bills/archive/2008/07/09/new-iis7-releases-url-rewrite-application-routing-and-load-balancing-and-powershell-cmd-lets.aspx.

    Background

    The Web Management Service (WMSvc) is the service that enables remote administration for IIS 7.0 Manager, providing an HTTPS end-point that exposes functionality similar to Web Services to manage the Web Server (IIS) remotely. This service uses HTTPS for its communication and exposes several configuration options that support giving access to Non-Windows Users (What we call IIS Manager Users), provide a list of IP Restrictions, support only local connections and many more that can be managed using the Management Service feature inside IIS Manager.

    To enable remote administration typically you need to: 1) Configure a valid Certificate for SSL, 2) Allow Remote Connections and 3) Start the WMSvc Service, all of which can be performed in IIS Manager. Once you have successfully enabled the remote service you should be able to go to a different machine and be able to connect remotely.

    Note: If you are using Windows Vista, Windows XP, or Windows 2003 to connect to a Windows Server 2008 you need to download and install the client to do this: http://www.iis.net/downloads/default.aspx?tabid=34&g=6&i=1626

    However, one of the drawbacks is that in order to be able to connect to a Web Site, the end-user needs to know the machine name, as well as the name of the Web Site they will be connecting to, which sometimes it would be better to be dynamic. The following image shows the information required to enter when connecting to a Web Site. Note that if connecting to an application you will also need to enter the name of the application.

    ConnectingToSite

    However, this can potentially reduce the flexibility for deployment options, since now your customers have specific knowledge of the physical machine and will limit the flexibility of moving the site to different machines or even changing the name of the site where it is being hosted.

    ARR and URL Rewrite to the rescue.

    ARR has several very interesting capabilities that are really useful for this scenario. First, we can configure it to act as a proxy and basically forward the requests to another server where they actually get processed. This is the simplest configuration option and what it allows you is to have something similar to the next image:

    WMSvcRouting

    To set up this configuration where a front-end management server forwards the IIS Remote Management requests to another server running WMSVC you have to:

    1. Install ARR and URL Rewrite in the Server that is intended to be used as the front-end for management requests. Lets call this ServerA.
    2. Create a new Web Site.
      1. Navigate to IIS Manager->Site
      2. Click Add Web Site.
      3. In the dialog set: Site name:ManagementSite, Binding: https, port: 8172 and choose a valid SSL certificate, specify a phisical path. Click OK
    3. Configure URL Rewrite to Route requests to the IIS Management Service running in the other computer.
      1. Navigate to IIS Manager->Sites->Management Site->URL Rewrite Module
      2. Click Add Rule
      3. Set: Name: RouteWMSvc, Pattern:.*, Rewrite URL:https://<RemoteServer>:8172/{R:0}, Stop Processing rules: Checked.
      4. This should generate a web.config with similar content (note that my backend, ie the RemoteServer in my case is carlosag2-iis below):

        <configuration>
           
        <system.webServer>
               
        <rewrite>
                   
        <rules>
                       
        <rule name="RouteWMSvc" stopProcessing="true">
                           
        <match url=".*" />
                            <
        action type="Rewrite" url="https://carlosag2-iis:8172/{R:0}" />
                        </
        rule>
                   
        </rules>
               
        </rewrite>
           
        </system.webServer>
        </configuration>

    4. Now you can run IIS Manager in any client machine, specify the ServerA as the machine name and specify any web site in the remote RemoteServer, the result will be that all requests will be forwarded to the WMSvc running in the remote server.

    Now, that is interesting and the scenario it allows you to do is potentially have WMSvc IP Request Filtering in the RemoteServer and only allow calls from the Management Server where you can do further configuration. Note that this also means that you can have a single public SSL Certificate in the management server and use privately issued certificates (or potentially even self-signed certificates in the remoteserver since you can control installing the certificate into the management server). It also means that the customers no longer use the physical name of the RemoteServer machine but instead connect to the Management Server allowing you to completely move them to another machine and not have to update your clients.

    Troubleshooting: If you are having troubles testing this, the best thing to do is enable Failed Request Tracing in the ManagementSite, which will tell you exactly what is going on. For example you will get entries like:

    Warning: ModuleName="ApplicationRequestRouting", Notification="EXECUTE_REQUEST_HANDLER", HttpStatus="502", HttpReason="Bad Gateway", HttpSubStatus="3", ErrorCode="2147954575", ConfigExceptionInfo=""

    If you lookup the ErrorCode, it is actually: ERROR_WINHTTP_SECURE_FAILURE, this means that you have most likely issues with the certificate. In my case, just to test this what I did is generate a self-signed certificate in the RemoteServer with the name of the machine (carlosag2-iis) and then I installed that certificate using the MMC certificates snap-in in the management server into the Trusted Root Certification Authority. Disclaimer Warning!! this is something you should only do for testing purposes or if you know what you are doing.

    More Advanced Stuff... Dynamically choosing the machine

    Now, trying to push the capabilities of this I decided to solve another requests that we've heard which is closely related "Can I have a single management server and dynamically route the requests to the machine where a particular site lives?"

    The following picture represents this, where the Management Server dynamically resolves the server that it should talk to using the URL Rewrite Maps functionality.

    WMSvcRoutingMultiple

    Turns out this is really simple using URL Rewrite, basically you can write a Rewrite Rule that matches the Site name that is included as part of the Query String and use the Rewrite Maps support for figuring out the machine where this site lives. The following shows such a rule:

    <?xml version="1.0" encoding="UTF-8"?>
    <configuration>
     
    <system.webServer>
       
    <rewrite>
         
    <rules>
           
    <rule name="RouteWMSvc" stopProcessing="true">
             
    <match url=".*" />
              <
    conditions>
               
    <add input="{QUERY_STRING}" pattern="Site=([^&amp;]+)" />
              </
    conditions>
             
    <action type="Rewrite" url="https://{ServersTable:{C:1}}:8172/{R:0}" appendQueryString="true" />
            </
    rule>
         
    </rules>
         
    <rewriteMaps>
           
    <rewriteMap name="ServersTable">
             
    <add key="CarlosAgWebSite" value="carlosag2-iis" />
              <
    add key="SomeOtherUserSite" value="carlosag1-iis" />
              <
    add key="SomeOtherUserSite2" value="carlosag3-iis" />
            </
    rewriteMap>
         
    </rewriteMaps>
       
    </rewrite>
     
    </system.webServer>
    </configuration>

    Basically, URL Rewrite matches every request and uses the condition entry to parse the Query String and find the Site name within it. With it, it and using the Map ServersTable to resolve the machine name based on the Site name it rewrites the request to the machine where its currently located. This makes it basically route "https://localhost:8172/Service.axd?...&Site=CarlosAgWebSite" into https://carlosag2-iis:8172/Service.axd?...&Site=CarlosAgWebSite. The end result is that can dynamically at any time just update this table and make ARR route the requests to the right machine giving you complete flexibility on the deployment of sites.

    One thing to note is that URL Rewrite is one of the ways you can make the ARR scenario work, however, you could also write your own module that uses any dynamic behavior such as going to a database or a provisioning system or anything else and rewrite the URL programmatically in a way that ARR will understand it and do the routing automatically.

    Also, worth to mention that ARR has way more features than just this, making it possible to load-balance requests and many more interesting stuff that I will try to get back in a future post.

    With all this you can imagine several benefits, such as single public end-point for remote management of multiple servers, only one valid certificate is needed facing public machines, you can relocate sites at your own will since customers will never really know the real machine name where their site lives, you can use a similar technique to rewrite even the Site Name and give them some friendly name such as their user name or whatever.

    Acknowledgements: I want to thank Anil Ruia and Daniel Vasquez Lopez who helped figuring out a few issues during this blog and Ruslan Yakushev and Won Yoo for reviewing its technical accuracy.

  • CarlosAg Blog

    IIS SEO Toolkit - Start new analysis automatically through code

    • 9 Comments

    One question that I've been asked several times is: "Is it possible to schedule the IIS SEO Toolkit to run automatically every night?". Other related questions are: "Can I automate the SEO Toolkit so that as part of my build process I'm able to catch regressions on my application?", or "Can I run it automatically after every check-in to my source control system to ensure no links are broken?", etc.

    The good news is that the answer is YES!. The bad news is that you have to write a bit of code to be able to make it work. Basically the SEO Toolkit includes a Managed code API to be able to start the analysis just like the User Interface does, and you can call it from any application you want using Managed Code.

    In this blog I will show you how to write a simple command application that will start a new analysis against the site provided in the command line argument and process a few queries after finishing.

    IIS SEO Crawling APIs

    The most important type included is a class called WebCrawler. This class takes care of all the process of driving the analysis. The following image shows this class and some of the related classes that you will need to use for this.

    image

    The WebCrawler class is initialized through the configuration specified in the CrawlerSettings. The WebCrawler class also contains two methods Start() and Stop() which starts the crawling process in a set of background threads. With the WebCrawler class you can also gain access to the CrawlerReport through the Report property. The CrawlerReport class represents the results (whether completed or in progress) of the crawling process. It has a method called GetUrls() that returns an instance to all the UrlInfo items. A UrlInfo is the most important class that represents a URL that has been downloaded and processed, it has all the metadata such as Title, Description, ContentLength, ContentType, and the set of Violations and Links that it includes.

    Developing the Sample

    1. Start Visual Studio.
    2. Select the option "File->New Project"
    3. In the "New Project" dialog select the template "Console Application", enter the name "SEORunner" and press OK.
    4. Using the menu "Project->Add Reference" add a reference to the IIS SEO Toolkit Client assembly "c:\Program Files\Reference Assemblies\Microsoft\IIS\Microsoft.Web.Management.SEO.Client.dll".
    5. Replace the code in the file Program.cs with the code shown below.
    6. Build the Solution
    using System;
    using System.IO;
    using System.Linq;
    using System.Net;
    using System.Threading;
    using Microsoft.Web.Management.SEO.Crawler;

    namespace SEORunner {
       
    class Program {

           
    static void Main(string[] args) {

               
    if (args.Length != 1) {
                   
    Console.WriteLine("Please specify the URL.");
                   
    return;
               
    }

               
    // Create a URI class
                Uri startUrl = new Uri(args[0]);

               
    // Run the analysis
                CrawlerReport report = RunAnalysis(startUrl);

               
    // Run a few queries...
                LogSummary(report);

               
    LogStatusCodeSummary(report);

               
    LogBrokenLinks(report);
           
    }

           
    private static CrawlerReport RunAnalysis(Uri startUrl) {
               
    CrawlerSettings settings = new CrawlerSettings(startUrl);
               
    settings.ExternalLinkCriteria = ExternalLinkCriteria.SameFolderAndDeeper;
               
    // Generate a unique name
                settings.Name = startUrl.Host + " " + DateTime.Now.ToString("yy-MM-dd hh-mm-ss");

               
    // Use the same directory as the default used by the UI
                string path = Path.Combine(
                   
    Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments),
                    "IIS SEO Reports"
    );

               
    settings.DirectoryCache = Path.Combine(path, settings.Name);

               
    // Create a new crawler and start running
                WebCrawler crawler = new WebCrawler(settings);
               
    crawler.Start();

               
    Console.WriteLine("Processed - Remaining - Download Size");
               
    while (crawler.IsRunning) {
                   
    Thread.Sleep(1000);
                   
    Console.WriteLine("{0,9:N0} - {1,9:N0} - {2,9:N2} MB",
                       
    crawler.Report.GetUrlCount(),
                       
    crawler.RemainingUrls,
                       
    crawler.BytesDownloaded / 1048576.0f);
               
    }

               
    // Save the report
                crawler.Report.Save(path);

               
    Console.WriteLine("Crawling complete!!!");

               
    return crawler.Report;
           
    }

           
    private static void LogSummary(CrawlerReport report) {
               
    Console.WriteLine();
               
    Console.WriteLine("----------------------------");
               
    Console.WriteLine(" Overview");
               
    Console.WriteLine("----------------------------");
               
    Console.WriteLine("Start URL:  {0}", report.Settings.StartUrl);
               
    Console.WriteLine("Start Time: {0}", report.Settings.StartTime);
               
    Console.WriteLine("End Time:   {0}", report.Settings.EndTime);
               
    Console.WriteLine("URLs:       {0}", report.GetUrlCount());
               
    Console.WriteLine("Links:      {0}", report.Settings.LinkCount);
               
    Console.WriteLine("Violations: {0}", report.Settings.ViolationCount);
           
    }

           
    private static void LogBrokenLinks(CrawlerReport report) {
               
    Console.WriteLine();
               
    Console.WriteLine("----------------------------");
               
    Console.WriteLine(" Broken links");
               
    Console.WriteLine("----------------------------");
               
    foreach (var item in from url in report.GetUrls()
                                    
    where url.StatusCode == HttpStatusCode.NotFound &&
                                          
    !url.IsExternal
                                    
    orderby url.Url.AbsoluteUri ascending
                                    
    select url) {
                   
    Console.WriteLine(item.Url.AbsoluteUri);
               
    }
           
    }

           
    private static void LogStatusCodeSummary(CrawlerReport report) {
               
    Console.WriteLine();
               
    Console.WriteLine("----------------------------");
               
    Console.WriteLine(" Status Code summary");
               
    Console.WriteLine("----------------------------");
               
    foreach (var item in from url in report.GetUrls()
                                    
    group url by url.StatusCode into g
                                    
    orderby g.Key
                                    
    select g) {
                   
    Console.WriteLine("{0,20} - {1,5:N0}", item.Key, item.Count());
               
    }
           
    }
       
    }
    }

     

    If you are not using Visual Studio, you can just save the contents above in a file, call it SEORunner.cs and compile it using the command line:

    C:\Windows\Microsoft.NET\Framework\v3.5\csc.exe /r:"c:\Program Files\Reference Assemblies\Microsoft\IIS\Microsoft.Web.Management.SEO.Client.dll" /optimize+ SEORunner.cs

     

    After that you should be able to run SEORunner.exe and pass the URL of your site as a argument, you will see an output like:

    Processed - Remaining - Download Size
           56 -       149 -      0.93 MB
          127 -       160 -      2.26 MB
          185 -       108 -      3.24 MB
          228 -        72 -      4.16 MB
          254 -        48 -      4.98 MB
          277 -        36 -      5.36 MB
          295 -        52 -      6.57 MB
          323 -        25 -      7.53 MB
          340 -         9 -      8.05 MB
          358 -         1 -      8.62 MB
          362 -         0 -      8.81 MB
    Crawling complete!!!
    
    ----------------------------
     Overview
    ----------------------------
    Start URL:  http://www.carlosag.net/
    Start Time: 11/16/2009 12:16:04 AM
    End Time:   11/16/2009 12:16:15 AM
    URLs:       362
    Links:      3463
    Violations: 838
    
    ----------------------------
     Status Code summary
    ----------------------------
                      OK -   319
        MovedPermanently -    17
                   Found -    23
                NotFound -     2
     InternalServerError -     1
    
    ----------------------------
     Broken links
    ----------------------------
    http://www.carlosag.net/downloads/ExcelSamples.zip

     

    The most interesting method above is RunAnalysis, it creates a new instance of the CrawlerSettings and specifies the start URL. Note that it also specifies that we should consider internal all the pages that are hosted in the same directory or subdirectories. We also set the a unique name for the report and use the same directory as the IIS SEO UI uses so that opening IIS Manager will show the reports just as if they were generated by it. Then we finally call Start() which will start the number of worker threads specified in the WebCrawler::WorkerCount property. We finally just wait for the WebCrawler to be done by querying the IsRunning property.

    The remaining methods just leverage LINQ to perform a few queries to output things like a report aggregating all the URLs processed by Status code and more.

    Summary

    As you can see the IIS SEO Toolkit crawling APIs allow you to easily write your own application to start the analysis against your Web site which can be easily integrated with the Windows Task Scheduler or your own scripts or build system to easily allow for continuous integration.

    Once the report is saved locally it can then be opened using IIS Manager and continue further analysis as with any other report. This sample console application can be scheduled using the Windows Task Scheduler so that it can run every night or at any time. Note that you could also write a few lines of PowerShell to automate it without the need of writing C# code and do that by only command line, but that is left for another post.

Page 4 of 10 (93 items) «23456»