• CarlosAg Blog

    IIS SEO Tip - Do not stress your server, limit the number of concurrent requests


    The other day somebody ask me if there was a way to limit the amount of work that Site Analysis in IIS SEO Toolkit would cause to the server. This is interesting for a couple of reasons,

    • You might want to reduce the load that Site Analysis cause to your server at any given time
    • You might have a Denial-of-service detection system such as our Dynamic IP Restrictions IIS module that will start failing requests based on number of requests in a certain amount of time
    • Or If you like me have to go through a Proxy and it has a configured limit of number of requests per minute you are allowed to issue

    In Beta 1 we do not support the Crawl-delay directive in the Robots exclusion protocol; in future versions we will look at adding support this setting. The good news is that in Beta 1 we do have a configurable setting that can help you achieve this goals called Maximum Number of Concurrent Requests that you can configure.

    To set it:

    1. Go to the Site Analysis Reports page
    2. Select the option "Edit Feature Settings..." as show in the next image
    3. In the "Edit Feature Settings" dialog you will see the Maximum Number of Concurrent Requests option that you can set to any value from 1 to 16. The default value is 8 which means at any given time we will issue 8 requests to the server.
  • CarlosAg Blog

    Redirects, 301, 302 and IIS SEO Toolkit


    In the URL Rewrite forum somebody posted the question "are redirects bad for search engine optimization?". The answer is: not necessarily, Redirects are an important tool for Web sites and if used in the right context they actually are a required tool. But first a bit of background.

    What is a Redirect?

    A redirect in simple terms is a way for the server to indicate to a client (typically a browser) that a resource has moved and they do this by the use of an HTTP status code and a HTTP location header. There are different types of redirects but the most common ones used are:

    • 301 - Moved Permanently. This type of redirect signals that the resource has permanently moved and that any further attempts to access it should be directed to the location specified in the header
    • 302 - Redirect or Found. This type of redirect signals that the resource is temporarily located in a different location, but any further attempts to access the resource should still go to the same original location.

    Below is an example on the response sent from the server when requesting

    HTTP/1.1 302 Found
    Connection: Keep-Alive
    Content-Length: 161
    Content-Type: text/html; charset=utf-8
    Date: Wed, 10 Jun 2009 17:04:09 GMT
    Location: /sqlserver/2008/en/us/default.aspx
    Server: Microsoft-IIS/7.0
    X-Powered-By: ASP.NET


    So what do redirects mean for SEO?

    One of the most important factors in SEO is the concept called organic linking, in simple words it means that your page gets extra points for every link that external Web sites have linking to your page. So now imagine the Search Engine Bot is crawling an external Web site and finds a link pointing to your page ( and when it tries to visit your page it runs into a redirect to another location (say Now the Search Engine has to decide if it should add the original "some-page" into its index as well as if it should "add the extra points" to the new location or to the original location, or if it should just ignore it entirely. Well the answer is not that simple, but a simplification of it could be:

    • if you return a 301 (Permanent Redirect) you are telling the search engine that the resource moved to a new location permanently so that all further traffic should be directed to that location. This clearly means that the search engine should ignore the original location (some-page) and index the new location (somepage), and that it should add all the "extra points" to it, as well as any further references to the original location should now be "treated" as if it was the new one.
    • if you return a 302 (Temporary Redirect) the answer can depend on search engines, but its likely to decide to index the original location and ignore the new location at all (unless directly linked in other places) since its only temporary and it could at any given point stop redirecting and start serving the content from the original location. This of course makes it very ambiguous on how to deal with the "extra points" and likely will be added to the original location and not the new destination.


    Enter IIS SEO Toolkit

    IIS Search Optimization Toolkit has a couple of rules that look for different patterns related to Redirects. The Beta version includes the following:

    1. The redirection did not include a location header. Believe it or not there are a couple of applications out there that does not generate a location header which completely breaks the model of redirection. So if your application is one of them, it will let you know.
    2. The redirection response results in another redirection. In this case it detected that your page (A) is linking to another page (B) which caused a redirection to another page (C) which resulted in another redirection to yet another page (D). In this case it is trying to let you know that the number of redirects could significantly impact the SEO "bonus points" since the organic linking could be all broken by this jumping around and that you should consider just linking from (A) to (D) or whatever actual end page is supposed to be the final destination.
    3. The page contains unnecessary redirects. In this case it detected that your page (A) is linking to another page (B) in your Web site that resulted in a redirect to another page (C) within your Web site. Note that this is an informational rule, since there are valid scenarios where you would want this behavior, such as when tracking page impressions, or login pages, etc. but in many cases you do not need them since we detect that you own the three pages we are suggesting to look and see if it wouldn't be better to just change the markup in (A) to point directly to (C) and avoid the (B) redirection entirely.
    4. The page uses a refresh definition instead of using redirection. Finally related to redirection, IIS SEO will flag when it detects that the use of the refresh meta-tag is being used as a mean for causing a redirection. This is a practice that is not recommended since the use of this tag does not include any semantics for search engines on how to process the content and in many cases is actually consider to be a tactic to confuse search engines, but I won't go there.

    So how does it look like? In the image below I ran Site Analysis against a Web site and it found a few of these violations (2 and 3).


    Notice that when you double click the violations it will tell you the details as well as give you direct access to the related URL's so that you can look at the content and all the relevant information about them to make the decision. From that menu you can also look at which other pages are linking to the different pages involved as well as launch it in the browser if needed.


    Similarly with all the other violations it tries to explain the reason it is being flagged as well as recommended actions to follow for each of them.

    IIS Search Engine Optimization Toolkit can also help you find all the different types of redirects and the locations where they are being used in a very easy way, just select Content->Status Code Summary in the Dashboard view and you will see all the different HTTP Status codes received from your Web site. Notice in the image below how you can see the number of redirects (in this case 18 temporary redirects and 2 permanent redirects). You can also see how much content they accounted for, in this case about 2.5 kb (Note that I've seen Web sites generate a large amount of useless content in redirect traffic, speaking of spending in bandwidth). You can double click any of those rows and it will show you the details of the URL's that returned that and from there you can see who links to them, etc.


    So what should I do?

    1. Know your Web site. Run Site Analysis against your Web site and see all the different redirects that are happening.
    2. Try to minimize redirections. If possible with the knowledge gain on 1, make sure to look for places where you can update your content to reduce the number of redirects.
    3. Use the right redirect. Understand what is the intent of the redirection you are trying to do and make sure you are using the right semantics (is it permanent or temporary). Whenever possible prefer Permanent Redirects 301.
    4. Use URL Rewrite to easily configure them. URL Rewrite allows you to configure a set of rules using both regular expressions and wildcards that live along with your application (no-administrative privileges required) that can let you set the right redirection status code. A must for SEO. More on this on a future blog.


    So going back to the original question: "are redirects bad for Search Engine Optimization?". Not necessarily, they are an important tool used by Web application for many reasons such as:

    • Canonicalization. Ensure that users are accessing your site with www. or without www. use permanent redirects
    • Page impressions and analytics. Using temporary redirects to ensure that the original link is preserved and counters work as expected.
    • Content reorganization. Whether you are changing your host due to a brand change or just renaming a page, you should make sure to use permanent redirects to keep your page rankings.
    • etc

    Just make sure you don't abuse them by having redirects to redirects, unnecessary redirects, infinite loops, and use the right semantics.

  • CarlosAg Blog

    Canonical Formats and Query Strings - IIS SEO Toolkit


    Today somebody was running the IIS SEO Toolkit and using the Site Analysis feature flagged a lot of violations about "The page contains multiple canonical formats.". The reason apparently is that he uses Query String parameters to pass contextual information or other information between pages. This of course yield the question: Does that mean in general query strings are bad news SEO wise?

    Well, the answer is not necessarily.

    I will start by clarifying that this violation in Site Analysis means that our algorithm detected that those two URL's look like the same content, note that we make no assumptions based on the URL (including Query String parameters). This kind of situation is bad for a couple of reasons:

    1. Based on the fact they look like the same page Search Engines will probably choose one of them and index it as the real content and will discard the other one. The problem is that you are leaving this decision to Search Engines which means some might choose the wrong version and end up using the one with Query String parameters instead of the clean one (not-likely though). Or even worse they might end up indexing both of them as if they were different.
    2. When other Web sites look at your content and add links to it, some of them might end up using the URL with different Query String parameters and some of them not. What this means is that the organic linking will not give you the benefits that you would if this was not the case. Remember Search Engines add you "extra" points when somebody external references your page but now you'll be splitting the earnings with "two pages" instead of a single canonical form.

    Query String by themselves do not pose a terrible threat to SEO, most modern Search Engines deal OK with Query Strings, however its the organic linking and the potential abuse of Query Strings that could give you headaches.

    Remember, Search Engines should make no assumptions based on the fact it is a single "page" that serves tons of content through a single Absulte Path and the use of Query Strings. This is typical in many cases such as when using index.php, where pretty much every page on the site is served by the same resource and just using variations of Query Strings or path information.


    So what should I do?

    Well, there are several things you could do, but probably one of the easiest is to just tell Search Engines (more specifically crawlers or bots) to not index pages that have the different Query String variations that really are meant only for the application to pass state and not to specify different content. This can be done using the Robots Exclusion Protocol and use the wildcard matching to specify to not follow any URL's that contain a '?'. Note that you should make sure you are not blocking URL's that actually are supposed to be indexed. For this you can use the Site Analysis feature to run it again and it will flag an informational message for each URL that is not visited due to the robots exclusion file.

    User-agent: *
    Disallow: /*?


    In summary, try to keep canonical formats yourself, don't leave any guesses to Search Engines cause some of them might get it wrong. There are new ways of specifying the canonical form in your markup but it is "very recent" (as in 2009) and some Search Engines do not support it (I believe the top three do, though) using the new rel="canonical":

    <link rel="canonical" href="" />

    In the Beta 2 version of IIS SEO Toolkit we will support this tag and have better detection of this canonical issues. So stay tuned.

    Other ways to solve this is to use URL Rewrite so that you can easily redirect or rewrite your URL's to get rid of the Query Strings and use more SEO friendly URL's.

  • CarlosAg Blog

    Are you caching your images and scripts? IIS SEO can tell you


    One easy way to enhance the experience of users visiting your Web site by increasing the perceived performance of navigating in your site is to reduce the number of HTTP requests that are required to display a page. There are several techniques for achieving this, such as merging scripts into a single file, merging images into a big image, etc, but by far the simplest one of all is making sure that you cache as much as you can in the client. This will not only increase the rendering time but will also reduce load in your server and will reduce your bandwidth consumption.

    Unfortunately the different types of caches and the different ways of set it can be quite confusing and esoteric. So my recommendation is to think about one way and use that all the time, and that way is using the HTTP 1.1 Cache-Control header.

    So first of all, how do I know if my application is being well behaved and sending the right headers so browsers can cache them. You can use a network monitor or tools like Fiddler or wfetch to look at all the headers and figure out if the headers are getting sent correctly. However, you will soon realize that this process won't scale for a site with hundreds if not thousands of scripts, styles and images.

    Enter Site Analysis - IIS Search Optimization Toolkit

    To figure out if your images are sending the right headers you can follow the next steps:

    1. Install the IIS Search Optimization Toolkit from
    2. Launch InetMgr.exe (IIS Manager) and crawl your Web Site. For more details on how to do that refer to the article "Using Site Analysis to crawl a web site".
    3. Once you are in the Site Analysis dashboard view you can start a New Query by using the Menu "Query->New Query" and add the following criteria:
      1. Is External - Equals - False -> To only include the files that are coming from your Web site.
      2. Status code - Equals - OK -> To include only successful requests
      3. Content Type Normalized - Begines With - image/ -> To include only images
      4. Headers - Not Contains - Cache-Control: -> to include the ones does not have the cache-control header specified
      5. Headers - Not Contains - Expires: -> To include only the ones that do no have the expires header
      6. Press Execute, and this will display all the images in your Web site that are not specifying any caching behavior.

    Alternatively you can just save the following query as "ImagesNotCached.xml" and use the Menu "Query->Open Query" for it. This should make it easy to open the query for different Web sites or keep testing the results when making changes:

    <?xml version="1.0" encoding="utf-8"?>
    <query dataSource="urls">
    <expression field="IsExternal" operator="Equals" value="False" />
    expression field="StatusCode" operator="Equals" value="OK" />
    expression field="ContentTypeNormalized" operator="Begins" value="image/" />
    expression field="Headers" operator="NotContains" value="Cache-Control:" />
    expression field="Headers" operator="NotContains" value="Expires:" />
    <field name="URL" />
    field name="ContentTypeNormalized" />
    field name="StatusCode" />

    How do I fix it?

    In IIS 7 this is trivial to fix, you can just drop a web.config file in the same directory where your images and scripts and CSS styles specifying the caching behavior for them. The following web.config will send the Cache-Control header so that the browser caches the responses for up to 7 days.

    <?xml version="1.0" encoding="UTF-8"?>
    <clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00" />

    You can also do this through the UI (IIS Manager) by going into the "HTTP Response Headers" feature -> Set Common Headers... or any of our API's using Managed code, JavaScript or your favorite language:

    Furthermore, using the same query above in the Query Builder you can Group by Directory and find the directories that really worth adding this. For that is just matter of clicking the "Group by" button and adding the URL-Directory to the Group by clauses. Not surprisingly in my case it flags the App_Themes directory where I store 8 images.



    Finally, what about 304's?

    One thing to note is that that even if you do not do anything most modern browsers will use conditional requests to reduce the latency if they have a copy in their cache, as an example, imagine the browser needs to display logo.gif as part of displaying test.htm and that image is available in their cache, the browser will issue a request like this

    GET /logo.gif HTTP/1.1
    Accept: */*
    Referer: http://carlosag-client/test.htm
    Accept-Language: en-us
    User-Agent: (whatever-browser-you-are-using)
    Accept-Encoding: gzip, deflate
    If-Modified-Since: Mon, 09 Jun 2008 16:58:00 GMT
    If-None-Match: "01c13f951cac81:0"
    Host: carlosagdev:8080
    Connection: Keep-Alive

    Note the use of If-Modfied-Since header which tells the server to only send the actual data if it has been changed after that time. In this case it hasn't so the server responds with a status code 304 (Not Modified)

    HTTP/1.1 304 Not Modified
    Last-Modified: Mon, 09 Jun 2008 16:58:00 GMT
    Accept-Ranges: bytes
    ETag: "01c13f951cac81:0"
    Server: Microsoft-IIS/7.0
    X-Powered-By: ASP.NET
    Date: Sun, 07 Jun 2009 06:33:51 GMT

    Even though this helps you can imagine that this still requires a whole roundtrip to the server which even though will have a short response, it can still have a significant impact if rendering of the page is waiting for it, as in the case of a CSS file that the browser needs to resolve to display correctly the page or an <img> tag that does not include the dimensions (width and height attributes) and so requires the actual image to determine the required space (one reason why you should always specify the dimensions in markup to increase rendering performance).


    To summarize, with IIS Search Engine Optimization Toolkit you can easily build your own queries to learn more about your own Web site, allowing you to easily find details that otherwise were tedious tasks. In this case I show how easy it is to find all the images that are not specifying any caching headers and you can do the same thing for scripts (if you add Content Type Normalized equals application/javascript)  or styles (Content Type Normalized Equals text/css). This way you can increase the performance of the rendering and reduce the overall bandwidth of your Web site.

  • CarlosAg Blog

    Announcing: IIS Search Engine Optimization Toolkit Beta 1


    Today we are releasing the IIS Search Engine Optimization Toolkit. The IIS SEO Toolkit is a set of features that aim to help you keep your Web site and its content in good shape for both Users and Search Engines.

    The features that are included in this Beta release include:

    • Site Analysis. This feature includes a crawler that starts looking at your Web site contents, discovering links, downloading the contents and applying a set of validation rules aimed to help you easily troubleshoot common problems such as broken links, duplicate content, keyword analysis, route analysis and many more features that will help you improve the overall quality of your Web site.
    • Robots Exclusion Editor. This includes a powerful editor to author Robots Exclusion files. It can leverage the output of a Site Analysis crawl report and allow you to easily add the Allow and Disallow entries without having to edit a plain text file, making it less error prone and more reliable. Furthermore, you can run the Site Analysis feature again and see immediately the results of applying your robots files.
    • Sitemap and Sitemap Index Editor. Similar to the Robots editor, this allows you to author Sitemap and Sitemap Index files with the ability to discover both physical and logical (Site Analysis crawler report) view of your Site.

    Checkout the great blog about IIS SEO Toolkit by ScottGu, or this IIS SEO simple video of some of its capabilities.

    Run it in your Development, Staging, or Production Environments

    One of the problems with many similar tools out there is that they require you to publish the updates to your production sites before you can even use the tools, and of course would never be usable for Intranet or internal applications that are not exposed to the Web. The IIS Search Engine Optimization Toolkit can be used internally in your own development or staging environments giving you the ability to clean up the content before publishing to the Web. This way your users do not need to pay the price of broken links once you publish to the Web and you will not need to wait for those tools or Search Engines to crawl your site to finally discover you broke things.

    For developers this means that they can now easily look at the potential impact of removing or renaming a file, easily check which files are referring to this page and which files he can remove because of only being referenced by this page.

    Run it against any Web application built on any framework running in any server

    One thing that is important to clarify is that you can target and analyze your production sites if you want to, and you can target Web applications running in any platform, whether its ASP.NET, PHP, or plain HTML text files running in your local IIS or on any other remote server.

    Bottom line, try it against your Web site, look at the different features and give us feedback for additional reports, options, violations, content to parse, etc, post any comments or questions at the IIS Search Engine Optimization Forum.

    The IIS SEO Toolkit documentation can be found at, but remember this is only Beta 1 so we will be adding more features and content.

    IIS Search Engine Optimization Toolkit

  • CarlosAg Blog

    IIS Manager Online Help Link gets updated


    While using IIS Manager, did you ever wondered what configuration section is this UI changing? Is there a way I could automate this using scripts or command line?

    Well, if you use IIS Manager 7.0 you might have noticed that we have a link in every page called Online Help, and if you've ever clicked it you would have noticed that it takes you to the IIS 7 Operations Guide, however you might ask yourself if it was that important to place that in every page, and the answer was that we had other reasons to add it there.

    Back then when we were designing the UI we realized we wanted to provide the best content we could have for each page and potentially be able to update it as more content was available, for that reason we added this link there. However, the content was not ready at the time and instead we pointed it to our operations guide.

    But the good news is that we are updating those links to point to their respective entry in the Configuration Reference that was recently published. This means now that in any page in IIS Manager, if you click the Online Help we will point you to the configuration section that this UI would change which will give you details about how the section look like, ways to change it using Scripts, AppCmd, etc.

    Online Help

    An interesting thing, this new routing mechanism is brought to you by our very own URL Rewrite module and a simple Rewrite Map as well as a couple of rules. If you haven't looked into it, you should definitely download it and give it a try, you'll soon realize that there are so many things you can do without writing any code that you'll love it.

  • CarlosAg Blog

    Calling Web Services from Silverlight using IIS 7.0 and ARR


    During this PDC I attended Ian's presentation about WPF and Silverlight where he demonstrated the high degree of compatibility that can be achieved between a WPF desktop application and a Silverlight application. One of the differences that he demonstrated was when your application consumed Web Services since Silverlight applications execute in a sandboxed environment they are not allowed to call random Web Services or issue HTTP requests to servers that are not the originating server, or a server that exposes a cross-domain manifest stating that it is allowed to be called by clients from that domain.

    Then he moved to show how you can work around this architectural difference by writing your own Web Service or HTTP end-point that basically gets the request from the client and using code on the server just calls the real Web Service. This way the client sees only the originating server and it allows the call to succeed, and the server can freely call the real Web Service. Funny enough while searching for a Quote Service I ran into an article from Dino Esposito in MSDN magazine  where he explains the same issue and also exposes a "Compatibility Layer" which again is just code (more than 40 lines of code) to act as proxy to call a Web Service (except he uses the JSON serializer to return the values).

    The obvious disadvantage is that this means you have to write code that only forwards the request and returns the response acting essentially as a proxy. Of course this can be very simple, but if the Web Service you are trying to call has any degree of complexity where custom types are being sent around, or if you actually need to consume several methods exposed by it, then it quickly becomes a big maintenance nightmare trying to keep them in sync when they change and having to do error handling properly, as well as dealing with differences when reporting network issues, soap exceptions, http exceptions, etc.

    So after looking at this, I immediately thought about ARR (Application Request Routing) which is a new extension for IIS 7.0 (see that you can download for free from IIS.NET for Windows 2008, that among many other things is capable of doing this kind of routing without writing a single line of code.

    This blog tries to show how easy it is to implement this using ARR. Here are the steps to try this: (below you can find the software required), note that if you are only interested in what is really new just go to 'Enter ARR' section below to see the configuration that fix the Web Service call.

    1. Create a new Silverlight Project (linked to an IIS Web Site)
      1. Launch Visual Web Developer from the Start Menu
      2. File->Open Web Site->Local IIS->Default Web Site. Click Open
      3. File->Add->New Project->Visual C#->Silverlight->Silverlight Application
      4. Name:SampleClient, Locaiton:c:\Demo,  Click OK
      5. On the "Add Silverlight Application" dialog choose the "Link this Silverlight control into an existing Web site", and choose the Web site in the combo box.
      6. This will add a SampleClientTestPage.html to your Web site which we will run to test the application.
    2. Find a Web Service to consume
      1. In my case I searched using for a Stock Quote Service which I found one at
    3. Back at our Silverlight project, add a Service Reference to the WSDL
      1. Select the SampleClient project in the Solution Explorer window
      2. Project->Add Service Reference and type in the Address and click Go
      3. Specify a friendly Namespace, in this case StockQuoteService
      4. Click OK
    4. Add a simple UI to call the Service
      1. In the Page.xaml editor type the following code inside the <UserControl></UserControl> tags:
      2.     <Grid x:Name="LayoutRoot" Background="Azure">
        <RowDefinition Height="30" />
        RowDefinition Height="*" />
        <ColumnDefinition Width="50" />
        ColumnDefinition Width="*" />
        ColumnDefinition Width="50" />
        <TextBlock Grid.Column="0" Grid.Row="0" Text="Symbol:" />
        TextBox Grid.Column="1" Grid.Row="0" x:Name="_symbolTextBox" />
        Button Grid.Column="2" Grid.Row="0" Content="Go!" Click="Button_Click" />
        ListBox Grid.Column="0" Grid.Row="1" x:Name="_resultsListBox" Grid.ColumnSpan="3"
        <StackPanel Orientation="Horizontal">
        <TextBlock Text="{Binding Path=Name}" FontWeight="Bold" Foreground="DarkBlue" />
        TextBlock Text=" = " />
        TextBlock Text="{Binding Path=Value}" />
      3. Right click the Button_Click text above and select the "Navigate to Event Handler" context menu.
      4. Enter the following code to call the Web Service
      5.     private void Button_Click(object sender, RoutedEventArgs e)
        var service = new StockQuoteService.StockQuoteSoapClient();
        service.GetQuoteCompleted += service_GetQuoteCompleted;
      6. Now, since we are going to use XLINQ to parse the result of the Web Service which is an XML then we need to add the reference to System.Xml.Linq by using the Project->Add Reference->System.Xml.Linq.
      7. Finally, add the following function to handle the result of the Web Service
      8.     void service_GetQuoteCompleted(object sender, StockQuoteService.GetQuoteCompletedEventArgs e)
        var el = System.Xml.Linq.XElement.Parse(e.Result);
        _resultsListBox.DataContext = el.Element("Stock").Elements();
    5. Compile the application. Build->Build Solution.
    6. At this point we are ready to test our application, to run it just navigate to http://localhost/SampleClientTestPage.html or simply select the SampleClientTestPage.html in the Solution Explorer and click View In Browser.
    7. Enter a Stock Symbol (say MSFT) and press Go!, Verify that it breaks. You will see a small "Error in page" with a Warning icon in the status bar. If you click that and select show details you will get a dialog with the following message:
    8. Message: Unhandled Error in Silverlight 2 Application An exception occurred during the operation, making the result invalid. 

    Enter Application Request Routing and IIS 7.0

    1. Ok, so now we are running into the cross-domain issue, and unfortunately we don't have a cross-domain here is where ARR can help us call the service without writing more code
    2. Modify the Web Service configuration to call a local Web Service instead
      1. Back in Visual Web Developer, open the file ServiceReferences.ClientConfig
      2. Modify the address="" to be instead address="http://localhost/stockquote.asmx", it should look like:
      3.     <client>
        <endpoint address="http://localhost/stockquote.asmx"
        ="basicHttpBinding" bindingConfiguration="StockQuoteSoap"
        ="StockQuoteService.StockQuoteSoap" name="StockQuoteSoap" />
    3. This will cause the client to call the Web Service in the same originating server, now we can configure ARR/URL Rewrite rule to route the Web Service requests to the original end-point
      1. Add a new Web.config to the http://localhost project (Add new item->Web.config)
      2. Add the following content:
      3. <?xml version="1.0" encoding="UTF-8"?>
        <rule name="Stock Quote Forward" stopProcessing="true">
        <match url="^stockquote.asmx$" />
        action type="Rewrite" url="" />
    4. This rule basically uses regular expression to match the requests for StockQuote.asmx and forwards them to the real Web Service.
    5. Compile everything by running Build->Rebuild Solution
    6. Back in your browser refresh the page to get the new, enter MSFT in the symbol and press Go!
    7. And Voila!!! everything works.


    One of the features offered by ARR is to provide proxy functionality to forward requests to another server. One of the scenarios where this functionality is useful is when using it from clients that cannot make calls directly to the real data, this includes Silverlight, Flash and AJAX applications. As shown in this blog, by just using a few lines of XML configuration you can enable clients to call services in other domains without having to write hundreds of lines of code for each method. It also means that I get the original data and that if the WSDL were to change I do not need to update any wrappers. Additionally if using REST based services you could use local caching in your server relying on Output Cache and increase the performance of your applications significantly (again with no code changes).

    Software used

    Here is the software I installed to do this sample(amazing that all of it is completely free):

    1. Install Visual Web Developer 2008 Express
    2. Install Silverlight Tools for Visual Studio 2008 SP 1
    3. Install Application Request Routing for IIS 7.
  • CarlosAg Blog

    Creating a Setup Project for IIS Extensions using Visual Studio 2008



    IIS 7 provides a rich extensibility model, whether extending the server or the user interface, one critical thing is provide a simple setup application that can install all the required files, add any registration information required, and modify the server settings as required by the extension.
    Visual Studio 2008 provides a set of project types called Setup and Deployment projects specifically for this kind of applications. The output generated for these projects is an MSI that can perform several actions for you, including copying files, adding files to the GAC, adding registry keys, and many more.
    In this document we will create a setup project to install a hypothetical runtime Server Module that also includes a User Interface extension for IIS Manager.
    Our setup will basically perform the following actions:
    •    Copy the required files, including three DLL’s and an html page.
    •    Add a couple of registry keys.
    •    Add the managed assemblies to the GAC
    •    Modify applicationHost.config to register a new module
    •    Modify administration.config to register a new UI extensibility for InetMgr
    •    Create a new sample application that exposes the html pages
    •    Finally, we will remove the changes from both configuration files during uninstall

    Creating the Setup Project

    Start Visual Studio 2008. In the File Menu, select the option New Project.
    In the New Project Dialog, expand the Other Project Types option in the Project type tree view.
    Select the option Setup and Deployment type and select the option Setup Project. Enter a name for the Project and a location. I will use SampleSetup as the name.


    Adding Files to the Setup

    • Select the menu View->Editor->File System. This will open the editor where you can add all the files that you need to deploy with your application. In this case I will just add an html file that I have created called readme.htm.
    • To do that, right click the Application Folder directory in the tree view and select the option Add File. Browse to your files and select all the files you want to copy to the setup folder (by default <Program Files>\<Project Name>.

    Adding assemblies to the GAC

    Adding assemblies to the setup is done in the same File System editor, however it includes a special folder called Global Assembly Cache that represents the GAC in the target system.
    In our sample we will add to the GAC the assemblies that have the runtime server module and the user interface modules for IIS Manager. I have created the following set of projects:

    1. SampleModule.dll that includes the runtime module on it.
    2. SampleModuleUI.dll that contains the server-side portion of the IIS Manager extension (ModuleProvider, ModuleService, etc).
    3. SampleModuleUIClient.dll that contains the client side portion of the IIS Manager extension (Module, ModulePage, TaskLists, etc).

    Back in Visual Studio,

    • Select the menu option View->Editor->File System
    • Right-click the root node in the Tree view titled File System on Target Machine and select the option Add Special Folder.
      Select the option Global Assembly Cache Folder.
      Right click the newly added GAC folder and choose the option Add File and browse to the DLL and choose OK. Another option is using the Add Assembly and use the "Select Component" dialog to add it.
      Visual Studio will recognize the dependencies that the assembly has, and will try to add them to the project automatically. However, certain assemblies such as Microsoft.Web.Administration, or any other System assemblies should be excluded because they will already be installed in the target machine.
    • To ensure that you don't ship system assemblies, in the Solution Explorer expand the Detected Dependencies folder and right click each of the assemblies that shouldn't be packaged and select the option Exclude. (In our case we will exclude Microsoft.Web.Administration.dll, Microsoft.Web.Management.dll, Microsoft.ManagementConsole.dll and MMCFxCommon.dll)
      After completing this, the project should look as follows:


    Adding Registry Keys

    Visual Studio also includes a Registry editor that helps you adding any registry keys in the target machine. For this sample I will just add a registry key in HKEY_LOCAL_MACHINE\Software\My Company\Message. For that:
    Select the menu option View->Editor->Registry.
    Expand the HKEY_LOCAL_MACHINE node and drill down to Software\[Manufacturer].
    [Manufacturer] is a variable that holds the name of the company, and can be set by selecting the SampleSetup node in Solution Explorer and using the Property Grid to change it. There are several other variables defined such as Author, Description, ProductName, Title and Version that helps whenever dynamic text is required.
    Right click [Manufacturer] and select the option new String Value. Enter Message as the name. To set the value you can select the item in the List View and use the Property Grid to set its value.
    After completing this, the project should look as follows:


    Executing Custom Code

    To support any custom code to be executed when running the setup application, Visual Studio (more explicitly MSI) supports the concept of Custom Actions. These Custom Actions include running an application, a script or executing code from a managed assembly.
    For our sample, we will create a new project where we will add all the code  to read and change configuration.
    Select the option File->Add->New Project.
    Select the Class Library template and name it SetupHelper.


    • Since we will be creating a custom action, we need to add a reference to System.Configuration.Install to be able to create the custom action. Use the Project->Add Reference. And in the .NET Tab select the System.Configuration.Install and press OK.
    • Since we will also be modifying server configuration (for registering the HTTP Module in ApplicationHost.config and the ModuleProvider in administration.config) using Microsoft.Web.Administration we need to add a reference to it as well. Again use the Project->Add Reference, and browse to <windows>\system32\inetsrv and select Microsoft.Web.Administration.dll
    • Rename the file Class1.cs file to be named SetupAction.cs and make the class name SetupAction. This class needs to inherit from System.Configuration.Install.Installer which is the base class for all custom actions and it has several methods that you can override to add custom logic to the setup process. In this case we will add our code in the Install and the Uninstall method.
    using System;
    using System.ComponentModel;
    using System.Configuration.Install;

    namespace SetupHelper {
    public class SetupAction : Installer {
    public override void Install(System.Collections.IDictionary stateSaver) {

                    "SampleUIModule.SampleModuleProvider, SampleUIModule, Version=, Culture=neutral, PublicKeyToken=12606126ca8290d1"

    // Add a Server Module to applicationHost.config
                    "SampleModule.SampleModule, SampleModule, Version=, Culture=neutral, PublicKeyToken=12606126ca8290d1"

    // Create a web application
                    "Default Web Site"

    public override void Uninstall(System.Collections.IDictionary savedState) {

    InstallUtil.RemoveApplication("Default Web Site", "/SampleApp");

    As you can see the code above is actually really simple, it just calls helper methods in a utility class called InstallUtil that is shown at the end of this entry. You will also need to add the InstallUtil class to the project to be able to compile it. The only interesting piece of code above is how we pass the TargetDir from the Setup project to the Custom action through the Parameters property of the InstallContext type.

    Configuring the Custom Action

    To be able to use our new Custom Action we need to add the SetupHelper output to our setup project, for that:
    Select the option View->Editor->File System
    Right-click the Application Folder node and select the option Add Project Output... and select the SetupHelper project in the Project drop down.


    After doing this, the DLL will be included as part of our setup.

    Adding the Install Custom Action

    Select the option View->Editor->Custom Actions
    Right-click the Install node and select the option Add Custom Action… drill down into the Application Folder and select the Primary output from SetupHelper.


    Click OK and type a name such as InstallModules

    Now, since we want to pass the TargetDir variable to be used as the physical path for the web application that we will create within our Installer derived-class, select the custom action and go to the Property Grid. There is a property called CustomActionData. This property is used to pass any data to the installer parameters class, and uses the format “/<name>=<value>”. So for our example we will set it to: /TargetDir="[TARGETDIR]\"


    Adding the Uninstall Custom Action

    In the same editor, right-click the Uninstall node and select the option Add Custom Action…, again drill down into the Application Folder and select the Primary output from SetupHelper.
    Press OK and type a name such as UninstallModules.
    After doing this the editor should look as follows:


    Building and Testing the Setup

    Finally we can build the solution by using the Build->Rebuild Solution menu option.
    This will create a file called SampleSetup.msi, in the folder SampleSetup\SampleSetup\Debug\SampleSetup.msi
    You can now run this MSI and it will walk through the process of installing. The user interface that is provided by default can also be configured to add new steps or remove the current steps. You can also provide a Banner logo for the windows and many more options from the View->Editor->User Interface.


    Visual Studio provides different packaging mechanisms for the setup application. You can change it through the Project Properties dialog where you get the option to use:
    1)    As loose uncompressed files. This option packages all the files by just copying them into a file system structure where the files are copied unchanged. This is a good packaging option for CD or DVD based setups
    2)    In setup file. This option packages all the files within the MSI file
    3)    In cabinet files. This option creates a set of CAB files that can be used in scenarios such as diskette based setup.

    You can also customize all the setup properties using the property grid, such as DetectNewerInstalledVersion which will warn users if a newer version is already installed or RemovePreviousVersion that will automatically remove older versions for the user whenever he tries to install a new one.


    64-bit considerations

    Turns out that the managed code custom action will fail under 64-bit platform due to it being executed as a 32-bit custom action the following blog talks about the details and shows how you can fix the issue:




    Visual Studio 2008 provides a simple option to easily create Setup applications that can perform custom code through Custom actions. In this document we created a simple custom action to install modules and InetMgr extensions through this support.

    For the latest information about IIS 7.0, see the IIS 7 Web site at


    This is the class that is used from the SetupHelper class we created to do the actual changes in configuration. As you can see it only has six public methods, AddModule, AddUIModuleProvider, CreateApplication, RemoveApplication, RemoveModule, and RemoveUIModule. The other methods are just helper methods to facilitate reading configuration.

    using System;
    using Microsoft.Web.Administration;

    namespace SetupHelper {

    public static class InstallUtil {

    /// <summary>
            /// Registers a new Module in the Modules section inside ApplicationHost.config
            /// </summary>
            public static void AddModule(string name, string type) {
    using (ServerManager mgr = new ServerManager()) {
    Configuration appHostConfig = mgr.GetApplicationHostConfiguration();
    ConfigurationSection modulesSection = appHostConfig.GetSection("system.webServer/modules");
    ConfigurationElementCollection modules = modulesSection.GetCollection();

    if (FindByAttribute(modules, "name", name) == null) {
    ConfigurationElement module = modules.CreateElement();
    module.SetAttributeValue("name", name);
    if (!String.IsNullOrEmpty(type)) {
    module.SetAttributeValue("type", type);



    public static void AddUIModuleProvider(string name, string type) {
    using (ServerManager mgr = new ServerManager()) {

    // First register the Module Provider 
                    Configuration adminConfig = mgr.GetAdministrationConfiguration();

    ConfigurationSection moduleProvidersSection = adminConfig.GetSection("moduleProviders");
    ConfigurationElementCollection moduleProviders = moduleProvidersSection.GetCollection();
    if (FindByAttribute(moduleProviders, "name", name) == null) {
    ConfigurationElement moduleProvider = moduleProviders.CreateElement();
    moduleProvider.SetAttributeValue("name", name);
    moduleProvider.SetAttributeValue("type", type);

    // Now register it so that all Sites have access to this module
                    ConfigurationSection modulesSection = adminConfig.GetSection("modules");
    ConfigurationElementCollection modules = modulesSection.GetCollection();
    if (FindByAttribute(modules, "name", name) == null) {
    ConfigurationElement module = modules.CreateElement();
    module.SetAttributeValue("name", name);


    /// <summary>
            /// Create a new Web Application
            /// </summary>
            public static void CreateApplication(string siteName, string virtualPath, string physicalPath) {
    using (ServerManager mgr = new ServerManager()) {
    Site site = mgr.Sites[siteName];
    if (site != null) {
    site.Applications.Add(virtualPath, physicalPath);

    /// <summary>
            /// Helper method to find an element based on an attribute
            /// </summary>
            private static ConfigurationElement FindByAttribute(ConfigurationElementCollection collection, string attributeName, string value) {
    foreach (ConfigurationElement element in collection) {
    if (String.Equals((string)element.GetAttribute(attributeName).Value, value, StringComparison.OrdinalIgnoreCase)) {
    return element;

    return null;

    public static void RemoveApplication(string siteName, string virtualPath) {
    using (ServerManager mgr = new ServerManager()) {
    Site site = mgr.Sites[siteName];
    if (site != null) {
    Application app = site.Applications[virtualPath];
    if (app != null) {

    /// <summary>
            /// Removes the specified module from the Modules section by name
            /// </summary>
            public static void RemoveModule(string name) {
    using (ServerManager mgr = new ServerManager()) {
    Configuration appHostConfig = mgr.GetApplicationHostConfiguration();
    ConfigurationSection modulesSection = appHostConfig.GetSection("system.webServer/modules");
    ConfigurationElementCollection modules = modulesSection.GetCollection();
    ConfigurationElement module = FindByAttribute(modules, "name", name);
    if (module != null) {


    /// <summary>
            /// Removes the specified UI Module by name
            /// </summary>
            public static void RemoveUIModuleProvider(string name) {
    using (ServerManager mgr = new ServerManager()) {
    // First remove it from the sites
                    Configuration adminConfig = mgr.GetAdministrationConfiguration();
    ConfigurationSection modulesSection = adminConfig.GetSection("modules");
    ConfigurationElementCollection modules = modulesSection.GetCollection();
    ConfigurationElement module = FindByAttribute(modules, "name", name);
    if (module != null) {

    // now remove the ModuleProvider
                    ConfigurationSection moduleProvidersSection = adminConfig.GetSection("moduleProviders");
    ConfigurationElementCollection moduleProviders = moduleProvidersSection.GetCollection();
    ConfigurationElement moduleProvider = FindByAttribute(moduleProviders, "name", name);
    if (moduleProvider != null) {

  • CarlosAg Blog

    Microsoft Web Platform Installer Beta Released


    Today we are releasing a new Web Site at where users can get a one stop shop for learning about the Microsoft Web Platform. This is part of a bigger effort to make it easier to get started with building and running Web Applications on Windows and IIS.

    As part of this a new tool called the Web Platform Installer Beta is also being released to help you getting started installing and getting all the software that you need from a single place without having to hunt around for installers, links or anything else. Just launch the tool, choose the software and configuration you are interested and it takes care of validating and installing pre-requisites.

    This tool will let you easily setup your development machines for building Web Applications quite nicely, it will also help you discover new tools, applications, features and beta's as they are getting released from several sources including IIS, ASP.NET and Visual Web Developer and more as we continually make new software available through updates to the feed that the tool consumes.

    Download page:

    Link to Run it: 

    Here are a few snapshots of the tool:

    This is the start page where you can choose to install everything available or customize the installation (Your Choice).

    WebPI Start Page

    In this page you can customize the selection and browse around all the current list of products and check and uncheck any product you want to install.


    There are a couple of more pages, and finally the progress where the tool downloads any files required and install them, so that you can at once get the whole Web Platform installed easily.


    Some of the products and features that the Beta supports installing and configuring include:

    1. IIS (Ability to granularly configure each of the features of IIS)
    2. IIS Extensions (such as the Out-of-band releases that we have made available in including Bit Rate Throttling, Web Playlist, Microsoft Web Deployment, FTP 7.0 Server, URL Rewrite, and more)
    3. .NET Framework 3.5
    4. SQL Express 2008
    5. SQL Server Driver for PHP
    6. Visual Web Developer 2008 Express
    7. Windows Installer 4.5
    8. more

    So as you can see everything you need to build Web Applications, from a Web Server (IIS), to a Development tool (Visual Web Developer) to a Database (SQL Server Express) and many more all for free.

    So go ahead and try the tool, give us feedback (remember this is a Beta) so it can only get better :)

  • CarlosAg Blog

    Using IIS Manager Users Authentication in your Web Application


    Today in the IIS.NET Forums a question was asked if it was possible to use the same IIS Manager Users authentication in the context of a Web Application so that you could have say something like WebDAV using the same credentials as you use when using IIS Manager Remote Administration.

    The IIS Manager Remote Administration allows you to connect to manage your Web Site using credentials that are not Windows Users, but instead just a combination of User and Password. This is implemented following a Provider model where the default implementation we ship uses our Administration.config file (%windir%\system32\inetsrv\config\administration.config) as the storage for this users. However, you can easily implement a base class to authentication against a database or any other users store if needed. This means you can build your own application and call our API's (ManagementAuthentication).

    Even better in the context of a Web Site running in IIS 7.0 you can actually implement this without having to write a single line of code.

    Disclaimer: Administration.config out-of-the box only has permissions for administrators to be able to read the file. This means that a Web Application will not be able to access the file, so you need to change the ACL's in the file to provide read permissions for your Application, but you should make sure that you limit the read access to the minimum required such as below.

    Here is how you do it:

    1. First make sure that your Web Site is using SSL to use this. (Use IIS Manager and right click your Web Site and Edit Bindings and add an SSL binding).
    2. So that we can restrict permissions further, make your application run in its own Application Pool, this way we can change the ACL's required to only affect your application pool and nothing else. So using IIS Manager go to Application Pools and add a new Application running in Integrated Mode, and give it a name you can easily remember, say WebMgmtAppPool (we will use this in the permissions below).
    3. Disable Anonymous Authentication in your application. (Use IIS Manager, drill-down to your application, double click the Authentication feature and disable Anonymous Authentication and any other authentication module enabled).
    4. Enable the Web Management Authentication Module in your application, you can add a Web.config file with the following contents on it:
      <add name='WebManagementBasicAuthentication' 
      ='Microsoft.Web.Management.Server.WebManagementBasicAuthenticationModule, Microsoft.Web.Management, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35' />
    5. Modify the ACL's in the required configuration files:
      1. Give read access to the config directory so we can access the files using the following command line (note that we are only giving permissions to the Application Pool)
        icacls %windir%\system32\inetsrv\config /grant "IIS AppPool\WebMgmtAppPool":(R)
      2. Give read access to the redirection.config:
        icacls %windir%\system32\inetsrv\config\redirection.config /grant "IIS AppPool\WebMgmtAppPool":(R)
      3. Finally give read access to administration.config:
        icacls %windir%\system32\inetsrv\config\administration.config /grant "IIS AppPool\WebMgmtAppPool":(R)
    6. At this point you should be able to navigate to your application using any browser and you should get a prompt for credentials that will be authenticated against the IIS Manager Users.

    What is also nice is that you can use URL Authorization to further restrict permissions in your pages for this users, for example, if I didn't want a particular IIS Manager User (say MyIisManagerUser) to access the Web Site I can just configure this in the same web.config:

    <add accessType="Deny" users="MyIisManagerUser" />

    If you want to learn more about remote administration and how to configure it you can read:

Page 4 of 10 (91 items) «23456»