• CarlosAg Blog

    Managing ASP.NET Configuration Settings using IIS Manager


    Today somebody asked a question about how to manage some ASP.NET configuration settings such as changing the trust level of the application and adding a few application settings and changing compilation settings to debug. I thought it would be trivial to search the web for an article or something that would show the features we added in IIS 7.0 to manage those, but to my surprise I was not able to find anything that would clearly show it, so I decided to write this pretty quickly for anyone that is not aware.


    With the release of IIS 7.0 (included in Windows Vista and Windows Server 2008), and of course included in IIS 7.5 (Windows 7 and Windows Server 2008 R2) we added a set of features for managing some of the configuration of common ASP.NET features inside the same IIS Manager. Those features include:

    1. .NET Authorization Rules 1 – To manage the authorization rules for ASP.NET, this is particularly useful when using Classic Mode. This UI basically is to manage the system.web/authorization section in a graphical way.
    2. .NET Compilation – This exposes the settings used by the ASP.NET compilation engine, such as list of assemblies, Debug settings, VB settings (Option Strict, Option Explicit), Temp directory, etc. This UI saves all the settings in the system.web/compilation section.
    3. .NET Error Pages 1 – Allows you to manage the ASP.NET custom errors, exposing the system.web/customErrors.
    4. .NET Globalization – Allows you to manage the globalization settings such as file encoding, ui culture, etc. This modifies the section system.web/globalization
    5. .NET Providers 2 – Allows you to manage the different provider configuration for the ASP.NET providers, such as Roles, Membership and Profile. (system.web/membership, system.web/profile, system.web/roleManager, etc).
    6. .NET Users, .NET Roles and .NET Profile 2 – Configure options that track settings for ASP.NET applications. All these features uses the ASP.NET runtime configuration to allow you to manage their settings, such as adding users, roles and profile settings. (What this post is about). These does not modify configuration but instead they use the Provider configured (such as SqlMembershipProvider, SqlRoleProvider, WindowsTokenRoleProvider, etc)
    7. .NET Trust Levels, allows you to configure the security trust level policy for the application. Modifies system.web/trust section.
    8. Application Settings – Allows you to manage the name/value pair stored in the .NET appSettings section.
    9. Connection Strings – Configures the database connection strings that can be used by ASP.NET applications. Manages the connectionStrings section.
    10. Machine Key – Allows you to modify the machine key and other related settings stored in system.web/machineKey section.
    11. Pages and Controls – Allows you to modify settings from the system.web/pages section, such as Base class, Namespaces, and Controls registered.
    12. Session State – Allows you to configure the session state settings such as connection string, cookie configuration and other configuration included in system.web/sessionState.
    13. SMTP E-mail – Configure the SMTP settings such as Server, Delivery mode, or Pickup directory, included in section.

    1 – These features are included in Windows 7 and Windows Server 2008 R2, but can be installed for Windows Vista and Windows Server 2008 when downloading the Administration Pack for IIS7.

    2 – Note, these features require hosting the ASP.NET runtime and due to technical limitations only application pools configure to run using .NET Version 2.0 will show these features. This means that if you configure your application pool to run .NET 4.0 (in IIS 7.0 and IIS 7.5) you will not see those features. As a workaround you could temporarily change the application pool to run in 2.0, make your changes and switch it back to 4.0 (of course, not recommended for production environments).

    These features are not meant to expose all the settings included in ASP.NET, and they only include configuration settings up to .NET 2.0. I should also add that IIS includes a generic configuration editor that allows you to manage a lot more configuration settings from ASP.NET, IIS, and more, in the image below you can see a lot more sections like webParts, trace, siteMap, and others:


    The best thing is that you can apply the changes immediately or you can also make changes and just generate the code to automate them later using code, command line or scripts them using Javascript, Managed code, or AppCmd.exe.


  • CarlosAg Blog

    Using the SEO Toolkit to generate a Sitemap of a remote Web Site


    The SEO Toolkit includes a set of features (like Robots Editor and Sitemap Editor) that only work when you are working with a local copy of your Web Site. The reason behind it is that we have to understand where we need to save the files that we need to generate (like Robots.txt and Sitemap XML files) without having to ask for physical paths as well as to verify that the functionality is added correctly such as only allowing Robots.txt in the root of a site, etc. Unfortunately this means that if you have a remote server that you cannot have a running local copy then you cannot use those features. (Note that you can still use Site Analysis tool since that will crawl your Web Site regardless of platform or framework and will store the report locally just fine.)

    The Good News

    The good news is that you can technically trick the SEO Toolkit into thinking you have a working copy locally and allow you to generate the Sitemap or Robots.txt file without too much hassle (“too much” being the key).

    For this sample, lets assume we want to create a Sitemap from a remote Web site, in this case I will use my own Web site ( , but you can specify your own Web site, below are the steps you need to follow to enable those features for any remote Web site (even if it is running in other versions of IIS or any other Web Server).

    Create a Fake Site

    • Open IIS Manager (Start Menu->InetMgr.exe)
    • Expand the Tree until you can see the “Sites” node.
    • Right-click the “Sites” node and select “Add Web Site…”
    • Specify a Name (in my case I’ll use MySite)
    • Click “Select” to choose the DefaultAppPool from the Application Pool list. This will avoid creating an additional AppPool that will never run.
    • Specify a Physical Path where you will want the Robots and Sitemap files to be saved. I recommend creating just a temporary directory that clearly states this is a fake site. So I will choose c:\FakeSite\ for that.
    • Important. Set the Host name so that it matches your Web Site, for example in my case
    • Uncheck the “Start Web site immediately”, since we do not need this to run.
    • Click OK

    This is how my Create site dialog looks like:


    Use the Sitemap Editor

    Since we have a site that SEO Toolkit thinks it is locally now you should be able to use the features as usual.

    • Select the new site created above in the Tree
    • Double-click the Search Engine Optimization icon in the Home page
    • Click the link “Create a new Sitemap”
    • Specify a name, in my case Sitemap.xml
    • Since this is a remote site, you will see that the physical location option shows an empty list, so change the “URL structure” to will use the “<Run new Site Analysis>..” or if you already have one you can choose that.
    • If creating a new one, just specify a name and click OK (I will use MySite). At this point the SEO Toolkit starts crawling the Remote site to discover links and URLs, when it is done it will present you the virtual namespace structure so you can work with.
    • After the crawling is done, you can now check any files you want to include in your Sitemap and leverage the Server response to define the changed date and all the features as if the content was local. and Click OK

    This is the way the dialog looks when discovered my remote Web site URLs:


    You will find your Sitemap.xml generated in the physical directory specified when creating the site (in my case c:\FakeSite\Sitemap.xml").

    Use the Robots Editor

    Just as with the Sitemap Editor, once you prepare a fake site for the remote server, you should be able to use the Robots Editor and leverage the same Site analysis output to build your Robots.txt file.



    In this blog I show how you can use the Sitemap and Robots Editor included in the SEO Toolkit when working with remote Web sites that might be running in different platforms or different versions of IIS.

  • CarlosAg Blog

    Free SEO Analysis using IIS SEO Toolkit


    In my spare time I’ve been thinking about new ideas for the SEO Toolkit, and it occurred to me that rather than continuing trying to figure out more reports and better diagnostics against some random fake sites, that it could be interesting to ask openly for anyone that is wanting a free SEO analysis report of your site and test drive some of it against real sites.

    • So what is in it for you, I will analyze your site to look for common SEO errors, I will create a digest of actions to do and other things (like generating a diagram of your site, layer information, etc), and will deliver it to you in a form of an email. If you agree I will post some of the results (hiding identification information like site, url, etc, so that it is made anonymously if needed).
    • and what is in it for me, well I will crawl your Web Site (once or twice at most, with a limit set to a few hundred pages) using the SEO Toolkit to test drive some ideas and reporting stuff that I’m starting to build and to continue investigating common patterns and errors.

    So if you want in, just post me your URL in the comments of this blog (make sure you are reading this blog from a URL inside , otherwise you might be posting comments in some syndicating site.), I will only allow the first few sites (if successful I will start another batch in the future) and I will be doing one by one in the following days. Make sure to include a way to contact you whether using the MSDN user infrastructure or include an email so that I can contact you with the results.

    Alternatively I will take also URLs using Twitter at so hurry up and let me know if you want me to look at your site.

  • CarlosAg Blog

    IIS SEO Toolkit and W3C Validation Service


    One thing that I’ve been asked several times about the SEO Toolkit is if it does a full standards validation on the markup and content that is processed, and if not, to add support for more comprehensive standards validation, in particular XHTML and HTML 4.01. Currently the markup validation performed by the SEO Toolkit is really simple, its main goal is to make sure that the markup is correctly organized, for example that things like <b><i>Test</b></i> are not found in the markup, the primary reason is to make sure that basic blocks of markup are generally "easy" to parse by Search Engines and that the semantics will not be terribly broken if a link, text or style is not correctly closed (since all of them would affect SEO).

    So the first thing I would say is that we have heard the feedback and are looking at what we could possibly add in future versions, however why wait, right?

    One thing that many people do not realize is that the SEO Toolkit can be extended to add new violations, new metadata and new rules to the analysis process and as such during a demo I gave a few weeks ago I decided to write a sample on how to consume the online W3C Markup Validation Service from the SEO Toolkit.


    You can download the SEOW3Validator including the source code at

    How to install it

    To run it you just need to:

    1. Unzip the contents in a folder.
    2. Install the SEOW3Validator.dll assembly in the GAC:
      1. Open a Windows Explorer window and navigate to c:\Windows\assembly
      2. Drag and Drop the SEOW3Validator.dll to the c:\Windows\assembly explorer window.
      3. Alternatively you can just run gacutil.exe /i SEOW3Validator.dll, usually located at C:\Program Files\Microsoft SDKs\Windows\v6.0A\bin or v7A.
      4. If you have problems with this, you could try just copying the assembly to the GAC (copy SEOW3Validator.dll c:\Windows\assembly\GAC_MSIL\SEOW3Validator\\SEOW3Validator.dll)
    3. Register the moduleProvider in Administration.config: In an elevated prompt open C:\Windows\System32\Inetsrv\config\Administration.config and add the following line right inside the <moduleProviders> right before closing the </moduleProviders>:
    4.   <add name="SEOW3Validator" 
      ="SEOW3Validator.SEOW3ValidatorModuleProvider, SEOW3Validator, Version=, Culture=neutral, PublicKeyToken=995ee9b8fa017847" />

    You should be able to now run the SEO Toolkit just as before but now you will find new violations, for example in my site I get the ones below. Notice that there are a new set of violations like W3 Validator – 68, etc, and all of them belong to the W3C category. (I would have liked to have better names, but the way the W3 API works is not really friendly for making this any better).


    And when double clicking any of those results you get the details as reported by the W3 Validation Service:


    The Code

    The code is actually pretty simple, the main class is called SEOW3ValidatorExtension that derives from CrawlerModule and overrides the Process method to call the W3C Validation service sending the actual markup in the request, this means that it does not matter if your site is an Intranet or in the Internet, it will work; and for every warning and error that is returned by the Validator it will add a new violation to the SEO report.

    The code looks like this:

        W3Validator validator = new W3Validator();
    W3ValidatorResults results = validator.Validate(context.UrlInfo.FileName, 

    foreach (W3ValidatorWarning warning in results.Warnings) {

    foreach (W3ValidatorError error in results.Errors) {




    I created a helper class W3Validator that basically encapsulates the consumption of the W3C Validation Service, the code is far from what I would like it to be however there are some "interesting" decisions on the way the API is exposed, I would have probably designed the service differently and not return the results formatted in HTML when this is actually an API/WebService that can be presented somewhere else than a browser. So a lot of the code is to just re-format the results to look "decent", but to be honest I did not want to spend too much time on it so everything was put together quite quickly. Also, if you look at the names I used for violations, I did not want to hard-code specific Message IDs and since the Error Message was different for all of them even within the same Message ID, it was not easy to provide better messages. Anyway, overall it is pretty usable and should be a good way to do W3 Validation.

    Note that one of the cool things you get for free is that since these are stored as violations, you can then re-run the report and use the Compare Report feature to see the progress while fixing them. Also, since they are stored as part of the report you will not need to keep running the validator over and over again but instead just open it and continue looking at them, as well as analyzing the data in the Reports and Queries, and be able to export them to Excel, etc.

    Hopefully this will give you a good example on some of the interesting things you can achieve with the SEO Toolkit and its extensibility.

  • CarlosAg Blog

    IIS SEO Toolkit Available in 10 Languages


    A couple of months ago I blogged about the release of the v1.0.1 of the IIS Search Engine Optimization Toolkit. In March we released the localized versions of the SEO Toolkit so now it is available in 10 languages: English, Japanese, French, Russian, Korean, German, Spanish, Chinese Simplified, Italian and Chinese Traditional.

    Here are all the direct links to download it.

    Name Language Download URL
    IIS SEO Toolkit 32bit english
    IIS SEO Toolkit 64bit english
    IIS SEO Toolkit 32bit ja-jp
    IIS SEO Toolkit 64bit ja-jp
    IIS SEO Toolkit 32bit fr-fr
    IIS SEO Toolkit 64bit fr-fr
    IIS SEO Toolkit 32bit ru-ru
    IIS SEO Toolkit 64bit ru-ru
    IIS SEO Toolkit 32bit ko-kr
    IIS SEO Toolkit 64bit ko-kr
    IIS SEO Toolkit 32bit de-de
    IIS SEO Toolkit 64bit de-de
    IIS SEO Toolkit 32bit es-es
    IIS SEO Toolkit 64bit es-es
    IIS SEO Toolkit 32bit zh-cn
    IIS SEO Toolkit 64bit zh-cn
    IIS SEO Toolkit 32bit it-it
    IIS SEO Toolkit 64bit it-it
    IIS SEO Toolkit 32bit zh-tw
    IIS SEO Toolkit 64bit zh-tw

    Here is a screenshot of how the SEO Toolkit running in Spanish.


    If you want to read the download files in the Microsoft Download Center you can click the links below:

    IIS Search Engine Optimization Toolkit - 32bit ja-jp
    IIS Search Engine Optimization Toolkit - 32bit fr-fr
    IIS Search Engine Optimization Toolkit - 32bit ru-ru
    IIS Search Engine Optimization Toolkit - 32bit ko-kr
    IIS Search Engine Optimization Toolkit - 32bit de-de
    IIS Search Engine Optimization Toolkit - 32bit es-es
    IIS Search Engine Optimization Toolkit - 32bit zh-cn
    IIS Search Engine Optimization Toolkit - 32bit it-it
    IIS Search Engine Optimization Toolkit - 32bit zh-tw

    To learn more about the SEO Toolkit you can visit:

    And for any help or provide us feedback you can do that in the IIS.NET SEO Forum.

  • CarlosAg Blog

    Setting up a Reverse Proxy using IIS, URL Rewrite and ARR


    Today there was a question in the Forums asking how to expose two different Internet sites from another site making them look like if they were subdirectories in the main site.

    So for example the goal was to have a site: expose a  and a and have the content from “” served for the first one and “” served in the second one. Furthermore we would like to have the responses cached in the server for performance reasons. The following image shows a simple diagram of this:

    Reverse Proxy Sample 

    This sounds easy since its just about routing or proxying every single request to the correct servers, right? Wrong!!! If it only it was that easy. Turns out the most challenging thing is that in this case we are modifying the structure of the underlying URLs and the original layout in the servers which makes relative paths break and of course images, Stylesheets (css), javascripts and other resources are not shown correctly.

    To try to clarify this, imagine that a user requests using his browser the page at, and so based on the specification above the request is proxied/routed to on the server-side. So far so good, however, imagine that the markup returned by this HTML turns out to have an image tag like “<img src=/some-image.png />”, well the problem is that now the browser will resolve that relative path using the base path on the original request he made which was resulting in a request for the image at instead of the right “company1” folder that would be .

    Do you see it? Basically the problem is that any relative path or for that matter absolute paths as well need to be translated to the new URL structure imposed by the original goal.

    So how do we do it then?

    Enter URL Rewrite 2.0 and Application Request Routing

    URL Rewrite 2.0 includes the ability to rewrite the content of a response as it is getting served back to the client which will allow us to rewrite those links without having to touch the actual application.

    Software Required:


    1. The first thing you need to do is enable Proxy support in ARR.
      1. To do that just launch IIS Manager and click the server node in the tree view.
      2. Double click the “Application Request Routing Cache” icon
      3. Select the “Server Proxy Settings…” task in the Actions panel
      4. And Make sure that “Enable Proxy” checkbox is marked. What this will do is allow any request in the server that is rewritten to a server that is not the local machine will be routed to the right place automatically without any further configuration.
    2. Configure URL Rewrite to route the right folders and their requests to the right site. But rather than bothering you with UI steps I will show you the configuration and then explain step by step what each piece is doing.
    3. Note that for this post I will only take care of Company1, but you can imagine the same steps apply for Company2, and to test this you can just save the configuration file below as web.config and save it in your inetpub\wwwroot\  or in any other site root and you can test it.
    <?xml version="1.0" encoding="UTF-8"?>
    <rule name="Route the requests for Company1" stopProcessing="true">
    <match url="^company1/(.*)" />
    <add input="{CACHE_URL}" pattern="^(https?)://" />
    <action type="Rewrite" url="{C:1}://{R:1}" />
    <set name="HTTP_ACCEPT_ENCODING" value="" />
    <rule name="ReverseProxyOutboundRule1" preCondition="ResponseIsHtml1">
    <match filterByTags="A, Area, Base, Form, Frame, Head, IFrame, Img, Input, Link, Script" pattern="^http(s)?://*)" />
    action type="Rewrite" value="/company1/{R:2}" />
    <rule name="RewriteRelativePaths" preCondition="ResponseIsHtml1">
    <match filterByTags="A, Area, Base, Form, Frame, Head, IFrame, Img, Input, Link, Script" pattern="^/(.*)" negate="false" />
    action type="Rewrite" value="/company1/{R:1}" />
    <preCondition name="ResponseIsHtml1">
    <add input="{RESPONSE_CONTENT_TYPE}" pattern="^text/html" />

    Setup the Routing

                    <rule name="Route the requests for Company1" stopProcessing="true">
    <match url="^company1/(.*)" />
    <add input="{CACHE_URL}" pattern="^(https?)://" />
    <action type="Rewrite" url="{C:1}://{R:1}" />
    <set name="HTTP_ACCEPT_ENCODING" value="" />

    The first rule is an inbound rewrite rule that basically captures all the requests to the root folder /company1/*, so if using Default Web Site, anything going to http://localhost/company1/* will be matched by this rule and it will rewrite it to respecting the HTTP vs HTTPS traffic.

    One thing to highlight which is what took me a bit of time is the “serverVariables” entry in that rule that basically is overwriting the Accept-Encoding header, the reason I do this is because if you do not remove that header then the response will likely be compressed (Gzip or deflate) and Output Rewriting is not supported on that case, and you will end up with an error message like:

    HTTP Error 500.52 - URL Rewrite Module Error.
    Outbound rewrite rules cannot be applied when the content of the HTTP response is encoded ("gzip").

    Also note that to be able to use this feature for security reasons you need to explicitly enable this by allowing the server variable. See enabling server variables here.


    Outbound Rewriting to fix the Links

    The last two rules just rewrite the links and scripts and other resources so that the URLs are translated to the right structure. The first one rewrites absolute paths, and the last one rewrites the relative paths. Note that if you use relative paths using “..” this will not work, but you can easily fix the rule above, I was too lazy to do that and since I never use those when I create a site it works for me :)

    Setting up Caching for ARR

    A huge added value of using ARR is that now we can with a couple of clicks enable disk caching so that the requests are cached locally in the, so that not every single request ends up paying the price to go to the backend servers.

    1. To do that just launch IIS Manager and click the server node in the tree view.
    2. Double click the “Application Request Routing Cache” icon
    3. Select the “Add Drive…” task in the Actions panel.
    4. Specify a directory where you want to keep your cache. Note that this can be any subfolder in your system.
    5. Make sure that “Enable Disk Cache” checkbox is marked in the Server Proxy Settings mentioned above.

    As easy as that now you will see caching working and your site will act as a container of other servers in the internet. Pretty cool hah! :)

    So in this post we saw how with literally few lines of XML, URL Rewrite and ARR we were able to enable a proxy/routing scenario with the ability to rewrite links and furthermore with caching support.

  • CarlosAg Blog

    SEO made easy with IIS URL Rewrite 2.0 SEO templates


    A few weeks ago my team released the version 2.0 of the URL Rewrite for IIS. URL Rewrite is probably the most powerful Rewrite engine for Web Applications. It gives you many features including Inbound Rewriting (ie. Rewrite the URL, Redirect to another URL, Abort Requests, use of Maps, and more), and in Version 2.0 it also includes Outbound Rewriting so that you can rewrite URLs or any markup as the content is being sent back even if its generated using PHP, ASP.NET or any other technology.

    It also includes a very powerful User Interface that allows you to test your regular expressions and even better it includes a set of templates for common types of Rules. Some of those rules are incredibly valuable for SEO (Search Engine Optimization) purposes. The SEO rules are:

    1. Enforce Lowercase URLs. It will make sure that every URL is used with only lower case and if not it will redirect with a 301 to the lower-case version.
    2. Enforce a Canonical Domain Name. It will help you specify what domain name you want to use for your site and it will redirect the traffic to the right host name.
    3. Append or Remove the Trailing Slash. It will make sure your request either include or not include the trailing slash depending on your preference.


    For more information on the SEO Templates look at:

    What is really cool is that you can use the SEO Toolkit to run it against your application and you probably will get some violations around lower-case, or canonical domains, etc. And after seeing those you can use URL Rewrite 2.0 to fix them with one click.

    I have personally used it in my Web site, try the following three URLs and all of them will be redirected to the canonical form ( and you will see URL Rewrite in action:


    Note that at the end those templates just translate to web.config settings that become part of your application that can be XCOPY with it. This works with ASP.NET, PHP, or any other server technology including static files. Below is the output of the Canonical Host Name rule which I use on my Web site’s web.config.

    <?xml version="1.0" encoding="UTF-8"?>
    <rule name="CanonicalHostNameRule1">
    <match url="(.*)" />
    <add input="{HTTP_HOST}" pattern="^www\.carlosag\.net$" negate="true" />
    <action type="Redirect" url="{R:1}" />

    There are many more features that I could talk, but for now this was just a quick SEO related post.

  • CarlosAg Blog

    Analyze your IIS Log Files - Favorite Log Parser Queries


    The other day I was asked if I knew about a tool that would allow users to easily analyze the IIS Log Files, to process and look for specific data that could easily be automated. My recommendation was that if they were comfortable with using a SQL-like language that they should use Log Parser. Log Parser is a very powerful tool that provides a generic SQL-like language on top of many types of data like IIS Logs, Event Viewer entries, XML files, CSV files, File System and others; and it allows you to export the result of the queries to many output formats such as CSV (Comma-Separated Values, etc), XML, SQL Server, Charts and others; and it works well with IIS 5, 6, 7 and 7.5.

    To use it you just need to install it and use the LogParser.exe that is found in its installation directory (on my x64 machine it is located at: C:\Program Files (x86)\Log Parser 2.2).

    I also thought on sharing some of my favorite queries. To run them, just execute LogParser.exe and make sure to specify that the input is an IIS Log file (-i:W3C) and for ease of use in this case we will export to a CSV file that can be then opened in Excel (-o:CSV) for further analysis:

    LogParser.exe -i:W3C "Query-From-The-Table-Below" -o:CSV
    Purpose Query Sample Output
    Number of Hits per Client IP, including a Reverse DNS lookup (SLOW) SELECT c-ip As Machine, 
    REVERSEDNS(c-ip) As Name, 
    COUNT(*) As Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    Machine Name Hits
    127.X.X.X MACHINE2 1
    Top 25 File Types SELECT TOP 25 
    EXTRACT_EXTENSION(cs-uri-stem) As Extension, 
    COUNT(*) As Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    GROUP BY Extension 
    Extension Hits
    gif 52127
    bmp 20377
    axd 10321
    txt 460
    htm 362
    Top 25 URLs SELECT TOP 25 
    cs-uri-stem as Url, 
    COUNT(*) As Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    GROUP BY cs-uri-stem 
    ORDER By Hits DESC
    Url Hits
    /WebResource.axd 10318
    /favicon.ico 8523
    /Tools/CodeTranslator/Translate.ashx 6519
    /App_Themes/Silver/carlosag.css 5898
    /images/arrow.gif 5720
    Number of hits per hour for the month of March SELECT 
    QUANTIZE(TO_LOCALTIME(TO_TIMESTAMP(date, time)), 3600) AS Hour, 
    COUNT(*) AS Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    WHERE date>'2010-03-01' and date<'2010-04-01' 
    Group By Hour
    Hour   Hits
    3/3/2010 10:00:00 33
    3/3/2010 11:00:00 5
    3/3/2010 12:00:00 3
    Number of hits per Method (GET, POST, etc) SELECT 
    cs-method As Method, 
    COUNT(*) As Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    GROUP BY Method
    Method Hits
    GET 133566
    POST 10901
    HEAD 568
    OPTIONS 11
    Number of requests made by user SELECT TOP 25 
    cs-username As User, 
    COUNT(*) as Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    WHERE User Is Not Null 
    GROUP BY User
    User Count
    Administrator 566
    Guest 1
    Extract Values from Query String (d and t) and use them for Aggregation SELECT TOP 25 
    EXTRACT_VALUE(cs-uri-query,'d') as Query_D, 
    EXTRACT_VALUE(cs-uri-query,'t') as Query_T, 
    COUNT(*) As Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    GROUP BY Query_D, Query_T 
    ORDER By Hits DESC
    Query_D Query_T Hits
    Value in Query1 Value in T1 1556
    Value in Query2 Value in T2 938
    Value in Query3 Value in T3 877
    Value in Query4 Value in T4 768
    Find the Slowest 25 URLs (in average) in the site SELECT TOP 25 
    cs-uri-stem as URL, 
    MAX(time-taken) As Max, 
    MIN(time-taken) As Min, 
    Avg(time-taken) As Average 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    ORDER By Average DESC
    URL Max Min Average
    /Test/Default.aspx 23215 23215 23215
    /WebSite/Default.aspx 5757 2752 4178
    /Remote2008.jpg 3510 3510 3510
    /wordpress/ 6541 2 3271
    /RemoteVista.jpg 3314 2 1658
    List the count of each Status and Substatus code SELECT TOP 25 
    STRCAT('.', TO_STRING(sc-substatus))) As Status, 
    COUNT(*) AS Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    GROUP BY Status 
    ORDER BY Status ASC
    Status Hits
    200 144
    304 38
    400 9
    403.14 10
    404 64
    404.3 2
    500.19 23
    List all the requests by user agent SELECT 
    cs(User-Agent) As UserAgent, 
    COUNT(*) as Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    GROUP BY UserAgent 
    UserAgent Hits
    iisbot/1.0+(+ 104
    Mozilla/4.0+(compatible;+MSIE+8.0;… 77
    Microsoft-WebDAV-MiniRedir/6.1.7600 23
    DavClnt 1
    List all the Win32 Error codes that have been logged SELECT 
    sc-win32-status As Win32-Status, 
    WIN32_ERROR_DESCRIPTION(sc-win32-status) as Description, 
    COUNT(*) AS Hits 
    FROM c:\inetpub\logs\LogFiles\W3SVC1\* 
    WHERE Win32-Status<>0 
    GROUP BY Win32-Status 
    ORDER BY Win32-Status ASC
    Win32-Status Description Hits
    2 The system cannot find the file specified. 64
    13 The data is invalid. 9
    50 The request is not supported. 2

    A final note: any time you deal with Date and Time, remember to use the TO_LOCALTIME function to convert the log times to your local time, otherwise you will find it very confusing when your entries seem to be reported incorrectly.

    If you need any help you can always visit the Log Parser Forums to find more information or ask specific questions.

    Any other useful queries I missed?

  • CarlosAg Blog

    IIS SEO Toolkit: Find warnings of HTTP content linked by pages using HTTPS


    Are you an developer/owner/publisher/etc of a site that uses HTTPS (SSL) for secure access? If you are, please continue to read.

    Have you ever visited a Web site that is secured using SSL (Secure Sockets Layer) just to get an ugly Security Warning message like:

    Do you want to view only the webpage content that was delivered securely?

    This webpage contains content that will not be delivered using a secure HTTPS connection, which could compromise the security of the entire webpage.


    How frustrating is this for you? Do you think that end-users know what is the right answer to the question above? Honestly, I think it actually even feels like the Yes/No buttons and the phrasing of the question would cause me to click the wrong option.

    What this warning is basically trying to tell the user is that even though he/she navigated to a page that you thought was secured by using SSL, the page is consuming resources that are coming from an unsecured location, this could be scripts, style-sheets or other types of objects that could potentially pose a security risk since they could be tampered on the way or come from different locations.

    As a site owner/developer/publisher/etc should always make sure that you are not going to expose your customers to such a bad experience, leaving them with an answer that they can’t possibly choose right. For one if they ‘choose Yes’ they will get an incomplete experience being broken images, broken scripts or something worse; otherwise they can ‘choose No’ which is even worse since that means you are actually teaching them to ignore this warnings which could indeed in some cases be real signs of security issues.

    Bottom-line it should be imperative that any issue like this should be treated as a bug and fixed in the application if possible.

    But the big question is how do you find these issues? Well the answer is very simple yet extremely time consuming, just navigate to every single page of your site using SSL and as you do that examine every single resource in the page (styles, objects, scripts, etc) and see if the URL is pointing to a non-HTTPS location.

    Enter the IIS Search Engine Optimization (SEO) Toolkit.

    The good news is that using the SEO Toolkit is extremely simple to find these issues.

    1. To do that just start a new Analysis using the IIS SEO Toolkit using the HTTPS URL of your site, for example:
    2. Once the analysis is over just select the option “Query->Open Query” and open the following XML file:
    3. <?xml version="1.0" encoding="utf-8"?>
      <query dataSource="links">
      <expression field="LinkingUrl" operator="Begins" value="https://" />
      expression field="LinkedUrl" operator="Begins" value="http://" />
      expression field="LinkType" operator="NotEquals" value="Link" />
      expression field="LinkType" operator="NotEquals" value="Rss" />
      <field name="LinkingUrl" />
      field name="LinkedUrl" />
      field name="LinkedStatus" />
      field name="LinkType" />
    4. Just by doing that it will open a Query Window that will show all the links in your site that have such a problem. Note that the query simply looks for all the resources that are being linked by a URL that begins with HTTPS and that the target resource is using HTTP and that are not normal links (since they do not have that problem).
    5. This is how my quick repro looks like. Note that it actually tells you the type of resource it is (an image and a style in this case). Additionally if you double click the row it will show you exactly the place in the markup where the problem occurs so you can easily fix it.



    Using the IIS SEO Toolkit and it powerful Query Engine you can easily detect conditions on your site that otherwise would take an incredible amount of time and that would be prohibitively expensive to do constantly.

  • CarlosAg Blog

    Announcing: IIS SEO Toolkit v1.0.1


    Last week we released a refresh for the IIS Search Engine Optimization (SEO) Toolkit v1.0. This version is a minor update that includes fixes for all the important bugs reported in the IIS.NET SEO Forum.

    Some of the fixes included in this version are:

    1. Pages sending the XHTML content type 'application/xhtml+xml' are not parsed correctly as HTML causing their links and violations to be empty.
    2. Report Analysis fails if the META tags include certain characters.
    3. <style> tag is not parsed correctly if it is empty causing Invalid Markup violations to be flagged incorrectly.
    4. Memory is not released when the "Store Copies of analyzed web pages locally" button is unchecked.
    5. HTML with leading empty lines and doctype fails to parse correctly causing their links and violations to be empty.
    6. Internal Link criteria option of "host: <sitename> and subdomains (*.<sitename>)" fails to work as expected under certain configurations.
    7. System.NullReferenceException when content attribute is misisng in Meta tag
    8. Windows authentication does not work with servers configured with NTLM or Kerberos only challenge.
    9. External META tags are stored in the report making it cumbersome to use the important ones.
    10. Several localization related bugs.
    11. DTD error when navigating to the Sitemap and Sitemap Index User Interface.
    12. And others…

    This release is compatible with v1.0 RTM and it will upgrade if already installed. So go ahead and install the new version using Web Platform Installed by clicking:


    Learn more about it at:

Page 2 of 10 (94 items) 12345»