Posts
  • CarlosAg Blog

    Extending the TreeView in IIS 7 in Windows Vista

    • 8 Comments

    Extending the Hierarchy Tree View in InetMgr

    InetMgr exposes several extensibility points that developers can use to plug-in their own features and make them look and feel just as the built-in functionality. One of those extensibility features is the hierarchy tree view and is exposed mainly through three classes:

    1. HierarchyService. This class is the class that handles the entire hierarchy and an instance is provided by the UI and you can get a reference to it through a ServiceProvider. It is used to manipulate the tree view programmatically, exposing methods to perform actions such as Select, Delete, Refresh, etc.
    2. HierarchyInfo. This abstract class represents a node in the tree view, for example the Web Sites node, the Default Web Site's node, the connections node are examples of instances of HierarchyInfo’s. This class has properties like Text, Image and allows you to react to actions such as selection, deletion, etc. Developers extending the tree view will need to create their own derived classes to implement the behavior as explained below.
    3. c) HierarchyProvider. This abstract class is the base class for all the features that want to extend the tree view. HierarchyService will query each of the registered providers to create the treeview. Developers that wish to add their own nodes should register a HierarchyProvider through the IExtensibilityManager interface.

    To extend the Tree view to add your own set of nodes or context menu tasks, developers need to perform the following actions:

    1. Create a class that derives from HierarchyProvider and handles the GetChildren and/or GetTaskItems to provide any nodes or tasks as needed.
    2. Register the HierarchyProvider using the IExtensibilityManager, this is typically done during the Module initialization phase.
    3. 3) Handle navigation and syncronization as needed.

     

    Tasks illustrated in this walkthrough include:

    • Creating a HierarchyProvider and creating HierarchyInfo classes
    • Registering a HierarchyProvider
    • Testing the new feature
    Note:
    This walkthrough is a continuation from the previous three walkthroughs you can find on how to extend InetMgr.
    You can find the first three at:

    Task 1: Creating a HierarchyProvider

    HierarchyProvider is the base class that developers need to inherit from in order to get calls from the UI whenever a node needs to be loaded. This way they can choose to add nodes or tasks to the HierarchyInfo node that is passed as an argument.

     

    To create a HierarchyProvider
    1. Back in Microsoft Visual Studio 2005 in the ExtensibilityDemo solution, select the option Add New Item from the Project Menu. In the Add New Item dialog select the Class template and type DemoHierarchyProvider.cs as the name for the file.
    2. Change the code so that it looks as follows:
      using System;
      using 
      Microsoft.Web.Management.Client;

      namespace 
      ExtensibilityDemo {

          
      internal class DemoHierarchyProvider : HierarchyProvider {

              
      public DemoHierarchyProvider(IServiceProvider serviceProvider)
                  : 
      base(serviceProvider) {
              }

              
      public override HierarchyInfo[] GetChildren(HierarchyInfo item) {
                  
      if (item.NodeType == HierarchyInfo.ServerConnection) {
                      
      return new HierarchyInfo[] { new DemoHierarchyInfo(this) };
                  
      }

                  
      return null;
              
      }


              
      internal class DemoHierarchyInfo : HierarchyInfo {

                  
      public DemoHierarchyInfo(IServiceProvider serviceProvider)
                      : 
      base(serviceProvider) {
                  }

                  
      public override string NodeType {
                      
      get {
                          
      return "DemoHierarchyInfo";
                      
      }
                  }

                  
      public override bool SupportsChildren {
                      
      get {
                          
      return false;
                      
      }
                  }

                  
      public override string Text {
                      
      get {
                          
      return "Demo Page";
                      
      }
                  }

                  
      protected override bool OnSelected() {
                      
      return Navigate(typeof(DemoPage));
                  
      }
              }
          }
      }

    The code above creates a class derived from HierarchyProvider that implements the base GetChildren method verifying that the node that is being expanded is a ServerConnection; if that is the case it returns an instance of a DemoHierarchyInfo node that will be added to that connection. The class DemoHierarchyInfo simply specifies its NodeType (a non-localized string that identifies the type of this node), SupportsChildren (false so that the + sign is not offered in tree view) and Text (the localized text that will be displayed in the tree view). Finally it overrides the OnSelected method and performs navigation to the DemoPage as needed.

    Task 2: Registering the HierarchyProvider

    In this task we will register the hierarchy provider created in the previous task so that the HierarchyService starts calling this type to extend the tree view.

    To register the provider
    1. Back in Microsoft Visual Studio 2005, open the file DemoModule.cs, and add the following code at the end of the method to register the provider:
      IExtensibilityManager extensibilityManager =
          
      (IExtensibilityManager)GetService(typeof(IExtensibilityManager));

      extensibilityManager.RegisterExtension(
          
      typeof(HierarchyProvider),
          
      new DemoHierarchyProvider(serviceProvider));
    2. The entire code should look as follows:
      protected override void Initialize(IServiceProvider serviceProvider,
                                         ModuleInfo moduleInfo) {
          
      base.Initialize(serviceProvider, moduleInfo);

          
      IControlPanel controlPanel =
              
      (IControlPanel)GetService(typeof(IControlPanel));

          
      ModulePageInfo modulePageInfo =
           new 
      ModulePageInfo(thistypeof(DemoPage), "Demo""Demo Page");

          
      controlPanel.RegisterPage(modulePageInfo);

          
      IExtensibilityManager extensibilityManager =
              
      (IExtensibilityManager)GetService(typeof(IExtensibilityManager));

          
      extensibilityManager.RegisterExtension(
              
      typeof(HierarchyProvider),
              
      new DemoHierarchyProvider(serviceProvider));
      }

    Task 3: Testing the new feature

    To test the feature

    1. Compile everything using Build Solution from the Build Menu and run InetMgr.exe from the <Windows>\System32\InetSrv directory.
    2. Connect to localhost using the TreeView and expand the server connection node.
    3. This will show the new node underneath the connection. When you click on it, it will navigate to the demo page just as expected:

       

    4. Furthermore, the breadcrumb at the top of the UI will automatically discover it and you will be able to navigate to the page by clicking on it, as well as using the text editing and intellisense feature it provides.

       

       

     

    Next Steps

    In this lab, you learned how to extend the tree view to customize any node on it and add your own nodes to it. You can also override the GetTasks method to provide context menu tasks for existing nodes, and you can also override the SyncSelection method to customize the way synchronization of navigation works.

  • CarlosAg Blog

    Using Microsoft.Web.Administration in Windows PowerShell

    • 7 Comments

    A couple of months ago I wrote about using LINQ with Microsoft.Web.Administration to manage and query IIS 7.0 configuration. Somebody came back to me and said that LINQ was very cool but that it was very much Developer oriented and that in a production server without VS or .NET 3.5 it wouldn't be an option. Indeed that is a very valid comment and so I decided to show similar stuff with a tool that is available in Windows and its more IT oriented, Windows PowerShell.

    So in this blog I will quickly mention some of the things you can easily do with Microsoft.Web.Administration inside Windows PowerShell.

    To start working with Microsoft.Web.Administration the first thing you need to do is load the assembly so that you can start using it. It is quite easy using the methods from the Assembly type.

    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Web.Administration")

    Once you have the assembly available then you will need to create an instance of our ServerManager class that gives you access to the entire configuration system.

    $iis = new-object Microsoft.Web.Administration.ServerManager

    The above line basically declares a variable called $iis that we will be able to use for all of our configuration tasks.

    Now to more interesting stuff.

    Getting the list of Sites

    Getting the list of sites is as easy as just accessing the Sites collection, this will output all the information about sites

    PS C:\ > $iis = new-object Microsoft.Web.Administration.ServerManager
    PS C:\ > $iis.sites
    ApplicationDefaults : Microsoft.Web.Administration.ApplicationDefaults
    Applications : {DefaultAppPool, DefaultAppPool, DefaultAppPool, DefaultAppPool...}
    Bindings : {}
    Id : 1
    Limits : Microsoft.Web.Administration.SiteLimits
    LogFile : Microsoft.Web.Administration.SiteLogFile
    Name : Default Web Site
    ServerAutoStart : True
    State : Started

    However, we can also specify the information we care and the format we want to use, for example:

    PS C:\ > $iis = new-object Microsoft.Web.Administration.ServerManager
    PS C:\ > $iis.sites | select-object Id, Name, State
    Id Name State
    -- ---- -----
    1 Default Web Site Started
    2 Site2 Started
    3 Site3 Started

    You can also use the where-object command to filter objects to get only the sites that are Stopped, and then we want to Start them.

    PS C:\ > $iis = new-object Microsoft.Web.Administration.ServerManager
    PS C:\ > $iis.sites | where-object {$_.State -eq "Stopped"} | foreach-object { $_.Start() }

    OK, now let's imagine I want to find all the applications that are configured to run in the Default ApplicationPool and move them to run in my NewAppPool. This is better to do it in three lines:

    PS C:\ > $iis = new-object Microsoft.Web.Administration.ServerManager
    PS C:\ > $iis.sites | foreach {
    $_.Applications | where { $_.ApplicationPoolName -eq 'DefaultAppPool' } |
    foreach { $_.ApplicationPoolName = 'NewAppPool' }
    }
    PS C:\ > $iis.CommitChanges()

    Now let's say I want to find the top 20 distinct URL's of all the requests running in all my worker processes that has taken more than 1 second.

    PS C:\ > $iis = new-object Microsoft.Web.Administration.ServerManager
    PS C:\ > $iis.WorkerProcesses | foreach {
    $_.GetRequests(1000) | sort TimeElapsed -descending |
    select-object Url -unique -first 20 }

    OK, finally let's say I want to display a table of all the applications running under DefaultAppPool and display if Anonymous authentication is enabled or not. (Now this one is almost on the edge of "you should do it differently, but it is Ok if you are only reading a single value from the section):

    PS C:\ > $iis = new-object Microsoft.Web.Administration.ServerManager
    PS C:\ > $iis.Sites | foreach {
    $_.Applications | where { $_.ApplicationPoolName -eq 'DefaultAppPool' } |
    select-object Path,@{Name="AnonymousEnabled"; Expression = {
    $_.GetWebConfiguration().GetSection("system.webServer/security/authentication/anonymousAuthentication").GetAttributeValue("enabled")
    }}
    }

    Again, the interesting thing is that now you can access all the functionality from M.W.A. from Windows PowerShell very easily without the need of compiling code or anything else. It does take some time to get used to the syntax, but once you do it you can do very fancy stuff.

  • CarlosAg Blog

    IIS 7.0 Manager for Windows XP, 2003 and Vista SP1

    • 7 Comments
    Technorati Tags:

    Last Wednesday we released the IIS Manager 7.0 client for Windows XP SP2, Windows Server 2003 and Windows Vista SP1.
    This is basically the IIS 7.0 Manager GUI that provides the ability to connect remotely to a Windows Server 2008 running the Web Management Service (WMSVC) to manage IIS 7.0 remotely.

    There are several key differences in this version of IIS Manager and its remote infrastructure:

    1) It allows for the first time users without administrative privileges to connect and manage their web sites and applications remotely

    2) It runs over SSL, no more DCOM, which makes this a firewall friendly feature easy to setup.

    3) Runs as a smart client, which means if a new feature is installed on the server it will automatically download the updated versions to the client machines.

    You can download it from:

    IIS.NET Web Site
    x86:
    http://www.iis.net/downloads/default.aspx?tabid=34&i=1626&g=6

    x64:
    http://www.iis.net/downloads/default.aspx?tabid=34&i=1633&g=6

    Microsoft.com/Downloads
    http://www.microsoft.com/downloads/details.aspx?FamilyID=32c54c37-7530-4fc0-bd20-177a3e5330b7&displaylang=en

    To learn more about remote management and how to install it:
    http://learn.iis.net/page.aspx/159/configuring-remote-administration-and-feature-delegation-in-iis-7/

    Now, to really show you what this is, I created a very simple demo that briefly shows the remote management capabilities over SSL. (Below there is a transcript in case my accent makes it difficult to understand my english :))

    Transcript:

    The purpose of this demonstration is to show you how easy it is to manage IIS 7.0 running in Windows Server 2008, from any machine that has Windows XP or Windows 2003 or Windows Vista by downloading the IIS Manager 7.0 that runs on all of those platforms.

    Now, today I am not going to focus on the details of how to configure it and how to setup the server to support remote management, but mainly just focus on the client aspect.

    On of the most interesting aspects of this remote management infrastructure is that it now uses an architecture that uses HTTPS to communicate to the server making this a nice firewall friendly remote management feature. Another key feature of this functionality is that it allows users without administrative privileges to connect and manage their Web Sites or their applications in a delegated way, where an administrator can restrict which options they can modify or not.

    OK, so to show you this I have here a Windows Server 2008 installed with IIS 7.0, and as you would expect I can manage it locally quite easily using IIS Manager, whether its adding a Web Site or managing the configuration from both IIS or ASP.NET I can do it here.

    This is all good, but now turns out I don’t want to connect locally but instead be able to remotely from my development machine connect to the server and still be able to do that and have the same experience as if I was locally logged on to the machine.

    To show this, I have here a Virtual PC image running a clean install of Windows XP SP2, the only thing it has installed additionally is the .NET Framework 2.0 which is the only requirement for the installation of IIS Manager 7.

    I have already downloaded the IIS Manager installer which takes only about 3MB of disk, that you can find at http://www. iis.net or http://Microsoft.com/downloads.

    Installing it is really simple and fast, just double click the icon and click next…

    Once installed I can now connect to any machine running Windows Server 2008 that has been configured to support remote management. To do that I just need to choose the option “Connect To Server/Site/Application” from the File Menu or the Start Page.

    Today, I will not drill down on the multiple differences between these connections, so for now I will just show how you can connect and manage the entire server by using a Windows Administrator account.

    Another interesting feature of the remote management platform is that if some new feature built on top of the UI Management extensibility API is installed on the server, when I connect again to the server, it will automatically prompt me if I want to get the new functionliaty and I can choose which features to install or not.

    To summarize, the IIS Manager 7 for Windows XP SP2, 2003 SP1 and Vista SP1 is available now, it only depends on the .NET FX 2.0 and it will allow you to connect to a remote server to manage it and have the same rich experience as if you were locally but using its new SSL remoting architecture.

  • CarlosAg Blog

    Looking for an Icon Editor?

    • 6 Comments

    If you are like me and every now and then develop applications that you want to create an icon for, and you don't have the money to spend on a nice commercial tool (or rather spend it in Halo 3, RockBand or Guitar Hero III), then you probably face the problem that tools don't tend to support saving bitmaps as icons.

    In the past I was using the old icon editor that came with mhh… was it Visual Basic 3? and life use to be good cause back then, icons were mostly used at 16 colors and you mostly cared for 16x16 or 32x32 sizes and the tool was decent for those scenarios. But now that icons can use thousands of colors and that you can use up to size of 256x256 it's certainly not a professional look icon when all you have is a 32x32 16 colors icon.

    So then I decide that I was going to build my own icon editor and started coding it in managed code, at the middle of it I was looking over the internet for the icon format and sure enough I was reading all the details of the format, headers, API's, etc, when suddenly searching for a specific struct yield a result that basically stopped me from doing any more code.

    Basically there is a sample in MSDN that has all the support that I need.

    It supports creating an icon, adding multiple formats to it such as 16x16 - 16 color, or 48x48-24bit color, up to 128x128-32bit color. It then allows you to import a bitmap into them and mark the transparent colors (XOR Mask and AND Mask) and everything. Additionally supports extracting icons from files, although I'm sure it could do a better job with PNG compressed files and other stuff, it's still million times better than my old icon editor.

    Nice thing it even comes with the source code in C++ in case you are interested on learning more about icon files.

    The article is from September 1995, but its still great.

    http://msdn2.microsoft.com/en-us/library/ms997538.aspx

    The code and executable can be downloaded from here:

    http://download.microsoft.com/download/win95/utility/1.0/w9xxp/en-us/4493.exe

    Disclaimer: I do not know how good or bad is the source code nor I pretend to recommend it against any other commercial tool, I'm just saying that its good enough for my "hobbyist" projects and you might find it useful.

  • CarlosAg Blog

    See you at PDC

    • 6 Comments

    Well, I’m really excited to be heading to PDC’05, it sure promise to be an interesting one where we can finally show off so many new technologies we have been working so hard during this past few months to create.

    I will be there along with many of my teammates from IIS trying to show off the really cool new features that we have built into the next version of IIS which I'm sure will rock your mind if you are a Web lover like me.

    So I really hope to meet a lot of you guys out there and try to answer any questions you might have.

     

  • CarlosAg Blog

    Internet Information Services (IIS) 7.0 Manager for Windows XP, 2003 and Windows Vista SP1 RC0 is available for download

    • 6 Comments

    NOTE: RTM has been released see the following blog: http://blogs.msdn.com/carlosag/archive/2008/03/04/IISManagerForWindowsXPand2003andVista.aspx 

    With the release of Windows Server 2008 RC0, in IIS we are also releasing the ability to manage the Web Server, the new FTP Server and the new modules remotely using IIS Manager 7.0.

    In the past with previous Beta we shipped similar functionality under a different name, however for the first time this is the real way we will be supporting this remote administration from different Windows versions when Windows Server 2008 final version comes along.

    The reason this release in particular is exiting is because for the first time all the UI extensibility is enabled for these platforms making it possible to build your own UI modules, install them in the server and have the clients that connect to your server automatically download the new functionality and use it as it was part of the IIS Manager release.

    Another reason this is important for us is because this is the first time we are releasing support for x64 which is something required for customers using Windows Vista 64 bit edition or any other 64 bit version of Windows.

    You can download and install them from:

    x86

    http://www.iis.net/downloads/default.aspx?tabid=34&g=6&i=1524

    x64

    http://www.iis.net/downloads/default.aspx?tabid=34&i=1525&g=6

    Note: This RC0 version will not be able to connect to any other older build of Windows 2008 Server including Beta 3, so if you need to still manage Beta 3 version you will need to install the Beta 3 build of the tool which can safely live side-by-side with the RC0 build.

  • CarlosAg Blog

    Very funny blog to read, Linux Hater's blog

    • 6 Comments

    One of the things I try to regularly do is to read blogs that are not necessarily pro-Microsoft and one of my favorite ones is Miguel de Icaza's blog. The other day I was reading one blog post that caught my attention about a blog he says he is a "fan of" named "Linux Hater's blog". So of course I decided to give that a read and went and started reading the entries in there, and I just could not stop laughing and laughing, and before I noticed I had been reading for almost an hour. I do have to warn though, the vocabulary used is their entries is ...um... lets say fluid?.

    Disclaimer: I'm not saying that their entries are right or wrong, I just literally couldn't stop laughing about some of the comments and their replies, and in some cases I do think they are a bit biased and in some cases they do have a good message that could help advance the software industry in general.

    Anyway, if you want to read a very "funny-colorful-passionate" blog I think its worth a read: http://linuxhaters.blogspot.com/

  • CarlosAg Blog

    Faster IIS Web Sites Provisioning using Microsoft Web Administration

    • 6 Comments

    Yesterday I got an email about some performance numbers that one of our customers were running into when creating remotely Web Sites, Applications, Application Pools and other tasks in IIS using Microsoft.Web.Administration. In case you don't know Microsoft.Web.Administration is a .NET library that exposes the IIS Configuration System.

    The code was using Microsoft.Web.Administration from PowerShell to create a single Application Pool, a new Web Site, and finally assign the Root Application to the new Application Pool, and it was measuring each of the tasks. The application then was called continually till 10,000 sites were created. They then created a very nice Excel Spreadsheet that had a chart with how long it took when performing each operation that clearly show how fast (or slow) things were depending on the existing number of sites in IIS.

    Funny enough the code was doing almost exactly what we use to measure in IIS 7.0 performance while provisioning. And that we ran into very similar numbers back a couple of years ago but we probably never share them. The results back in the day were showing a trend like the one below, where the Creation of the AppPool was quite fast (less than 400ms once you had 10,000 sites), the Application Pool assignment was taking almost 10 times more (almost 4 seconds) and the creation of the Sites was taking almost 20 times more (up to 8 seconds).

    SampleData

    Our test code looked something like this (removing all the Timing code):

    using(ServerManager m = new ServerManager()) {
       
    ApplicationPool pool = m.ApplicationPools.Add("MyPool");

       
    Site site = m.Sites.Add("MySite", ... "d:\inetpub\wwwroot");
       
    site.Applications["/"].AppPoolName = "MyPool";
        
       
    m.CommitChanges();
    }

    Back in those days (2005) the numbers were even worst, you would have to wait literally minutes to get a site created but we profiled it and understood exactly what the performance behavior was and we made it better.

    Here is how to make it faster

    1. Most importantly batch as many changes you need to do to our configuration system so that you only CommitChanges once, for example, creating the 10,000 sites and only Committing once will probably average 1,000 sites per second. More importantly you will reduce the load on the server since every time the configuration is changed our server has to process its changes and that incurs in other CPU cycles in WAS and W3SVC.
    2. The reason we noticed Application Pools was being slower at the beginning when using ServerManager locally was just an artifact of the fact that it’s the first operation being done in our Test Script. The reason is that when you create a ServerManager in reality nothing interesting happens other than we activate a COM object that parses the schema, however whenever you request a section that is when the real XML parsing of files happens. This means that the first operation (in this case accessing the AppPool collection) pays the price of parsing machine.config, root web.config and ApplicationHost.config, which is the file that grows considerably as new sites/pools are added.
    3. Definitely using ServerManager remotely (OpenRemote) will be several times slower because every time property is set or anything is changed the code actually hits the network to set the values in the remote objects, since on the client there are only client proxies. This means that setting appPool.Name actually goes through the network to set the values on the dllhost.exe hosting DCOM.
    4. Finally the most important one, the reason the Site creation is extremely slow is because the way our test code was creating the site and the application, in particular we are using the method m.Sites.Add() which has several overloads, where most of them provide a set of "friendly-non-performant” combinations. This is due to the fact that none of them receives a Site ID which forces us to automatically calculate one every time a Site will be added. We have implemented two algorithms for this (depending on a registry key called IncrementalSiteIDCreation) and both have the issue that we need to read all the sites first to make sure we don’t generate an ID that is already in use. This means that if you have 4,000 sites we have to read all 4,000 sites and that of course over the network translates to literally thousands of round trips.
      So bottom line use the m.Sites.CreateElement() followed by a m.Sites.Add(newSite) which will not create an ID for the Site that will not incur in the costs mentioned. Something like the snippet below. Notice that there is a little bit more code since you also need to create the Root Application and set the bindings, but this will proof useful since setting the AppPoolName requires the App anyway and you save all the Lookups on the Sites list as well.
    5. using(ServerManager m = new ServerManager()) {
         
      ApplicationPool pool = m.ApplicationPools.Add("MyPool");

         
      Site site = m.Sites.CreateElement("site");
         
      site.Name = "MySite"
         
      site.ID = "MySite".GetHashCode();

         
      Application app = site.Applications.Add("/", "d:\inetp...");
         
      app.AppPoolName = "MyPool";
          
         
      m.CommitChanges();
      }

    After all that story, what is important to realize is that there are ways to make this way faster and that you should use them if you are going to do any sort of massive creation of sites. With the two changes outlined above (Specify the Site ID yourself and don't lookup the Site again to get the App) the results were incredible we were back to under 350ms for all the tasks.

  • CarlosAg Blog

    IIS 7.0 Remote Administration and Database Manager Video

    • 5 Comments

    DiscountASP.net published a very nice video that shows how you can enable IIS Manager and Database Manager and other modules for their customers.

    If you don't use DiscountASP.net as your ISP at least its interesting to see how IIS 7.0 and its Remote Administration capabilities over HTTPS and Delegated Management look like. Also you can see the Database Manager in action that you can download for free from http://learn.iis.net/page.aspx/416/basics-of-database-manager/

    First couple of minutes show how they expose this functionality to their customers, but If you just care to see the IIS 7.0 features running seek to minute 2:00.

    http://iis7test.com/iis7_modules/iis7_modules.html

  • CarlosAg Blog

    SEO Tip - Beware of the Login pages - add them to Robots Exclusion

    • 5 Comments

    A lot of sites today have the ability for users to sign in to show them some sort of personalized content, whether its a forum, a news reader, or some e-commerce application. To simplify their users life they usually want to give them the ability to log on from any page of the Site they are currently looking at. Similarly, in an effort to keep a simple navigation for users Web Sites usually generate dynamic links to have a way to go back to the page where they were before visiting the login page, something like: <a href="/login?returnUrl=/currentUrl">Sign in</a>.

    If your site has a login page you should definitely consider adding it to the Robots Exclusion list since that is a good example of the things you do not want a search engine crawler to spend their time on. Remember you have a limited amount of time and you really want them to focus on what is important in your site.

    Out of curiosity I searched for login.php and login.aspx and found over 14 million login pages… that is a lot of useless content in a search engine.

    Another big reason is because having this kind of URL's that vary depending on each page means there will be hundreds of variations that crawlers will need to follow, like /login?returnUrl=page1.htm, /login?returnUrl=page2.htm, etc, so it basically means you just increased the work for the crawler by two-fold. And even worst, in some cases if you are not careful you can easily cause an infinite loop for them when you add the same "login-link" in the actual login page since you get /login?returnUrl=login as the link and then when you click that you get /login?returnUrl=login?returnUrl=login... and so on with an ever changing URL for each page on your site. Note that this is not hypothetical this is actually a real example from a few famous Web sites (which I will not disclose). Of course crawlers will not infinitely crawl your Web site and they are not that silly and will stop after looking at the same resource /login for a few hundred times, but this means you are just reducing the time of them looking at what really matters to your users.

    IIS SEO Toolkit

    If you use the IIS SEO Toolkit it will detect the condition when the same resource (like login.aspx) is being used too many times (and only varying the Query String) and will give you a violation error like: Resource is used too many times.

     

    So how do I fix this?

    There are a few fixes, but by far the best thing to do is just add the login page to the Robots Exclusion protocol.

    1. Add the URL to the /robots.txt, you can use the IIS Search Engine Optimization Toolkit to edit the robots file, or just drop a file with something like:
      User-agent: *
      Disallow: /login
    2. Alternatively (or additionally)  you can add a rel attribute with the nofollow value to tell them not to even try. Something like:
      <a href="/login?returnUrl=page" rel="nofollow">Log in</a>
    3. Finally make sure to use the Site Analysis feature in the IIS SEO Toolkit to make sure you don't have this kind of behavior. It will automatically flag a violation when it identifies that the same "page" (with different Query String) has already been visited over 500 times.

    Summary

    To summarize always add the login page to the robots exclusion protocol file, otherwise you will end up:

    1. sacrificing valuable "search engine crawling time" in your site.
    2. spending unnecessary bandwidth and server resources.
    3. potentially even blocking crawlsers from your content.
Page 3 of 10 (93 items) 12345»