Posts
  • CarlosAg Blog

    Winsxs is huge… Free up a few Gigabytes with dism

    • 9 Comments

    I was running out of disk space in C: and was unable to install a small software that I needed, so I decided to clean up a bit. For that I like using WinDirStat http://windirstat.info/ which very quickly allows you to find where the big files/folders are. In this case I found that my c:\Windows\winsxs folder was over 12 GB of size. One way to reclaim some of that disk space is to cleanup all files that have been backed up when a Service Pack has been installed. To do that in Windows 7 you can run the following DISM command:

    dism /online /cleanup-image /spsuperseded /hidesp


    That freed up 4 GB in my machine and now I can move on.

    Disclaimer: I only ran this in my Windows 7 machine and it worked great, have not tried it in Server SKUs so run at your own risk.

  • CarlosAg Blog

    It has been a long time since last post

    • 1 Comments

    Wow, just realized that in the last 6 months I’ve only had a chance to post 2 items and I think it is about time to start this going again.

    So why this much silence? Well, About 8 months ago a couple of big changes happened at my division as described in this link. As part of that transition my responsibilities changed and I transitioned from being the Development Manager for the Web Platform (IIS, WebMatrix, WebDeploy, etc…) to take a new role and start a new team that we called Azure UX team. Our team is in charge of reimagining the Windows Azure User Experience and we stated on a mission to really make it even better. As part of that we’ve been super busy delivering a set of projects, some of which we released in December such as our brand new Windows Azure web site at http://www.windowsazure.com/ and the new Windows Azure Billing Web site at https://account.windowsazure.com/, several updates to the Windows Azure Management Portal at http://windows.azure.com/ and some that have not yet been released.

    These last months have been quite a ride, I feel privileged of having been part of the IIS family and community and its multiple releases since even before Windows Vista shipped, 7, 7.5, ARR, URL Rewrite, Web Deploy, WebMatrix, WebPI, IIS Media Services, FTP, WebDAV, SEO Toolkit, WFF, FastCGI, WinCache, AdminPack, PowerShell, Database Manager, Dynamic IP Restrictions, and many many more releases my team delivered, and obviously, the awesome new version in Windows 8. I will always remain close to all of them, both the people, the products, and the community.

    At the same time, I’m extremely happy and excited to also have now a chance to be part of such an amazing group of people the AUX team, a very fast pace, dedicated and professional group, working on a new mission that will influence so deeply the shape of the cloud.

  • CarlosAg Blog

    Using Windows Authentication with Web Deploy and WMSVC

    • 0 Comments

    By default in Windows Server 2008 when you are using the Web Management Service (WMSVC) and Web Deploy (also known as MSDeploy) it will use Basic authentication to perform your deployments. If you want to enable Windows Authentication you will need to set a registry key so that the Web Management Service also supports using NTLM. To do this, update the registry on the server by adding a DWORD key named "WindowsAuthenticationEnabled" under HKEY_LOCAL_MACHINE\Software\Microsoft\WebManagement\Server, and set it to 1. If the Web Management Service is already started, the setting will take effect after the service is restarted.

    For more details on other configuration options see:

    http://technet.microsoft.com/en-us/library/dd722796(WS.10).aspx

  • CarlosAg Blog

    Razor Migration Notes 3: Use app_offline.htm to deploy the new version

    • 1 Comments

    This is the third post on the series:

    1: Moving a SitemapPath Control to ASP.NET Web Pages

    2: Use URL Rewrite to maintain your Page rankings (SEO)

     

    ASP.NET has a nice feature to help for deployment processes where you can drop an HTML file named app_offline.htm and it will unload all assemblies and code that it has loaded letting you easily delete binaries and deploy the new version while still serving back to customers the friendly message that you provide telling them that your site is under maintenance.

    One caveat though, is that Internet Explorer users might still see the “friendly” error that they display and not your nice message. This happens because of a page size validation that IE performs. See Scott’s blog on how to workaround that problem: App_Offline.htm and working around the IE Friendly Errors

    Note: The live site is now running in .NET 4.0 and all using Razor.

  • CarlosAg Blog

    Razor Migration Notes 2: Use URL Rewrite to maintain your Page rankings (SEO)

    • 0 Comments

    This is the second note of the series:

    1: Moving a SitemapPath Control to ASP.NET Web Pages

    My current Web Site was built using ASP.NET 2.0 and WebForms, that means that all of my pages have the extension .aspx. While moving each page to use ASP.NET Web Pages their extension is being changed to .cshtml, and while I’m sure I could configure it in a way to get them to keep their aspx extensions it is a good opportunity to “start clean”. Furthermore, in ASP.NET WebPages you can also access them without the extension at all, so if you have /my-page.cshtml, you can also get to it using just /my-page. Given I will go through this migration I decided to use the clean URL format (no extension) and in the process get better URLs for SEO purposes, for example, today one of the URLs look like http://www.carlosag.net/Articles/configureComPlus.aspx but this would be a good time to enforce lower-case semantics and also get rid of those ugly camel casing and get a much more standard a friendly format for Search Engines using “-“, like: http://www.carlosag.net/articles/configure-com-plus.aspx.

    Use URL Rewrite to make sure to keep your Page Ranking and no broken links

    The risk of course is that if you just change the URLs of your site you will end up not only with lots of 404’s (Not Found), but your page ranking will be reset and you will loose all the “juice” that external links and history have provided to it. The right way to do this is to make sure that you perform a permanent redirect (301) from the old URL to the new URL, this way Search Engines (and browsers) will know that the content has permanently moved to a new location so they should “pass all the page ranking” to the new page.

    There are many ways to achieve this, but I happen to like URL Rewrite a lot, so I decided to use it. To do that I basically created one rule that uses a Rewrite Map (think of it as a Dictionary) to match the URL and if it matches it will perform a permanent redirect to the new one. So for example, if /aboutme.aspx is requested, then it will 301 to /about-me:

    <?xml version="1.0"?>
    <configuration>
     
    <system.webServer>
       
    <rewrite>
         
    <rules>
           
    <rule name="Redirect for OldUrls" stopProcessing="true">
             
    <match url=".*"/>
              <
    conditions>
               
    <add input="{OldUrls:{REQUEST_URI}}" pattern="(.+)"/>
              </
    conditions>
             
    <action type="Redirect" url="{C:1}" appendQueryString="true" redirectType="Permanent" />
            </
    rule>
         
    </rules>
         
    <rewriteMaps>
           
    <rewriteMap name="OldUrls">
             
    <add key="/aboutme.aspx" value="/about-me"/>
              <
    add key="/soon.aspx?id=1" value="/coming-soon"/>
              <
    add key="/Articles/configureComPlus.aspx" value="/articles/configure-com-plus"/>
              <
    add key="/Articles/createChartHandler.aspx" value="/articles/create-aspnet-chart-handler"/>
              <
    add key="/Articles/createVsTemplate.aspx" value="/articles/create-vs-template"/>
          ...
            </
    rewriteMap>
         
    </rewriteMaps>
       
    </rewrite>
     
    </system.webServer>
    </configuration>

     

    Note that I could have also created a simple rule that would change the extension to cshtml, however I decided that I also wanted to change the page names. The best thing is that you can do it incrementally and only rewrite them once your new page is ready or even switch back to the old one later if any problems occur.

    Summary

    Using URL Rewrite you can easily keep your SEO and pages without broken links. You can also achieve lots more, check out: SEO made easy with IIS URL Rewrite 2.0 SEO templates – CarlosAg

  • CarlosAg Blog

    Razor Migration Notes 1: Moving a SitemapPath Control to ASP.NET Web Pages

    • 2 Comments

    After many years I decided that it is time to rewrite my Web site using Razor. A bit of history, I started it around 2003 using ASP.NET 1.1. When .NET 2.0 came around in 2005 I migrated to it and it was great being able to leverage features like MasterPages, Themes, Sitemaps, and many other features. Honestly it is a pretty simple Web site, with mostly content, so very few controls, Sitemap, my own custom Menu control, and a couple more. Last week it was moved to use .NET 4.0 and it feels its about time to go back and update it a bit, both in look and features. So this (if time permits) will be the first of a series of migration notes that I discover as I move it to use ASP.NET Razor (aka WebPages). Do note that this is not meant to be a best practice in anyway, I would never claim I can make such a thing, these will be only my personal notes as I discover more details in ASP.NET WebPages features and as I move my own implementation to use them.

    So with that, one of the first things I faced during this migration, was the use of a Sitemap control (asp:SiteMapPath) in my MasterPage (future post about moving from MasterPages coming). I knew about Sitemap API, so I just decided to write a simple Sitemap helper that I can now use anywhere in Razor. The code is pretty simple, it basically generates an unordered list of links using <ul> and <li> with <a> inside, and used CSS to layout them in a way that I liked.

    SitemapPath Control in WebForms

    The original code I was using in my MasterPage looked like the following:

    <asp:SiteMapPath CssClass="HeaderText" runat="server" ID="siteMap" ShowToolTips="true" NodeStyle-ForeColor="White" CurrentNodeStyle-Font-Bold="true" />

    And generated the following markup:

    <span id="siteMap" class="HeaderText"><a href="#siteMap_SkipLink"><img alt="Skip Navigation Links" height="0" width="0" src="http://blogs.msdn.com/WebResource.axd?d=S2jbW9E-HYlS0UQoRCcsm94KUJelFI6yS-CQIkFvzT6fyMF-zCI4oIF9bSrGjIv4IvVLF9liJbz7Om3voRpNZ8yQbW3z1KfqYr4e-0YYpXE1&amp;t=634219272564138624" style='border-width:0px;' /></a><span><a title='Home' href='/' style='color:White;'>Home</a></span><span> &gt; </span><span><a title='Free tools for download' href='/Tools/' style='color:White;'>Tools</a></span><span> &gt; </span><span style='color:White;font-weight:bold;'>Code Translator</span><a id='siteMap_SkipLink'></a></span>

    Which looks like the following in the browser:

    image

    I used some CSS to set the color, and background and other stuff, but still to set the last item to bold required me to use a property in the Sitemap to get it to look the way I wanted.

    My Sitemap Helper in Razor

    Since I was familiar with the Sitemap API and my goal was to change as “little” as possible as part of this first migration, I decided to write a Sitemap helper that I can use in my Layout pages. The code in the Page is as simple as it gets, you just call @Helpers.Sitemap() and that’s it (added the Div below to get some context in the markup, but that was already there with the SitemapPath control anyway):

    <div class="bannerPath">
    @Helpers.Sitemap()
    </div>

    This new helper version generates the markup below. I don’t know about you, but I can sure make more sense of what it says, and I imagine Search Engines will as well, I decided to use more semantically correct markup using a <nav> to signal navigational section and use a list of links.

    <nav>
        <ul class="siteMap">
            <li><a href="http://blogs.msdn.com/" title="Home">Home</a>&nbsp;&gt;&nbsp;</li>
            <li><a href="http://blogs.msdn.com/Tools/" title="Free tools for download">Tools</a>&nbsp;&gt;&nbsp;</li>
            <li><span>Code Translator</span></li>
        </ul>
    </nav>

    And it looks like the following in the browser (I decided to remove the underlining, and have more padding, and a new font, but all of that is CSS):

    image

    The Sitemap helper code

    The code to do the Sitemap was pretty simple, just use the SiteMap API to get the current node. Since I’m picky and I wanted to generate the markup in the “right” order (note you could use CSS to float them to the right instead), I used a Stack to push the nodes while traversing them up. Finally just generate the <li>.

    @helper Sitemap()
    {
        SiteMapNode currentNode = SiteMap.CurrentNode;
        <nav>
        <ul class="siteMap">
        @if (currentNode != null)
        {
            // Push into a stack to reverse them
            var node = currentNode;
            var nodes = new Stack<SiteMapNode>();
            while (node.ParentNode != null)
            {
                nodes.Push(node.ParentNode);
                node = node.ParentNode;
            }
           
            while(nodes.Count != 0)
            {
                SiteMapNode n = nodes.Pop();
                <li><a href="@n.Url" title="@n.Description">@n.Title</a>&nbsp;&gt;&nbsp;</li>
            }
            <li><span>@currentNode.Title</span></li>
        }
        else
        {
            <li><span>@Page.Title</span></li>
        }
        </ul>
        </nav>
    }

     

    To make it look the way I wanted I used the following CSS:

    .siteMap

      { float:right; font-size:11px; color:White; display:inline; margin-top:3px; margin-bottom:3px; margin-left:0px; margin-right:10px; } .siteMap li,span { float:left; list-style-type:none; padding-left:5px; border-width:0px;} .siteMap span { font-weight:bold; } .siteMap a,a.Visited { color:White; text-decoration:none; }

     

    Conclusion

    SitemapPath control gives you a really easy way to put together a navigation control based on the Sitemap APIs (and the Web.Sitemap file in my case). Creating a simple ASP.NET Razor helper is actually pretty easy since all the functionality needed is there in the base API’s and although it required some code (20 lines of code) now I feel like I have more control over my markup, can style it in anyway I want using CSS and have cleaner markup rendered.

    I’m sure there are better ways to do this, but as I said, the goal of this first pass is to push my site soon with as little changes possible while keeping the same functionality first.

  • CarlosAg Blog

    Get IIS bindings at runtime without being an Administrator

    • 3 Comments

    Today there was a question in StackOverflow asking whether it was possible to read the IIS binding information such as Port and Protocols from the ASP.NET application itself to try to handle redirects from HTTP to HTTPS in a way that was reliable without worrying about using different ports than 80/443.

    Turns out this is possible in the context of the IIS worker process by using Microsoft.Web.Administration.

    The following function will take care of that by reading the Worker Process isolated configuration file and find the HTTP based bindings.

        private static IEnumerable<KeyValuePair<string, string>> GetBindings(HttpContext context) {
           
    // Get the Site name 
            string siteName = System.Web.Hosting.HostingEnvironment.SiteName;

           
    // Get the sites section from the AppPool.config
            Microsoft.Web.Administration.ConfigurationSection sitesSection =
               
    Microsoft.Web.Administration.WebConfigurationManager.GetSection(null, null, "system.applicationHost/sites");
            
           
    foreach (Microsoft.Web.Administration.ConfigurationElement site in sitesSection.GetCollection()) {
               
    // Find the right Site
                if (String.Equals((string)site["name"], siteName, StringComparison.OrdinalIgnoreCase)) {

                   
    // For each binding see if they are http based and return the port and protocol
                    foreach (Microsoft.Web.Administration.ConfigurationElement binding in site.GetCollection("bindings")) {
                       
    string protocol = (string)binding["protocol"];
                       
    string bindingInfo = (string)binding["bindingInformation"];

                       
    if (protocol.StartsWith("http", StringComparison.OrdinalIgnoreCase)) {
                           
    string[] parts = bindingInfo.Split(':');
                           
    if (parts.Length == 3) {
                               
    string port = parts[1];
                               
    yield return new KeyValuePair<string, string>(protocol, port);
                           
    }
                       
    }
                   
    }
               
    }
           
    }
       
    }

     

    If you want to try it, you could use the following page, just save it as test.aspx and add the function above, the result is a simple table that shows the protocol and port to be used:

    <%@ Page Language="C#" %>
    <%@ Import Namespace="System.Collections.Generic" %>
    <script runat="server">
       
    protected void Page_Load(object sender, EventArgs e) {
           
    Response.Write("<table border='1'>");
           
    foreach (KeyValuePair<string, string> binding in GetBindings(this.Context)) {
               
    Response.Write("<tr><td>");
               
    Response.Write(binding.Key);
               
    Response.Write("</td><td>");
               
    Response.Write(binding.Value);
               
    Response.Write("</td></tr>");
           
    }
           
    Response.Write("</table>");
       
    }
    </script>



    Also, you will need to add Microsoft.Web.Administration to your compilation assemblies inside the web.config for it to work:

    <?xml version="1.0"?>
    <configuration>
      
    <system.web>
        
    <compilation debug="true">
          
    <assemblies>
            
    <add assembly="Microsoft.Web.Administration, Version=7.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"/>
          </
    assemblies>
        
    </compilation>
      
    </system.web>
    </configuration>
  • CarlosAg Blog

    Managing ASP.NET Configuration Settings using IIS Manager

    • 0 Comments

    Today somebody asked a question about how to manage some ASP.NET configuration settings such as changing the trust level of the application and adding a few application settings and changing compilation settings to debug. I thought it would be trivial to search the web for an article or something that would show the features we added in IIS 7.0 to manage those, but to my surprise I was not able to find anything that would clearly show it, so I decided to write this pretty quickly for anyone that is not aware.

    image

    With the release of IIS 7.0 (included in Windows Vista and Windows Server 2008), and of course included in IIS 7.5 (Windows 7 and Windows Server 2008 R2) we added a set of features for managing some of the configuration of common ASP.NET features inside the same IIS Manager. Those features include:

    1. .NET Authorization Rules 1 – To manage the authorization rules for ASP.NET, this is particularly useful when using Classic Mode. This UI basically is to manage the system.web/authorization section in a graphical way.
    2. .NET Compilation – This exposes the settings used by the ASP.NET compilation engine, such as list of assemblies, Debug settings, VB settings (Option Strict, Option Explicit), Temp directory, etc. This UI saves all the settings in the system.web/compilation section.
    3. .NET Error Pages 1 – Allows you to manage the ASP.NET custom errors, exposing the system.web/customErrors.
    4. .NET Globalization – Allows you to manage the globalization settings such as file encoding, ui culture, etc. This modifies the section system.web/globalization
    5. .NET Providers 2 – Allows you to manage the different provider configuration for the ASP.NET providers, such as Roles, Membership and Profile. (system.web/membership, system.web/profile, system.web/roleManager, etc).
    6. .NET Users, .NET Roles and .NET Profile 2 – Configure options that track settings for ASP.NET applications. All these features uses the ASP.NET runtime configuration to allow you to manage their settings, such as adding users, roles and profile settings. (What this post is about). These does not modify configuration but instead they use the Provider configured (such as SqlMembershipProvider, SqlRoleProvider, WindowsTokenRoleProvider, etc)
    7. .NET Trust Levels, allows you to configure the security trust level policy for the application. Modifies system.web/trust section.
    8. Application Settings – Allows you to manage the name/value pair stored in the .NET appSettings section.
    9. Connection Strings – Configures the database connection strings that can be used by ASP.NET applications. Manages the connectionStrings section.
    10. Machine Key – Allows you to modify the machine key and other related settings stored in system.web/machineKey section.
    11. Pages and Controls – Allows you to modify settings from the system.web/pages section, such as Base class, Namespaces, and Controls registered.
    12. Session State – Allows you to configure the session state settings such as connection string, cookie configuration and other configuration included in system.web/sessionState.
    13. SMTP E-mail – Configure the SMTP settings such as Server, Delivery mode, or Pickup directory, included in system.net/mailSettings/smtp section.

    1 – These features are included in Windows 7 and Windows Server 2008 R2, but can be installed for Windows Vista and Windows Server 2008 when downloading the Administration Pack for IIS7.

    2 – Note, these features require hosting the ASP.NET runtime and due to technical limitations only application pools configure to run using .NET Version 2.0 will show these features. This means that if you configure your application pool to run .NET 4.0 (in IIS 7.0 and IIS 7.5) you will not see those features. As a workaround you could temporarily change the application pool to run in 2.0, make your changes and switch it back to 4.0 (of course, not recommended for production environments).

    These features are not meant to expose all the settings included in ASP.NET, and they only include configuration settings up to .NET 2.0. I should also add that IIS includes a generic configuration editor that allows you to manage a lot more configuration settings from ASP.NET, IIS, and more, in the image below you can see a lot more sections like webParts, trace, siteMap, and others:

    image

    The best thing is that you can apply the changes immediately or you can also make changes and just generate the code to automate them later using code, command line or scripts them using Javascript, Managed code, or AppCmd.exe.

    image

  • CarlosAg Blog

    Using the SEO Toolkit to generate a Sitemap of a remote Web Site

    • 3 Comments

    The SEO Toolkit includes a set of features (like Robots Editor and Sitemap Editor) that only work when you are working with a local copy of your Web Site. The reason behind it is that we have to understand where we need to save the files that we need to generate (like Robots.txt and Sitemap XML files) without having to ask for physical paths as well as to verify that the functionality is added correctly such as only allowing Robots.txt in the root of a site, etc. Unfortunately this means that if you have a remote server that you cannot have a running local copy then you cannot use those features. (Note that you can still use Site Analysis tool since that will crawl your Web Site regardless of platform or framework and will store the report locally just fine.)

    The Good News

    The good news is that you can technically trick the SEO Toolkit into thinking you have a working copy locally and allow you to generate the Sitemap or Robots.txt file without too much hassle (“too much” being the key).

    For this sample, lets assume we want to create a Sitemap from a remote Web site, in this case I will use my own Web site (http://www.carlosag.net/ , but you can specify your own Web site, below are the steps you need to follow to enable those features for any remote Web site (even if it is running in other versions of IIS or any other Web Server).

    Create a Fake Site

    • Open IIS Manager (Start Menu->InetMgr.exe)
    • Expand the Tree until you can see the “Sites” node.
    • Right-click the “Sites” node and select “Add Web Site…”
    • Specify a Name (in my case I’ll use MySite)
    • Click “Select” to choose the DefaultAppPool from the Application Pool list. This will avoid creating an additional AppPool that will never run.
    • Specify a Physical Path where you will want the Robots and Sitemap files to be saved. I recommend creating just a temporary directory that clearly states this is a fake site. So I will choose c:\FakeSite\ for that.
    • Important. Set the Host name so that it matches your Web Site, for example in my case www.carlosag.net.
    • Uncheck the “Start Web site immediately”, since we do not need this to run.
    • Click OK

    This is how my Create site dialog looks like:

    image

    Use the Sitemap Editor

    Since we have a site that SEO Toolkit thinks it is locally now you should be able to use the features as usual.

    • Select the new site created above in the Tree
    • Double-click the Search Engine Optimization icon in the Home page
    • Click the link “Create a new Sitemap”
    • Specify a name, in my case Sitemap.xml
    • Since this is a remote site, you will see that the physical location option shows an empty list, so change the “URL structure” to will use the “<Run new Site Analysis>..” or if you already have one you can choose that.
    • If creating a new one, just specify a name and click OK (I will use MySite). At this point the SEO Toolkit starts crawling the Remote site to discover links and URLs, when it is done it will present you the virtual namespace structure so you can work with.
    • After the crawling is done, you can now check any files you want to include in your Sitemap and leverage the Server response to define the changed date and all the features as if the content was local. and Click OK

    This is the way the dialog looks when discovered my remote Web site URLs:

    image

    You will find your Sitemap.xml generated in the physical directory specified when creating the site (in my case c:\FakeSite\Sitemap.xml").

    Use the Robots Editor

    Just as with the Sitemap Editor, once you prepare a fake site for the remote server, you should be able to use the Robots Editor and leverage the same Site analysis output to build your Robots.txt file.

    image

    Summary

    In this blog I show how you can use the Sitemap and Robots Editor included in the SEO Toolkit when working with remote Web sites that might be running in different platforms or different versions of IIS.

  • CarlosAg Blog

    Free SEO Analysis using IIS SEO Toolkit

    • 10 Comments

    In my spare time I’ve been thinking about new ideas for the SEO Toolkit, and it occurred to me that rather than continuing trying to figure out more reports and better diagnostics against some random fake sites, that it could be interesting to ask openly for anyone that is wanting a free SEO analysis report of your site and test drive some of it against real sites.

    • So what is in it for you, I will analyze your site to look for common SEO errors, I will create a digest of actions to do and other things (like generating a diagram of your site, layer information, etc), and will deliver it to you in a form of an email. If you agree I will post some of the results (hiding identification information like site, url, etc, so that it is made anonymously if needed).
    • and what is in it for me, well I will crawl your Web Site (once or twice at most, with a limit set to a few hundred pages) using the SEO Toolkit to test drive some ideas and reporting stuff that I’m starting to build and to continue investigating common patterns and errors.

    So if you want in, just post me your URL in the comments of this blog (make sure you are reading this blog from a URL inside http://blogs.msdn.com/carlosag/ , otherwise you might be posting comments in some syndicating site.), I will only allow the first few sites (if successful I will start another batch in the future) and I will be doing one by one in the following days. Make sure to include a way to contact you whether using the MSDN user infrastructure or include an email so that I can contact you with the results.

    Alternatively I will take also URLs using Twitter at http://twitter.com/CarlosAguilarM so hurry up and let me know if you want me to look at your site.

Page 1 of 10 (91 items) 12345»