Writing ... or Just Practicing?

Random Disconnected Diatribes of a p&p Documentation Engineer

  • Writing ... or Just Practicing?

    An Upper Case of Indecisive Instruction

    • 2 Comments

    A couple of weeks ago I was ruminating on how somebody in our style guidance team here at Microsoft got a new Swiss army knife as a holiday-time gift, and instead of a tool for removing stones from horse's hooves it has one for removing capital letters and hyphens from documentation. Meanwhile the people in the development teams obviously got handkerchiefs or a pair of slippers instead because they are still furiously delivering capital letters whenever they get the chance.

    As you will probably have noticed, the modern UI style for new products uses all small capital letters in top level navigation bars and menus. I guess your view of this is based on personal preference combined with familiarity with the old fashioned initial-capital style; I've seen a plethora of comments and they seem to be fairly balanced between like and dislike. Personally I quite like the new style, especially in interfaces such as the new Windows Azure Preview Management Portal. It looks clean and smart, and fits in really well.

    Meanwhile my editor and I have been pondering on how we cope with this in our documentation. No doubt some official style guidance will soon surface to resolve our predicament, but in the meantime I've been experimenting with possibilities for our Hands-on Labs. I started out with the obvious approach that matches the way we currently document steps that refer to UI elements (bearing in mind the accessibility guidelines described in It Feels Like I've Been Snookered).

    For example:

        Choose +NEW, select CLOUD SERVICE, and then choose QUICK CREATE.

    But written down on virtual paper that does look a bit awkward and "shouty". Perhaps I should just continue to use the initial capitalized form:

        Choose New, select Cloud Service, and then choose Quick create.

    However, that doesn't match the UI and one of the rules is that the text should reflect what the UI looks like to make it intuitive and easy for users. Maybe I can just use ordinary words instead, in a kind of chatty informal way, so that they don't actually need to match the UI:

        Choose new, select cloud service, and then choose quick create.

    But that looks wrong and may even be confusing. Perhaps I should just abandon any attempt to specify the actual options:

        Create a new cloud service without a using custom template.

    Though that just seems vague and unhelpful. Of course, you might assume that a user would already know how to create a new cloud service, so it's redundant anyway. But something more complicated may not be so obvious without more specific guidance about where to start from:

        Open the management window for your Windows Azure SQL Database.

    I did suggest to my editor that we simply run with something like:

        Choose the part of the window that contains what appears to be some
        text that would say "cloud services" if it was all lowercase, and then...

    Ahh, but wait! In a non-web-based application UI I can use shortcut keys, like this:

        Press Alt-F, then N, then press P.

    Oh dear, that violates the accessibility rules, and doesn't work in a web page anyway. Maybe I'll just go with:

        Get the person sitting next to you to show you how to create a new cloud service.

    And, as a bonus, this approach may even foster team cohesiveness and encourage agile paired programming. Though you probably can't call it guidance...

  • Writing ... or Just Practicing?

    Web Sites or Cloud Services?

    • 0 Comments

    The latest update to the range of Windows Azure services includes a nifty feature called Web Sites that provides a really great way to deploy your own websites to the cloud. It's quick and easy, you can progressively update the site by uploading individual files, and it's cheap. In fact at the moment, if you can get onto the public preview program, it's actually free!

    Here at p&p, as part of the regular update cycle for our series of guides about Windows Azure, we're looking at Web Sites as a way of deploying the example applications used by the fictitious companies described in our guides. In today's fast changing business environment, you'd expect that the ability to rapidly build, test, deploy, and update applications using a wide range of tools and deployment methods would be a huge advantage. However, so far, I'm not sure.

    Business-critical applications typically have a life cycle that requires them to pass through several rigidly enforced processes. The code must undergo rigorous pre-deployment testing, validation, and certification. It must be deployed to a staging environment that is as near identical to the final runtime environment as possible, where it must undergo another full test cycle before release. It needs to be performance tested under load, and tested with real data over the connections and networks it will use when running live. Finally, it must be versioned so that bugs and updates can be tracked, and full integration testing can occur for each deployed version.

    But what about prototyping, proof of concept, and initial development? This seems to be an ideal scenario for Web Sites. Unlike Cloud Services, developers can use any tool they wish, and any of several development languages and deployment methods. Updates to the running application are quick and easy without requiring a full packaging and deployment cycle, and the changes are visible in the application almost instantly.

    However, unless you pay for reserved instances rather than using the default (free) shared instance model, load and performance testing is unlikely to provide a realistic result. And the cost of a reserved instance is about the same as a Cloud Services instance. You still get the advantages of support for lots of different development tools and languages, but your application will need to be modified to run under the .NET Framework using Visual Studio's Cloud project structure later if you want to use Cloud Services.

    We're still evaluating the options here, and I'd welcome any feedback around this topic. If you are building business applications for Windows Azure, let us know your thoughts...

  • Writing ... or Just Practicing?

    Can I Afford the Cloud?

    • 0 Comments

    Like many people I'm trying to evaluate whether I can save money by moving my lightly-loaded, community-oriented websites to Windows Azure instead of running them all on my own hardware (a web server in my garage). With the advent of the low-priced Web Sites model in Windows Azure (which I rambled on about last week), it seems like it should be a lot more attractive in financial terms than using the setup I have now.

    At the moment I use a business-level ADSL connection that provides 16 fixed IP addresses and an SLA that makes the connection suitable for exposing websites and services over the Internet. With one exception the websites and services run on a single Hyper-V hosted instance of Windows Server 2008 R2, on a server that also hosts four other Hyper-V VMs. The web server VM exposes five websites; while also providing DNS services for my sites and acting as a secondary for a colleague (who provides the secondary DNS for my sites). The exception is the website running on a very old Dell workstation that displays local weather information captured from a weather monitoring station in my back garden.

    In theory Windows Azure should provide a simple way to get rid of the Hyper-V hosted web server, and allow me to avoid paying the high costs of the business ADSL connection. I have a cable Internet connection that I use for my daily access to the wider world, and I could replace the ADSL connection with a simpler package at a considerably reduced cost to maintain backup and failover connectivity. I'd need to find a solution for the weather web server because it requires USB connectivity to the weather station hardware (which it why it runs on a separate server), but that's not a major issue and could be solved by posting data to a remote server over an ordinary Internet connection.

    So I started evaluating possible savings. Using a separate Cloud Services Web role for each site is a non starter because the cost, together with one Windows Azure SQL Database server, is four times what I pay for the ADSL connection. Even taking into account the saving from running one less on-premises Hyper-V instance, it doesn't make sense for my situation. And I'll still need a DNS server, though I can switch to using a hosted service from another company for a few dollars per month to resolve (if you'll pardon the pun) that issue.

    But I can run multiple sites in one Cloud Services Web role by using host headers, which gives me a marginal saving against the cost of the ADSL connection. Of course, according to the Windows Azure SLA I should deploy two instances of the role, which would double the cost. However, the expected downtime of a single role instance is probably less that I get using my own ADSL connection when you consider maintenance and backup for the Hyper-V role I use now.

    Using a Virtual Machine seems like a sensible alternative because I can set it up as a copy of the existing server; in fact I could probably export the existing VHD as it stands and run it in Windows Azure with only minor alterations. Of course, I'd need SQL Server in the Virtual Machine as well as a DNS server, but that's all fully supported. If I could get away with running a small instance Virtual Machine, the cost is about the same as I pay for the ADSL connection. However, with only 1.75 GB of memory a small instance might struggle (the existing Hyper-V instance has 2.5 GB of memory and still struggles occasionally). A medium size instance with 3.5 GB of memory would be better, but the costs would be around double the cost of my ADSL line.

    So what about the new Windows Azure Web Sites option? Disregarding the currently free shared model, I can run all five sites in one small reserved instance and use a commercial hosted DNS service. Without SQL Server installed, the 1.75 GB of memory should be fine for my needs. I also get a free shared MySQL database in that cost, but it would mean migrating the data and possibly editing the code to work with MySQL instead of SQL Server. A Windows Azure SQL Database for up to five GB of data costs around $26 per month so the difference over a year is significant, but familiarity with SQL Server and ease of maintenance and access using existing SQL Server tools would probably be an advantage.

    Interestingly, Cloud Services and reserved Web Sites costs are the same when using Windows Azure SQL Database. However, the advantage of easier deployment from a range of development environments and tools would make Web Sites a more attractive option. It would also be useful for the weather website because the software I use to interface with it (Cumulus) can automatically push content to the website using FTP over any Internet connection.

    So, summarizing all this I came up with the following comparisons (in US dollars excluding local taxes):

    ConfigurationPer Month
    Existing on-premises Hyper-V hosted VM including SQL Server and DNS, plus connectivity and ancillary costs over and above a simple ADSL connection $ 95.00
    Cloud Services (5 instances), SQL Database, externally hosted DNS, 1 GB bandwidth $ 458.00
    Cloud Services (2 instances), SQL Database, externally hosted DNS, 1 GB bandwidth $ 204.00
    Cloud Services (single instance), SQL Database, externally hosted DNS, 1 GB bandwidth $ 117.00
    Virtual Machine (medium) including SQL Server and DNS, 1 GB bandwidth $ 198.00
    Virtual Machine (small) including SQL Server and DNS, 1 GB bandwidth $ 115.00
    Web Sites two reserved instances, SQL Database, externally hosted DNS, 1 GB bandwidth $ 204.00
    Web Sites one reserved instance, SQL Database, externally hosted DNS, 1 GB bandwidth $ 117.00
    Web Sites two small reserved instances, MySQL database, externally hosted DNS, 1 GB bandwidth $ 178.00
    Web Sites one small reserved instance, MySQL database, externally hosted DNS, 1 GB bandwidth $ 92.00

    I only included one GB of outbound bandwidth because that's all I need based on average traffic volumes. However, bandwidth costs are very low so that even if I use ten times the estimated amount it adds only one dollar to the monthly costs. Also note that these costs are based on the July 2012 price list for Windows Azure services, and do not take into account current discounts on some services. For example, there is a 33% discount on the reserved Web Sites instance at the time of writing.

    It looks like the last of these, one small reserved Web Sites instance with a MySQL database and externally hosted DNS is the most attractive option if I can manage with MySQL instead of Windows Azure SQL Database. However, what's interesting is that I can achieve a saving for my specific limited requirements, and that's without taking into account the hidden ancilliary costs of my on-premises setup such as maintaining and patching the O/S, licensing costs, electricity and use of space, etc.

    And if the hardware in my garage fails, which I know it will some day, the cost of fixing or renewing it...

  • Writing ... or Just Practicing?

    I Nearly Found Nirvana In The Cloud

    • 0 Comments

    So it's been a week of semi-fruitful searching for lots of people. In China there's a team setting out on a million-pound expedition in the mountains and forests of Hubei province to find the Yeren or Yeti that's supposedly been sighted hundreds of times. In Geneva, scientists have revealed that they've probably found the Higgs boson particle they've been searching for over the last fifty years. Meanwhile, as I mentioned last week, I've been seeking a way to rid myself of the cost and hassle of maintaining my own web servers.

    The Geneva team reckons there's only a one in 1.7 million chance that what they've found is not the so called "God particle" but they need to examine it in more detail to be absolutely sure. I just hope that the level of probability for the expedition team in China will be more binary in nature. I guess that being faced with a huge black hairy creature that's half man and half gorilla (and which hopefully, unlike the Higgs boson, exists for more than a fraction of a second) will prompt either a definite "yes it exists" or an "it was just a big bear" response.

    Meanwhile, my own search for Windows Azure-based heaven has been only partially successful so far. A couple of days playing with Windows Azure technology have demonstrated that everything they say about it is pretty much true. It's easy, quick, and mostly works really well. But unfortunately, having overcome almost all of the issues I originally envisaged, I fell at the last fence.

    The plan was to move five community-style websites to Windows Azure Web Sites, with all the data in a Windows Azure SQL Database server. Two of the sites consist of mainly static HTML pages, and these were easy to open in Web Matrix 2 and upload to a new Windows Azure Web Site using the Web Deploy feature in Web Matrix. They just worked. A third site is HTML, but the static pages and graph images are re-generated once an hour by my Cumulus weather station software. However, Cumulus can automatically deploy these generated resources using FTP, and it worked fine with the FTP publishing settings you can obtain from the Windows Azure Management Portal for your site.

    The likely problem sites were the other two that use ASP.NET and data that is currently stored in SQL Server. Both use ASP.NET authentication, so I needed to create an ASPNETDB database in my Windows Azure SQL Database server and two other databases as well. However, my web server runs SQL Server 2005 and I couldn't get Management Studio to connect to my cloud database server. In the end I resorted to opening the databases in the Server Explorer window in Visual Web Developer and creating scripts to build the databases and populate them with the data from the existing tables. Then I could create the new databases in the Windows Azure SQL Database management portal and execute the script in the Query page. I had to do some modification to the script (such as removing the FILLFACTOR attributes for tables) but it was generally easy for the ASPNETDB and another small database.

    However, problems arose when I looked at the generated script for the local village residents' group website. This is based on the old Club Starter Site, much modified to meet our requirements, and is fiendishly complicated. It also stores all the images in the database instead of as disk files. The result is that the SQL script was nearly 72 MB, which you won't be surprised to hear cannot be copied into the Query page of the management portal. However, I was able to break it up into smaller pieces and load it into a Visual Studio 2008 database query window, connect to the Windows Azure database, and execute each part separately. It was probably the most time-consuming part of the whole process.

    Then, of course, comes testing time. Will the ASP.NET authentication work with the hashed passwords in the new ASPNETDB database, or is there some SALT key that is machine-specific? Thankfully it did work, so I don't have to regenerate accounts for all the site members and administrators. In fact, it turns out that almost everything worked just fine. A really good indication that the Web Sites feature does what it says on the tin.

    However, there were three things left I needed to resolve. Firstly, I found that one site which generates RSS files to disk could no longer do so because you obviously can't set write permission on the folders in a Windows Azure Web Site. The solution was to change the code that generated the RSS file so it stored the result in a Windows Azure SQL Database table, and add an ASP.NET page that reads it and sends it back with ContentType = "text/xml". That works, but it means I need to change all the links to the original RSS file and the few people who may be subscribing to it won't find it - though I can leave an XML file with the same name in the site that redirects to the new ASP.NET page.

    Secondly, I need to be able to send email from the two ASP.NET sites so users can get a password reset email, and administrators are advised of new members and changes made to the site content. There's no SMTP server in Windows Azure so I was faced with either paying for a Virtual Machine just to send email (in which case I could have set up all the websites and SQL Server on it), or finding a way to relay email through another provider. It turns out that you can use Hotmail for this, though you do need to log into the account you use before attempting to relay, and regularly afterwards. So that was another issue resolved.

    The final issue to resolve was directing traffic from the domains we use now to the new Windows Azure Web Sites. Adding CNAME records for "www" to my own DNS server was the first step before I investigate moving DNS to an external provider. It's a part-fix because I really want to redirect all requests except for email (MX records) to the new site, but Windows DNS doesn't seem to allow that. However, there are DNS providers who will map a CNAME to a root domain, so that will be the answer.

    Unfortunately, this was where it all fell apart. Windows Azure Web Sites obviously uses some host-header-style routing mechanism because requests using the redirected URL just produce a "404 Not Found" response. Checking the DNS settings showed that DNS resolution was working and that it was returning the correct IP address. But accessing the Azure-hosted sites using the resolved IP address also produced the "404 Not Found" response. Of course, thinking about this, I suppose I can't expect to get a unique IP address for my site when it's hosted on a shared server. The whole point of shared servers in Windows Azure is to provide a low cost environment where one IP address serves all of the hosted sites. Without the correct URL, the server cannot locate the site.

    According to several blog posts from people intimately connected with the technology there will be a capability to use custom domain names with shared sites soon, though probably at extra cost. The only solution I can see at the moment is to set up a redirect page in each website on my own server that specifies the actual Windows Azure URL, so that routing within Windows Azure works properly. But that means I still need to maintain my own web server!

    Meanwhile, here's a few gotcha's I came across that might save you some hassle if you go down the same route as I did:

    • When you configure a Windows Azure SQL Database server, don't use an email address that includes "@" as the user name. The actual user name for the database will be [your user name]@[database name] and the extra "@" seems to confuse the management portal.
    • Also avoid using characters that need to be HTML encoded (such as "&") in database passwords. It gets complicated when you need to specify the password in configuration files and in tools that connect to the database.
    • If you use Web Matrix, you can download the publishing settings for each site from the portal. However, once you import them into Web Matrix is seems to be impossible to edit them or re-import them if you need to change the settings. The only solution I found was to delete the site (but not the content) in the Web Matrix "My Sites" dialog then open the site again as a folder.
    • If you find that database connections are failing, or you get connection errors in your pages, check the connection string settings. I had a weird problem with one site where every time I deployed the Web.config file from Web Matrix it did a fancy transform on it and changed the connection strings. I had the databases configured as linked resources, but despite checking these and all other settings several times I couldn't find any reason. They were correct in the Web Matrix Database page and in the Configure page of Windows Azure management portal for the site. It only happened on one site, and only after I switched it from .NET Framework version 2.0 to version 4.0. The other site where I did the same never suffered this problem. The kludge solution I found was to open the site in Remote View in Web Matrix, open the Web.config file from there, correct the error, and save it again. But it means doing this every time I edit the file and redeploy it.
    • If you switch a .NET 2.0 site to .NET 4.0, and that site turns off request validation in any pages or in Web.config that must accept HTML input (such as blogs), you need to add <httpRuntime requestValidationMode="2.0"> to the <system.web> section in Web.config. If possible turn off request validation only in the specific pages that must accept HTML input, and ensure you always validate the content before saving or displaying it. 

    So was the whole "migrate to Azure" exercise a waste of time and effort? No, because I know that I have a solution that will let me get rid of my web server and expensive fixed IP address, business-level ADSL connection in time. And in less than two days I learned a lot about Windows Azure as well. However, what's becoming obvious is that I probably need to go down the road of using reserved instead of shared instances, or even Cloud Services instead of Web Sites. But that just raises the question of cost all over again.

    Though, just to cheer me up, a colleague I brainstormed with during the process did point out that what I was really doing was Yak shaving, so I don't feel so bad now...

  • Writing ... or Just Practicing?

    Fully Cloud-Enabled!

    • 0 Comments

    Yes, another episode in my continuing onslaught on the cloud. But this week it's a heartwarming story of intrepid adventure and final success. At last I'm fully resident in the cloud - or, to be more precise, several clouds. And I might even have saved some money as well...

    Over the past couple of weeks I've been blethering about getting rid of my very expensive and not always totally reliable ADSL connection by moving all the various stuff attached to it into the cloud. This includes several websites and the DNS services for mine and a colleague's root TLDs. Previous episodes described the cost verification exercise and the experimental migration, but ended with the thorny issue of traffic redirection to the new sites. And I still had the DNS issue to (again, please pardon the pun) resolve.

    After looking at several commercial DNS hosting specialists, the situation seemed bleak. While they all appear fully equipped to satisfy my requirements, the cost of hosting DNS services for around 25 domains was prohibitive in my situation. An average quote of somewhere between five and eight US dollars per domain per month meant that the cost of DNS alone would be more than I pay now for all my on-premises infrastructure and connectivity. And I didn't much fancy using a free DNS service with no SLA.

    But then I discovered a web hosting provider that does offer full domain management services as an add-on to their very reasonably priced packages. A quick calculation showed that paying GoDaddy.com for a fixed IP address website and all the associated frippery (such as email and other stuff that I don't need), plus the cost of their premium DNS hosting package, was around a tenth of the cost of my ADLS connection. It seemed like the perfect solution, and after signing up and spending a day setting up the DNS records in their superb web interface it all worked fine. They support secondary DNS as master and slave, so I was able to create secondary domains for my colleague's TLDs as well as configuring my own domains to use his DNS server as a secondary. Then it was just a matter of changing the IP addresses of my domains and my own root DNS server entry at Network Solutions to point to their DNS servers.

    However, this still didn't solve the problem of redirecting traffic to my Windows Azure websites. I can set up CNAME records in the new DNS server for subdomains, but (as I discovered last week) that doesn't help because Windows Azure Web Sites depends on host headers to locate the sites. But now I have a hosted website with a fixed IP address at GoDaddy, so I can move all the redirection pages from my own web server to this hosted site. As all the domains now point to a single website I'll need to do some fancy stuff to detect the requested domain name and redirect to the correct site on Windows Azure. But I had the forethought to specify a Windows host for the site when I set it up, so I can use ASP.NET for that. Easy!

    So I pointed all the root and "www" records in DNS to the fixed IP address of my GoDaddy site and set up a simple default ASP.NET page there that extracts the requested URL as a URI instance from the Request.URL header, parses out the domain, and does a Server.Redirect to the appropriate page on the matching Windows Azure website. There's no need for visitors to see a "we have moved" redirection page, and Windows Azure gets the correct domain name in the request so that it can find my site.

    But there's another problem. Requests to anything other than the root of a domain (such as requests for specific pages from search engine results) don't work because the site can't find the specified page or file. The default page doesn't get called in this case. Instead, the server just sends back a 404 "Not Found" page. However, GoDaddy allows you to specify your own 404 Error page, so I pointed this at an ASP.NET page that parses the original request (the bit after the "404;" in the request string) and builds a new URL pointing to the appropriate Windows Azure site, together with the full path and query string of the requested page or file. It displays a "page moved" message for five seconds (yes, I know this is annoying) and then does a client-side HTTP redirect with a META instruction. So that's the problem solved!

    Err, not quite. Some of my sites are ASP.NET, and a request for a non-existent page doesn't result in a 404 "not found" error. Instead, the ASP.NET handler creates a 500 "code execution" error. And GoDaddy doesn't allow you to specify the default error page for this. But you can specify the error page in Web.config, or (as I did) just use a global error handler in Global.asax to redirect to a custom ASP.NET error page. My custom error page pulls out the original URL, does the same parsing to build the correct URL on the Windows Azure site, and returns a "page moved" message with a client-side META redirect.

    So that's it! My own web and DNS server is switched off, and everything is going somewhere else. A quick call to my ISP means that my expensive business ADSL connection has been replaced by a much simpler business package at a considerably lower price. I even managed to persuade the nice sales guy to cancel the downgrade fee and give me a single fixed IP address on the new service so I can still run a web server for testing, or whatever else I need, should the situation arise.

    Was it worth the effort? The original on-premises connection cost (converted to US dollars and including local taxes) was $ 2,148, and that doesn't cover the cost of running the server itself, maintenance, upgrades, and other related stuff. At the moment I'm using Shared mode for the Windows Azure sites, which (together with a Windows Azure SQL Database) is free for a year. My new ADSL connection package is $ 696 per year, and the GoDaddy hosting package (Windows website and premium DNS service) is only $ 186 per year, so the annual cost at the moment is less than $ 900 - a saving of almost 60%!

    Of course, when Windows Azure Web Sites becomes a chargeable service (see Pricing Details) I'll need to review this, but the stated aim is to provide a competitive platform so even when using SQL Database I should still see a saving. And I can still investigate moving to a shared MySQL hosted database to reduce the cost. Meanwhile I'm finally free of DNS Amplification attacks, web server vulnerability attacks, and all my inbound ports are closed. I also have one less server to run, manage, monitor, maintain, upgrade, and try to keep cool in summer.

    All I need to do now is out-source my day job and I can spend the next few years lazing on some remote foreign sun-kissed beach - preferably one that's got Wi-Fi...

Page 1 of 1 (5 items)