Random Disconnected Diatribes of a p&p Documentation Engineer
At one time you had to work in a museum to be a curator, but the wonders of information technology mean that now we can all exhibit our technical grasp of complicated topics and elucidate the general population by identifying the optimum resources that help to answer even the most complex of questions.
I'm talking about the new Curah! website here. The idea is simple: a resource that gathers together the questions most commonly asked about computing topics; each with a carefully and lovingly crafted set of links to the most useful blogs, reference documents, tools, and other information that offers a solution to the question.
Anyone can register and create a curation, and the site is optimized for search engines to make it easy to find answers. It's still in beta as I write this, but already has hundreds of answers to common questions. The great thing is that the curations are not just a set of links like you'd get from the usual search engines, which tend to optimize the list based on keywords in the resources, the number of links to them from other pages, and the newness of the content. None of these factors can provide the same level of usefulness as a list compiled by an expert in the relevant topic area who regularly creates and uses information that provides the maximum benefits.
My interest in the Curah! site also comes about partly because I am part of the group that defined the original vision and got it started. I've also added a few curations of my own, which are centered on the topic area that I now seem to have been permanently assigned to - Windows Azure application design and deployment. My regular reader will probably have noticed this from the rambling posts on this blog in the past.
However, one point that concerned me was that, having created my own curations, I am now responsible for maintaining them. As I plan to create more in the future, I was beginning to wonder if I would end up spending all of every Monday just checking and updating them as the target resources move, disappear, or I discover new ones. What I needed was some type of automated tool that would make this job easier. So I built one.
The CurahCheck utility is a simple console-based utility that will check one or more views on the Curah! site by testing all of the links in each curation ID you specify. The curation title and the linked page titles can be displayed to ensure that it is valid and that all of the linked resources are still available. It can also be run interactively, or automatically from a scheduled task.
The utility generates a log file containing details of the checks and any errors found. It can also generate an HTML page for your website that shows the results of the most recent check and the contents of the log file. If you have access to an email server, the utility can send warning email messages when an error is detected in any of the views it scans.
If you are a Curah curationist you can download the utility from here, and use and modify it as you wish. The source project and code for Visual Studio 2012 is included. Before you use it, you'll need to edit the settings in the configuration file to suit your requirements - the file contains full details of the settings required and their effect on program behavior.
Of course, the usual terms and conditions about me not being responsible for any side-effects of using the program, such as your house falling down, your children being eaten by a dinosaur, or your computer bursting into flames, still apply...
No, this post isn't about parental difficulties and I didn't spell "paternal" wrong in the title, although I admit it is about problems with relationships. More specifically, the relationship between design patterns and pretty much everything else. And, based on previous experience of dabbling in this area, how I hate design patterns.
For more years than I care to remember I've been driven by situation to describe, document, present, and generally discuss software design patterns. Initially it was just patterns related directly to ASP.NET where common ones such as Factory, Singleton, MVP/MVC, and Publish/Subscribe were obvious - and generally built-in to the framework, or easy to implement. We could never agree on a structure for documenting patterns, never mind the actual defintion of the pattern. Or which implementions to show, and in what programming languages.
Then I got involved in Enterprise Library, and more design patterns surfaced in my world: Builder, Adapter, Decorator, and Lazy Initialization. All good solid patterns that are well documented and easy to use in Enterprise Library. I even write code samples to demonstrate how, together with some tenuously humorous descriptions that attempted to relate the guy who comes to paint your house with the way the Decorator pattern works. Needless to say, those documents never saw the light of day.
But now I'm back in the mire of design patterns again, paddling furiously to try and stay afloat at the same time as writing semi-comprehensible verbiage around patterns in Windows Azure. Some of which seem so vague and newly invented that you might think they were giving out prizes for finding new ones. I've reached the state of wondering what design patterns really are, and if many of the new examples I'm trying to document are just techniques, guidance, general advice for implementation, or made-up stuff that sounds like it might be useful.
According to most reputable resources, a software design pattern is "... a general reusable solution to a commonly occurring problem within a given context in software design" and "... a description or template for how to solve a problem, which can be used in many different situations." But then the definition typically continues with "... they are formalized best practices that guide a programmer on the implementation, not complete designs or solutions that can be transformed directly into code."
So is something that's Windows Azure specific, such as how you perform health verification checking for an application, a design pattern? Or is it just a technique? Or guidance? It certainly doesn't fit the idea of a generally reusable solution or template that can be used in many different situations - it's pretty specifically a technique for checking if an application is alive. But it is, I guess, formalized best practice and definitely not a complete design.
In fact there's nothing I can find on the web that seems to relate to "Health Verification Pattern". Or anything related around "probe" or "ping" that fits with the scenario. Yet it doesn't seem like something that somebody just made up for fun either. There's features in Windows Azure Traffic Manager and Windows Azure Management Services to do health verification, even if it is just a simple probe on a specified URL and port.
Of course, what's clever is that you can have the target of the probe do some internal checking of the operation and availability of the resources the application uses, and maybe some validation of the internal state, then decide whether to send back a "200 OK" or a "500 Internal Error" status code (or some other error code). Though you do need to do it in a timely way so that the probing service doesn't think you've gone away altogether, and flag your application as "failed."
For example, with Traffic Manager you get just ten seconds, including network latency while the request and response traverse the Internet, before it gets fed up waiting. So there's no point is doing things like checking a dozen database connections, or validating checksums of every order in your system, because you probably won't have time. And is there any point in sending back a detailed error message if something has gone wrong in the application? You'll need a custom client in this case to handle it. But surely the application will already contain instrumentation such as error handlers and performance counters that will flag up any failed connections or errant behavior within the code at runtime?
Maybe it really is similar to a paternal relationship after all. Whenever I probe our son to ask if he's OK, all I ever get back is "yeah, cool". The real-world equivalent of "200 OK" I suppose...
My rather staid daily newspaper occasionally makes an attempt to be cool and trendy by squeezing an article about technology and lifestyle between the reports of war, famine, crime, and pictures of the Royal Family. But it was still a bit of a shock yesterday to see the headline "42% of People Admit to Nomophobia."
At first I assumed it was another kind of attention deficit disorder they'd identified in kids, or something you caught from watching too many reality TV shows. But after perusing the article I quickly grasped the real meaning: fear of being without your mobile phone. And, from reading more, it seems there is an acute version of the phobia where you're not only without your phone, but you can't remember where you left it.
I suppose I've never come across this condition before because I always know where my mobile phone is. It's at the back of the third drawer down in the kitchen cabinet next to the sink. And if I did forget, there'd be no point in dialing the number from another phone and trying to trace the sound because it's turned off. So it looks as though I should be suffering from chronic nomophobia. Something else I can ask my doctor about during my next visit.
Yet, strangely, I don't feel any symptoms or stress. I guess some people that don't know me will say it's because I don't have much interest in technical gadgets. But that's obviously not true – we have a ton of them in our house, everything from a computer-powered TV to a fully automated weather station (with added solar intensity recording) to a robot vacuum cleaner. And plenty more gizmos and electronic wizardry in between.
But, somehow, I can't get excited about all this new portable and wearable stuff – though that's probably because I hardly ever go anywhere. I have a wristwatch that is guaranteed to be 100% accurate because it gets its time from a radio transmitter in Rugby, but I can't remember when I last wore it. And my phone is a proper smartphone, even if it is three or so years old, though it only ever gets turned on about once a month. I have a Windows Surface tablet, but I've never found any reason to take it past the front door - for some reason it seems to stop working once I get a hundred yards away from the wireless router.
So will I be a customer for some new wearable technology? I already wear spectacles, and none of the photos of people using Google Glass show it perched on top of an existing pair of prescription spectacles - maybe you can get a prescription Google Glass, or one that fixes to existing spectacles? And I doubt that, even with spectacles, my aging eyes are good enough to read anything useful on the one-inch-square screen of a smart watch. Perhaps they'll bring out a smart watch that projects the display two feet square onto into a nearby wall so that everyone else can read my email at the same time.
Or maybe my prescription Google Glass will have a zoom feature so I can see the screen of my smart watch...
Much as I complain about some TV documentaries being dumbed down (for example, showing a clip of an explosion every time the presenter mentions The Big Bang in case you can't remember what an explosion looks and sounds like), I have to admit that a recent episode of the BBC Horizon series was an excellent in-depth examination of the latest nightmare scenario.
"Defeating the Hackers" explored two recent high-profile cases in detail; the hacking of Wired journalist Mat Honan, who had all of his online presence infiltrated, and the Stuxnet attack on Iran's nuclear plant. It also explained in layman's terms how SSL encryption works, and how the ongoing development of quantum computers will render our current secure communication techniques obsolete.
Of course, anyone following the current events in regard to online privacy and government access to our personal data will already be wondering if there is any security left. Or risk travelling through a UK airport where it seems that all of your digital belongings are open to detailed examination and confiscation. But that's another story.
Anyway, getting back to the Horizon documentary, most of the topics are probably well known to most IT people. But there was one that I hadn't come across before: Ultra Paranoid Computing. It's obviously not a mainstream topic. Wikipedia doesn't know about it and there's little on the web. However, I did find one article on the National Science Foundation site that covers the same ground as the TV program.
Ultra Paranoid Computing attempts to deal with the scenario where every other computer on the planet has been taken over by malware (I guess that's where the "ultra-paranoid" bit comes in – I thought I was a paranoid but I never considered this one). As well as the nightmare scenario of all of our utilities (water, electricity, gas, telephone) being hacked and disrupted, and global finance being completely broken, we need to protect ourselves by finding a way to securely identify users and other computers.
However, all of the techniques we currently use for this can, they say, be defeated. The new quantum computers will crack passwords and certificate keys instantly, and be able to read encrypted data. Even fingerprints and retina scans can be imitated, the program suggested, and so a new way of identifying ourselves - which cannot be replicated - is required.
The NSF article mentions an approach called Rubber Hose Resistant Passwords. I couldn't help getting visions of trying to log on with an elastic stocking by waving a leg in front of some specialist detector, but I'm going to assume that's not the case (I couldn't get the video that explains it to play). But typically our identity will need to be confirmed by some technique that makes use of physical attributes.
In the TV program, they showed an interesting approach using the guitar from the Microsoft Xbox 360 Guitar Hero game. You play a tune several times until the computer has built up a pattern of your timing, mistakes, and responses; and this becomes your physical passkey. You just need to play the same song again (in exactly the same way, of course) to log in. Maybe companies will have a central guitar station where you go to sign into the network every morning. Or, more likely, everyone will turn up for work disguised as an itinerant rock star with a guitar slung across their back, like they showed in the program.
Talking of disguises, I suppose I should keep up my usual tradition of helping to publicize the results of the best joke competition at this year's Edinburgh Festival. Jack: "I'm thinking of going to a fancy dress party disguised as a Mediterranean island." John: "Don't be Scicily!"
Meanwhile, I wonder if I can put in for promotion from just being paranoid to being "ultra-paranoid." Though I doubt it comes with a pay raise...
As summer continues to exhibit its typical level of weather unpredictability here in Ye Olde England, the effects of our harsh spring have faded and everything is in bloom as though nothing untoward had happened. Everywhere you look, the countryside in this part of our green and pleasant land seems to be at its best.
Even the usual populations of bees and butterflies are evident, despite warning from experts that their existence was under threat. Though there is a marked shortage of people's favourite insect - the red and black spotted ladybird - due, they say, to the low number of aphids compared to previous years. And, although we recently had a spell of very hot and dry weather, even the lawns are looking pristine; while the assorted shrubs in my garden are defeating any attempt to keep them under control. From my "alternative office", a desk in the conservatory, even the dull days are filled with the wonderful sights and sounds of an English summer.
Best of all, however, is the confirmation that our local wildlife has survived the winter and still considers our garden to be a welcome stop on their night-time (and sometimes daytime) travels. The local fox family seems to have produced two cubs this year, rather than the more usual three, and there's evidence of young badgers in the woods next door. Though, so far, none of the light-coloured variety like the one we sadly lost some weeks ago.
As usual, my wildlife camera has been keeping watch. This time I set it to movie mode rather than still picture mode. The results aren't perfect at night because it takes a couple of seconds to start up the infra-red LEDs and stabilize the picture after detecting movement. And, typically, only one in fifty of the movies captures anything of interest - especially when we seem to be on the main route that all of our neighbours' cats use during their night-time constitutionals.
But, in case you are interested, I've posted a short video of extracts. The quality is not great, as the original was over 60 MB as so I reduced the frame size to make it more manageable. See if you can figure out what is shown in the first two clips; something scooting down and back up a small shrub, and then what might be a bat flying slowly past the camera.
According to a news item this week it seems that both CBS television and the City of London may need to do some serious rebranding, and they might even have to rename the Hubble Space Telescope. Though mathematicians and physicists will no doubt be pleased because now they won’t have to invent weird stuff to make their equations work at the instant of creation.
According to the article, some scientists and astronomers are coming round to the opinion of Christof Wetterich at Heidelberg University that there never was a "Big Bang" - instead, Edwin Hubble was wrong and the Universe is pretty much static and is not expanding. Christof suggests that the red shift in the color of stars, which Hubble used as proof of the expansion, is due instead to changes in atomic mass rather than acceleration away from us.
Therefore the Universe couldn’t have started from a singularity that exploded. Though they do admit that it might be a combination of the two factors; the Universe is slowly expanding as well as the atoms in it changing their mass. So, maybe there was a Big Bang but - like a car running out of petrol - everything gradually came to a stop. Perhaps this lends support to Stephen Hawkins suggestion that the Universe will stop expanding and start to contract again, leading to the Big Crunch where everything goes back to being a singular point - and then explodes in a new Big Bang.
But I suppose it will make space travel much simpler in the future. Instead of worrying how humans will survive the long journey to the nearest solar system, we can just wait till it comes nearer...
So I finally got round to reading Bill Bryson's book "A Short History of Nearly Everything". OK, so it's not quite as entertaining as some of his travel guides, but it is amazingly full of things that make you go "Wow" and "Can you believe it?" I especially liked the bits about soft drink cans and a man named Norman.
I can vaguely remember learning about Avogadro's Number (the number of molecules in a couple of grams of hydrogen) when I was studying chemistry a great many years ago, and I know it's a big number. A really big number. Much bigger that the kind of numbers we computer people usually play with. And the book also shows how bad we are at explaining how big our numbers are.
For example, we often talk about the lack of publicly available IPv4 addresses and explain that the new IPv6 mechanism will provide enough for everyone on the planet to have a trillion each. Unless you can envisage how many people live on this planet, and what a trillion looks like, it's all pretty uninformative. But when you hear how Avogadro's Number is described in the book you get a much more realistic impression of its size. Supposedly, when converted into the same number of soft drink cans, there would be enough to cover the entire planet with a stack two hundred miles high. Now you really get that "big number" feeling.
Bill's book also talks a lot about biology, and illustrates how flighty and unreliable we computer programmers are. We casually skip from one technology to another, flip between programming languages, and wander through the forests of patterns and frameworks that make our life easy. It just demonstrates how little real concentration we have compared to the guy named Norman who worked in the Natural History Museum in London.
Norman steadfastly spent forty-two years studying one species of plant, St. John's Wort. And after he retired he still came into work one day a week to continue his task. Imagine what it would be like if you had to spend most of your working life just producing better implementations of the Singleton design pattern.
Still it could be worse. I read in the newspaper this week that the team at CERN using the Large Hadron Collider think they've discovered a whole new Standard Model on which all physics is based, possibly rendering the old one obsolete. You have to feel sorry for Peter Higgs, who's been waiting nearly fifty years for them to find a Higgs boson
Now someone has to tell him that actually it's all just wiggly bits of string...
This week has been an interesting combination of learning and re-acquaintance opportunities. Learning because I finally got fully switched over to Windows 8, at the same time as discovering how many parts of your body are involved in the simple act of walking.
A couple of weeks ago I suffered a reoccurrence of the trapped nerve syndrome associated with sciatica, which left me hobbling about with a stick like the old man I guess I’m turning into. The process of walking across a room became a whole new re-acquaintance experience, which clearly identified all of the muscles and tendons that delineate the upright posture of the human race from the four-legged approach used by most of the rest of the animal kingdom. And it’s an experience I really don’t want to repeat.
Meanwhile, the concurrent learning experience has been with Windows 8 and Office 365, now that my new company computer has arrived (yes, I did decide it was safe to use despite the warnings from last week). Coincidently, my non-work email provider upgraded their systems last week so that I’m now on Exchange Server 2013. In both cases it’s been like learning to walk again as I figure out how to do things in Windows 8, Office 2013, and the Outlook Web App, which were second nature in previous versions.
For example, I prefer to turn off the Preview pane and open messages in a new window to avoid downloading all the crud in the junk emails in my Inbox. However, I can’t get Outlook Web App to open the next message when I close the current one, no matter what option settings I choose. And when I reply to a message it leaves the existing one open instead of closing it automatically. Previous versions of Outlook Web Access managed to do this. And as much as I’ve got used to the Modern interface style, the shortcut menu looks very odd without capitalized words. I’m not sure why, but it seems to make it harder to find the option you want.
In Office 2013 I’ve generally come to terms with the new version of Word, which is the application I use most. But this week I tackled Visio for the first time, and it hurt. Screen updates seem really slow, and often all I get when dragging items is a grey outline. OK, so the computer isn’t the fastest in the world (Intel Core 2 Duo and on-board graphics) but it managed OK in Visio 2010. And just drawing a simple line arrow the first time took ages until I discovered that they’ve moved them to the “Connector” option in the ribbon.
So even after a week of creating documents and schematics I often still feel like I’m lost in Office 2013. Compared to the changes between earlier versions (such as from 2007 to 2010), the latest upgrade sometimes seems like a step too far. Colleagues who have been switched for a while say they find the new version easier to use, and more productive, but I suspect it will take some time before I’m fully competent with it. I wonder if the typical learning process means everything seems harder at first as you try to do things the way you’re accustomed to, and before you discover the new way to do them that provides the productivity increase.
But I am converting nicely to Windows 8, though like many people I do miss the Start button in the desktop. Hitting the Windows key and then Windows key + Q to get the search box just to start an application seems a retrograde step, and I’m looking forward to the 8.1 refresh that will solve this. Otherwise, all was going really well until I came to install our custom Word add-in that we use to generate p&p documents. The installer politely informed me that it needed .NET 3.5, and helpfully provided a link to download it. Except that you can’t do this on Windows 8; you have to enable .NET 3.5 in the “Turn Windows features on and off” section of the Programs and Features dialog.
So I do this and get error 0x800F0906 (download failure). It seems that it’s a common problem with .NET 3.5 and many other Windows 8 features if, like me, you use Windows Software Update Service (WSUS) to manage patching machines on your network. The solution (and a description of why it occurs) is provided in this blog post. You need to change the Group Policy setting named “Specify settings for optional component installation and component repair”, which is located under Computer Configuration\Administrative Templates\System, to Enabled and then set the “Contact Windows Update directly to download repair content instead of Windows Server Update Service (WSUS)” checkbox. If your domain controllers still run Windows Server 2008 you’ll have to apply the Group Policy setting locally on each Windows 8 computer.
And if you are considering choosing between sciatica and a Windows/Office upgrade, I’d suggest that the latter. It’s far less painful…