Random Disconnected Diatribes of a p&p Documentation Engineer
After the problems with network location ignorance the other week, and being an inquisitive type, I decided to dig a little deeper and see if I could identify why my sever was unsure about the type of network is was connected to. For some while I've had occasional issues with web browsing where page requests immediately throw up an error that a URL cannot be found, but refreshing the page in the browser works fine. And, of course, the odd Event Log message that "Name resolution for the name [some domain name] timed out after none of the configured DNS servers responded."
I keep coming back to the conclusion that there is a DNS error somewhere in my network setup, but I've never been able to trace it. However, after some experimenting with nslookup I discovered that querying for a domain outside my network without adding a period (.) to the end of the domain name resulted in a response giving that domain name with my own FQDN domain appended to it, and it always resolved to the address 22.214.171.124. This seems wrong. For example, querying for "microsoft.com" returns "microsoft.com.[my-own-domain].com with the IP address 126.96.36.199, whereas it should return something like 188.8.131.52. But appending a period to the query ("microsoft.com.") gives the correct result.
So I query the weird IP address and it resolves to "advancedsearch.virginmedia.com". Which, if you query it as a DNS server ("nslookup microsoft.com advancedsearch.virginmedia.com") just times out. It isn't a DNS server. I use NTL Business Cable as one of my ISPs, and they are a branch of Virgin Media, so I can see where the IP address is coming from. I also have two valid Virgin Media DNS servers I can use, so I repeat the lookups specifying one of these to try and discover where the strange behavior is coming from.
It turns out that the Virgin Media DNS servers have a neat trick: if they receive a request for a domain they can't find, they automatically return the IP address of the Virgin Advanced Web Search page. As the browser does a DNS lookup by appending the machine's domain name if it can't find the one requested (I assume this happens because the default network connection setting in the DNS tab of the Network Connections dialog is to append the primary suffixes to the domain name for resolution of unqualified names), the Virgin Media DNS server responds with the requested domain name plus the machine FQDN and the IP address of the advanced web search page.
My internal DNS servers have forwarders set up for resolution of external domain names, and I had added the two Virgin Media DNS servers to the list along with the DNS servers of my other ISP (British Telecom). Repeating the tests against the BT DNS servers shows that they don't do any fancy tricks. Looking up a non-existent domain simply returns a "Non-existent domain" message. So I removed the Virgin Media DNS servers from the list of forwarders in my internal DNS and it stopped the weird "advanced search" behavior happening. And, so far, it also seems to have stopped the problems of failed lookups and "Name resolution timed out after none of the configured DNS servers responded" errors in the server's event logs.
Removing the Virgin Media DNS servers from the list of forwarders also looks like it has stopped the occurrence of exessive non-TCP requests being sent out onto the net from my domain controllers. My ISA Server occassionally reported that they were opening more than the maximum configured number of non-TCP connections, and these turned out to be DNS lookups. but, of course, it could all just be a wild coincidence.
But I can't help wondering why, for a business connection where you'd expect people to run their own DCs and DNS servers, they decided it's a good idea to return the address of a web search page from a DNS lookup query. Perhaps they get paid a bonus for each click-though...
Footnote: If you are looking for alternative DNS servers to use, you might like to try the Google ones (see http://code.google.com/speed/public-dns/). I'm using these at the moment, with no problems detected so far.
I suppose it isn't easy switching a whole country over from old-fashioned analogue TV to the wondrous delights of multi-channel digital, but here in Britain it's hard to see how they could have made more of a pigs-ear of it. Not only is it taking five years, but they seem to be doing it as an unpredictable series of updates; each of which is just significant enough to be really annoying. It's as though they want people to switch over to cable or BSkyB.
For example, sometimes they move a few of the channels around without telling us so that I get an earful when Coronation Street doesn't get recorded. Or they reduce the power on one of the multiplexes for testing so that a random selection of channels turns into a series of static and pixelated scenes that only update every ten seconds. Then, occasionally, they change the channel name just enough so that the Media Center guide listings don't match and my wife misses Emmerdale.
OK, so they assure us that when we reach the magical "final switchover" date and they turn off the analogue signal the digital transmitter power will be increased ten-fold. Or maybe, as my aerial/dish man suggested, four-fold. He also suggested it probably won't make any difference to me due to my "fringe location", and I don't think he was talking about the result of my last haircut. I suppose it's a bit like when they assured us that paying into a pension scheme all your working life would provide for "a comfortable retirement".
Anyway, I finally reached the end of my terrestrial tether last month after re-tuning Media Center twice in a week and still only getting half of the usual channels; and decided to install a DVB-S satellite card instead. We already have a dish hanging off the side of the house (see I Need a Wii...) so the cost of the upgrade was reasonably minimal and worth experimental evaluation.
Unfortunately our Media Center is old enough to have only a PCI socket, not one of those groovy new PCI/e ones, and there seems to be only one dual-tuner satellite card around now (the Pinnacle PcTv 4000i) that doesn't need the "/e" thing. But after I upgraded the Media Center from Vista to Windows 7 and slotted in the new satellite card it found drivers on Windows Update and all seemed to be hunky dory. It "just worked" and now we have over 300 channels of garbage instead of the forty-something that (on a good day) our terrestrial digital antenna and DVB-T card could find. Amazing. And Media Center in Windows 7 is really easy to set up compared to Vista, detecting the correct satellite and channels automatically.
It could well be that my computer is partly to blame - like me it's no Spring chicken and might not be equipped with all the current bells and whistles necessary to fully support suspend mode for complex drivers and multimedia subsystems. Installing the latest version of the PcTv TV card drivers did minimize the occurrence of the problem, so obviously Pinnacle recognized the issue and made efforts to fix it (just a shame they didn't put the latest drivers on Windows Update).
However, determined to finally resolve this issue, I followed the various forum threads and blog posts describing different solutions. In the end, I decided that a custom solution was the only way round the problem and set to creating the necessary bits of code. And discovered just how awkward coping with recalcitrant devices and suspend modes can be...
You can restart a device using Microsoft's DevCon utility, and there are utilities available on the web that can execute scripts and programs when the computer sleeps, hibernates, and wakes up. Batch files can stop and start the required applications and services so that the device can properly reset. However, you need to run DevCon and stop/start the services under the context of an administrative account, whereas - if you follow good practice and run under a standard user account - you have to restart the Media Center shell under the context of the current user (or else you can't see it!). So you have a problem of how to switch execution context for the different operations.
After much experimentation, I combined several of the techniques described to create a utility that does the job. I created a Windows Service that runs under the LOCALSYSTEM account, detects suspend and resume events, and runs scripts when they occur. It also writes messages to the Application Event Log when the events are detected. The service uses a configuration file to specify the scripts/programs to execute and the maximum timeout. It also has a "debug" mode that outputs additional information.
The trick is to allow the scripts that run under the system account to do the stuff that requires administrative privileges; such as stopping and starting the Media Center scheduler and receiver services, and restarting the device and its drivers. The service write an event with ID=4 when it has finished executing the "resume" script, so you can create a task in Task Scheduler that is triggered when this event is added to the Application Event Log. The task is configured to run as a specific user (in the case of our Media Center, my wife's domain account) and it executes another batch file that uses the start command to start the Media Center shell maximized.
So the sequence of events is:
You can download the service as a Visual Studio 2010 project and as a compiled assembly, together with the scripts and other files, if you want to try this on your own system.
The main problem with all this background activity is that the resume operations take anything between ten and twenty seconds to complete, and if you "Green Button" to manually start the Media Center Shell too early it may not detect the tuners if the Receiver service is not fully started. So now I use a desktop background that has a large "Please wait" message to warn impatient people with itchy fingers who think computers should work just like other domestic appliances. And you probably want to consider turning the attached sound system off when you're not watching anything or you'll be serenaded by the charmingly ethereal Media Center start-up tune when it decides to wake up in the middle of the night to record some seriously bad 1960's SciFi movie.
All in all, it seems a lot of effort to get round a glitch in a combination of software, hardware, drivers, BIOS, and operating system that doesn't always behave as it should. Though a regular recycling of the scheduler and receiver services and the TV card drivers doesn't seem like a bad idea. According to many blogs, that's pretty much what the optional "Daily Optimize" feature in Media Center does anyway.
Of course, the extremely non-typical sense of smug satisfaction resulting from having resolved any issue even remotely connected to computers was soon dissipated. Just nine days after finally completing coding, debugging and testing my solution to ensure it was performing perfectly, the BBC effectively did the "/e" thing by moving all it's HD channels from DVB-S to DVB-S2. Which my satellite card can't receive.
I wonder if I can make a pair of external USB satellite receivers work with Media Center....?
Everybody loves a Terry Pratchett quote, so I'll start this week with "In the beginning there was nothing, which exploded". It came to mind as I read in the science section of the newspaper about how those amazing people at CERN in Switzerland have managed to create a (rather small) handful of hydrogen anti-matter molecules, and then kept them alive for a little over 16 minutes. Before they, too, exploded.
It seems like they are trying to answer the question about where all the anti-matter that must have been around at the beginning of the Universe (when all of the nothing exploded) has gone. Theory says there should be equal amounts of matter and anti-matter, but the anti-matter has just disappeared. Or, at least we haven't found any yet; which is probably a good thing because - rather like Linux and Windows users at a computer conference - if you let matter and anti-matter get too close together they violently interact. Then cancel each other out and disappear with a loud bang.
Of course, Terry Pratchett also has a highly rational theory for why there is not enough matter lying around in the Universe. He explained that, according to current scientific theory, nine-tenths of the mass of the Universe is unaccounted for (as I mentioned in Reference to the Universe Class some while ago). The missing nine tenths is, in fact, just the paperwork. And because you can never find the paperwork, we're wasting our time looking for it.
Anti-matter is, as the name suggests, the opposite of matter. Instead of having a positive nucleus and negative electrons, it has positive electrons (positrons) and a negative nucleus. But what I want to know is how they can be sure that the stuff we have in our dimension of the space-time continuum isn't actually "anti-matter", and what they made is really "matter". How do they know that what we are made of is the real one?
It seems obvious that people living in one of the other parallel dimensions in the Universe would think that they have the real "matter". And probably all of the other stuff missing from our Universe as well, which they use in anti-matter engines to power their Starships and Battle Cruisers. Though I just remembered that's in Star Trek, so it may not actually be real.
As to the lack of anti-matter in our Universe, I wonder if the reason they haven't found it yet is because we in the IT industry are hiding it all in our software, hardware, documentation, and networks. Obviously much of what's on the internet today doesn't really matter at all to anyone, and I could easily be convinced that nine tenths of the software on my computer exhibits the same characteristic. The fact that I need over one and a half million bytes of program just to write this post must be an indication of how much anti-code it contains (to give you an idea of how big a number that is, bear in mind that one and a half million days ago they were still building the pyramids in Egypt).
Of course, we all know that - unless you are actually a genius programmer - your own code also contains a welter of nasty stuff such as anti-variables and anti-functions that automatically adopt or return exactly the opposite result you expect. And, of course, your code could well be riddled with anti-patterns as well. Meanwhile, in terms of documentation, it's not untypical to discover help files that contain everything except the stuff you need to know (anti-information), including a ton of content that has no real meaning and would not matter a jot if omitted.
In fact the project I'm currently working on is displaying the after-effects of anti-matter content in terms of documentation. We're updating a guide about Windows Azure to reflect the current release, and removing all the anti-content that has turned out to be unnecessary, misleading, and in one or two cases just plain wrong. Of course, when we wrote the original version it was completely anti-matter-free. It's just that the changes between the beta version of Azure that we used, and the current release version, have converted parts of the content into anti-documentation.
But maybe there's a lesson here for our heroic group of scientists at CERN. Instead of needing a multi-billion dollar particle accelerator and massive magnetic fields to capture some wonderful new anti-particle, they could just sit around drinking coffee and wait for the release version to come out...
They've been advertising the book "In the Land of Invented Languages" by Arika Okrent on The Register web site for a while, and I finally caved in and bought a copy. And I have to say it's quite an amazing book. It really makes you think about how languages have evolved, and how we use language today. It even contains a list of the 500 most well-known invented languages; and a whole chapter that explores the origins and syntax of Klingon.
Even the chapter titles tempt you to explore the contents. There's a whole chapter devoted to the symbolic language representation for human excrement (though the word they use in the title is a little more graphic), and another called "A Calculus of Thought" that describes mathematical approaches to and analysis of language. Though the chapter title I liked best is "A Nudist, a Gay Ornithologist, a Railroad Enthusiast, and a Punk Cannabis Smoker Walk Into a Bar...". Meanwhile the chapter on Klingon explains that "Hab SoSlI' Quch" is a useful term for insulting someone ("Your Mother has a smooth forehead").
The book ranges widely over topics such as how languages work, and the many different ways that people have tried over the years to categorize languages into a set of syntactic representation trees that separate the actual syntax from the underlying meaning. A bit like we use CSS to separate the UI representation from the underlying data in web pages. It raises an interesting point that, if every language could be categorized into a tree like this, translation from one to another should be really easy.
For example, the problem we have with words such as "like" that could mean two completely different things ("similar to" or "have affection for") would go away because the symbolic representation and the location within the syntax tree would be different for each meaning. Except that you'd have to figure out how to convert the original text into the symbolic tree representation first, so it's probably no advantage...
But it struck me that the book makes little mention of the myriad invented languages that we use in the IT world every day. Surely Visual Basic, C#, Smalltalk, LISP, and even HTML and CSS are invented languages? OK, so we tend to use them to talk to a machine rather than to each other (though I've met a few people who could well be the exception that proves the rule), but they are still languages as such. And the best part is that they already have a defined symbolic tree that includes every word in the language, because that's how code compilers work.
However, it seems that our computer-related invented languages are actually resolutely single-language in terms of globalization. A web search for print("Hello World") returns 12,600,000 matches, whereas imprimer("Bonjour tout le monde") finds nothing even remotely related. It looks as though, at least in the IT world, we are actually forcing everybody to learn US English - even if it's only computer language keywords.
Does this mean that computer programming for people whose first language is not English is harder because they need to learn what words like "print", "add", "credential", "begin", "file", and more actually mean in their language to be able to choose the correct keywords? Or does learning a language such as Visual Basic or C# make it easier to learn English as a spoken language? Are there enough words in these computer languages to make yourself understood if that's the only English words you know? I guess it would be a very limited conversation.
So maybe we should consider expanding the range of reserved words in our popular computing languages to encompass more everyday situations. Working on the assumption that, in a few years' time everyone will need to be computer literate just to survive, eventually there would be no need for language translation. We could just converse using well-understood computer languages.
Finally let Me End this POST DateTime.Now && JOIN Me Next(Week) To continue...